Researchers in the National Exposure Research Laboratory (NERL) have performed a number of large human exposure measurement studies during the past decade. It is the goal of the NERL to make the data available to other researchers for analysis in order to further the scientific ...
Environment/Health/Safety (EHS): Databases
Hazard Documents Database Biosafety Authorization System CATS (Corrective Action Tracking System) (for findings 12/2005 to present) Chemical Management System Electrical Safety Ergonomics Database (for new Learned / Best Practices REMS - Radiation Exposure Monitoring System SJHA Database - Subcontractor Job
EXPOSURES AND INTERNAL DOSES OF ...
The National Center for Environmental Assessment (NCEA) has released a final report that presents and applies a method to estimate distributions of internal concentrations of trihalomethanes (THMs) in humans resulting from a residential drinking water exposure. The report presents simulations of oral, dermal and inhalation exposures and demonstrates the feasibility of linking the US EPA’s information Collection Rule database with other databases on external exposure factors and physiologically based pharmacokinetic modeling to refine population-based estimates of exposure. Review Draft - by 2010, develop scientifically sound data and approaches to assess and manage risks to human health posed by exposure to specific regulated waterborne pathogens and chemicals, including those addressed by the Arsenic, M/DBP and Six-Year Review Rules.
Zilaout, Hicham; Vlaanderen, Jelle; Houba, Remko; Kromhout, Hans
2017-07-01
In 2000, a prospective Dust Monitoring Program (DMP) was started in which measurements of worker's exposure to respirable dust and quartz are collected in member companies from the European Industrial Minerals Association (IMA-Europe). After 15 years, the resulting IMA-DMP database allows a detailed overview of exposure levels of respirable dust and quartz over time within this industrial sector. Our aim is to describe the IMA-DMP and the current state of the corresponding database which due to continuation of the IMA-DMP is still growing. The future use of the database will also be highlighted including its utility for the industrial minerals producing sector. Exposure data are being obtained following a common protocol including a standardized sampling strategy, standardized sampling and analytical methods and a data management system. Following strict quality control procedures, exposure data are consequently added to a central database. The data comprises personal exposure measurements including auxiliary information on work and other conditions during sampling. Currently, the IMA-DMP database consists of almost 28,000 personal measurements which have been performed from 2000 until 2015 representing 29 half-yearly sampling campaigns. The exposure data have been collected from 160 different worksites owned by 35 industrial mineral companies and comes from 23 European countries and approximately 5000 workers. The IMA-DMP database provides the European minerals sector with reliable data regarding worker personal exposures to respirable dust and quartz. The database can be used as a powerful tool to address outstanding scientific issues on long-term exposure trends and exposure variability, and importantly, as a surveillance tool to evaluate exposure control measures. The database will be valuable for future epidemiological studies on respiratory health effects and will allow for estimation of quantitative exposure response relationships. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.
Clinical Databases for Chest Physicians.
Courtwright, Andrew M; Gabriel, Peter E
2018-04-01
A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Surveillance of occupational noise exposures using OSHA's Integrated Management Information System.
Middendorf, Paul J
2004-11-01
Exposure to noise has long been known to cause hearing loss, and is an ubiquitous problem in workplaces. Occupational noise exposures for industries stored in the Occupational Safety and Health Administration's (OSHA) Integrated Management Information System (IMIS) can be used to identify temporal and industrial trends of noise exposure to anticipate changes in rates of hearing loss. The noise records in OSHA's IMIS database for 1979-1999 were extracted by major industry division and measurement criteria. The noise exposures were summarized by year, industry, and employment size. The majority of records are from Manufacturing and Services. Exposures in Manufacturing and Services have decreased during the period, except that PEL exposures measured by federal enforcement increased from 1995 to 1999. Noise exposures in manufacturing have been reduced since the late 1970s, except those documented by federal enforcement. Noise exposure data outside manufacturing is not well represented in IMIS. Copyright 2004 Wiley-Liss, Inc.
Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan
2010-01-01
To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.
Seifert, Steven A; Oakes, Jennifer A; Boyer, Leslie V
2007-01-01
Non-native (exotic) snake exposures in the United States have not been systematically characterized. The Toxic Exposure Surveillance System (TESS) database of the American Association of Poison Control Centers was analyzed to quantify the number and types, demographic associations, clinical presentations, managements and outcomes, and the health resource utilization of non-native snake exposures. From 1995 through 2004, there were 399 non-native exposures in the TESS database. Of these, 350 snakes (87%) were identified by genus and species, comprising at least 77 different varieties. Roughly equal percentages of snakes originated in Asia, Africa and Latin America, with a smaller number from the Middle-East, Australia, and Europe. Nearly half were viperids and a little more than a third were elapids. The vast majority of exposed individuals were adults. However, almost 15% were aged 17 years or less, and almost 7% were children aged 5 years or younger. Eighty-four percent were males. The vast majority of exposures occurred at the victim's own residence. Over 50% were evaluated at a healthcare facility, with 28.7% admitted to an ICU. Overall, 26% of patients were coded as receiving antivenom treatment. Coded outcomes were similar between viperid and elapid envenomations. There were three deaths, two involving viperid snakes and one elapid. Enhancements to the TESS database are required for better precision in and more complete characterization of non-native snake envenomations.
Cowan, Dallas M; Cheng, Thales J; Ground, Matthew; Sahmel, Jennifer; Varughese, Allysha; Madl, Amy K
2015-08-01
The United States Occupational Safety and Health Administration (OSHA) maintains the Chemical Exposure Health Data (CEHD) and the Integrated Management Information System (IMIS) databases, which contain quantitative and qualitative data resulting from compliance inspections conducted from 1984 to 2011. This analysis aimed to evaluate trends in workplace asbestos concentrations over time and across industries by combining the samples from these two databases. From 1984 to 2011, personal air samples ranged from 0.001 to 175 f/cc. Asbestos compliance sampling data associated with the construction, automotive repair, manufacturing, and chemical/petroleum/rubber industries included measurements in excess of 10 f/cc, and were above the permissible exposure limit from 2001 to 2011. The utility of combining the databases was limited by the completeness and accuracy of the data recorded. In this analysis, 40% of the data overlapped between the two databases. Other limitations included sampling bias associated with compliance sampling and errors occurring from user-entered data. A clear decreasing trend in both airborne fiber concentrations and the numbers of asbestos samples collected parallels historically decreasing trends in the consumption of asbestos, and declining mesothelioma incidence rates. Although air sampling data indicated that airborne fiber exposure potential was high (>10 f/cc for short and long-term samples) in some industries (e.g., construction, manufacturing), airborne concentrations have significantly declined over the past 30 years. Recommendations for improving the existing exposure OSHA databases are provided. Copyright © 2015. Published by Elsevier Inc.
Mbaeyi, Chukwuma; Panlilio, Adelisa L; Hobbs, Cynthia; Patel, Priti R; Kuhar, David T
2012-10-01
Occupational exposure management is an important element in preventing the transmission of bloodborne pathogens in health care settings. In 2008, the US Centers for Disease Control and Prevention conducted a survey to assess procedures for managing occupational bloodborne pathogen exposures in outpatient dialysis facilities in the United States. A cross-sectional survey of randomly selected outpatient dialysis facilities. 339 outpatient dialysis facilities drawn from the 2006 US end-stage renal disease database. Hospital affiliation (free-standing vs hospital-based facilities), profit status (for-profit vs not-for-profit facilities), and number of health care personnel (≥100 vs <100 health care personnel). Exposures to hepatitis B virus (HBV), hepatitis C virus (HCV), and human immunodeficiency virus (HIV); provision of HBV and HIV postexposure prophylaxis. We calculated the proportion of facilities reporting occupational bloodborne pathogen exposures and offering occupational exposure management services. We analyzed bloodborne pathogen exposures and provision of postexposure prophylaxis by facility type. Nearly all respondents (99.7%) had written policies and 95% provided occupational exposure management services to health care personnel during the daytime on weekdays, but services were provided infrequently during other periods of the week. Approximately 10%-15% of facilities reported having HIV, HBV, or HCV exposures in health care personnel in the 12 months prior to the survey, but inconsistencies were noted in procedures for managing such exposures. Despite 86% of facilities providing HIV prophylaxis for exposed health care personnel, only 37% designated a primary HIV postexposure prophylaxis regimen. For-profit and free-standing facilities reported fewer exposures, but did not as reliably offer HBV prophylaxis or have a primary HIV postexposure prophylaxis regimen relative to not-for-profit and hospital-based facilities. The survey response rate was low (37%) and familiarity of individuals completing the survey with facility policies or national guidelines could not be ascertained. Significant improvements are required in the implementation of guidelines for managing occupational exposures to bloodborne pathogens in outpatient dialysis facilities. Published by Elsevier Inc.
Dell’Acqua, F.; Gamba, P.; Jaiswal, K.
2012-01-01
This paper discusses spatial aspects of the global exposure dataset and mapping needs for earthquake risk assessment. We discuss this in the context of development of a Global Exposure Database for the Global Earthquake Model (GED4GEM), which requires compilation of a multi-scale inventory of assets at risk, for example, buildings, populations, and economic exposure. After defining the relevant spatial and geographic scales of interest, different procedures are proposed to disaggregate coarse-resolution data, to map them, and if necessary to infer missing data by using proxies. We discuss the advantages and limitations of these methodologies and detail the potentials of utilizing remote-sensing data. The latter is used especially to homogenize an existing coarser dataset and, where possible, replace it with detailed information extracted from remote sensing using the built-up indicators for different environments. Present research shows that the spatial aspects of earthquake risk computation are tightly connected with the availability of datasets of the resolution necessary for producing sufficiently detailed exposure. The global exposure database designed by the GED4GEM project is able to manage datasets and queries of multiple spatial scales.
Pathobiology and management of laboratory rodents administered CDC category A agents.
He, Yongqun; Rush, Howard G; Liepman, Rachel S; Xiang, Zuoshuang; Colby, Lesley A
2007-02-01
The Centers for Disease Control and Prevention Category A infectious agents include Bacillus anthracis (anthrax), Clostridium botulinum toxin (botulism), Yersinia pestis (plague), variola major virus (smallpox), Francisella tularensis (tularemia), and the filoviruses and arenaviruses that induce viral hemorrhagic fevers. These agents are regarded as having the greatest potential for adverse impact on public health and therefore are a focus of renewed attention in infectious disease research. Frequently rodent models are used to study the pathobiology of these agents. Although much is known regarding naturally occurring infections in humans, less is documented on the sources of exposures and potential risks of infection to researchers and animal care personnel after the administration of these hazardous substances to laboratory animals. Failure to appropriately manage the animals can result both in the creation of workplace hazards if human exposures occur and in disruption of the research if unintended animal exposures occur. Here we review representative Category A agents, with a focus on comparing the biologic effects in naturally infected humans and rodent models and on considerations specific to the management of infected rodent subjects. The information reviewed for each agent has been curated manually and stored in a unique Internet-based database system called HazARD (Hazards in Animal Research Database, http://helab.bioinformatics.med.umich.edu/hazard/) that is designed to assist researchers, administrators, safety officials, Institutional Biosafety Committees, and veterinary personnel seeking information on the management of risks associated with animal studies involving hazardous substances.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munganahalli, D.
Sedco Forex is a drilling contractor that operates approximately 80 rigs on land and offshore worldwide. The HSE management system developed by Sedco Forex is an effort to prevent accidents and minimize losses. An integral part of the HSE management system is establishing risk profiles and thereby minimizing risk and reducing loss exposures. Risk profiles are established based on accident reports, potential accident reports and other risk identification reports (RIR) like the Du Pont STOP system. A rig could fill in as many as 30 accident reports, 30 potential accident reports and 500 STOP cards each year. Statistics are importantmore » for an HSE management system, since they are indicators of success or failure of HSE systems. It is however difficult to establish risk profiles based on statistical information, unless tools are available at the rig site to aid with the analysis. Risk profiles are then used to identify important areas in the operation that may require specific attention to minimize the loss exposure. Programs to address the loss exposure can then be identified and implemented with either a local or corporate approach. In January 1995, Sedco Forex implemented a uniform HSE Database on all the rigs worldwide. In one year companywide, the HSE database would contain information on approximately 500 accident and potential accident reports, and 10,000 STOP cards. This paper demonstrates the salient features of the database and describes how it has helped in establishing key risk profiles. It also shows a recent example of how risk profiles have been established at the corporate level and used to identify the key contributing factors to hands and finger injuries. Based on this information, a campaign was launched to minimize the frequency of occurrence and associated loss attributed to hands and fingers accidents.« less
Tennant, David Robin; Bruyninckx, Chris
2018-03-01
Consumer exposure assessments for food additives are incomplete without information about the proportions of foods in each authorised category that contain the additive. Such information has been difficult to obtain but the Mintel Global New Products Database (GNPD) provides information about product launches across Europe over the past 20 years. These data can be searched to identify products with specific additives listed on product labels and the numbers compared with total product launches for food and drink categories in the same database to determine the frequency of occurrence. There are uncertainties associated with the data but these can be managed by adopting a cautious and conservative approach. GNPD data can be mapped with authorised food categories and with food descriptions used in the EFSA Comprehensive European Food Consumption Surveys Database for exposure modelling. The data, when presented as percent occurrence, could be incorporated into the EFSA ANS Panel's 'brand-loyal/non-brand loyal exposure model in a quantitative way. Case studies of preservative, antioxidant, colour and sweetener additives showed that the impact of including occurrence data is greatest in the non-brand loyal scenario. Recommendations for future research include identifying occurrence data for alcoholic beverages, linking regulatory food codes, FoodEx and GNPD product descriptions, developing the use of occurrence data for carry-over foods and improving understanding of brand loyalty in consumer exposure models.
Can control banding be useful for the safe handling of nanomaterials? A systematic review
NASA Astrophysics Data System (ADS)
Eastlake, Adrienne; Zumwalde, Ralph; Geraci, Charles
2016-06-01
Control banding (CB) is a risk management strategy that has been used to identify and recommend exposure control measures to potentially hazardous substances for which toxicological information is limited. The application of CB and level of expertise required for implementation and management can differ depending on knowledge of the hazard potential, the likelihood of exposure, and the ability to verify the effectiveness of exposure control measures. A number of different strategies have been proposed for using CB in workplaces where exposure to engineered nanomaterials (ENMs) can occur. However, it is unclear if the use of CB can effectively reduce worker exposure to nanomaterials. A systematic review of studies was conducted to answer the question "can control banding be useful to ensure adequate controls for the safe handling of nanomaterials." A variety of databases were searched to identify relevant studies pertaining to CB. Database search terms included `control,' `hazard,' `exposure,' and `risk' banding as well as the use of these terms in the context of nanotechnology or nanomaterials. Other potentially relevant studies were identified during the review of articles obtained in the systematic review process. Identification of studies and the extraction of data were independently conducted by the reviewers. Quality of the studies was assessed using the methodological index for nonrandomized studies. The quality of the evidence was evaluated using grading of recommendations assessment, development and evaluation (GRADE). A total of 235 records were identified in the database search in which 70 records were determined to be eligible for full-text review. Only two studies were identified that met the inclusion criteria. These studies evaluated the application of the CB Nanotool in workplaces where ENMs were being handled. A total of 32 different nanomaterial handling activities were evaluated in these studies by comparing the recommended exposure controls using CB to existing exposure controls previously recommended by an industrial hygienist. It was determined that the selection of exposure controls using CB were consistent with those recommended by an industrial hygienist for 19 out of 32 (59.4 %) job activities. A higher level of exposure control was recommended for nine out of 32 (28.1 %) job activities using CB, while four out of 32 (12.5 %) job activities had in-place exposure controls that were more stringent than those recommended using CB. After evaluation using GRADE, evidence indicated that the use of CB Nanotool can recommend exposure controls for many ENM job activities that would be consistent with those recommended by an experienced industrial hygienist. The use of CB for reducing exposures to ENMs has the potential to be an effective risk management strategy when information is limited on the health risk to the nanomaterial and/or there is an absence of an occupational exposure limit. However, there remains a lack of evidence to conclude that the use of CB can provide adequate exposure control in all work environments. Additional validation work is needed to provide more data to support the use of CB for the safe handling of ENMs.
Can Control Banding be Useful for the Safe Handling of Nanomaterials? A Systematic Review
Eastlake, Adrienne; Zumwalde, Ralph; Geraci, Charles
2016-01-01
Objectives Control banding (CB) is a risk management strategy that has been used to identify and recommend exposure control measures to potentially hazardous substances for which toxicological information is limited. The application of CB and level of expertise required for implementation and management can differ depending on knowledge of the hazard potential, the likelihood of exposure, and the ability to verify the effectiveness of exposure control measures. A number of different strategies have been proposed for using CB in workplaces where exposure to engineered nanomaterials (ENMs) can occur. However, it is unclear if the use of CB can effectively reduce worker exposure to nanomaterials. A systematic review of studies was conducted to answer the question “can control banding be useful to ensure adequate controls for the safe handling of nanomaterials.” Methods A variety of databases were searched to identify relevant studies pertaining to CB. Database search terms included ‘control’, ‘hazard’, ‘exposure’ and ‘risk’ banding as well as the use of these terms in the context of nanotechnology or nanomaterials. Other potentially relevant studies were identified during the review of articles obtained in the systematic review process. Identification of studies and the extraction of data were independently conducted by the reviewers. Quality of the studies was assessed using the Methodological Index for Non-Randomized Studies (MINORS). The quality of the evidence was evaluated using Grading of Recommendations Assessment, Development and Evaluation (GRADE). Results A total of 235 records were identified in the database search in which 70 records were determined to be eligible for full-text review. Only two studies were identified that met the inclusion criteria. These studies evaluated the application of the CB Nanotool in workplaces where ENMs were being handled. A total of 32 different nanomaterial handling activities were evaluated in these studies by comparing the recommended exposure controls using CB to existing exposure controls previously recommended by an industrial hygienist. It was determined that the selection of exposure controls using CB were consistent with those recommended by an industrial hygienist for 19 out of 32 (59.4%) job activities. A higher level of exposure control was recommended for nine out of 32 (28.1%) job activities using CB while four out of 32 (12.5%) job activities had in place exposure controls that were more stringent than those recommended using CB. After evaluation using GRADE, evidence indicated that the use of CB Nanotool can recommend exposure controls for many ENM job activities that would be consistent with those recommended by an experienced industrial hygienist. Conclusion The use of CB for reducing exposures to ENMs has the potential to be an effective risk management strategy when information is limited on the health risk to the nanomaterial and/or there is an absence of an occupational exposure limit (OEL). However, there remains a lack of evidence to conclude that the use of CB can provide adequate exposure control in all work environments. Additional validation work is needed to provide more data to support the use of CB for the safe handling of ENMs. PMID:27471426
Rattner, B.A.; Pearson, J.L.; Golden, N.H.; Cohen, J.B.; Erwin, R.M.; Ottinger, M.A.
2000-01-01
In order to examine the condition of biota in Atlantic coast estuaries, a ?Contaminant Exposure and Effects--Terrestrial Vertebrates? database (CEE-TV) has been compiled through computerized search of published literature, review of existing databases, and solicitation of unpublished reports from conservation agencies, private groups, and universities. Summary information has been entered into the database, including species, collection date (1965-present), site coordinates, estuary name, hydrologic unit catalogue code, sample matrix, contaminant concentrations, biomarker and bioindicator responses, and reference source, utilizing a 98-field character and numeric format. Currently, the CEE-TV database contains 3699 georeferenced records representing 190 vertebrate species and >145,000 individuals residing in estuaries from Maine through Florida. This relational database can be directly queried, imported into a Geographic Information System to examine spatial patterns, identify data gaps and areas of concern, generate hypotheses, and focus ecotoxicological field assessments. Information on birds made up the vast majority (83%) of the database, with only a modicum of data on amphibians (75,000 chemical compounds in commerce, only 118 commonly measured environmental contaminants were quantified in tissues of terrestrial vertebrates. There were no CEE-TV data records in 15 of the 67 estuaries located along the Atlantic coast and Florida Gulf coast. The CEE-TV database has a number of potential applications including focusing biomonitoring efforts to generate critically needed ecotoxicological data in the numerous ?gaps? along the coast, reducing uncertainty about contaminant risk, identifying areas for mitigation, restoration or special management, and ranking ecological conditions of estuaries.
Conwell, J L; Creek, K L; Pozzi, A R; Whyte, H M
2001-02-01
The Industrial Hygiene and Safety Group at Los Alamos National Laboratory (LANL) developed a database application known as IH DataView, which manages industrial hygiene monitoring data. IH DataView replaces a LANL legacy system, IHSD, that restricted user access to a single point of data entry needed enhancements that support new operational requirements, and was not Year 2000 (Y2K) compliant. IH DataView features a comprehensive suite of data collection and tracking capabilities. Through the use of Oracle database management and application development tools, the system is Y2K compliant and Web enabled for easy deployment and user access via the Internet. System accessibility is particularly important because LANL operations are spread over 43 square miles, and industrial hygienists (IHs) located across the laboratory will use the system. IH DataView shows promise of being useful in the future because it eliminates these problems. It has a flexible architecture and sophisticated capability to collect, track, and analyze data in easy-to-use form.
Burstyn, I; Kromhout, H; Cruise, P J; Brennan, P
2000-01-01
The objective of this project was to construct a database of exposure measurements which would be used to retrospectively assess the intensity of various exposures in an epidemiological study of cancer risk among asphalt workers. The database was developed as a stand-alone Microsoft Access 2.0 application, which could work in each of the national centres. Exposure data included in the database comprised measurements of exposure levels, plus supplementary information on production characteristics which was analogous to that used to describe companies enrolled in the study. The database has been successfully implemented in eight countries, demonstrating the flexibility and data security features adequate to the task. The database allowed retrieval and consistent coding of 38 data sets of which 34 have never been described in peer-reviewed scientific literature. We were able to collect most of the data intended. As of February 1999 the database consisted of 2007 sets of measurements from persons or locations. The measurements appeared to be free from any obvious bias. The methodology embodied in the creation of the database can be usefully employed to develop exposure assessment tools in epidemiological studies.
Impact of generalist care managers on patients with diabetes.
Dorr, David A; Wilcox, Adam; Donnelly, Steven M; Burns, Laurie; Clayton, Paul D
2005-10-01
To determine how the addition of generalist care managers and collaborative information technology to an ambulatory team affects the care of patients with diabetes. Multiple ambulatory clinics within Intermountain Health Care (IHC), a large integrated delivery network. A retrospective cohort study comparing diabetic patients treated by generalist care managers with matched controls was completed. Exposure patients had one or more contacts with a care manager; controls were matched on utilization, demographics, testing, and baseline glucose control. Using role-specific information technology to support their efforts, care managers assessed patients' readiness for change, followed guidelines, and educated and motivated patients. Patient data collected as part of an electronic patient record were combined with care manager-created databases to assess timely testing of glycosylated hemoglobin (HbA1c) and low-density lipoprotein (LDL) levels and changes in LDL and HbA1c levels. In a multivariable model, the odds of being overdue for testing for HbA1c decreased by 21 percent in the exposure group (n=1,185) versus the control group (n=4,740). The odds of being tested when overdue for HbA1c or LDL increased by 49 and 26 percent, respectively, and the odds of HbA1c <7.0 percent also increased by 19 percent in the exposure group. The average HbA1c levels decreased more in the exposure group than in the controls. The effect on LDL was not significant. Generalist care managers using computer-supported diabetes management helped increase adherence to guidelines for testing and control of HbA1c levels, leading to improved health status of patients with diabetes.
Rattner, B.A.; Pearson, J.L.; Garrett, L.J.; Erwin, R.M.; Walz, A.; Ottinger, M.A.; Barrett, H.R.
1997-01-01
The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. Despite the desire of many to continuously monitor the environmental health of our estuaries, much can be learned by summarizing existing temporal, geographic, and phylogenetic contaminant information. To this end, retrospective contaminant exposure and effects data for amphibians, reptiles, birds, and mammals residing within 30 km of Atlantic coast estuaries are being assembled through searches of published literature (e.g., Fisheries Review, Wildlife Review, BIOSIS Previews) and databases (e.g., US EPA Ecological Incident Information System; USGS Diagnostic and Epizootic Databases), and compilation of summary data from unpublished reports of government natural resource agencies, private conservation groups, and universities. These contaminant exposure and effect data for terrestrial vertebrates (CEE-TV) are being summarized using Borland dBASE in a 96- field format, including species, collection time and site coordinates, sample matrix, contaminant concentration, biomarker and bioindicator responses, and source of information (N>1500 records). This CEE-TV database has been imported into the ARC/INFO geographic information system (GIS), for purposes of examining geographic coverage and trends, and to identify critical data gaps. A preliminary risk assessment will be conducted to identify and characterize contaminants and other stressors potentially affecting terrestrial vertebrates that reside, migrate through or reproduce in these estuaries. Evaluations are underway, using specific measurement and assessment endpoints, to rank and prioritize estuarine ecosystems in which terrestrial vertebrates are potentially at risk for purposes of prediction and focusing future biomonitoring efforts.
Lee, Hunjoo; Lee, Kiyoung; Park, Ji Young; Min, Sung-Gi
2017-05-01
With support from the Korean Ministry of the Environment (ME), our interdisciplinary research staff developed the COnsumer Product Exposure and Risk assessment system (COPER). This system includes various databases and features that enable the calculation of exposure and determination of risk caused by consumer products use. COPER is divided into three tiers: the integrated database layer (IDL), the domain specific service layer (DSSL), and the exposure and risk assessment layer (ERAL). IDL is organized by the form of the raw data (mostly non-aggregated data) and includes four sub-databases: a toxicity profile, an inventory of Korean consumer products, the weight fractions of chemical substances in the consumer products determined by chemical analysis and national representative exposure factors. DSSL provides web-based information services corresponding to each database within IDL. Finally, ERAL enables risk assessors to perform various exposure and risk assessments, including exposure scenario design via either inhalation or dermal contact by using or organizing each database in an intuitive manner. This paper outlines the overall architecture of the system and highlights some of the unique features of COPER based on visual and dynamic rendering engine for exposure assessment model on web.
Goldsmith, M-R; Grulke, C M; Brooks, R D; Transue, T R; Tan, Y M; Frame, A; Egeghy, P P; Edwards, R; Chang, D T; Tornero-Velez, R; Isaacs, K; Wang, A; Johnson, J; Holm, K; Reich, M; Mitchell, J; Vallero, D A; Phillips, L; Phillips, M; Wambaugh, J F; Judson, R S; Buckley, T J; Dary, C C
2014-03-01
Consumer products are a primary source of chemical exposures, yet little structured information is available on the chemical ingredients of these products and the concentrations at which ingredients are present. To address this data gap, we created a database of chemicals in consumer products using product Material Safety Data Sheets (MSDSs) publicly provided by a large retailer. The resulting database represents 1797 unique chemicals mapped to 8921 consumer products and a hierarchy of 353 consumer product "use categories" within a total of 15 top-level categories. We examine the utility of this database and discuss ways in which it will support (i) exposure screening and prioritization, (ii) generic or framework formulations for several indoor/consumer product exposure modeling initiatives, (iii) candidate chemical selection for monitoring near field exposure from proximal sources, and (iv) as activity tracers or ubiquitous exposure sources using "chemical space" map analyses. Chemicals present at high concentrations and across multiple consumer products and use categories that hold high exposure potential are identified. Our database is publicly available to serve regulators, retailers, manufacturers, and the public for predictive screening of chemicals in new and existing consumer products on the basis of exposure and risk. Published by Elsevier Ltd.
Release of ToxCastDB and ExpoCastDB databases
EPA has released two databases - the Toxicity Forecaster database (ToxCastDB) and a database of chemical exposure studies (ExpoCastDB) - that scientists and the public can use to access chemical toxicity and exposure data. ToxCastDB users can search and download data from over 50...
Sauvé, Jean-François; Beaudry, Charles; Bégin, Denis; Dion, Chantal; Gérin, Michel; Lavoué, Jérôme
2012-09-01
A quantitative determinants-of-exposure analysis of respirable crystalline silica (RCS) levels in the construction industry was performed using a database compiled from an extensive literature review. Statistical models were developed to predict work-shift exposure levels by trade. Monte Carlo simulation was used to recreate exposures derived from summarized measurements which were combined with single measurements for analysis. Modeling was performed using Tobit models within a multimodel inference framework, with year, sampling duration, type of environment, project purpose, project type, sampling strategy and use of exposure controls as potential predictors. 1346 RCS measurements were included in the analysis, of which 318 were non-detects and 228 were simulated from summary statistics. The model containing all the variables explained 22% of total variability. Apart from trade, sampling duration, year and strategy were the most influential predictors of RCS levels. The use of exposure controls was associated with an average decrease of 19% in exposure levels compared to none, and increased concentrations were found for industrial, demolition and renovation projects. Predicted geometric means for year 1999 were the highest for drilling rig operators (0.238 mg m(-3)) and tunnel construction workers (0.224 mg m(-3)), while the estimated exceedance fraction of the ACGIH TLV by trade ranged from 47% to 91%. The predicted geometric means in this study indicated important overexposure compared to the TLV. However, the low proportion of variability explained by the models suggests that the construction trade is only a moderate predictor of work-shift exposure levels. The impact of the different tasks performed during a work shift should also be assessed to provide better management and control of RCS exposure levels on construction sites.
Development of a Consumer Product Ingredient Database for ...
Consumer products are a primary source of chemical exposures, yet little structured information is available on the chemical ingredients of these products and the concentrations at which ingredients are present. To address this data gap, we created a database of chemicals in consumer products using product Material Safety Data Sheets (MSDSs) publicly provided by a large retailer. The resulting database represents 1797 unique chemicals mapped to 8921 consumer products and a hierarchy of 353 consumer product “use categories” within a total of 15 top-level categories. We examine the utility of this database and discuss ways in which it will support (i) exposure screening and prioritization, (ii) generic or framework formulations for several indoor/consumer product exposure modeling initiatives, (iii) candidate chemical selection for monitoring near field exposure from proximal sources, and (iv) as activity tracers or ubiquitous exposure sources using “chemical space” map analyses. Chemicals present at high concentrations and across multiple consumer products and use categories that hold high exposure potential are identified. Our database is publicly available to serve regulators, retailers, manufacturers, and the public for predictive screening of chemicals in new and existing consumer products on the basis of exposure and risk. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts resear
Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans
2012-01-01
SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.
HEDS - EPA DATABASE SYSTEM FOR PUBLIC ACCESS TO HUMAN EXPOSURE DATA
Human Exposure Database System (HEDS) is an Internet-based system developed to provide public access to human-exposure-related data from studies conducted by EPA's National Exposure Research Laboratory (NERL). HEDS was designed to work with the EPA Office of Research and Devel...
THE HUMAN EXPOSURE DATABASE SYSTEM (HEDS)-PUTTING THE NHEXAS DATA ON-LINE
The EPA's National Exposure Research Laboratory (NERL) has developed an Internet accessible Human Exposure Database System (HEDS) to provide the results of NERL human exposure studies to both the EPA and the external scientific communities. The first data sets that will be ava...
Magin, Parker; Pond, Dimity; Smith, Wayne; Watson, Alan
2005-02-01
Lay perceptions that diet, hygiene and sunlight exposure are strongly associated with acne causation and exacerbation are common but at variance with the consensus of current dermatological opinion. The objective of this study was to carry out a review of the literature to assess the evidence for diet, face-washing and sunlight exposure in acne management. Original studies were identified by searches of the Medline, EMBASE, AMED (Allied and Complementary Medicine), CINAHL, Cochrane, and DARE databases. Methodological information was extracted from identified articles but, given the paucity of high quality studies found, no studies were excluded from the review on methodological grounds. Given the prevalence of lay perceptions, and the confidence of dermatological opinion in rebutting these perceptions as myths and misconceptions, surprisingly little evidence exists for the efficacy or lack of efficacy of dietary factors, face-washing and sunlight exposure in the management of acne. Much of the available evidence has methodological limitations. Based on the present state of evidence, clinicians cannot be didactic in their recommendations regarding diet, hygiene and face-washing, and sunlight to patients with acne. Advice should be individualized, and both clinician and patient cognizant of its limitations.
Lead and cadmium in public health in Nigeria: physicians neglect and pitfall in patient management.
Orisakwe, Orish Ebere
2014-02-01
Low-level heavy metals exposure may contribute much more toward the causation of chronic disease and impaired functioning than previously thought. Among the suggested preventive and intervention measures for the control of renal diseases are the reduction in the exposure to heavy metals. Although these indicate knowledge and awareness of possible role of some heavy metals in the etiogenesis of some chronic diseases by Nigerian Physicians, heavy metal assay as diagnostic guide in patient management is often omitted in most healthcare settings. This is a synoptic capture of the increased incidence and prevalence of some metabolic disorders where heavy metals may be implicated. A search of the terms heavy metal exposure, source, toxicity, metabolic disorders, poisoning in Nigeria, in bibliographical databases (in English language) such as PubMed, Scopus, Google Scholar, and Africa Journal Online (AJOL) digital library was conducted. Leaded gasoline, refuse dumping, absence of poison information centers, and poor record keeping characterize environmental health in Nigeria. Lead and cadmium are of most significant public health importance in Nigeria. The recognition and inclusion of heavy metals assays in the diagnosis of metabolic disorders may ensure early diagnosis and improve management.
Lead and Cadmium in Public Health in Nigeria: Physicians Neglect and Pitfall in Patient Management
Orisakwe, Orish Ebere
2014-01-01
Low-level heavy metals exposure may contribute much more toward the causation of chronic disease and impaired functioning than previously thought. Among the suggested preventive and intervention measures for the control of renal diseases are the reduction in the exposure to heavy metals. Although these indicate knowledge and awareness of possible role of some heavy metals in the etiogenesis of some chronic diseases by Nigerian Physicians, heavy metal assay as diagnostic guide in patient management is often omitted in most healthcare settings. This is a synoptic capture of the increased incidence and prevalence of some metabolic disorders where heavy metals may be implicated. A search of the terms heavy metal exposure, source, toxicity, metabolic disorders, poisoning in Nigeria, in bibliographical databases (in English language) such as PubMed, Scopus, Google Scholar, and Africa Journal Online (AJOL) digital library was conducted. Leaded gasoline, refuse dumping, absence of poison information centers, and poor record keeping characterize environmental health in Nigeria. Lead and cadmium are of most significant public health importance in Nigeria. The recognition and inclusion of heavy metals assays in the diagnosis of metabolic disorders may ensure early diagnosis and improve management. PMID:24696827
Xiao, Da; Tan, Xiaoling; Wang, Wenjuan; Zhang, Fan; Desneux, Nicolas; Wang, Su
2017-01-01
Biological control is usually used in combination with chemical control for practical agricultural applications. Thus, the influence of insecticides on the natural predators used for biological control should be investigated for integrated pest management. The ladybird Harmonia axyridis is an effective predator on aphids and coccids. Beta-cypermethrin is a broad-spectrum insecticide used worldwide for controlling insect pests. H. axyridis is becoming increasingly threatened by this insecticide. Here, we investigated the effect of a sublethal dose of beta-cypermethrin on flight, locomotion, respiration, and detoxification system of H. axyridis. After exposure to beta-cypermethrin, succinic female adults flew more times, longer distances, and during longer time periods. Exposure to a sublethal dose of beta-cypermethrin also promoted an increase in walking rate, walking distance, walking duration, and also an increase in respiratory quotient and respiratory rate. To investigate the effects of beta-cypermethrin on H. axyridis detoxification system, we analyzed the transcriptome of H. axyridis adults, focusing on genes related to detoxification systems. De novo assembly generated 65,509 unigenes with a mean length of 799 bp. From these genes, 26,020 unigenes (40.91% of all unigenes) exhibited clear homology to known genes in the NCBI non-redundant database. In addition, 10,402 unigenes were annotated in the Cluster of Orthologous Groups database, 12,088 unigenes were assigned to the Gene Ontology database and 12,269 unigenes were in the Kyoto Encyclopedia of Genes and Genome (KEGG) database. Exposure to beta-cypermethrin had significant effects on the transcriptome profile of H. axyridis adult. Based on uniquely mapped reads, 3,296 unigenes were differentially expressed, 868 unigenes were up-regulated and 2,248 unigenes were down-regulated. We identified differentially-expressed unigenes related to general detoxification systems in H. axyridis. This assembled, annotated transcriptome provides a valuable genomic resource for further understanding the molecular basis of detoxification mechanisms in H. axyridis. PMID:28239355
Xiao, Da; Tan, Xiaoling; Wang, Wenjuan; Zhang, Fan; Desneux, Nicolas; Wang, Su
2017-01-01
Biological control is usually used in combination with chemical control for practical agricultural applications. Thus, the influence of insecticides on the natural predators used for biological control should be investigated for integrated pest management. The ladybird Harmonia axyridis is an effective predator on aphids and coccids. Beta-cypermethrin is a broad-spectrum insecticide used worldwide for controlling insect pests. H. axyridis is becoming increasingly threatened by this insecticide. Here, we investigated the effect of a sublethal dose of beta-cypermethrin on flight, locomotion, respiration, and detoxification system of H. axyridis . After exposure to beta-cypermethrin, succinic female adults flew more times, longer distances, and during longer time periods. Exposure to a sublethal dose of beta-cypermethrin also promoted an increase in walking rate, walking distance, walking duration, and also an increase in respiratory quotient and respiratory rate. To investigate the effects of beta-cypermethrin on H. axyridis detoxification system, we analyzed the transcriptome of H. axyridis adults, focusing on genes related to detoxification systems. De novo assembly generated 65,509 unigenes with a mean length of 799 bp. From these genes, 26,020 unigenes (40.91% of all unigenes) exhibited clear homology to known genes in the NCBI non-redundant database. In addition, 10,402 unigenes were annotated in the Cluster of Orthologous Groups database, 12,088 unigenes were assigned to the Gene Ontology database and 12,269 unigenes were in the Kyoto Encyclopedia of Genes and Genome (KEGG) database. Exposure to beta-cypermethrin had significant effects on the transcriptome profile of H. axyridis adult. Based on uniquely mapped reads, 3,296 unigenes were differentially expressed, 868 unigenes were up-regulated and 2,248 unigenes were down-regulated. We identified differentially-expressed unigenes related to general detoxification systems in H. axyridis . This assembled, annotated transcriptome provides a valuable genomic resource for further understanding the molecular basis of detoxification mechanisms in H. axyridis .
Vila, Javier; Bowman, Joseph D.; Richardson, Lesley; Kincl, Laurel; Conover, Dave L.; McLean, Dave; Mann, Simon; Vecchia, Paolo; van Tongeren, Martie; Cardis, Elisabeth
2016-01-01
Introduction: To date, occupational exposure assessment of electromagnetic fields (EMF) has relied on occupation-based measurements and exposure estimates. However, misclassification due to between-worker variability remains an unsolved challenge. A source-based approach, supported by detailed subject data on determinants of exposure, may allow for a more individualized exposure assessment. Detailed information on the use of occupational sources of exposure to EMF was collected as part of the INTERPHONE-INTEROCC study. To support a source-based exposure assessment effort within this study, this work aimed to construct a measurement database for the occupational sources of EMF exposure identified, assembling available measurements from the scientific literature. Methods: First, a comprehensive literature search was performed for published and unpublished documents containing exposure measurements for the EMF sources identified, a priori as well as from answers of study subjects. Then, the measurements identified were assessed for quality and relevance to the study objectives. Finally, the measurements selected and complementary information were compiled into an Occupational Exposure Measurement Database (OEMD). Results: Currently, the OEMD contains 1624 sets of measurements (>3000 entries) for 285 sources of EMF exposure, organized by frequency band (0 Hz to 300 GHz) and dosimetry type. Ninety-five documents were selected from the literature (almost 35% of them are unpublished technical reports), containing measurements which were considered informative and valid for our purpose. Measurement data and complementary information collected from these documents came from 16 different countries and cover the time period between 1974 and 2013. Conclusion: We have constructed a database with measurements and complementary information for the most common sources of exposure to EMF in the workplace, based on the responses to the INTERPHONE-INTEROCC study questionnaire. This database covers the entire EMF frequency range and represents the most comprehensive resource of information on occupational EMF exposure. It is available at www.crealradiation.com/index.php/en/databases. PMID:26493616
Hein, Misty J.; Waters, Martha A.; Ruder, Avima M.; Stenzel, Mark R.; Blair, Aaron; Stewart, Patricia A.
2010-01-01
Objectives: Occupational exposure assessment for population-based case–control studies is challenging due to the wide variety of industries and occupations encountered by study participants. We developed and evaluated statistical models to estimate the intensity of exposure to three chlorinated solvents—methylene chloride, 1,1,1-trichloroethane, and trichloroethylene—using a database of air measurement data and associated exposure determinants. Methods: A measurement database was developed after an extensive review of the published industrial hygiene literature. The database of nearly 3000 measurements or summary measurements included sample size, measurement characteristics (year, duration, and type), and several potential exposure determinants associated with the measurements: mechanism of release (e.g. evaporation), process condition, temperature, usage rate, type of ventilation, location, presence of a confined space, and proximity to the source. The natural log-transformed measurement levels in the exposure database were modeled as a function of the measurement characteristics and exposure determinants using maximum likelihood methods. Assuming a single lognormal distribution of the measurements, an arithmetic mean exposure intensity level was estimated for each unique combination of exposure determinants and decade. Results: The proportions of variability in the measurement data explained by the modeled measurement characteristics and exposure determinants were 36, 38, and 54% for methylene chloride, 1,1,1-trichloroethane, and trichloroethylene, respectively. Model parameter estimates for the exposure determinants were in the anticipated direction. Exposure intensity estimates were plausible and exhibited internal consistency, but the ability to evaluate validity was limited. Conclusions: These prediction models can be used to estimate chlorinated solvent exposure intensity for jobs reported by population-based case–control study participants that have sufficiently detailed information regarding the exposure determinants. PMID:20418277
Adverse Events Associated with Prolonged Antibiotic Use
Meropol, Sharon B.; Chan, K. Arnold; Chen, Zhen; Finkelstein, Jonathan A.; Hennessy, Sean; Lautenbach, Ebbing; Platt, Richard; Schech, Stephanie D.; Shatin, Deborah; Metlay, Joshua P.
2014-01-01
Purpose The Infectious Diseases Society of America and US CDC recommend 60 days of ciprofloxacin, doxycycline or amoxicillin for anthrax prophylaxis. It is not possible to determine severe adverse drug event (ADE) risks from the few people thus far exposed to anthrax prophylaxis. This study’s objective was to estimate risks of severe ADEs associated with long-term ciprofloxacin, doxycycline and amoxicillin exposure using 3 large databases: one electronic medical record (General Practice Research Database) and two claims databases (UnitedHealthcare, HMO Research Network). Methods We include office visit, hospital admission and prescription data for 1/1/1999–6/30/2001. Exposure variable was oral antibiotic person-days (pds). Primary outcome was hospitalization during exposure with ADE diagnoses: anaphylaxis, phototoxicity, hepatotoxicity, nephrotoxicity, seizures, ventricular arrhythmia or infectious colitis. Results We randomly sampled 999,773, 1,047,496 and 1,819,004 patients from Databases A, B and C respectively. 33,183 amoxicillin, 15,250 ciprofloxacin and 50,171 doxycycline prescriptions continued ≥30 days. ADE hospitalizations during long-term exposure were not observed in Database A. ADEs during long-term amoxicillin were seen only in Database C with 5 ADEs or 1.2(0.4–2.7) ADEs/100,000 pds exposure. Long-term ciprofloxacin showed 3 and 4 ADEs with 5.7(1.2–16.6) and 3.5(1.0–9.0) ADEs/100,000 pds in Databases B and C, respectively. Only Database B had ADEs during long-term doxycycline with 3 ADEs or 0.9(0.2–2.6) ADEs/100,000 pds. For most events, the incidence rate ratio, comparing >28 vs.1–28 pds exposure was <1, showing limited evidence for cumulative dose-related ADEs from long-term exposure. Conclusions Long-term amoxicillin, ciprofloxacin and doxycycline appears safe, supporting use of these medications if needed for large-scale post-exposure anthrax prophylaxis. PMID:18215001
The risky business of being an entomologist: A systematic review.
Stanhope, Jessica; Carver, Scott; Weinstein, Philip
2015-07-01
Adverse work-related health outcomes are a significant problem worldwide. Entomologists, including arthropod breeders, are a unique occupational group exposed to potentially harmful arthropods, pesticides, and other more generic hazards. These exposures may place them at risk of a range of adverse work-related health outcomes. To determine what adverse work-related health outcomes entomologists have experienced, the incidence/prevalence of these outcomes, and what occupational management strategies have been employed by entomologists, and their effectiveness. A systematic search of eight databases was undertaken to identify studies informing the review objectives. Data pertaining to country, year, design, work-exposure, adverse work-related health outcomes, incidence/prevalence of these outcomes, and occupational management strategies were extracted, and reported descriptively. Results showed entomologists experienced work-related allergies, venom reactions, infections, infestations and delusional parasitosis. These related to exposure to insects, arachnids, chilopods and entognathans, and non-arthropod exposures, e.g. arthropod feed. Few studies reported the incidence/prevalence of such conditions, or work-related management strategies utilised by entomologists. There were no studies that specifically investigated the effectiveness of potential management strategies for entomologists as a population. Indeed, critical appraisal analysis indicated poor research quality in this area, which is a significant research gap. Entomologists are a diverse, unique occupational group, at risk of a range of adverse work-related health outcomes. This study represents the first systematic review of their work-related health risks. Future studies investigating the prevalence of adverse work-related health outcomes for entomologists, and the effectiveness of management strategies are warranted to decrease the disease burden of this otherwise understudied group. Copyright © 2015 Elsevier Inc. All rights reserved.
Most of the existing arsenic dietary databases were developed from the analysis of total arsenic in water and dietary samples. These databases have been used to estimate arsenic exposure and in turn human health risk. However, these dietary databases are becoming obsolete as the ...
Quinot, Catherine; Amsellem-Dubourget, Sylvie; Temam, Sofia; Sevin, Etienne; Barreto, Christine; Tackin, Arzu; Félicité, Jérémy; Lyon-Caen, Sarah; Siroux, Valérie; Girard, Raphaële; Descatha, Alexis; Le Moual, Nicole; Dumas, Orianne
2018-05-14
Healthcare workers are highly exposed to various types of disinfectants and cleaning products. Assessment of exposure to these products remains a challenge. We aimed to investigate the feasibility of a method, based on a smartphone application and bar codes, to improve occupational exposure assessment among hospital/cleaning workers in epidemiological studies. A database of disinfectants and cleaning products used in French hospitals, including their names, bar codes and composition, was developed using several sources: ProdHyBase (a database of disinfectants managed by hospital hygiene experts), and specific regulatory agencies and industrial websites. A smartphone application has been created to scan bar codes of products and fill a short questionnaire. The application was tested in a French hospital. The ease of use and the ability to record information through this new approach were estimated. The method was tested in a French hospital (7 units, 14 participants). Through the application, 126 records (one record referred to one product entered by one participant/unit) were registered, majority of which were liquids (55.5%) or sprays (23.8%); 20.6% were used to clean surfaces and 15.9% to clean toilets. Workers used mostly products with alcohol and quaternary ammonium compounds (>90% with weekly use), followed by hypochlorite bleach and hydrogen peroxide (28.6%). For most records, information was available on the name (93.7%) and bar code (77.0%). Information on product compounds was available for all products and recorded in the database. This innovative and easy-to-use method could help to improve the assessment of occupational exposure to disinfectants/cleaning products in epidemiological studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Koh, Dong-Hee; Locke, Sarah J.; Chen, Yu-Cheng; Purdue, Mark P.; Friesen, Melissa C.
2016-01-01
Background Retrospective exposure assessment of occupational lead exposure in population-based studies requires historical exposure information from many occupations and industries. Methods We reviewed published US exposure monitoring studies to identify lead exposure measurement data. We developed an occupational lead exposure database from the 175 identified papers containing 1,111 sets of lead concentration summary statistics (21% area air, 47% personal air, 32% blood). We also extracted ancillary exposure-related information, including job, industry, task/location, year collected, sampling strategy, control measures in place, and sampling and analytical methods. Results Measurements were published between 1940 and 2010 and represented 27 2-digit standardized industry classification codes. The majority of the measurements were related to lead-based paint work, joining or cutting metal using heat, primary and secondary metal manufacturing, and lead acid battery manufacturing. Conclusions This database can be used in future statistical analyses to characterize differences in lead exposure across time, jobs, and industries. PMID:25968240
United States Army Medical Materiel Development Activity: 1997 Annual Report.
1997-01-01
business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was
Petit, Pascal; Bicout, Dominique J; Persoons, Renaud; Bonneterre, Vincent; Barbeau, Damien; Maître, Anne
2017-05-01
Similar exposure groups (SEGs) are needed to reliably assess occupational exposures and health risks. However, the construction of SEGs can turn out to be rather challenging because of the multifactorial variability of exposures. The objective of this study is to put forward a semi-empirical approach developed to construct and implement a SEG database for exposure assessments. An occupational database of airborne levels of polycyclic aromatic hydrocarbons (PAHs) was used as an illustrative and working example. The approach that was developed consisted of four steps. The first three steps addressed the construction and implementation of the occupational Exporisq-HAP database (E-HAP). E-HAP was structured into three hierarchical levels of exposure groups, each of which was based on exposure determinants, along 16 dimensions that represented the sampled PAHs. A fourth step was implemented to identify and generate SEGs using the geometric standard deviation (GSD) of PAH concentrations. E-HAP was restructured into 16 (for 16 sampled PAHs) 3 × 3 matrices: three hierarchical levels of description versus three degrees of dispersion, which included low (the SEG database: GSD ≤ 3), medium (3 < GSD ≤ 6), and high (GSD > 6). Benzo[a]pyrene (BaP) was the least dispersed particulate PAH with 41.5% of groups that could be considered as SEGs, 48.5% of groups of medium dispersion, and only 8% with high dispersion. These results were comparable for BaP, BaP equivalent toxic, or the sum of all carcinogenic PAHs but were different when individual gaseous PAHs or ∑PAHG were chosen. Within the framework of risk assessment, such an approach, based on groundwork studies, allows for both the construction of an SEG database and the identification of exposure groups that require improvements in either the description level or the homogeneity degree toward SEG. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
The Global Earthquake Model - Past, Present, Future
NASA Astrophysics Data System (ADS)
Smolka, Anselm; Schneider, John; Stein, Ross
2014-05-01
The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic Source Models • Ground Motion (Attenuation) Models • Physical Exposure Models • Physical Vulnerability Models • Composite Index Models (social vulnerability, resilience, indirect loss) • Repository of national hazard models • Uniform global hazard model Armed with these tools and databases, stakeholders worldwide will then be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Earthquake hazard information will be able to be combined with data on exposure (buildings, population) and data on their vulnerability, for risk assessment around the globe. Furthermore, for a truly integrated view of seismic risk, users will be able to add social vulnerability and resilience indices and estimate the costs and benefits of different risk management measures. Having finished its first five-year Work Program at the end of 2013, GEM has entered into its second five-year Work Program 2014-2018. Beyond maintaining and enhancing the products developed in Work Program 1, the second phase will have a stronger focus on regional hazard and risk activities, and on seeing GEM products used for risk assessment and risk management practice at regional, national and local scales. Furthermore GEM intends to partner with similar initiatives underway for other natural perils, which together are needed to meet the need for advanced risk assessment methods, tools and data to underpin global disaster risk reduction efforts under the Hyogo Framework for Action #2 to be launched in Sendai/Japan in spring 2015
Occupational exposure to silica in construction workers: a literature-based exposure database.
Beaudry, Charles; Lavoué, Jérôme; Sauvé, Jean-François; Bégin, Denis; Senhaji Rhazi, Mounia; Perrault, Guy; Dion, Chantal; Gérin, Michel
2013-01-01
We created an exposure database of respirable crystalline silica levels in the construction industry from the literature. We extracted silica and dust exposure levels in publications reporting silica exposure levels or quantitative evaluations of control effectiveness published in or after 1990. The database contains 6118 records (2858 of respirable crystalline silica) extracted from 115 sources, summarizing 11,845 measurements. Four hundred and eighty-eight records represent summarized exposure levels instead of individual values. For these records, the reported summary parameters were standardized into a geometric mean and a geometric standard deviation. Each record is associated with 80 characteristics, including information on trade, task, materials, tools, sampling strategy, analytical methods, and control measures. Although the database was constructed in French, 38 essential variables were standardized and translated into English. The data span the period 1974-2009, with 92% of the records corresponding to personal measurements. Thirteen standardized trades and 25 different standardized tasks are associated with at least five individual silica measurements. Trade-specific respirable crystalline silica geometric means vary from 0.01 (plumber) to 0.30 mg/m³ (tunnel construction skilled labor), while tasks vary from 0.01 (six categories, including sanding and electrical maintenance) to 1.59 mg/m³ (abrasive blasting). Despite limitations associated with the use of literature data, this database can be analyzed using meta-analytical and multivariate techniques and currently represents the most important source of exposure information about silica exposure in the construction industry. It is available on request to the research community.
The CTEPP (Children's Total Exposure to Persistent Pesticides and Other Persistent Organic Pollutants) database contains a wealth of data on children's aggregate exposures to pollutants in their everyday surroundings. Chemical analysis data for the environmental media and ques...
Anabolic androgen use in the management of hereditary angioedema: Not so cheap after all.
Tse, Kevin Y; Zuraw, Bruce L; Chen, Qiaoling; Christiansen, Sandra C
2017-04-01
Hereditary angioedema due to C1 inhibitor deficiency (HAE) is a rare, life-threatening disease that imposes a significant burden on affected patients. 17α-alkylated androgens (anabolic androgens) decrease attack frequency and severity but carry the risk of potentially serious dose-related adverse effects. Despite the emergence of targeted therapies for HAE, continued anabolic androgen use has been driven in part by their low cost. To examine the hidden cost of anabolic androgen use related to the risk of developing non-HAE comorbidities. Patients with HAE were identified in the Southern California Kaiser Permanente database using clinical and laboratory findings compatible with HAE. These patients were stratified into anabolic androgen exposed and nonexposed groups. Matched controls were selected from the Kaiser database who did not have HAE or anabolic androgen exposure. Using multivariate analysis, we determined the number of non-HAE comorbidities linked to anabolic androgen use. We next determined the association between dosing and increasing exposure to anabolic androgens and the likelihood of having various comorbidities. Patients with HAE exposed to anabolic androgens had a 28% increase (P = .04) in non-HAE comorbidities when compared with their matched (nonexposed) controls. With each gram per month increase in exposure, a 12% increase in non-HAE comorbidities is observed (P < .01). The most commonly occurring non-HAE comorbidities were psychiatric, muscle cramps, obesity, and hyperlipidemia. Our data suggest that long-term anabolic androgen use enhances the risk of developing comorbid health conditions, thus amplifying the cost of care. Our report provides additional support for the preferred use of newer, targeted therapies for the management of HAE. Copyright © 2017 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Calogero, Rachel M; Jost, John T
2011-02-01
Despite extensive evidence confirming the negative consequences of self-objectification, direct experimental evidence concerning its environmental antecedents is scarce. Incidental exposure to sexist cues was employed in 3 experiments to investigate its effect on self-objectification variables. Consistent with system justification theory, exposure to benevolent and complementary forms of sexism, but not hostile or no sexism, increased state self-objectification, self-surveillance, and body shame among women but not men in Experiment 1. In Experiment 2, we replicated these effects and demonstrated that they are specific to self-objectification and not due to a more general self-focus. In addition, following exposure to benevolent sexism only, women planned more future behaviors pertaining to appearance management than did men; this effect was mediated by self-surveillance and body shame. Experiment 3 revealed that the need to avoid closure might afford women some protection against self-objectification in the context of sexist ideology. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
An Introduction to Database Structure and Database Machines.
ERIC Educational Resources Information Center
Detweiler, Karen
1984-01-01
Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…
Feuerstein, Michael; Huang, Grant D; Ortiz, Jose M; Shaw, William S; Miller, Virginia I; Wood, Patricia M
2003-08-01
An integrated case management (ICM) approach (ergonomic and problem-solving intervention) to work-related upper-extremity disorders was examined in relation to patient satisfaction, future symptom severity, function, and return to work (RTW). Federal workers with work-related upper-extremity disorder workers' compensation claims (n = 205) were randomly assigned to usual care or ICM intervention. Patient satisfaction was assessed after the 4-month intervention period. Questionnaires on clinical outcomes and ergonomic exposure were administered at baseline and at 6- and 12-months postintervention. Time from intervention to RTW was obtained from an administrative database. ICM group assignment was significantly associated with greater patient satisfaction. Regression analyses found higher patient satisfaction levels predicted decreased symptom severity and functional limitations at 6 months and a shorter RTW. At 12 months, predictors of positive outcomes included male gender, lower distress, lower levels of reported ergonomic exposure, and receipt of ICM. Findings highlight the utility of targeting workplace ergonomic and problem solving skills.
Gorman Ng, Melanie; Semple, Sean; Cherrie, John W; Christopher, Yvette; Northage, Christine; Tielemans, Erik; Veroughstraete, Violaine; Van Tongeren, Martie
2012-11-01
Occupational inadvertent ingestion exposure is ingestion exposure due to contact between the mouth and contaminated hands or objects. Although individuals are typically oblivious to their exposure by this route, it is a potentially significant source of occupational exposure for some substances. Due to the continual flux of saliva through the oral cavity and the non-specificity of biological monitoring to routes of exposure, direct measurement of exposure by the inadvertent ingestion route is challenging; predictive models may be required to assess exposure. The work described in this manuscript has been carried out as part of a project to develop a predictive model for estimating inadvertent ingestion exposure in the workplace. As inadvertent ingestion exposure mainly arises from hand-to-mouth contact, it is closely linked to dermal exposure. We present a new integrated conceptual model for dermal and inadvertent ingestion exposure that should help to increase our understanding of ingestion exposure and our ability to simultaneously estimate exposure by the dermal and ingestion routes. The conceptual model consists of eight compartments (source, air, surface contaminant layer, outer clothing contaminant layer, inner clothing contaminant layer, hands and arms layer, perioral layer, and oral cavity) and nine mass transport processes (emission, deposition, resuspension or evaporation, transfer, removal, redistribution, decontamination, penetration and/or permeation, and swallowing) that describe event-based movement of substances between compartments (e.g. emission, deposition, etc.). This conceptual model is intended to guide the development of predictive exposure models that estimate exposure from both the dermal and the inadvertent ingestion pathways. For exposure by these pathways the efficiency of transfer of materials between compartments (for example from surfaces to hands, or from hands to the mouth) are important determinants of exposure. A database of transfer efficiency data relevant for dermal and inadvertent ingestion exposure was developed, containing 534 empirically measured transfer efficiencies measured between 1980 and 2010 and reported in the peer-reviewed and grey literature. The majority of the reported transfer efficiencies (84%) relate to transfer between surfaces and hands, but the database also includes efficiencies for other transfer scenarios, including surface-to-glove, hand-to-mouth, and skin-to-skin. While the conceptual model can provide a framework for a predictive exposure assessment model, the database provides detailed information on transfer efficiencies between the various compartments. Together, the conceptual model and the database provide a basis for the development of a quantitative tool to estimate inadvertent ingestion exposure in the workplace.
EPA scientists have compiled detailed data on human behavior from 22 separate exposure and time-use studies into CHAD. The database includes more than 54,000 individual study days of detailed human behavior.
USE OF EXISTING DATABASES FOR THE PURPOSE OF HAZARD IDENTIFICATION: AN EXAMPLE
Keywords: existing databases, hazard identification, cancer mortality, birth malformations
Background: Associations between adverse health effects and environmental exposures are difficult to study, because exposures may be widespread, low-dose in nature, and common thro...
Applications of GIS and database technologies to manage a Karst Feature Database
Gao, Y.; Tipping, R.G.; Alexander, E.C.
2006-01-01
This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.
Gamba, P.; Cavalca, D.; Jaiswal, K.S.; Huyck, C.; Crowley, H.
2012-01-01
In order to quantify earthquake risk of any selected region or a country of the world within the Global Earthquake Model (GEM) framework (www.globalquakemodel.org/), a systematic compilation of building inventory and population exposure is indispensable. Through the consortium of leading institutions and by engaging the domain-experts from multiple countries, the GED4GEM project has been working towards the development of a first comprehensive publicly available Global Exposure Database (GED). This geospatial exposure database will eventually facilitate global earthquake risk and loss estimation through GEM’s OpenQuake platform. This paper provides an overview of the GED concepts, aims, datasets, and inference methodology, as well as the current implementation scheme, status and way forward.
Negative Effects of Learning Spreadsheet Management on Learning Database Management
ERIC Educational Resources Information Center
Vágner, Anikó; Zsakó, László
2015-01-01
A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…
Rattner, B.A.; Ackerson, B.K.; Eisenreich, K.M.; McKernan, M.A.; Harmon, David
2006-01-01
The Biomonitoring of Environmental Status and Trends (BEST) Project of the U.S. Geological Survey (USGS) assesses the exposure and effects of environmental contaminants on select species and habitats in the United States. One of the many BEST Project activities entails the development of decision-support tools to assist in the identification of chemical threats to species and lands under the stewardship of the Department of the Interior. Although there are many ecotoxicological monitoring programs that focus on aquatic species and habitats, there are currently no large-scale efforts that are focused on terrestrial vertebrates in the United States. Nonetheless, organochlorine contaminants, metals, and new pollutants continue to pose hazards to terrestrial vertebrates at many spatial scales (ranging from small hazardous-waste-site point sources to entire watersheds). To evaluate and prioritize pollutant hazards for terrestrial vertebrates, a ?Contaminant Exposure and EffectsTerrestrial Vertebrates? (CEE-TV) database (www.pwrc.usgs.gov/contaminants-online) was developed. The CEE-TV database has been used to conduct simple searches for exposure and biological effects information for a given species or location, identification of temporal contaminant exposure trends, information gap analyses for national wildlife refuge and national park units, and ranking of terrestrial vertebrate ecotoxicological information needs based on data density and water quality problems. Despite widespread concerns about environmental contamination, during the past decade only about one-half of the coastal National Park units appear to have terrestrial vertebrate ecotoxicological data. Based upon known environmental contaminant hazards, it is recommended that regionalized monitoring programs or efforts focused on lands managed by the Department of the Interior should be undertaken to prevent serious natural resource problems.
Caron, Alexandre; Clement, Guillaume; Heyman, Christophe; Aernout, Eva; Chazard, Emmanuel; Le Tertre, Alain
2015-01-01
Incompleteness of epidemiological databases is a major drawback when it comes to analyzing data. We conceived an epidemiological study to assess the association between newborn thyroid function and the exposure to perchlorates found in the tap water of the mother's home. Only 9% of newborn's exposure to perchlorate was known. The aim of our study was to design, test and evaluate an original method for imputing perchlorate exposure of newborns based on their maternity of birth. In a first database, an exhaustive collection of newborn's thyroid function measured during a systematic neonatal screening was collected. In this database the municipality of residence of the newborn's mother was only available for 2012. Between 2004 and 2011, the closest data available was the municipality of the maternity of birth. Exposure was assessed using a second database which contained the perchlorate levels for each municipality. We computed the catchment area of every maternity ward based on the French nationwide exhaustive database of inpatient stay. Municipality, and consequently perchlorate exposure, was imputed by a weighted draw in the catchment area. Missing values for remaining covariates were imputed by chained equation. A linear mixture model was computed on each imputed dataset. We compared odds ratios (ORs) and 95% confidence intervals (95% CI) estimated on real versus imputed 2012 data. The same model was then carried out for the whole imputed database. The ORs estimated on 36,695 observations by our multiple imputation method are comparable to the real 2012 data. On the 394,979 observations of the whole database, the ORs remain stable but the 95% CI tighten considerably. The model estimates computed on imputed data are similar to those calculated on real data. The main advantage of multiple imputation is to provide unbiased estimate of the ORs while maintaining their variances. Thus, our method will be used to increase the statistical power of future studies by including all 394,979 newborns.
THE NATIONAL EXPOSURE RESEARCH LABORATORY'S CONSOLIDATED HUMAN ACTIVITY DATABASE
EPA's National Exposure Research Laboratory (NERL) has combined data from 12 U.S. studies related to human activities into one comprehensive data system that can be accessed via the Internet. The data system is called the Consolidated Human Activity Database (CHAD), and it is ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT; Notice of Negative... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management Division). The negative determination was issued on August 19, 2011...
Exploring Global Exposure Factors Resources for Use in Consumer Exposure Assessments.
Zaleski, Rosemary T; Egeghy, Peter P; Hakkinen, Pertti J
2016-07-22
This publication serves as a global comprehensive resource for readers seeking exposure factor data and information relevant to consumer exposure assessment. It describes the types of information that may be found in various official surveys and online and published resources. The relevant exposure factors cover a broad range, including general exposure factor data found in published compendia and databases and resources about specific exposure factors, such as human activity patterns and housing information. Also included are resources on exposure factors related to specific types of consumer products and the associated patterns of use, such as for a type of personal care product or a type of children's toy. Further, a section on using exposure factors for designing representative exposure scenarios is included, along with a look into the future for databases and other exposure science developments relevant for consumer exposure assessment.
Exploring Global Exposure Factors Resources for Use in Consumer Exposure Assessments
Zaleski, Rosemary T.; Egeghy, Peter P.; Hakkinen, Pertti J.
2016-01-01
This publication serves as a global comprehensive resource for readers seeking exposure factor data and information relevant to consumer exposure assessment. It describes the types of information that may be found in various official surveys and online and published resources. The relevant exposure factors cover a broad range, including general exposure factor data found in published compendia and databases and resources about specific exposure factors, such as human activity patterns and housing information. Also included are resources on exposure factors related to specific types of consumer products and the associated patterns of use, such as for a type of personal care product or a type of children’s toy. Further, a section on using exposure factors for designing representative exposure scenarios is included, along with a look into the future for databases and other exposure science developments relevant for consumer exposure assessment. PMID:27455300
Exploring consumer exposure pathways and patterns of use for chemicals in the environment through the Chemical/Product Categories Database (CPCat) (Presented by: Kathie Dionisio, Sc.D., NERL, US EPA, Research Triangle Park, NC (1/23/2014).
THE NATIONAL EXPOSURE RESEARCH LABORATORY'S COMPREHENSIVE HUMAN ACTIVITY DATABASE
EPA's National Exposure Research Laboratory (NERL) has combined data from nine U.S. studies related to human activities into one comprehensive data system that can be accessed via the world-wide web. The data system is called CHAD-Consolidated Human Activity Database-and it is ...
TWRS technical baseline database manager definition document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acree, C.D.
1997-08-13
This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.
Exploring Global Exposure Factors Resources for Use in ...
This publication serves as a global comprehensive resource for readers seeking exposure factor data and information relevant to consumer exposure assessment. It describes the types of information that may be found in various official surveys and online and published resources. The relevant exposure factors cover a broad range, including general exposure factor data found in published compendia and databases and resources about specific exposure factors, such as human activity patterns and housing information. Also included are resources on exposure factors related to specific types of consumer products and the associated patterns of use, such as for a type of personal care product or a type of children’s toy. Further, a section on using exposure factors for designing representative exposure scenarios is included, along with a look into the future for databases and other exposure science developments relevant for consumer exposure assessment. Review article in the International Journal of Environmental Research and Public Health
[Quality management and participation into clinical database].
Okubo, Suguru; Miyata, Hiroaki; Tomotaki, Ai; Motomura, Noboru; Murakami, Arata; Ono, Minoru; Iwanaka, Tadashi
2013-07-01
Quality management is necessary for establishing useful clinical database in cooperation with healthcare professionals and facilities. The ways of management are 1) progress management of data entry, 2) liaison with database participants (healthcare professionals), and 3) modification of data collection form. In addition, healthcare facilities are supposed to consider ethical issues and information security for joining clinical databases. Database participants should check ethical review boards and consultation service for patients.
Total Human Exposure Risk Database and Advance Simulaiton Environment
THERdbASE is no longer supported by EPA and is no longer available as download.
THERdbASE is a collection of databases and models that are useful to assist in conducting assessments of human exposure to chemical pollutants, especial...
Consumer products are a primary source of chemical exposures, yet little structured information is available on the chemical ingredients of these products and the concentrations at which ingredients are present. To address this data gap, we created a database of chemicals in cons...
Exposure Modeling Tools and Databases for Consideration for Relevance to the Amended TSCA (ISES)
The Agency’s Office of Research and Development (ORD) has a number of ongoing exposure modeling tools and databases. These efforts are anticipated to be useful in supporting ongoing implementation of the amended Toxic Substances Control Act (TSCA). Under ORD’s Chemic...
AAPCC database characterization of native U.S. venomous snake exposures, 2001-2005.
Seifert, Steven A; Boyer, Leslie V; Benson, Blaine E; Rogers, Jody J
2009-04-01
Differences in victim demographics, clinical effects, managements, and outcomes among native viperid (rattlesnake, copperhead, and cottonmouth) and elapid (coral snake) species have not been systematically characterized. The database of the American Association of Poison Control Centers from 2001 through 2005 was analyzed. Between 2001 and 2005, there were 23,676 human exposures (average = 4,735/year) to native venomous snakes in the United States reported to U.S. poison centers in all states except Hawaii: 98% were to viperid snakes and 2% to elapids. Overall, 77% of victims were male, 70% were adults >20 years, and 12% were aged less than 10 years. Sixty-five cases involved pregnant women, with rattlesnake bites resulting in moderate or greater effects in over 70%. The overall hospital admission rate was 53%. Outcomes were generally more severe with rattlesnake and copperhead envenomations and in children <6 years of age. The fatality rate of reported cases was 0.06%. Native U.S. venomous snakebite results in considerable morbidity and mortality. Rattlesnake and copperhead envenomations, and those in children <6 years of age, produce the most severe outcomes, but coral snakebites result in similar hospital admission rates.
Human Exposure Modeling - Databases to Support Exposure Modeling
Human exposure modeling relates pollutant concentrations in the larger environmental media to pollutant concentrations in the immediate exposure media. The models described here are available on other EPA websites.
Fähnrich, C; Denecke, K; Adeoye, O O; Benzler, J; Claus, H; Kirchner, G; Mall, S; Richter, R; Schapranow, M P; Schwarz, N; Tom-Aba, D; Uflacker, M; Poggensee, G; Krause, G
2015-03-26
In the context of controlling the current outbreak of Ebola virus disease (EVD), the World Health Organization claimed that 'critical determinant of epidemic size appears to be the speed of implementation of rigorous control measures', i.e. immediate follow-up of contact persons during 21 days after exposure, isolation and treatment of cases, decontamination, and safe burials. We developed the Surveillance and Outbreak Response Management System (SORMAS) to improve efficiency and timeliness of these measures. We used the Design Thinking methodology to systematically analyse experiences from field workers and the Ebola Emergency Operations Centre (EOC) after successful control of the EVD outbreak in Nigeria. We developed a process model with seven personas representing the procedures of EVD outbreak control. The SORMAS system architecture combines latest In-Memory Database (IMDB) technology via SAP HANA (in-memory, relational database management system), enabling interactive data analyses, and established SAP cloud tools, such as SAP Afaria (a mobile device management software). The user interface consists of specific front-ends for smartphones and tablet devices, which are independent from physical configurations. SORMAS allows real-time, bidirectional information exchange between field workers and the EOC, ensures supervision of contact follow-up, automated status reports, and GPS tracking. SORMAS may become a platform for outbreak management and improved routine surveillance of any infectious disease. Furthermore, the SORMAS process model may serve as framework for EVD outbreak modeling.
Complications of oral exposure to fentanyl transdermal delivery system patches.
Prosser, Jane M; Jones, Brent E; Nelson, Lewis
2010-12-01
Fentanyl is a synthetic opioid available therapeutically as an intravenous, transbucal, or transdermal preparation. It is also used as a drug of abuse through a variety of different methods, including the oral abuse of transdermal fentanyl patches. This is a series of patients with oral fentanyl patch exposure reported to our center and represents the first series of oral fentanyl patch exposures collected outside of the postmortem setting. In this series, we examined the New York Poison Control Center database for all cases of oral abuse of fentanyl reported between January 2000 and April 2008. Twenty cases were reported, nine were asymptomatic or had symptoms of opioid withdrawal; 11 had symptoms of opioid intoxication. Eight patients were administered naloxone and all showed improvement in clinical status. Only one case resulted in a confirmed fatality-this patient had an orally adherent patch discovered at intubation. Oral exposure may result in life-threatening toxicity. Patients should be closely assessed and monitored for the opioid toxidrome, and if symptomatic, should be managed with opioid antagonists and ventilatory support.
Short Fiction on Film: A Relational DataBase.
ERIC Educational Resources Information Center
May, Charles
Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2012 CFR
2012-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2013 CFR
2013-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2011 CFR
2011-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
Microcomputer Database Management Systems for Bibliographic Data.
ERIC Educational Resources Information Center
Pollard, Richard
1986-01-01
Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)
The Data Base and Decision Making in Public Schools.
ERIC Educational Resources Information Center
Hedges, William D.
1984-01-01
Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…
de Jonge, Linda; Garne, Ester; Gini, Rosa; Jordan, Susan E; Klungsoyr, Kari; Loane, Maria; Neville, Amanda J; Pierini, Anna; Puccini, Aurora; Thayer, Daniel S; Tucker, David; Vinkel Hansen, Anne; Bakker, Marian K
2015-11-01
Research on associations between medication use during pregnancy and congenital anomalies is significative for assessing the safe use of a medicine in pregnancy. Congenital anomaly (CA) registries do not have optimal information on medicine exposure, in contrast to prescription databases. Linkage of prescription databases to the CA registries is a potentially effective method of obtaining accurate information on medicine use in pregnancies and the risk of congenital anomalies. We linked data from primary care and prescription databases to five European Surveillance of Congenital Anomalies (EUROCAT) CA registries. The linkage was evaluated by looking at linkage rate, characteristics of linked and non-linked cases, first trimester exposure rates for six groups of medicines according to the prescription data and information on medication use registered in the CA databases, and agreement of exposure. Of the 52,619 cases registered in the CA databases, 26,552 could be linked. The linkage rate varied between registries over time and by type of birth. The first trimester exposure rates and the agreements between the databases varied for the different medicine groups. Information on anti-epileptic drugs and insulins and analogue medicine use recorded by CA registries was of good quality. For selective serotonin reuptake inhibitors, anti-asthmatics, antibacterials for systemic use, and gonadotropins and other ovulation stimulants, the recorded information was less complete. Linkage of primary care or prescription databases to CA registries improved the quality of information on maternal use of medicines in pregnancy, especially for medicine groups that are less fully registered in CA registries.
Oakes, Jennifer; Seifert, Steven
2008-12-01
Tilmicosin is a veterinary antibiotic with significant human toxicity at doses commonly used in animals, but the parenteral dose-response relationship has not been well characterized. Human exposures to tilmicosin in the database of the American Association of Poison Control Centers (AAPCC) from 2001 to 2005 were analyzed for demographic associations, exposure dose, clinical effects and outcomes. Over the 5-year period, there were 1,291 single-substance human exposures to tilmicosin. The mean age was 39.1 years, and 80% were male. By route there were 768 (54%) parenteral exposures. Patients with parenteral exposures had a significantly increased likelihood of being seen at a healthcare facility, admission, and admission to an ICU. With nonparenteral exposure, most had no clinical effects or minor effects, and there were no major effects or deaths. With parenteral exposure, moderate effects occurred in 46 (6%), major effects in 2 (0.3%) and there were 4 (0.5%) deaths, two of which were suicides. A dose-response relationship could be demonstrated. Clinical effect durations of up to a week occurred at even the lowest dose range. Over 250 cases of human tilmicosin exposure are reported to poison centers per year and over 150 of those are parenteral. Most exposures produce no or minor effects, but fatalities have occurred with parenteral exposure. The case fatality rate in parenteral exposures is 10 times the case fatality rate for all human exposures in the AAPCC database. Significant adverse and prolonged effects are reported at parenteral doses > 0.5 mL, suggesting that all parenteral exposures should be referred for healthcare facility evaluation.
Coleman, Jennifer A; Delahanty, Douglas L; Schwartz, Joseph; Murani, Kristina; Brondolo, Elizabeth
2016-11-01
Prior research has examined the incidence of posttraumatic stress stemming from either direct or indirect trauma exposure in employees of high-risk occupations. However, few studies have examined the contribution of both direct and indirect trauma exposure in high-risk groups. One particularly salient indirect trauma often endorsed as the most stressful by many occupational groups is interacting with distressed family members of victims of crime, illness, or accidents. The present study examined the extent to which interacting with distressed families moderated the impact of cumulative potentially traumatic event (PTE) exposure on depression and posttraumatic stress disorder (PTSD) symptoms in 245 employees of medical examiner (ME) offices. Employees from 9 ME office sites in the United States participated in an online survey investigating the frequency of work place PTE exposures (direct and indirect) and mental health outcomes. Results revealed that cumulative PTE exposure was associated with higher PTSD symptoms (PTSS) for employees who had higher frequency of exposure to distressed family members. After controlling for cumulative and direct PTE exposure, gender, and office site, exposure to distressed families was significantly associated with depressive symptoms, but not PTSS. Findings of our research underscore the need for training employees in high-risk occupations to manage their reactions to exposure to distraught family members. Employee training may buffer risk for developing PTSD and depression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2014 CFR
2014-10-01
... individual database managers; and to perform other functions as needed for the administration of the TV bands... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers...
Coppola, Danielle; Russo, Leo J; Kwarta, Robert F; Varughese, Ruana; Schmider, Juergen
2007-01-01
A significant number of women of childbearing age have schizophrenia or other psychoses. This means that there is a considerable risk of in utero exposure to risperidone due to maternal use. To determine whether in utero exposure to the atypical antipsychotic risperidone is associated with poor pregnancy and fetal/neonatal outcomes. A search of the Benefit Risk Management Worldwide Safety database, using a selection of preferred terms from the Medical Dictionary of Regulatory Activities, was performed to identify all cases of pregnancy or fetal/neonatal outcomes reported in association with risperidone treatment from its first market launch (international birth date, 1 June 1993) to 31 December 2004. The main measures were the patterns and reporting rates of pregnancy (stillbirth and spontaneous and induced abortion) and fetal/neonatal outcomes (congenital abnormalities, perinatal syndromes and withdrawal symptoms) for women administered risperidone during pregnancy. Overall, 713 pregnancies were identified in women who were receiving risperidone. Data were considered prospective in 516 of these, and retrospective in the remaining 197 cases. The majority of the known adverse pregnancy and fetal/neonatal outcomes were retrospectively reported. Of the 68 prospectively reported pregnancies with a known outcome, organ malformations and spontaneous abortions occurred 3.8% and 16.9% (when the 15 induced abortions were excluded from the denominator, as they were predominantly undertaken for nonmedical reasons), respectively, a finding consistent with background rates of the general population. There were 12 retrospectively reported pregnancies involving major organ malformations, the most frequently reported of which affected the heart, brain, lip and/or palate. There were 37 retrospectively reported pregnancies involving perinatal syndromes, of which 21 cases involved behavioural or motor disorders. In particular, there was a cluster of cases reporting tremor, jitteriness, irritability, feeding problems and somnolence, which may represent a withdrawal-emergent syndrome. This comprehensive review of the Benefit Risk Management Worldwide Safety database for case reports of risperidone exposure during pregnancy represents the largest ever published dataset documenting pregnancy outcomes for women taking the atypical antipsychotic risperidone. It indicates that in utero exposure to risperidone does not appear to increase the risk of spontaneous abortions, structural malformations and fetal teratogenic risk above that of the general population. Self-limited extrapyramidal effects in neonates were observed after maternal exposure to risperidone during the third trimester of pregnancy. Risperidone should only be used during pregnancy if the benefits outweigh the potential risks.
Database Management Systems: New Homes for Migrating Bibliographic Records.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Bierbaum, Esther G.
1987-01-01
Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.
ERIC Educational Resources Information Center
Pieska, K. A. O.
1986-01-01
Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
Ruttenber, A J; McCrea, J S; Wade, T D; Schonbeck, M F; LaMontagne, A D; Van Dyke, M V; Martyny, J W
2001-02-01
We outline methods for integrating epidemiologic and industrial hygiene data systems for the purpose of exposure estimation, exposure surveillance, worker notification, and occupational medicine practice. We present examples of these methods from our work at the Rocky Flats Plant--a former nuclear weapons facility that fabricated plutonium triggers for nuclear weapons and is now being decontaminated and decommissioned. The weapons production processes exposed workers to plutonium, gamma photons, neutrons, beryllium, asbestos, and several hazardous chemical agents, including chlorinated hydrocarbons and heavy metals. We developed a job exposure matrix (JEM) for estimating exposures to 10 chemical agents in 20 buildings for 120 different job categories over a production history spanning 34 years. With the JEM, we estimated lifetime chemical exposures for about 12,000 of the 16,000 former production workers. We show how the JEM database is used to estimate cumulative exposures over different time periods for epidemiological studies and to provide notification and determine eligibility for a medical screening program developed for former workers. We designed an industrial hygiene data system for maintaining exposure data for current cleanup workers. We describe how this system can be used for exposure surveillance and linked with the JEM and databases on radiation doses to develop lifetime exposure histories and to determine appropriate medical monitoring tests for current cleanup workers. We also present time-line-based graphical methods for reviewing and correcting exposure estimates and reporting them to individual workers.
Construction of databases: advances and significance in clinical research.
Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian
2015-12-01
Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.
[The future of clinical laboratory database management system].
Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y
1999-09-01
To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
How to handle 6GBytes a night and not get swamped
NASA Technical Reports Server (NTRS)
Allsman, R.; Alcock, C.; Axelrod, T.; Bennett, D.; Cook, K.; Park, H.-S.; Griest, K.; Marshall, S.; Perlmutter, S.; Stubbs, C.
1992-01-01
The Macho Project has undertaken a 5 year effort to search for dark matter in the halo of the Galaxy by scanning the Magellanic Clouds for micro-lensing events. Each evening's raw image data will be reduced in real-time into the observed stars' photometric measurements. The actual search for micro-lensing events will be a post-processing operation. The theoretical prediction of the rate of such events necessitates the collection of a large number of repeated exposures. The project designed camera subsystem delivers 64 Mbytes per exposure with exposures typically occurring every 500 seconds. An ideal evening's observing will provide 6 Gbytes of raw image data and 40 Mbytes of reduced photometric measurements. Recognizing the difficulty of digging out from a snowballing cascade of raw data, the project requires the real-time reduction of each evening's data. The software team's implementation strategy centered on this non-negotiable mandate. Accepting the reality that 2 full time people needed to implement the core real-time control and data management system within 6 months, off-the-shelf vendor components were explored to provide quick solutions to the classic needs for file management, data management, and process control. Where vendor solutions were lacking, state-of-the-art models were used for hand tailored subsystems. In particular, petri nets manage process control, memory mapped bulletin boards provide interprocess communication between the multi-tasked processes, and C++ class libraries provide memory mapped, disk resident databases. The differences between the implementation strategy and the final implementation reality are presented. The necessity of validating vendor product claims are explored. Both the successful and hindsight decisions enabling the collection and processing of the nightly data barrage are reviewed.
ERIC Educational Resources Information Center
Freeman, Carla; And Others
In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...
An Examination of Selected Software Testing Tools: 1992
1992-12-01
Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows
Database Searching by Managers.
ERIC Educational Resources Information Center
Arnold, Stephen E.
Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…
Barlow, J H; Ellard, D R; Hainsworth, J M; Jones, F R; Fisher, A
2005-04-01
To review current evidence for the clinical and cost-effectiveness of self-management interventions for panic disorder, phobias and obsessive-compulsive disorder (OCD). Papers were identified through computerized searches of databases for the years between 1995 and 2003, manual searches and personal contacts. Only randomized-controlled trials were reviewed. Ten studies were identified (one OCD, five panic disorder, four phobias). Effective self-management interventions included cognitive-behavioural therapy (CBT) and exposure to the trigger stimuli for phobias and panic disorders. All involved homework. There was evidence of effectiveness in terms of improved symptoms and psychological wellbeing when compared with standard care, waiting list or relaxation. Brief interventions and computer-based interventions were effective for most participants. In terms of quality, studies were mainly based on small samples, lacked long-term follow-up, and failed to address cost-effectiveness. Despite the limitations of reviewed studies, there appears to be sufficient evidence to warrant greater exploration of self-management in these disorders. Copyright 2005 Blackwell Munksgaard.
Exploring consumer exposure pathways and patterns of use for chemicals in the environment.
Dionisio, Kathie L; Frame, Alicia M; Goldsmith, Michael-Rock; Wambaugh, John F; Liddell, Alan; Cathey, Tommy; Smith, Doris; Vail, James; Ernstoff, Alexi S; Fantke, Peter; Jolliet, Olivier; Judson, Richard S
2015-01-01
Humans are exposed to thousands of chemicals in the workplace, home, and via air, water, food, and soil. A major challenge in estimating chemical exposures is to understand which chemicals are present in these media and microenvironments. Here we describe the Chemical/Product Categories Database (CPCat), a new, publically available (http://actor.epa.gov/cpcat) database of information on chemicals mapped to "use categories" describing the usage or function of the chemical. CPCat was created by combining multiple and diverse sources of data on consumer- and industrial-process based chemical uses from regulatory agencies, manufacturers, and retailers in various countries. The database uses a controlled vocabulary of 833 terms and a novel nomenclature to capture and streamline descriptors of chemical use for 43,596 chemicals from the various sources. Examples of potential applications of CPCat are provided, including identifying chemicals to which children may be exposed and to support prioritization of chemicals for toxicity screening. CPCat is expected to be a valuable resource for regulators, risk assessors, and exposure scientists to identify potential sources of human exposures and exposure pathways, particularly for use in high-throughput chemical exposure assessment.
A retrospective cohort study of Parkinson's disease in Korean shipbuilders.
Park, Jungsun; Yoo, Cheol-In; Sim, Chang Sun; Kim, Jae Woo; Yi, Yunjeong; Shin, Yong Chul; Kim, Dae-Hyun; Kim, Yangho
2006-05-01
We performed a retrospective cohort study in South Korea to clarify the role of occupational exposure, especially to welding, in the etiology of Parkinson's disease (PD). We constructed a database of subjects classified into an exposure group (blue-collar workers) and a non-exposure group (white-collar workers) in two shipbuilding companies. Jobs of blue-collar workers were categorized into the first group of welding, the second group of fitting, grinding and finishing, cutting, and the other group. To determine new cases of PD during the follow-up period (1992-2003), we used the physician billing claims database of the National Health Insurance Corporation. For the detected PD patients in the physician billing claims database, a neurologist in our research team confirmed the appropriateness of each diagnosis by reviewing medical charts. Based on the review, we confirmed the numbers of new cases of PD and calculated the relative risk (RR) and the 95% confidence intervals (CI) by Cox regression analysis. In a backward selection procedure, 'age' was a significant independent variable but exposure was not. Furthermore, the RR in welders (high exposure group) was also insignificant and less than that in others (very low exposure group). This longitudinal study of shipbuilding workers supports our previous case-control studies suggesting that exposure to manganese does not increase the risk of PD.
Eisenstein, Neil; Kasavkar, Ganesh; Bhavsar, Dhruva; Khan, Faisal Shehzaad; Paskins, Zoe
2017-01-23
Atypical femoral fractures (AFFs) are rare events associated with increased duration of bisphosphonate exposure. Recommended management of AFFs include cessation of bisphosphonates and imaging of the contralateral femur. The aims of this study were to identify the local incidence of AFFs in bisphosphonate users and to audit the medical management of AFFs against published recommendations. A retrospective analysis of the admissions database for a major trauma centre identified all femoral fractures (3150) in a five-year period (July 2009 to June 2014). Electronic health records and radiographs were reviewed using the 2013 American Society for Bone and Mineral Research (ASBMR) diagnostic criteria for AFF to establish the number of cases. To estimate incidence, the total number of bisphosphonate users was derived from primary care prescription and secondary care day-case records. Medical management of cases with AFF on bisphosphonates was audited against guidance from ASBMR and Medicines & Healthcare Products Regulatory Agency. 10 out of 3150 femoral fractures met criteria for AFF; 7 of these patients had a history of exposure to bisphosphonates (6 oral, 1 intravenous). There were 19.1 AFFs per 100,000 years of bisphosphonate use in our region. Bisphosphonates were stopped and the contralateral femur imaged in only 2 of the 7 patients treated with bisphosphonates. Our local incidence is in line with published figures; however, this is the first published evidence suggesting that medical management and identification of AFF may be suboptimal. Managing these patients remains challenging due to their rarity and possible lack of awareness.
Generalized Database Management System Support for Numeric Database Environments.
ERIC Educational Resources Information Center
Dominick, Wayne D.; Weathers, Peggy G.
1982-01-01
This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…
Exploring consumer pathways and patterns of use for ...
Background: Humans may be exposed to thousands of chemicals through contact in the workplace, home, and via air, water, food, and soil. A major challenge is estimating exposures to these chemicals, which requires understanding potential exposure routes directly related to how chemicals are used. Objectives: We aimed to assign “use categories” to a database of chemicals, including ingredients in consumer products, to help prioritize which chemicals will be given more scrutiny relative to human exposure potential and target populations. The goal was to identify (a) human activities that result in increased chemical exposure while (b) simplifying the dimensionality of hazard assessment for risk characterization. Methods: Major data sources on consumer- and industrial-process based chemical uses were compiled from multiple countries, including from regulatory agencies, manufacturers, and retailers. The resulting categorical chemical use and functional information are presented through the Chemical/Product Categories Database (CPCat). Results: CPCat contains information on 43,596 unique chemicals mapped to 833 terms categorizing their usage or function. Examples presented demonstrate potential applications of the CPCat database, including the identification of chemicals to which children may be exposed (including those that are not identified on product ingredient labels), and prioritization of chemicals for toxicity screening. The CPCat database is availabl
Tong, Shu-Hui; Liu, Yi-Ting; Liu, Yang
2013-02-01
To investigate the association between paternal exposure to occupational electromagnetic radiation and the sex ratio of the offspring. We searched various databases, including PubMed, Embase, Cochrane Library, OVID, Bioscience Information Service (BIOSIS), China National Knowledge Infrastructure, VIP Database for Chinese Technical Periodicals and Wanfang Database, for the literature relevant to the association of paternal exposure to occupational electromagnetic radiation with the sex ratio of the offspring. We conducted a meta-analysis on their correlation using Stata 11.0. There was no statistically significant difference in the sex ratio between the offspring with paternal exposure to occupational electromagnetic radiation and those without (pooled OR = 1.00 [95% CI: 0.95 -1.05], P = 0.875). Subgroup analysis of both case-control and cohort studies revealed no significant difference (pooled OR = 1.03 [95% CI: 0.99 -1.08], P = 0.104 and pooled OR = 0.98 [95% CI: 0.99 -1.08], P = 0.186, respectively). Paternal exposure to occupational electromagnetic radiation is not correlated with the sex ratio of the offspring.
Data Model for Multi Hazard Risk Assessment Spatial Support Decision System
NASA Astrophysics Data System (ADS)
Andrejchenko, Vera; Bakker, Wim; van Westen, Cees
2014-05-01
The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The data model includes data-structures for CBA and SMCE. The model is at the stage where risk and cost-benefit calculations can be stored but the remaining part is currently under development. Multi-criteria information, user management and the relation of these with the rest of the model is our next step. Having a carefully designed data model plays a crucial role in the development of the whole system for rapid development, keeping the data consistent, and in the end, support the end-user in making good decisions in risk-reduction measures related to multiple natural hazards. This work is part of the EU FP7 Marie Curie ITN "CHANGES"project (www.changes-itn.edu)
Building Databases for Education. ERIC Digest.
ERIC Educational Resources Information Center
Klausmeier, Jane A.
This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…
The purpose of this SOP is to describe the database storage organization, as well as describe the sources of data for each database used during the Arizona NHEXAS project and the "Border" study. Keywords: data; database; organization.
The National Human Exposure Assessment Sur...
Bujak-Pietrek, Stella; Mikołajczyk, Urszula; Szadkowska-Stańczyk, Irena; Stroszejn-Mrowca, Grazyna
2008-01-01
To evaluate occupational exposure to dusts, the Nofer Institute of Occupational Medicine in Łódź, in collaboration with the Chief Sanitary Inspectorate, has developed the national database to store the results of routine dust exposure measurements performed by occupational hygiene and environmental protection laboratories in Poland in the years 2001-2005. It was assumed that the collected information will be useful in analyzing workers' exposure to free crystalline silica (WKK)-containing dusts in Poland, identyfing exceeded hygiene standards and showing relevant trends, which illustrate the dynamics of exposure in the years under study. Inhalable and respirable dust measurement using personal dosimetry were done according to polish standard PN-91/Z-04030/05 and PN-91/Z-04030/06. In total, 148 638 measurement records, provided by sanitary inspection services from all over Poland, were entered into the database. The database enables the estimation of occupational exposure to dust by the sectors of national economy, according to the Polish Classification of Activity (PKD) and by kinds of dust. The highest exposure level of inhalable and respirable dusts was found in coal mining. Also in this sector, almost 60% of surveys demonstrated exceeded current hygiene standards. High concentrations of both dust fractions (inhalable and respirable) and a considerable percentage of measurements exceeding hygiene standards were found in the manufacture of transport equipment (except for cars), as well as in the chemical, mining (rock, sand, gravel, clay mines) and construction industries. The highest percentage of surveys (inhalable and respirable dust) showing exceeded hygiene standards were observed for coal dust with different content of crystalline silica, organic dust containing more than 10% of SiO2, and highly fibrosis dust containing more than 50% of SiO2.
ERIC Educational Resources Information Center
American Society for Information Science, Washington, DC.
This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…
The Network Configuration of an Object Relational Database Management System
NASA Technical Reports Server (NTRS)
Diaz, Philip; Harris, W. C.
2000-01-01
The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bower, J.C.; Burford, M.J.; Downing, T.R.
The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less
ANALYSIS OF HUMAN ACTIVITY DATA FOR USE IN MODELING ENVIRONMENTAL EXPOSURES
Human activity data are a critical part of exposure models being developed by the US EPA's National Exposure Research Laboratory (NERL). An analysis of human activity data within NERL's Consolidated Human Activity Database (CHAD) was performed in two areas relevant to exposure ...
EXPOSURE TO PESTICIDES BY MEDIUM AND ROUTE: THE 90TH PERCENTILE AND RELATED UNCERTAINTIES
This study investigates distributions of exposure to chlorpyrifos and diazinon using the database generated in the state of Arizona by the National Human Exposure Assessment Survey (NHEXAS-AZ). Exposure to pesticide and associated uncertainties are estimated using probabilistic...
[Selected aspects of computer-assisted literature management].
Reiss, M; Reiss, G
1998-01-01
We want to report about our own experiences with a database manager. Bibliography database managers are used to manage information resources: specifically, to maintain a database to references and create bibliographies and reference lists for written works. A database manager allows to enter summary information (record) for articles, book sections, books, dissertations, conference proceedings, and so on. Other features that may be included in a database manager include the ability to import references from different sources, such as MEDLINE. The word processing components allow to generate reference list and bibliographies in a variety of different styles, generates a reference list from a word processor manuscript. The function and the use of the software package EndNote 2 for Windows are described. Its advantages in fulfilling different requirements for the citation style and the sort order of reference lists are emphasized.
Life After Press: The Role of the Picture Library in Communicating Astronomy to the Public
NASA Astrophysics Data System (ADS)
Evans, G. S.
2005-12-01
Science communication is increasingly led by the image, providing opportunities for 'visual' disciplines such as astronomy to receive greater public exposure. In consequence, there is a huge demand for good and exciting images within the publishing media. The picture library is a conduit linking image makers of all kinds to image buyers of all kinds. The image maker benefits from the exposure of their pictures to the people who want to use them, with minimal time investment, and with the safeguards of effective rights management. The image buyer benefits from a wide choice of images available from a single point of contact, stored in a database that offers a choice between subject-based and conceptual searches. By forming this link between astronomer, professional or amateur, and the publishing media, the picture library helps to make the wonders of astronomy visible to a wider public audience.
NASA Astrophysics Data System (ADS)
Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan
2016-11-01
The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ˜175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.
Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.
ERIC Educational Resources Information Center
Gutmann, Myron P.; And Others
1989-01-01
Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)
Content Independence in Multimedia Databases.
ERIC Educational Resources Information Center
de Vries, Arjen P.
2001-01-01
Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…
Development of expert systems for analyzing electronic documents
NASA Astrophysics Data System (ADS)
Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.
2018-05-01
The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.
Hewett, Paul; Morey, Sandy Z; Holen, Brian M; Logan, Perry W; Olsen, Geary W
2012-01-01
A study was conducted to construct a job exposure matrix for the roofing granule mine and mill workers at four U.S. plants. Each plant mined different minerals and had unique departments and jobs. The goal of the study was to generate accurate estimates of the mean exposure to respirable crystalline silica for each cell of the job exposure matrix, that is, every combination of plant, department, job, and year represented in the job histories of the study participants. The objectives of this study were to locate, identify, and collect information on all exposure measurements ever collected at each plant, statistically analyze the data to identify deficiencies in the database, identify and resolve questionable measurements, identify all important process and control changes for each plant-department-job combination, construct a time line for each plant-department combination indicating periods where the equipment and conditions were unchanged, and finally, construct a job exposure matrix. After evaluation, 1871 respirable crystalline silica measurements and estimates remained. The primary statistic of interest was the mean exposure for each job exposure matrix cell. The average exposure for each of the four plants was 0.042 mg/m(3) (Belle Mead, N.J.), 0.106 mg/m(3) (Corona, Calif.), 0.051 mg/m(3) (Little Rock, Ark.), and 0.152 mg/m(3) (Wausau, Wis.), suggesting that there may be substantial differences in the employee cumulative exposures. Using the database and the available plant information, the study team assigned an exposure category and mean exposure for every plant-department-job and time interval combination. Despite a fairly large database, the mean exposure for > 95% of the job exposure matrix cells, or specific plant-department-job-year combinations, were estimated by analogy to similar jobs in the plant for which sufficient data were available. This approach preserved plant specificity, hopefully improving the usefulness of the job exposure matrix.
ANALYSIS OF DISCRIMINATING FACTORS IN HUMAN ACTIVITIES THAT AFFECT EXPOSURE
Accurately modeling exposure to particulate matter (PM) and other pollutants ultimately involves the utilization of human location-activity databases to assist in understanding the potential variability of microenvironmental exposures. This paper critically considers and stati...
NASA Technical Reports Server (NTRS)
Moroh, Marsha
1988-01-01
A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grahn, D.; Wright, B.J.; Carnes, B.A.
A research reactor for exclusive use in experimental radiobiology was designed and built at Argonne National Laboratory in the 1960`s. It was located in a special addition to Building 202, which housed the Division of Biological and Medical Research. Its location assured easy access for all users to the animal facilities, and it was also near the existing gamma-irradiation facilities. The water-cooled, heterogeneous 200-kW(th) reactor, named JANUS, became the focal point for a range of radiobiological studies gathered under the rubic of {open_quotes}the JANUS program{close_quotes}. The program ran from about 1969 to 1992 and included research at all levels ofmore » biological organization, from subcellular to organism. More than a dozen moderate- to large-scale studies with the B6CF{sub 1} mouse were carried out; these focused on the late effects of whole-body exposure to gamma rays or fission neutrons, in matching exposure regimes. In broad terms, these studies collected data on survival and on the pathology observed at death. A deliberate effort was made to establish the cause of death. This archieve describes these late-effects studies and their general findings. The database includes exposure parameters, time of death, and the gross pathology and histopathology in codified form. A series of appendices describes all pathology procedures and codes, treatment or irradiation codes, and the manner in which the data can be accessed in the ORACLE database management system. A series of tables also presents summaries of the individual experiments in terms of radiation quality, sample sizes at entry, mean survival times by sex, and number of gross pathology and histopathology records.« less
Computer Security Products Technology Overview
1988-10-01
13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a
An Introduction to Database Management Systems.
ERIC Educational Resources Information Center
Warden, William H., III; Warden, Bette M.
1984-01-01
Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…
Uddin, Md Jamal; Groenwold, Rolf H H; de Boer, Anthonius; Gardarsdottir, Helga; Martin, Elisa; Candore, Gianmario; Belitser, Svetlana V; Hoes, Arno W; Roes, Kit C B; Klungel, Olaf H
2016-03-01
Instrumental variable (IV) analysis can control for unmeasured confounding, yet it has not been widely used in pharmacoepidemiology. We aimed to assess the performance of IV analysis using different IVs in multiple databases in a study of antidepressant use and hip fracture. Information on adults with at least one prescription of a selective serotonin reuptake inhibitor (SSRI) or tricyclic antidepressant (TCA) during 2001-2009 was extracted from the THIN (UK), BIFAP (Spain), and Mondriaan (Netherlands) databases. IVs were created using the proportion of SSRI prescriptions per practice or using the one, five, or ten previous prescriptions by a physician. Data were analysed using conventional Cox regression and two-stage IV models. In the conventional analysis, SSRI (vs. TCA) was associated with an increased risk of hip fracture, which was consistently found across databases: the adjusted hazard ratio (HR) was approximately 1.35 for time-fixed and 1.50 to 2.49 for time-varying SSRI use, while the IV analysis based on the IVs that appeared to satisfy the IV assumptions showed conflicting results, e.g. the adjusted HRs ranged from 0.55 to 2.75 for time-fixed exposure. IVs for time-varying exposure violated at least one IV assumption and were therefore invalid. This multiple database study shows that the performance of IV analysis varied across the databases for time-fixed and time-varying exposures and strongly depends on the definition of IVs. It remains challenging to obtain valid IVs in pharmacoepidemiological studies, particularly for time-varying exposure, and IV analysis should therefore be interpreted cautiously. Copyright © 2016 John Wiley & Sons, Ltd.
Li, Yuanfang; Zhou, Zhiwei
2016-02-01
Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.
Exposure to chemicals in consumer products has been identified as a significant source of human exposure. To predict such exposures, information about the ingredients and their quantities in consumer products is required, but is often not available. The Chemicals and Products Dat...
Occupational radiation Exposure at Agreement State-Licensed Materials Facilities, 1997-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Nuclear Regulatory Commission, Office of Nuclear Regulatory Research
The purpose of this report is to examine occupational radiation exposures received under Agreement State licensees. As such, this report reflects the occupational radiation exposure data contained in the Radiation Exposure Information and Reporting System (REIRS) database, for 1997 through 2010, from Agreement State-licensed materials facilities.
The future application of GML database in GIS
NASA Astrophysics Data System (ADS)
Deng, Yuejin; Cheng, Yushu; Jing, Lianwen
2006-10-01
In 2004, the Geography Markup Language (GML) Implementation Specification (version 3.1.1) was published by Open Geospatial Consortium, Inc. Now more and more applications in geospatial data sharing and interoperability depend on GML. The primary purpose of designing GML is for exchange and transportation of geo-information by standard modeling and encoding of geography phenomena. However, the problems of how to organize and access lots of GML data effectively arise in applications. The research on GML database focuses on these problems. The effective storage of GML data is a hot topic in GIS communities today. GML Database Management System (GDBMS) mainly deals with the problem of storage and management of GML data. Now two types of XML database, namely Native XML Database, and XML-Enabled Database are classified. Since GML is an application of the XML standard to geographic data, the XML database system can also be used for the management of GML. In this paper, we review the status of the art of XML database, including storage, index and query languages, management systems and so on, then move on to the GML database. At the end, the future prospect of GML database in GIS application is presented.
Lavoue, J; Friesen, M C; Burstyn, I
2013-01-01
Inspectors from the US Occupational Safety and Health Administration (OSHA) have been collecting industrial hygiene samples since 1972 to verify compliance with Permissible Exposure Limits. Starting in 1979, these measurements were computerized into the Integrated Management Information System (IMIS). In 2010, a dataset of over 1 million personal sample results analysed at OSHA's central laboratory in Salt Lake City [Chemical Exposure Health Data (CEHD)], only partially overlapping the IMIS database, was placed into public domain via the internet. We undertook this study to inform potential users about the relationship between this newly available OSHA data and IMIS and to offer insight about the opportunities and challenges associated with the use of OSHA measurement data for occupational exposure assessment. We conducted a literature review of previous uses of IMIS in occupational health research and performed a descriptive analysis of the data recently made available and compared them to the IMIS database for lead, the most frequently sampled agent. The literature review yielded 29 studies reporting use of IMIS data, but none using the CEHD data. Most studies focused on a single contaminant, with silica and lead being most frequently analysed. Sixteen studies addressed potential bias in IMIS, mostly by examining the association between exposure levels and ancillary information. Although no biases of appreciable magnitude were consistently reported across studies and agents, these assessments may have been obscured by selective under-reporting of non-detectable measurements. The CEHD data comprised 1 450 836 records from 1984 to 2009, not counting analytical blanks and erroneous records. Seventy eight agents with >1000 personal samples yielded 1 037 367 records. Unlike IMIS, which contain administrative information (company size, job description), ancillary information in the CEHD data is mostly analytical. When the IMIS and CEHD measurements of lead were merged, 23 033 (39.2%) records were in common to both IMIS and CEHD datasets, 10 681 (18.2%) records were only in IMIS, and 25 012 (42.6%) records were only in the CEHD database. While IMIS-only records represent data analysed in other laboratories, CEHD-only records suggest partial reporting of sampling results by OSHA inspectors into IMIS. For lead, the percentage of non-detects in the CEHD-only data was 71% compared to 42% and 46% in the both-IMIS-CEHD and IMIS-only datasets, respectively, suggesting differential under-reporting of non-detects in IMIS. IMIS and the CEHD datasets represent the biggest source of multi-industry exposure data in the USA and should be considered as a valuable source of information for occupational exposure assessment. The lack of empirical data on biases, adequate interpretation of non-detects in OSHA data, complicated by suspected differential under-reporting, remain the principal challenges to the valid estimation of average exposure conditions. We advocate additional comparisons between IMIS and CEHD data and discuss analytical strategies that may play a key role in meeting these challenges.
Database Systems. Course Three. Information Systems Curriculum.
ERIC Educational Resources Information Center
O'Neil, Sharon Lund; Everett, Donna R.
This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…
23 CFR 972.204 - Management systems requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...
23 CFR 972.204 - Management systems requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...
23 CFR 972.204 - Management systems requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...
23 CFR 972.204 - Management systems requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...
Carbon Dioxide - Our Common "Enemy"
NASA Technical Reports Server (NTRS)
James, John T.; Macatangay, Ariel
2009-01-01
Health effects of brief and prolonged exposure to carbon dioxide continue to be a concern for those of us who manage this pollutant in closed volumes, such as in spacecraft and submarines. In both examples, considerable resources are required to scrub the atmosphere to levels that are considered totally safe for maintenance of crew health and performance. Defining safe levels is not a simple task because of many confounding factors, including: lack of a robust database on human exposures, suspected significant variations in individual susceptibility, variations in the endpoints used to assess potentially adverse effects, the added effects of stress, and the fluid shifts associated with micro-gravity (astronauts only). In 2007 the National Research Council proposed revised Continuous Exposure Guidelines (CEGLs) and Emergency Exposure Guidelines (EEGLs) to the U.S. Navy. Similarly, in 2008 the NASA Toxicology Group, in cooperation with another subcommittee of the National Research Council, revised Spacecraft Maximum Allowable Concentrations (SMACs). In addition, a 1000-day exposure limit was set for long-duration spaceflights to celestial bodies. Herein we examine the rationale for the levels proposed to the U.S. Navy and compare this rationale with the one used by NASA to set its limits. We include a critical review of previous studies on the effects of exposure to carbon dioxide and attempt to dissect out the challenges associated with setting fully-defensible limits. We also describe recent experiences with management of carbon dioxide aboard the International Space Station with 13 persons aboard. This includes the tandem operations of the Russian Vozduk and the U.S. Carbon Dioxide Removal System. A third removal system is present while the station is docked to the Shuttle spacecraft, so our experience includes the lithium hydroxide system aboard Shuttle for the removal of carbon dioxide. We discuss strategies for highly-efficient, regenerable removal of carbon dioxide that could meet the 1000-day SMAC of 0.5%, which would apply to long-duration voyages to Mars.
Wildland fire smoke and human health.
Cascio, Wayne E
2018-05-15
The natural cycle of landscape fire maintains the ecological health of the land, yet adverse health effects associated with exposure to emissions from wildfire produce public health and clinical challenges. Systematic reviews conclude that a positive association exists between exposure to wildfire smoke or wildfire particulate matter (PM 2.5 ) and all-cause mortality and respiratory morbidity. Respiratory morbidity includes asthma, chronic obstructive pulmonary disease (COPD), bronchitis and pneumonia. The epidemiological data linking wildfire smoke exposure to cardiovascular mortality and morbidity is mixed, and inconclusive. More studies are needed to define the risk for common and costly clinical cardiovascular outcomes. Susceptible populations include people with respiratory and possibly cardiovascular diseases, middle-aged and older adults, children, pregnant women and the fetus. The increasing frequency of large wildland fires, the expansion of the wildland-urban interface, the area between unoccupied land and human development; and an increasing and aging U.S. population are increasing the number of people at-risk from wildfire smoke, thus highlighting the necessity for broadening stakeholder cooperation to address the health effects of wildfire. While much is known, many questions remain and require further population-based, clinical and occupational health research. Health effects measured over much wider geographical areas and for longer periods time will better define the risk for adverse health outcomes, identify the sensitive populations and assess the influence of social factors on the relationship between exposure and health outcomes. Improving exposure models and access to large clinical databases foreshadow improved risk analysis facilitating more effective risk management. Fuel and smoke management remains an important component for protecting population health. Improved smoke forecasting and translation of environmental health science into communication of actionable information for use by public health officials, healthcare professionals and the public is needed to motivate behaviors that lower exposure and protect public health, particularly among those at high risk. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, William A.; Litovitz, Toby L.; Belson, Martin G.
2005-09-01
The Toxic Exposure Surveillance System (TESS) is a uniform data set of US poison centers cases. Categories of information include the patient, the caller, the exposure, the substance(s), clinical toxicity, treatment, and medical outcome. The TESS database was initiated in 1985, and provides a baseline of more than 36.2 million cases through 2003. The database has been utilized for a number of safety evaluations. Consideration of the strengths and limitations of TESS data must be incorporated into data interpretation. Real-time toxicovigilance was initiated in 2003 with continuous uploading of new cases from all poison centers to a central database. Real-timemore » toxicovigilance utilizing general and specific approaches is systematically run against TESS, further increasing the potential utility of poison center experiences as a means of early identification of potential public health threats.« less
Burnham, J F; Shearer, B S; Wall, J C
1992-01-01
Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424
Creating databases for biological information: an introduction.
Stein, Lincoln
2013-06-01
The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.
Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.
2018-01-01
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804
Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A
2017-11-06
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.
Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database
Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.
2010-01-01
Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938
Federated web-accessible clinical data management within an extensible neuroimaging database.
Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S
2010-12-01
Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.
Implementation of a data management software system for SSME test history data
NASA Technical Reports Server (NTRS)
Abernethy, Kenneth
1986-01-01
The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.
Kimber, Melissa; Adham, Sami; Gill, Sana; McTavish, Jill; MacMillan, Harriet L
2018-02-01
Increasingly recognized as a distinct form of childhood maltreatment, children's exposure to intimate partner violence (IPV) has been shown to be associated with an array of negative psychosocial outcomes, including elevated risk for additional violence over the life course. Although studies have identified child exposure to IPV as a predictor of IPV perpetration in adulthood, no review has critically evaluated the methodology of this quantitative work. The present study examines the association between childhood exposure to IPV and the perpetration of IPV in adulthood based on a systematic review of the literature from inception to January 4, 2016. Databases searched included Medline, Embase, PsycINFO, CINAHL, Cochrane Database of Systematic Reviews, Sociological Abstracts and ERIC. Database searches were complemented with backward and forward citation chaining. Studies were critically appraised using the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Of 5601 articles identified by the search, 19 studies were included for data extraction. Sixteen of these studies found that child exposure to IPV was significantly and positively associated with adult IPV perpetration; three studies reported null findings. The methodological quality of the studies was low. Work thus far has tended to focus on child exposure to physical IPV and the perpetration of physical IPV within heterosexual contexts. In addition, measures of child exposure to IPV vary in their classification of what exposure entails. We critically discuss the strengths and limitations of the existing evidence and the theoretical frameworks informing this work. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Roberson, Sheri (Editor); Kelly, Bruce (Editor); Gettleman, Alan G. (Technical Monitor)
2001-01-01
This Conference convened approximately 86 registered participants of invited guest speakers, NASA presenters, and a broad spectrum of the Occupational Health disciplines representing NASA Headquarters and all NASA Field Centers. Two days' Professional Development Courses on Exposure Assessment Strategies and Statistics and on Advanced Cardiac Life Support training and recertification preceded the Conference. With the theme, 'Risk Assessment and Management in 2001,' conferees were first provided updates from the Program Principal Center Office and the Headquarters Office. Plenary sessions elaborated on several topics: biological terrorism, OSHA recordability, Workers' Compensation issues, Federal ergonomic standards, bridging aerospace medicine and occupational health-especially in management of risk in spaceflight, and EAP operations with mission failures. A keynote address dealt with resiliency skills for 21st century workers and two NASA astronaut speakers highlighted a tour of the Johnson Space Center. During discipline specific breakout sessions, current issues in occupational health management and policy, credentialing and privileging, health risk assessment, measurement and standardization, audits, database development, prevention and rehabilitation, international travel and infection control, employee assistance, nursing process, and environmental health were presented.
We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...
ExpoCastDB: A Publicly Accessible Database for Observational Exposure Data
The application of environmental informatics tools for human health risk assessment will require the development of advanced exposure information technology resources. Exposure data for chemicals is often not readily accessible. There is a pressing need for easily accessible, che...
Formaldehyde exposure in U.S. industries from OSHA air sampling data.
Lavoue, Jerome; Vincent, Raymond; Gerin, Michel
2008-09-01
National occupational exposure databanks have been cited as sources of exposure data for exposure surveillance and exposure assessment for occupational epidemiology. Formaldehyde exposure data recorded in the U.S Integrated Management Information System (IMIS) between 1979 and 2001 were collected to elaborate a multi-industry retrospective picture of formaldehyde exposures and to identify exposure determinants. Due to the database design, only detected personal measurement results (n = 5228) were analyzed with linear mixed-effect models, which explained 29% of the total variance. Short-term measurement results were higher than time-weighted average (TWA) data and decreased 18% per year until 1987 (TWA data 5% per year) and 5% per year (TWA data 4% per year) after that. Exposure varied across industries with maximal estimated TWA geometric means (GM) for 2001 in the reconstituted wood products, structural wood members, and wood dimension and flooring industries (GM = 0.20 mg/m(3). Highest short-term GMs estimated for 2001 were in the funeral service and crematory and reconstituted wood products industries (GM = 0.35 mg/m(3). Exposure levels in IMIS were marginally higher during nonprogrammed inspections compared with programmed inspections. An increasing exterior temperature tended to cause a decrease in exposure levels for cold temperatures (-5% per 5 degrees C for T < 15 degrees C) but caused an increase in exposure levels for warm temperatures (+15% per 5 degrees C for T >15 degrees C). Concentrations measured during the same inspection were correlated and varied differently across industries and sample type (TWA, short term). Sensitivity analyses using TOBIT regression suggested that the average bias caused by excluding non-detects is approximately 30%, being potentially higher for short-term data if many non-detects were actually short-term measurements. Although limited by availability of relevant exposure determinants and potential selection biases in IMIS, these results provide useful insight on formaldehyde occupational exposure in the United States in the last two decades. The authors recommend that more information on exposure determinants be recorded in IMIS.
Asbestos Exposure Assessment Database
NASA Technical Reports Server (NTRS)
Arcot, Divya K.
2010-01-01
Exposure to particular hazardous materials in a work environment is dangerous to the employees who work directly with or around the materials as well as those who come in contact with them indirectly. In order to maintain a national standard for safe working environments and protect worker health, the Occupational Safety and Health Administration (OSHA) has set forth numerous precautionary regulations. NASA has been proactive in adhering to these regulations by implementing standards which are often stricter than regulation limits and administering frequent health risk assessments. The primary objective of this project is to create the infrastructure for an Asbestos Exposure Assessment Database specific to NASA Johnson Space Center (JSC) which will compile all of the exposure assessment data into a well-organized, navigable format. The data includes Sample Types, Samples Durations, Crafts of those from whom samples were collected, Job Performance Requirements (JPR) numbers, Phased Contrast Microscopy (PCM) and Transmission Electron Microscopy (TEM) results and qualifiers, Personal Protective Equipment (PPE), and names of industrial hygienists who performed the monitoring. This database will allow NASA to provide OSHA with specific information demonstrating that JSC s work procedures are protective enough to minimize the risk of future disease from the exposures. The data has been collected by the NASA contractors Computer Sciences Corporation (CSC) and Wyle Laboratories. The personal exposure samples were collected from devices worn by laborers working at JSC and by building occupants located in asbestos-containing buildings.
Dietary Exposure Potential Model
Existing food consumption and contaminant residue databases, typically products of nutrition and regulatory monitoring, contain useful information to characterize dietary intake of environmental chemicals. A PC-based model with resident database system, termed the Die...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Yubin; Shankar, Mallikarjun; Park, Byung H.
Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS NATIONAL PARK... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases...
NASA Astrophysics Data System (ADS)
Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.
2014-11-01
Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.
NASA Astrophysics Data System (ADS)
Velazquez, Enrique Israel
Improvements in medical and genomic technologies have dramatically increased the production of electronic data over the last decade. As a result, data management is rapidly becoming a major determinant, and urgent challenge, for the development of Precision Medicine. Although successful data management is achievable using Relational Database Management Systems (RDBMS), exponential data growth is a significant contributor to failure scenarios. Growing amounts of data can also be observed in other sectors, such as economics and business, which, together with the previous facts, suggests that alternate database approaches (NoSQL) may soon be required for efficient storage and management of big databases. However, this hypothesis has been difficult to test in the Precision Medicine field since alternate database architectures are complex to assess and means to integrate heterogeneous electronic health records (EHR) with dynamic genomic data are not easily available. In this dissertation, we present a novel set of experiments for identifying NoSQL database approaches that enable effective data storage and management in Precision Medicine using patients' clinical and genomic information from the cancer genome atlas (TCGA). The first experiment draws on performance and scalability from biologically meaningful queries with differing complexity and database sizes. The second experiment measures performance and scalability in database updates without schema changes. The third experiment assesses performance and scalability in database updates with schema modifications due dynamic data. We have identified two NoSQL approach, based on Cassandra and Redis, which seems to be the ideal database management systems for our precision medicine queries in terms of performance and scalability. We present NoSQL approaches and show how they can be used to manage clinical and genomic big data. Our research is relevant to the public health since we are focusing on one of the main challenges to the development of Precision Medicine and, consequently, investigating a potential solution to the progressively increasing demands on health care.
EV@LUTIL: An open access database on occupational exposures to asbestos and man-made mineral fibres.
Orlowski, Ewa; Audignon-Durand, Sabyne; Goldberg, Marcel; Imbernon, Ellen; Brochard, Patrick
2015-10-01
The aim of Evalutil is to document occupational exposure to asbestos and man-made mineral fibers. These databases provide grouped descriptive and metrological data from observed situations of occupational exposure, collected through the analysis of scientific articles and technical reports by industrial hygienists. Over 5,000 measurements were collected. We describe the occupations, economic activities, fiber-containing products, and operations on them that have been documented most often. Graphical measurement syntheses of these data show that the situations presented for asbestos and RCF, except mineral wools, report fiber concentrations mainly above historical occupational exposure limits. Free access to these data in French and in English on the Internet (https://ssl2.isped.u-bordeaux2.fr/eva_003/) helps public health and prevention professionals to identify and characterize occupational exposures to fibers. Extended recently to nanoscale particles, Evalutil continues to contribute to the improvement of knowledge about exposure to inhaled particles and the health risks associated with them. © 2015 Wiley Periodicals, Inc.
Database Security: What Students Need to Know
ERIC Educational Resources Information Center
Murray, Meg Coffin
2010-01-01
Database security is a growing concern evidenced by an increase in the number of reported incidents of loss of or unauthorized exposure to sensitive data. As the amount of data collected, retained and shared electronically expands, so does the need to understand database security. The Defense Information Systems Agency of the US Department of…
interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and
Databases for multilevel biophysiology research available at Physiome.jp.
Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki
2015-01-01
Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.
Creating databases for biological information: an introduction.
Stein, Lincoln
2002-08-01
The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.
Mass-storage management for distributed image/video archives
NASA Astrophysics Data System (ADS)
Franchi, Santina; Guarda, Roberto; Prampolini, Franco
1993-04-01
The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.
A spatial-temporal system for dynamic cadastral management.
Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie
2006-03-01
A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.
Expanding on Successful Concepts, Models, and Organization
If the goal of the AEP framework was to replace existing exposure models or databases for organizing exposure data with a concept, we would share Dr. von Göetz concerns. Instead, the outcome we promote is broader use of an organizational framework for exposure science. The f...
Leveraging Publicly-Available Consumer Product and Chemical Data in Support of Exposure Modeling
Near-field contact with chemicals in consumer products has been identified as a significant source of human exposure. To predict such exposures, information about chemical occurrence in consumer products is required, but is often not available. The Chemicals and Products Database...
76 FR 16301 - Flubendiamide; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
....O. Box 12014, Research Triangle Park, NC 27709-2014. The petition requested that 40 CFR 180.639 be... maximization test. In the mammalian toxicology database, the primary target organ of flubendiamide exposure is... exposures from flubendiamide in food for the proposed new uses as follows: i. Acute exposure. Quantitative...
Near-field exposure to chemicals in consumer products has been identified as a significant source of exposure for many chemicals. Quantitative data on product chemical composition and weight fraction is a key parameter for characterizing this exposure. While data on product compo...
War and remembrance: Combat exposure in young adulthood and memory function sixty years later.
Nevarez, Michael D; Malone, Johanna C; Rentz, Dorene M; Waldinger, Robert J
2017-01-01
Identifying adaptive ways to cope with extreme stress is essential to promoting long-term health. Memory systems are highly sensitive to stress, and combat exposure during war has been shown to have deleterious effects on cognitive processes, such as memory, decades later. No studies have examined coping styles used by combat veterans and associations with later-life cognitive functioning. Defenses are coping mechanisms that manage difficult memories and feelings, with some more closely related to memory processes (e.g., suppression, repression). Utilizing a longitudinal database, we assessed how reliance on certain defense mechanisms after World War II combat exposure could affect cognitive health 60years later. Data spanning 75years were available on 71 men who had post-war assessment of combat exposure, defense mechanism ratings (ages 19-50), and late-life neuropsychological testing. Interaction models of combat exposure with defenses predicting late-life memory were examined. In bivariate analyses, greater reliance on suppression correlated with worse memory performance (r=-0.30, p=.01), but greater reliance on repression did not. Greater reliance on suppression strengthened the link between combat exposure and worse memory in late life (R 2 =0.24, p<.001). In contrast, greater reliance on repression attenuated the link between combat exposure and poorer late-life memory (R 2 =0.19, p<.001). Results suggest that coping styles may affect the relationship between early-adult stress and late-life cognition. Findings highlight the importance of understanding how coping styles may impact cognitive functioning as people move through adult life. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Deutsch, Donald R.
This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…
NASA Technical Reports Server (NTRS)
1990-01-01
In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.
NBIC: National Ballast Information Clearinghouse
Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database / Database Manager: Tami Huber Senior Analyst / Ecologist: Mark Minton Data Managers Ashley Arnwine Jessica Hardee Amanda Reynolds Database Design and Programming / Application Programming: Paul Winterbauer
AGRICULTURAL BEST MANAGEMENT PRACTICE EFFECTIVENESS DATABASE
Resource Purpose:The Agricultural Best Management Practice Effectiveness Database contains the results of research projects which have collected water quality data for the purpose of determining the effectiveness of agricultural management practices in reducing pollutants ...
Evaluation of relational and NoSQL database architectures to manage genomic annotations.
Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard
2016-12-01
While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.
La Gamba, Fabiola; Corrao, Giovanni; Romio, Silvana; Sturkenboom, Miriam; Trifirò, Gianluca; Schink, Tania; de Ridder, Maria
2017-10-01
Clustering of patients in databases is usually ignored in one-stage meta-analysis of multi-database studies using matched case-control data. The aim of this study was to compare bias and efficiency of such a one-stage meta-analysis with a two-stage meta-analysis. First, we compared the approaches by generating matched case-control data under 5 simulated scenarios, built by varying: (1) the exposure-outcome association; (2) its variability among databases; (3) the confounding strength of one covariate on this association; (4) its variability; and (5) the (heterogeneous) confounding strength of two covariates. Second, we made the same comparison using empirical data from the ARITMO project, a multiple database study investigating the risk of ventricular arrhythmia following the use of medications with arrhythmogenic potential. In our study, we specifically investigated the effect of current use of promethazine. Bias increased for one-stage meta-analysis with increasing (1) between-database variance of exposure effect and (2) heterogeneous confounding generated by two covariates. The efficiency of one-stage meta-analysis was slightly lower than that of two-stage meta-analysis for the majority of investigated scenarios. Based on ARITMO data, there were no evident differences between one-stage (OR = 1.50, CI = [1.08; 2.08]) and two-stage (OR = 1.55, CI = [1.12; 2.16]) approaches. When the effect of interest is heterogeneous, a one-stage meta-analysis ignoring clustering gives biased estimates. Two-stage meta-analysis generates estimates at least as accurate and precise as one-stage meta-analysis. However, in a study using small databases and rare exposures and/or outcomes, a correct one-stage meta-analysis becomes essential. Copyright © 2017 John Wiley & Sons, Ltd.
Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad
2011-03-01
Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.
Application of cloud database in the management of clinical data of patients with skin diseases.
Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning
2015-04-01
To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.
UNITED STATES METEOROLOGICAL DATA - DAILY AND HOURLY FILES TO SUPPORT PREDICTIVE EXPOSURE MODELING
ORD numerical models for pesticide exposure include a model of spray drift (AgDisp), a cropland pesticide persistence model (PRZM), a surface water exposure model (EXAMS), and a model of fish bioaccumulation (BASS). A unified climatological database for these models has been asse...
23 CFR 972.204 - Management systems requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FISH AND... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...
ECOTOX knowledgebase: New tools for data visualization and database interoperability
The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data fromsingle chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on toxi...
A web based relational database management system for filariasis control
Murty, Upadhyayula Suryanarayana; Kumar, Duvvuri Venkata Rama Satya; Sriram, Kumaraswamy; Rao, Kadiri Madhusudhan; Bhattacharyulu, Chakravarthula Hayageeva Narasimha Venakata; Praveen, Bhoopathi; Krishna, Amirapu Radha
2005-01-01
The present study describes a RDBMS (relational database management system) for the effective management of Filariasis, a vector borne disease. Filariasis infects 120 million people from 83 countries. The possible re-emergence of the disease and the complexity of existing control programs warrant the development of new strategies. A database containing comprehensive data associated with filariasis finds utility in disease control. We have developed a database containing information on the socio-economic status of patients, mosquito collection procedures, mosquito dissection data, filariasis survey report and mass blood data. The database can be searched using a user friendly web interface. Availability http://www.webfil.org (login and password can be obtained from the authors) PMID:17597846
Integrating RFID technique to design mobile handheld inventory management system
NASA Astrophysics Data System (ADS)
Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung
2008-04-01
An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.
NASA Astrophysics Data System (ADS)
Bartolini, S.; Becerril, L.; Martí, J.
2014-11-01
One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.
Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze
2013-04-01
Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Assessment of Confounders in Comparative Effectiveness Studies From Secondary Databases.
Franklin, Jessica M; Schneeweiss, Sebastian; Solomon, Daniel H
2017-03-15
Secondary clinical databases are an important and growing source of data for comparative effectiveness research (CER) studies. However, measurement of confounders, such as biomarker values or patient-reported health status, in secondary clinical databases may not align with the initiation of a new treatment. In many published CER analyses of registry data, investigators assessed confounders based on the first questionnaire in which the new exposure was recorded. However, it is known that adjustment for confounders measured after the start of exposure can lead to biased treatment effect estimates. In the present study, we conducted simulations to compare assessment strategies for a dynamic clinical confounder in a registry-based comparative effectiveness study of 2 therapies. As expected, we found that adjustment for the confounder value at the time of the first questionnaire after the start of exposure creates a biased estimate the total effect of exposure choice on outcome when the confounder mediates part of the effect. However, adjustment for the prior value can also be badly biased when measured long before exposure initiation. Thus, investigators should carefully consider the timing of confounder measurements relative to exposure initiation and the rate of change in the confounder in order to choose the most relevant measure for each patient. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Analysis and preliminary design of Kunming land use and planning management information system
NASA Astrophysics Data System (ADS)
Li, Li; Chen, Zhenjie
2007-06-01
This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.
Zhang, Y J; Zhou, D H; Bai, Z P; Xue, F X
2018-02-10
Objective: To quantitatively analyze the current status and development trends regarding the land use regression (LUR) models on ambient air pollution studies. Methods: Relevant literature from the PubMed database before June 30, 2017 was analyzed, using the Bibliographic Items Co-occurrence Matrix Builder (BICOMB 2.0). Keywords co-occurrence networks, cluster mapping and timeline mapping were generated, using the CiteSpace 5.1.R5 software. Relevant literature identified in three Chinese databases was also reviewed. Results: Four hundred sixty four relevant papers were retrieved from the PubMed database. The number of papers published showed an annual increase, in line with the growing trend of the index. Most papers were published in the journal of Environmental Health Perspectives . Results from the Co-word cluster analysis identified five clusters: cluster#0 consisted of birth cohort studies related to the health effects of prenatal exposure to air pollution; cluster#1 referred to land use regression modeling and exposure assessment; cluster#2 was related to the epidemiology on traffic exposure; cluster#3 dealt with the exposure to ultrafine particles and related health effects; cluster#4 described the exposure to black carbon and related health effects. Data from Timeline mapping indicated that cluster#0 and#1 were the main research areas while cluster#3 and#4 were the up-coming hot areas of research. Ninety four relevant papers were retrieved from the Chinese databases with most of them related to studies on modeling. Conclusion: In order to better assess the health-related risks of ambient air pollution, and to best inform preventative public health intervention policies, application of LUR models to environmental epidemiology studies in China should be encouraged.
Serials Management by Microcomputer: The Potential of DBMS.
ERIC Educational Resources Information Center
Vogel, J. Thomas; Burns, Lynn W.
1984-01-01
Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…
Serra-Varela, María Jesús; Alía, Ricardo; Pórtoles, Javier; Gonzalo, Julián; Soliño, Mario; Grivet, Delphine; Raposo, Rosa
2017-01-01
Climate change is gravely affecting forest ecosystems, resulting in large distribution shifts as well as in increasing infection diseases and biological invasions. Accordingly, forest management requires an evaluation of exposure to climate change that should integrate both its abiotic and biotic components. Here we address the implications of climate change in an emerging disease by analysing both the host species (Pinus pinaster, Maritime pine) and the pathogen's (Fusarium circinatum, pitch canker) environmental suitability i.e. estimating the host's risk of habitat loss and the disease`s future environmental range. We constrained our study area to the Spanish Iberian Peninsula, where accurate climate and pitch canker occurrence databases were available. While P. pinaster is widely distributed across the study area, the disease has only been detected in its north-central and north-western edges. We fitted species distribution models for the current distribution of the conifer and the disease. Then, these models were projected into nine Global Climate Models and two different climatic scenarios which totalled to 18 different future climate predictions representative of 2050. Based on the level of agreement among them, we created future suitability maps for the pine and for the disease independently, which were then used to assess exposure of current populations of P. pinaster to abiotic and biotic effects of climate change. Almost the entire distribution of P. pinaster in the Spanish Iberian Peninsula will be subjected to abiotic exposure likely to be driven by the predicted increase in drought events in the future. Furthermore, we detected a reduction in exposure to pitch canker that will be concentrated along the north-western edge of the study area. Setting up breeding programs is recommended in highly exposed and productive populations, while silvicultural methods and monitoring should be applied in those less productive, but still exposed, populations.
Serra-Varela, María Jesús; Alía, Ricardo; Pórtoles, Javier; Gonzalo, Julián; Soliño, Mario; Grivet, Delphine; Raposo, Rosa
2017-01-01
Climate change is gravely affecting forest ecosystems, resulting in large distribution shifts as well as in increasing infection diseases and biological invasions. Accordingly, forest management requires an evaluation of exposure to climate change that should integrate both its abiotic and biotic components. Here we address the implications of climate change in an emerging disease by analysing both the host species (Pinus pinaster, Maritime pine) and the pathogen’s (Fusarium circinatum, pitch canker) environmental suitability i.e. estimating the host’s risk of habitat loss and the disease`s future environmental range. We constrained our study area to the Spanish Iberian Peninsula, where accurate climate and pitch canker occurrence databases were available. While P. pinaster is widely distributed across the study area, the disease has only been detected in its north-central and north-western edges. We fitted species distribution models for the current distribution of the conifer and the disease. Then, these models were projected into nine Global Climate Models and two different climatic scenarios which totalled to 18 different future climate predictions representative of 2050. Based on the level of agreement among them, we created future suitability maps for the pine and for the disease independently, which were then used to assess exposure of current populations of P. pinaster to abiotic and biotic effects of climate change. Almost the entire distribution of P. pinaster in the Spanish Iberian Peninsula will be subjected to abiotic exposure likely to be driven by the predicted increase in drought events in the future. Furthermore, we detected a reduction in exposure to pitch canker that will be concentrated along the north-western edge of the study area. Setting up breeding programs is recommended in highly exposed and productive populations, while silvicultural methods and monitoring should be applied in those less productive, but still exposed, populations. PMID:28192454
Some Reliability Issues in Very Large Databases.
ERIC Educational Resources Information Center
Lynch, Clifford A.
1988-01-01
Describes the unique reliability problems of very large databases that necessitate specialized techniques for hardware problem management. The discussion covers the use of controlled partial redundancy to improve reliability, issues in operating systems and database management systems design, and the impact of disk technology on very large…
Tufts Health Sciences Database: Lessons, Issues, and Opportunities.
ERIC Educational Resources Information Center
Lee, Mary Y.; Albright, Susan A.; Alkasab, Tarik; Damassa, David A.; Wang, Paul J.; Eaton, Elizabeth K.
2003-01-01
Describes a seven-year experience with developing the Tufts Health Sciences Database, a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. Identifies major effects on teaching and learning. Also addresses issues of faculty development, copyright and…
Hansen, Alana; Pisaniello, Dino; Varghese, Blesson; Rowett, Shelley; Hanson-Easey, Scott; Bi, Peng; Nitschke, Monika
2018-01-01
Heat exposure can be a health hazard for many Australian workers in both outdoor and indoor situations. With many heat-related incidents left unreported, it is often difficult to determine the underlying causal factors. This study aims to provide insights into perceptions of potentially unsafe or uncomfortably hot working conditions that can affect occupational health and safety using information provided by the public and workers to the safety regulator in South Australia (SafeWork SA). Details of complaints regarding heat exposure to the regulator’s “Help Centre” were assembled in a dataset and the textual data analysed thematically. The findings showed that the majority of calls relate to indoor work environments such as kitchens, factories, and warehouses. The main themes identified were work environment, health effects, and organisational issues. Impacts of hot working conditions ranged from discomfort to serious heat-related illnesses. Poor management practices and inflexibility of supervisors featured strongly amongst callers’ concerns. With temperatures predicted to increase and energy prices escalating, this timely study, using naturalistic data, highlights accounts of hot working conditions that can compromise workers’ health and safety and the need for suitable measures to prevent heat stress. These could include risk assessments to assess the likelihood of heat stress in workplaces where excessively hot conditions prevail. PMID:29509710
Hansen, Alana; Pisaniello, Dino; Varghese, Blesson; Rowett, Shelley; Hanson-Easey, Scott; Bi, Peng; Nitschke, Monika
2018-03-06
Heat exposure can be a health hazard for many Australian workers in both outdoor and indoor situations. With many heat-related incidents left unreported, it is often difficult to determine the underlying causal factors. This study aims to provide insights into perceptions of potentially unsafe or uncomfortably hot working conditions that can affect occupational health and safety using information provided by the public and workers to the safety regulator in South Australia (SafeWork SA). Details of complaints regarding heat exposure to the regulator's "Help Centre" were assembled in a dataset and the textual data analysed thematically. The findings showed that the majority of calls relate to indoor work environments such as kitchens, factories, and warehouses. The main themes identified were work environment, health effects, and organisational issues. Impacts of hot working conditions ranged from discomfort to serious heat-related illnesses. Poor management practices and inflexibility of supervisors featured strongly amongst callers' concerns. With temperatures predicted to increase and energy prices escalating, this timely study, using naturalistic data, highlights accounts of hot working conditions that can compromise workers' health and safety and the need for suitable measures to prevent heat stress. These could include risk assessments to assess the likelihood of heat stress in workplaces where excessively hot conditions prevail.
ALARA in European nuclear installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lefaure, C.; Croft, J.; Pfeffer, W.
1995-03-01
For over a decade the Commission of the European Community has sponsored research projects on the development and practical implementation of the Optimization principle, or as it is often referred to, ALARA. These projects have given rise to a series of successful international Optimization training courses and have provided a significant input to the periodic European Seminars on Optimization, the last one of which took place in April 1993. This paper reviews the approaches to Optimization that have development within Europe and describes the areas of work in the current project. The on-going CEC research project addresses the problem ofmore » ALARA and internal exposures, and tries to define procedures for ALARA implementation, taking account of the perception of the hazard as well as the levels of probability of exposure. The relationships between ALARA and work management, and ALARA and decommissioning of installations appear to be other fruitful research areas. Finally, this paper introduces some software for using ALARA decision aiding techniques and databases containing feed back experience developed in Europe.« less
2013-01-01
Background : Inspectors from the US Occupational Safety and Health Administration (OSHA) have been collecting industrial hygiene samples since 1972 to verify compliance with Permissible Exposure Limits. Starting in 1979, these measurements were computerized into the Integrated Management Information System (IMIS). In 2010, a dataset of over 1 million personal sample results analysed at OSHA’s central laboratory in Salt Lake City [Chemical Exposure Health Data (CEHD)], only partially overlapping the IMIS database, was placed into public domain via the internet. We undertook this study to inform potential users about the relationship between this newly available OSHA data and IMIS and to offer insight about the opportunities and challenges associated with the use of OSHA measurement data for occupational exposure assessment. Methods : We conducted a literature review of previous uses of IMIS in occupational health research and performed a descriptive analysis of the data recently made available and compared them to the IMIS database for lead, the most frequently sampled agent. Results : The literature review yielded 29 studies reporting use of IMIS data, but none using the CEHD data. Most studies focused on a single contaminant, with silica and lead being most frequently analysed. Sixteen studies addressed potential bias in IMIS, mostly by examining the association between exposure levels and ancillary information. Although no biases of appreciable magnitude were consistently reported across studies and agents, these assessments may have been obscured by selective under-reporting of non-detectable measurements. The CEHD data comprised 1 450 836 records from 1984 to 2009, not counting analytical blanks and erroneous records. Seventy eight agents with >1000 personal samples yielded 1 037 367 records. Unlike IMIS, which contain administrative information (company size, job description), ancillary information in the CEHD data is mostly analytical. When the IMIS and CEHD measurements of lead were merged, 23 033 (39.2%) records were in common to both IMIS and CEHD datasets, 10 681 (18.2%) records were only in IMIS, and 25 012 (42.6%) records were only in the CEHD database. While IMIS-only records represent data analysed in other laboratories, CEHD-only records suggest partial reporting of sampling results by OSHA inspectors into IMIS. For lead, the percentage of non-detects in the CEHD-only data was 71% compared to 42% and 46% in the both-IMIS-CEHD and IMIS-only datasets, respectively, suggesting differential under-reporting of non-detects in IMIS. Conclusions : IMIS and the CEHD datasets represent the biggest source of multi-industry exposure data in the USA and should be considered as a valuable source of information for occupational exposure assessment. The lack of empirical data on biases, adequate interpretation of non-detects in OSHA data, complicated by suspected differential under-reporting, remain the principal challenges to the valid estimation of average exposure conditions. We advocate additional comparisons between IMIS and CEHD data and discuss analytical strategies that may play a key role in meeting these challenges. PMID:22952385
[Establishement for regional pelvic trauma database in Hunan Province].
Cheng, Liang; Zhu, Yong; Long, Haitao; Yang, Junxiao; Sun, Buhua; Li, Kanghua
2017-04-28
To establish a database for pelvic trauma in Hunan Province, and to start the work of multicenter pelvic trauma registry. Methods: To establish the database, literatures relevant to pelvic trauma were screened, the experiences from the established trauma database in China and abroad were learned, and the actual situations for pelvic trauma rescue in Hunan Province were considered. The database for pelvic trauma was established based on the PostgreSQL and the advanced programming language Java 1.6. Results: The complex procedure for pelvic trauma rescue was described structurally. The contents for the database included general patient information, injurious condition, prehospital rescue, conditions in admission, treatment in hospital, status on discharge, diagnosis, classification, complication, trauma scoring and therapeutic effect. The database can be accessed through the internet by browser/servicer. The functions for the database include patient information management, data export, history query, progress report, video-image management and personal information management. Conclusion: The database with whole life cycle pelvic trauma is successfully established for the first time in China. It is scientific, functional, practical, and user-friendly.
Database on Demand: insight how to build your own DBaaS
NASA Astrophysics Data System (ADS)
Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio
2015-12-01
At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.
ECOTOX Knowledgebase: New tools for data visualization and database interoperability -Poster
The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data from single chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on tox...
ECOTOX Knowledgebase: New tools for data visualization and database interoperability (poster)
The ECOTOXicology knowledgebase (ECOTOX) is a comprehensive, curated database that summarizes toxicology data from single chemical exposure studies to terrestrial and aquatic organisms. The ECOTOX Knowledgebase provides risk assessors and researchers consistent information on tox...
Insertion algorithms for network model database management systems
NASA Astrophysics Data System (ADS)
Mamadolimov, Abdurashid; Khikmat, Saburov
2017-12-01
The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.
Kirkpatrick, Jeffrey S; Howard, Jacqueline M; Reed, David A
2002-04-08
As part of comprehensive joint medical surveillance measures outlined by the Department of Defense, the US Army Center for Health Promotion and Preventive Medicine (USACHPPM) is beginning to assess environmental health threats to continental US military installations. A common theme in comprehensive joint medical surveillance, in support of Force Health Protection, is the identification and assessment of potential environmental health hazards, and the evaluation and documentation of actual exposures in both a continental US and outside a continental US setting. For the continental US assessments, the USACHPPM has utilized the US Environmental Protection Agency (EPA) database for risk management plans in accordance with Public Law 106-40, and the toxic release inventory database, in a state-of the art geographic information systems based program, termed the Consequence Assessment and Management Tool Set, or CATS, for assessing homeland industrial chemical hazards outside the military gates. As an example, the US EPA toxic release inventory and risk management plans databases are queried to determine the types and locations of industries surrounding a continental US military installation. Contaminants of concern are then ranked with respect to known toxicological and physical hazards, where they are then subject to applicable downwind hazard simulations using applicable meteorological and climatological data sets. The composite downwind hazard areas are mapped in relation to emergency response planning guidelines (ERPG), which were developed by the American Industrial Hygiene Association to assist emergency response personnel planning for catastrophic chemical releases. In addition, other geographic referenced data such as transportation routes, satellite imagery and population data are included in the operational, equipment, and morale risk assessment and management process. These techniques have been developed to assist military medical planners and operations personnel in determining the industrial hazards, vulnerability assessments and health risk assessments to continental United States military installations. These techniques and procedures support the Department of Defense Force Protection measures, which provides awareness of a terrorism threat, appropriate measures to prevent terrorist attacks and mitigate terrorism's effects in the event that preventive measures are ineffective.
The research study, Children's Total Exposure to Persistent Pesticides and Other Persistent Organic Pollutants, (CTEPP), examines the exposures of approximately 260 preschool children between the ages of 18 months and 5 years and their primary adult caregivers to pollutants com...
Using Statistics for Database Management in an Academic Library.
ERIC Educational Resources Information Center
Hyland, Peter; Wright, Lynne
1996-01-01
Collecting statistical data about database usage by library patrons aids in the management of CD-ROM and database offerings, collection development, and evaluation of training programs. Two approaches to data collection are presented which should be used together: an automated or nonintrusive method which monitors search sessions while the…
Database Software Selection for the Egyptian National STI Network.
ERIC Educational Resources Information Center
Slamecka, Vladimir
The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…
Teaching Database Management System Use in a Library School Curriculum.
ERIC Educational Resources Information Center
Cooper, Michael D.
1985-01-01
Description of database management systems course being taught to students at School of Library and Information Studies, University of California, Berkeley, notes course structure, assignments, and course evaluation. Approaches to teaching concepts of three types of database systems are discussed and systems used by students in the course are…
Second-generation antipsychotics and risk of cerebrovascular accidents in the elderly.
Percudani, Mauro; Barbui, Corrado; Fortino, Ida; Tansella, Michele; Petrovich, Lorenzo
2005-10-01
Concern has been recently raised for risperidone and olanzapine, possibly associated with cerebrovascular events in placebo-controlled trials conducted in elderly subjects with dementia. We investigated the relationship between exposure to second-generation antipsychotics (SGAs) and occurrence of cerebrovascular accidents in the elderly. From the regional database of hospital admissions of Lombardy, Italy, we extracted all patients aged 65 or older with cerebrovascular-related outcomes for the year 2002. From the regional database of prescriptions reimbursed by the National Health Service, we extracted all patients aged 65 or older who received antipsychotic prescriptions during 2001. The 2 databases were linked anonymously using the individual patient code. The proportions of cerebrovascular accidents were 3.31% (95% confidence interval, 2.95-3.69) in elderly subjects exclusively exposed to SGAs and 2.37% (95% confidence interval, 2.19-2.57) in elderly subjects exclusively exposed to first-generation antipsychotics. After background group differences were controlled for, exposure to SGAs significantly increased the risk of accidents. The analysis of cerebrovascular events in elderly subjects exposed to each individual SGA, in comparison with exposure to haloperidol, showed a significantly increased risk for risperidone only (adjusted odds ratio, 1.43; 95% confidence interval, 1.12-1.93). These data provide preliminary epidemiological evidence that exposure to SGAs, in comparison with exposure to first-generation antipsychotics, significantly increased the risk of cerebrovascular accidents in the elderly.
Modelling of occupational exposure to inhalable nickel compounds.
Kendzia, Benjamin; Pesch, Beate; Koppisch, Dorothea; Van Gelder, Rainer; Pitzke, Katrin; Zschiesche, Wolfgang; Behrens, Thomas; Weiss, Tobias; Siemiatycki, Jack; Lavoué, Jerome; Jöckel, Karl-Heinz; Stamm, Roger; Brüning, Thomas
2017-07-01
The aim of this study was to estimate average occupational exposure to inhalable nickel (Ni) using the German exposure database MEGA. This database contains 8052 personal measurements of Ni collected between 1990 and 2009 in adjunct with information on the measurement and workplace conditions. The median of all Ni concentrations was 9 μg/m 3 and the 95th percentile was 460 μg/m 3 . We predicted geometric means (GMs) for welders and other occupations centered to 1999. Exposure to Ni in welders is strongly influenced by the welding process applied and the Ni content of the used welding materials. Welding with consumable electrodes of high Ni content (>30%) was associated with 10-fold higher concentrations compared with those with a low content (<5%). The highest exposure levels (GMs ≥20 μg/m 3 ) were observed in gas metal and shielded metal arc welders using welding materials with high Ni content, in metal sprayers, grinders and forging-press operators, and in the manufacture of batteries and accumulators. The exposure profiles are useful for exposure assessment in epidemiologic studies as well as in industrial hygiene. Therefore, we recommend to collect additional exposure-specific information in addition to the job title in community-based studies when estimating the health risks of Ni exposure.
The land management and operations database (LMOD)
USDA-ARS?s Scientific Manuscript database
This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...
DOT National Transportation Integrated Search
2006-01-01
An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was designed, developed, and implemented at the Virginia Department of Transportation (VDOT) in 2002 to retrieve, manage, archive, and analyze geotechnical da...
Lurie, Peter; Wolfe, Sidney M
2002-11-01
Hexavalent chromium is widely recognized to be a lung carcinogen. However, the U.S. Occupational Safety and Health Administration (OSHA) has failed to reduce the permissible exposure limit (PEL), despite having acknowledged in 1994 that the current limit is too high. In 1993, Public Citizen and the Paper, Allied-Industrial, Chemical and Energy Workers International Union (PACE) petitioned to lower the PEL from the current 100 microg/m(3) to 0.5 microg/m(3) as an 8-hr time-weighted average (TWA). To assess industry compliance with the current PEL, and to determine the feasibility of achieving the proposed lower limit of 0.5 microg/m(3), we conducted a secondary data analysis of OSHA's Integrated Management Information System (IMIS) database. This database contains 813 measurements of hexavalent chromium exposure from inspections performed during the years 1990-2000. There was a statistically significant decline in the annual number of measurements over the study period from 127 in 1990 to 67 in 2000 (F = 0.0009; linear regression). The median TWA measurement was 10 microg/m(3) (range: 0.01-13,960 microg/m(3)) and the median ceiling measurement was 40.5 microg/m(3) (range: 0.25-25,000 microg/m(3)). Neither median TWA nor median ceiling exposures (if hexavalent chromium was detected) declined significantly during the study period (F = 0.065 and 0.57, respectively). Overall, 13.7% of TWA measurements were at or below the Public Citizen/PACE proposed standard; 65.0% were between the Public Citizen/PACE proposal and the current OSHA PEL; and 21.3% exceeded the OSHA PEL. Compared to OSHA measurements, state measurements were less likely to detect hexavalent chromium (40.2% vs. 52.1%; P = 0.0007; chi-square) and less likely to issue any citation (9.3% vs. 19.1%; P = 0.0003), including citations for overexposure if the exposure exceeded the PEL (54.8% vs. 78.8%; P = 0.012). U.S. workers continue to be exposed to dangerously high hexavalent chromium levels, but low exposure levels were found in some industries. Further investigations should examine whether state plans provide weaker enforcement than federal OSHA. Copyright 2002 Wiley-Liss, Inc.
Exposure Assessment of Livestock Carcass Management ...
Report This report describes relative exposures and hazards for different livestock carcass management options in the event of a natural disaster. A quantitative exposure assessment by which livestock carcass management options are ranked relative to one another for a hypothetical site setting, a standardized set of environmental conditions (e.g., meteorology), and following a single set of assumptions about how the carcass management options are designed and implemented. These settings, conditions, and assumptions are not necessarily representative of site-specific carcass management efforts. Therefore, the exposure assessment should not be interpreted as estimating levels of chemical and microbial exposure that can be expected to result from the management options evaluated. The intent of the relative rankings is to support scientifically-based livestock carcass management decisions that consider potential hazards to human health, livestock, and the environment. This exposure assessment also provides information to support choices about mitigation measures to minimize or eliminate specific exposure pathways.
Gilligan, Tony; Alamgir, Hasanat
2008-01-01
Healthcare workers are exposed to a variety of work-related hazards including biological, chemical, physical, ergonomic, psychological hazards; and workplace violence. The Occupational Health and Safety Agency for Healthcare in British Columbia (OHSAH), in conjunction with British Columbia (BC) health regions, developed and implemented a comprehensive surveillance system that tracks occupational exposures and stressors as well as injuries and illnesses among a defined population of healthcare workers. Workplace Health Indicator Tracking and Evaluation (WHITE) is a secure operational database, used for data entry and transaction reporting. It has five modules: Incident Investigation, Case Management, Employee Health, Health and Safety, and Early Intervention/Return to Work. Since the WHITE database was first introduced into BC in 2004, it has tracked the health of 84,318 healthcare workers (120,244 jobs), representing 35,927 recorded incidents, resulting in 18,322 workers' compensation claims. Currently, four of BC's six healthcare regions are tracking and analyzing incidents and the health of healthcare workers using WHITE, providing OHSAH and healthcare stakeholders with comparative performance indicators on workplace health and safety. A number of scientific manuscripts have also been published in peer-reviewed journals. The WHITE database has been very useful for descriptive epidemiological studies, monitoring health risk factors, benchmarking, and evaluating interventions.
An "EAR" on environmental surveillance and monitoring: A ...
Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on chemical concentration alone, it can be difficult to identify which compounds may be of toxicological concern for prioritization for further monitoring or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high throughput screening data like the ToxCast™ database, which contains data for over 9,000 compounds screened through up to 1,100 assays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast™ effects database were used to calculate exposure-activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast™ database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Biological pathways were then linked to adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts. Anthropogenic contaminants are frequently reported in environm
Exploring Global Exposure Factors Resources URLs
The dataset is a compilation of hyperlinks (URLs) for resources (databases, compendia, published articles, etc.) useful for exposure assessment specific to consumer product use.This dataset is associated with the following publication:Zaleski, R., P. Egeghy, and P. Hakkinen. Exploring Global Exposure Factors Resources for Use in Consumer Exposure Assessments. International Journal of Environmental Research and Public Health. Molecular Diversity Preservation International, Basel, SWITZERLAND, 13(7): 744, (2016).
The Stochastic Human Exposure and Dose Simulation Model – High-Throughput (SHEDS-HT) is a U.S. Environmental Protection Agency research tool for predicting screening-level (low-tier) exposures to chemicals in consumer products. This course will present an overview of this m...
Information Management System, Materials Research Society Fall Meeting (2013) Photovoltaics Informatics scientific data management, database and data systems design, database clusters, storage systems integration , and distributed data analytics. She has used her experience in laboratory data management systems, lab
DOT National Transportation Integrated Search
2007-01-01
An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was implemented at the Virginia Department of Transportation (VDOT) in 2002 to manage geotechnical data using a distributed Geographical Information System (G...
23 CFR 973.204 - Management systems requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...
23 CFR 973.204 - Management systems requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...
RATE Exposure Assessment Modules - EXA 408, EXA 409
EXA 408 – Interpreting Biomonitoring Data and Using Pharmacokinetic Modeling in Exposure Assessment Widespread acceptance and use of the CDC's National Health and Nutritional Examination Survey (NHANES) database, which, among other things, reports measured concentrations of...
Juarez, Paul D; Hood, Darryl B; Rogers, Gary L; Baktash, Suzanne H; Saxton, Arnold M; Matthews-Juarez, Patricia; Im, Wansoo; Cifuentes, Myriam Patricia; Phillips, Charles A; Lichtveld, Maureen Y; Langston, Michael A
2017-01-01
Objectives The aim is to identify exposures associated with lung cancer mortality and mortality disparities by race and gender using an exposome database coupled to a graph theoretical toolchain. Methods Graph theoretical algorithms were employed to extract paracliques from correlation graphs using associations between 2162 environmental exposures and lung cancer mortality rates in 2067 counties, with clique doubling applied to compute an absolute threshold of significance. Factor analysis and multiple linear regressions then were used to analyze differences in exposures associated with lung cancer mortality and mortality disparities by race and gender. Results While cigarette consumption was highly correlated with rates of lung cancer mortality for both white men and women, previously unidentified novel exposures were more closely associated with lung cancer mortality and mortality disparities for blacks, particularly black women. Conclusions Exposures beyond smoking moderate lung cancer mortality and mortality disparities by race and gender. Policy Implications An exposome approach and database coupled with scalable combinatorial analytics provides a powerful new approach for analyzing relationships between multiple environmental exposures, pathways and health outcomes. An assessment of multiple exposures is needed to appropriately translate research findings into environmental public health practice and policy. PMID:29152601
Salary Management System for Small and Medium-sized Enterprises
NASA Astrophysics Data System (ADS)
Hao, Zhang; Guangli, Xu; Yuhuan, Zhang; Yilong, Lei
Small and Medium-sized Enterprises (SMEs) in the process of wage entry, calculation, the total number are needed to be done manually in the past, the data volume is quite large, processing speed is low, and it is easy to make error, which is resulting in low efficiency. The main purpose of writing this paper is to present the basis of salary management system, establish a scientific database, the computer payroll system, using the computer instead of a lot of past manual work in order to reduce duplication of staff labor, it will improve working efficiency.This system combines the actual needs of SMEs, through in-depth study and practice of the C/S mode, PowerBuilder10.0 development tools, databases and SQL language, Completed a payroll system needs analysis, database design, application design and development work. Wages, departments, units and personnel database file are included in this system, and have data management, department management, personnel management and other functions, through the control and management of the database query, add, delete, modify, and other functions can be realized. This system is reasonable design, a more complete function, stable operation has been tested to meet the basic needs of the work.
Chang, Chunxin; Chen, Minjian; Gao, Jiawei; Luo, Jia; Wu, Keqin; Dong, Tianyu; Zhou, Kun; He, Xiaowei; Hu, Weiyue; Wu, Wei; Lu, Chuncheng; Hang, Bo; Meeker, John D; Wang, Xinru; Xia, Yankai
2017-05-01
Although various pesticides were used globally, the pesticides profiles in human blood serum remain largely unknown. We determined pesticide exposure profiles using solid-phase extraction and gas chromatography tandem with triple quadrupole mass spectrometry in 200 human blood serum samples from the adult population in Jiangsu Province, China. A systematic and comprehensive literature review was carried out to identify the articles investigating pesticide exposure and compare exposure data. Of the 88 pesticides, 76 were found in the blood serum of the population in Jiangsu Province. To the best of our knowledge, 58 pesticides were reported in human blood serum for the first time, and among these pesticides, parathion-methyl, pyrimethanil, fluacrypyrim, simazine, cloquintocet-mexyl and barban were debatable in more than half of the samples. By statistical comparison of the blood serum levels of pesticides between this study and other countries, we found the levels of several organochlorine pesticides were significantly higher in the female population of Jiangsu Province. Health risks related to the pesticide profiling were then revealed, which identified higher carcinogenic toxicity and teratogenic toxicity risk in the female adults of Jiangsu Province caused by organochlorine pesticide exposure. This study not only provides a high-throughput pesticide screening method for future studies of the exposome, but also presents the first human data on exposure to a number of pesticides. It may provide a knowledge database for the risk assessment and management of the pesticides. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin
2017-08-01
As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.
48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.
Code of Federal Regulations, 2013 CFR
2013-10-01
... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...
48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.
Code of Federal Regulations, 2014 CFR
2014-10-01
... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...
Microcomputer-Based Access to Machine-Readable Numeric Databases.
ERIC Educational Resources Information Center
Wenzel, Patrick
1988-01-01
Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)
ERIC Educational Resources Information Center
Hoffman, Tony
Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…
Code of Federal Regulations, 2014 CFR
2014-10-01
... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...
A Database Design and Development Case: NanoTEK Networks
ERIC Educational Resources Information Center
Ballenger, Robert M.
2010-01-01
This case provides a real-world project-oriented case study for students enrolled in a management information systems, database management, or systems analysis and design course in which database design and development are taught. The case consists of a business scenario to provide background information and details of the unique operating…
ERIC Educational Resources Information Center
Dalrymple, Prudence W.; Roderer, Nancy K.
1994-01-01
Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…
NASA Astrophysics Data System (ADS)
Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong
2015-03-01
Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.
Kuhn, Stefan; Schlörer, Nils E
2015-08-01
nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.
Slaughter, Robin J; Beasley, D Michael G; Lambie, Bruce S; Wilkins, Gerard T; Schep, Leo J
2012-12-14
New Zealand has a number of plants, both native and introduced, contact with which can lead to poisoning. The New Zealand National Poisons Centre (NZNPC) frequently receives enquiries regarding exposures to poisonous plants. Poisonous plants can cause harm following inadvertent ingestion, via skin contact, eye exposures or inhalation of sawdust or smoked plant matter. The purpose of this article is to determine the 15 most common poisonous plant enquiries to the NZNPC and provide a review of current literature, discussing the symptoms that might arise upon exposure to these poisonous plants and the recommended medical management of such poisonings. Call data from the NZNPC telephone collection databases regarding human plant exposures between 2003 and 2010 were analysed retrospectively. The most common plants causing human poisoning were selected as the basis for this review. An extensive literature review was also performed by systematically searching OVID MEDLINE, ISI Web of Science, Scopus and Google Scholar. Further information was obtained from book chapters, relevant news reports and web material. For the years 2003-2010 inclusive, a total of 256,969 enquiries were received by the NZNPC. Of these enquiries, 11,049 involved exposures to plants and fungi. The most common poisonous plant enquiries, in decreasing order of frequency, were: black nightshade (Solanum nigrum), arum lily (Zantedeschia aethiopica), kowhai (Sophora spp.), euphorbia (Euphorbia spp.), peace lily (Spathiphyllum spp.), agapanthus (Agapanthus spp.), stinking iris (Iris foetidissima), rhubarb (Rheum rhabarbarum), taro (Colocasia esculentum), oleander (Nerium oleander), daffodil (Narcissus spp.), hemlock (Conium maculatum), karaka (Corynocarpus laevigatus), foxglove (Digitalis purpurea) and ongaonga/New Zealand tree nettle (Urtica ferox). The combined total of enquiries for these 15 species was 2754 calls (representing approximately 25% of all enquiries regarding plant exposures). The signs and symptoms resulting from poisoning from these plants are discussed. Medical treatment recommendations are made. Poisoning following ingestion or other forms of exposures to plants in New Zealand is relatively common, particularly among children. However, serious adverse reactions are comparatively rare. Accurate plant identification and details on the type of exposure can be important in assessing the likely risks. Effective medical management of these poisonings can be achieved by following the principles outlined in this review.
Network Configuration of Oracle and Database Programming Using SQL
NASA Technical Reports Server (NTRS)
Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.
2000-01-01
A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).
NASA Astrophysics Data System (ADS)
Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George
2017-10-01
In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.
Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe
2015-04-01
The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Kelley, Steve; Roussopoulos, Nick; Sellis, Timos; Wallace, Sarah
1993-01-01
The Universal Index System (UIS) is an index management system that uses a uniform interface to solve the heterogeneity problem among database management systems. UIS provides an easy-to-use common interface to access all underlying data, but also allows different underlying database management systems, storage representations, and access methods.
Maintaining Research Documents with Database Management Software.
ERIC Educational Resources Information Center
Harrington, Stuart A.
1999-01-01
Discusses taking notes for research projects and organizing them into card files; reviews the literature on personal filing systems; introduces the basic process of database management; and offers a plan for managing research notes. Describes field groups and field definitions, data entry, and creating reports. (LRW)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
... CONSUMER PRODUCT SAFETY COMMISSION Agency Information Collection Activities; Announcement of Office of Management and Budget Approval; Publicly Available Consumer Product Safety Information Database... Product Safety Information Database has been approved by the Office of Management and Budget (OMB) under...
Bekarian, Nyree; Payne-Sturges, Devon; Edmondson, Stuart; Chism, Bill; Woodruff, Tracey J
2006-05-25
Residential-use pesticides have been shown to be a major source of pesticide exposure to people in the United States. However, little is understood about the exposures to household pesticides and the resultant health effects. One reason that little is known about home-use pesticide exposure is the lack of comprehensive data on exposures to pesticides in the home. One method to help ascertain the amount of pesticides present in the home is use of point-of-sale data collected from marketing companies that track product sales to obtain the volume of pesticides sold for home-use. This provides a measure of volume of home-use pesticide. We have constructed a searchable database containing sales data for home-use permethrin-containing pesticides sold by retail stores in the United States from January 1997 through December 2002 in an attempt to develop a tracking method for pesticide. This pilot project was conducted to determine if point-of-sale data would be effective in helping track the purchase of home-use permethrin containing pesticides and if it would stand as a good model for tracking sales of other home-use pesticides. There are several limitations associated with this tracking method, including the availability of sales data, market coverage, and geographic resolution. As a result, a fraction of sales data potentially available for reporting is represented in this database. However, the database is sensitive to the number and type of merchants reporting permethrin sales. Further, analysis of the sale of individual products included in the database indicates that year to year variability has a greater impact on reported permethrin sales than the amount sold by each type of merchant. We conclude that, while nothing could completely replace a detailed exposure assessment to estimate exposures to home-use pesticides, a point-of-sale database is a useful tool in tracking the purchase of these types of pesticides to 1) detect anomalous trends in regional and seasonal pesticide sales warranting further investigation into the potential causes of the trends; 2) determine the most commonly purchased application types; and 3) compare relative trends in sales between indoor and outdoor use products as well as compare trends in sales between different active ingredients.
Use of point-of-sale data to track usage patterns of residential pesticides: methodology development
Bekarian, Nyree; Payne-Sturges, Devon; Edmondson, Stuart; Chism, Bill; Woodruff, Tracey J
2006-01-01
Background Residential-use pesticides have been shown to be a major source of pesticide exposure to people in the United States. However, little is understood about the exposures to household pesticides and the resultant health effects. One reason that little is known about home-use pesticide exposure is the lack of comprehensive data on exposures to pesticides in the home. One method to help ascertain the amount of pesticides present in the home is use of point-of-sale data collected from marketing companies that track product sales to obtain the volume of pesticides sold for home-use. This provides a measure of volume of home-use pesticide. Methods We have constructed a searchable database containing sales data for home-use permethrin-containing pesticides sold by retail stores in the United States from January 1997 through December 2002 in an attempt to develop a tracking method for pesticide. This pilot project was conducted to determine if point-of-sale data would be effective in helping track the purchase of home-use permethrin containing pesticides and if it would stand as a good model for tracking sales of other home-use pesticides. Results There are several limitations associated with this tracking method, including the availability of sales data, market coverage, and geographic resolution. As a result, a fraction of sales data potentially available for reporting is represented in this database. However, the database is sensitive to the number and type of merchants reporting permethrin sales. Further, analysis of the sale of individual products included in the database indicates that year to year variability has a greater impact on reported permethrin sales than the amount sold by each type of merchant. Conclusion We conclude that, while nothing could completely replace a detailed exposure assessment to estimate exposures to home-use pesticides, a point-of-sale database is a useful tool in tracking the purchase of these types of pesticides to 1) detect anomalous trends in regional and seasonal pesticide sales warranting further investigation into the potential causes of the trends; 2) determine the most commonly purchased application types; and 3) compare relative trends in sales between indoor and outdoor use products as well as compare trends in sales between different active ingredients. PMID:16725037
NASA Astrophysics Data System (ADS)
Zhou, Hui
It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.
The MAJORANA Parts Tracking Database
NASA Astrophysics Data System (ADS)
Abgrall, N.; Aguayo, E.; Avignone, F. T.; Barabash, A. S.; Bertrand, F. E.; Brudanin, V.; Busch, M.; Byram, D.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Combs, D. C.; Cuesta, C.; Detwiler, J. A.; Doe, P. J.; Efremenko, Yu.; Egorov, V.; Ejiri, H.; Elliott, S. R.; Esterline, J.; Fast, J. E.; Finnerty, P.; Fraenkle, F. M.; Galindo-Uribarri, A.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guiseppe, V. E.; Gusev, K.; Hallin, A. L.; Hazama, R.; Hegai, A.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Kochetov, O.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J. Diaz; Leviner, L. E.; Loach, J. C.; MacMullin, J.; Martin, R. D.; Meijer, S. J.; Mertens, S.; Miller, M. L.; Mizouni, L.; Nomachi, M.; Orrell, J. L.; O`Shaughnessy, C.; Overman, N. R.; Petersburg, R.; Phillips, D. G.; Poon, A. W. P.; Pushkin, K.; Radford, D. C.; Rager, J.; Rielage, K.; Robertson, R. G. H.; Romero-Romero, E.; Ronquest, M. C.; Shanks, B.; Shima, T.; Shirchenko, M.; Snavely, K. J.; Snyder, N.; Soin, A.; Suriano, A. M.; Tedeschi, D.; Thompson, J.; Timkin, V.; Tornow, W.; Trimble, J. E.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Young, A. R.; Yu, C.-H.; Yumatov, V.; Zhitnikov, I.
2015-04-01
The MAJORANA DEMONSTRATOR is an ultra-low background physics experiment searching for the neutrinoless double beta decay of 76Ge. The MAJORANA Parts Tracking Database is used to record the history of components used in the construction of the DEMONSTRATOR. The tracking implementation takes a novel approach based on the schema-free database technology CouchDB. Transportation, storage, and processes undergone by parts such as machining or cleaning are linked to part records. Tracking parts provide a great logistics benefit and an important quality assurance reference during construction. In addition, the location history of parts provides an estimate of their exposure to cosmic radiation. A web application for data entry and a radiation exposure calculator have been developed as tools for achieving the extreme radio-purity required for this rare decay search.
Hewett, Paul; Bullock, William H
2014-01-01
For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) of 3 ppm the nitrogen dioxide exposure profile merits an exposure rating of AIHA exposure category 1. However, using the newly adopted TLV of 0.2 ppm the exposure profile receives an exposure rating of category 4. Further evaluation is recommended to determine the current status of nitrogen dioxide exposures. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: additional text on OELs, methods, results, and additional figures and tables.].
Brilleman, Samuel L.; Wolfe, Rory; Moreno-Betancur, Margarita; Sales, Anne E.; Langa, Kenneth M.; Li, Yun; Daugherty Biddison, Elizabeth L.; Rubinson, Lewis; Iwashyna, Theodore J.
2016-01-01
Disasters occur frequently in the United States (US) and their impact on acute morbidity, mortality and short-term increased health needs has been well described. However, barring mental health, little is known about the medium or longer-term health impacts of disasters. This study sought to determine if there is an association between community-level disaster exposure and individual-level changes in disability and/or the risk of death for older Americans. Using the US Federal Emergency Management Agency’s database of disaster declarations, 602 disasters occurred between August 1998 and December 2010 and were characterized by their presence, intensity, duration and type. Repeated measurements of a disability score (based on activities of daily living) and dates of death were observed between January 2000 and November 2010 for 18,102 American individuals aged 50 to 89 years, who were participating in the national longitudinal Health and Retirement Study. Longitudinal (disability) and time-to-event (death) data were modelled simultaneously using a ‘joint modelling’ approach. There was no evidence of an association between community-level disaster exposure and individual-level changes in disability or the risk of death. Our results suggest that future research should focus on individual-level disaster exposures, moderate to severe disaster events, or higher-risk groups of individuals. PMID:27960126
Rattner, B.A.; Eisenreich, K.M.; Golden, N.H.; McKernan, M.A.; Hothem, R.L.; Custer, T.W.
2005-01-01
The Contaminant Exposure and Effects—Terrestrial Vertebrates (CEE-TV) database was developed to conduct simple searches for ecotoxicological information, examine exposure trends, and identify significant data gaps. The CEE-TV database contains 16,696 data records on free-ranging amphibians, reptiles, birds, and mammals residing in estuarine and coastal habitats of the Atlantic, Gulf, and Pacific coasts, Alaska, Hawaii, and the Great Lakes. Information in the database was derived from over 1800 source documents, representing 483 unique species (about 252,000 individuals), with sample collection dates spanning from 1884 to 2003. The majority of the records contain exposure data (generally contaminant concentrations) on a limited number (n = 209) of chlorinated and brominated compounds, cholinesterase-inhibiting pesticides, economic poisons, metals, and petroleum hydrocarbons, whereas only 9.3% of the records contain biomarker or bioindicator effects data. Temporal examination of exposure data provides evidence of declining concentrations of certain organochlorine pesticides in some avian species (e.g., ospreys, Pandion haliaetus), and an apparent increase in the detection and possibly the incidence of avian die-offs related to cholinesterase-inhibiting pesticides. To identify spatial data gaps, 11,360 database records with specific sampling locations were combined with the boundaries of coastal watersheds, and National Wildlife Refuge and National Park units. Terrestrial vertebrate ecotoxicological data were lacking in 41.9% of 464 coastal watersheds in the continental United States. Recent (1990–2003) terrestrial vertebrate contaminant exposure or effects data were available for only about half of the National Wildlife Refuge and National Park units in the geographic area encompassed by the database. When these data gaps were overlaid on watersheds exhibiting serious water quality problems and/or high vulnerability to pollution, 72 coastal watersheds, and 76 National Wildlife Refuge and 59 National Park units in the continental United States were found to lack recent terrestrial vertebrate ecotoxicology data. Delineation of data gaps in watersheds of concern can help prioritize monitoring in areas with impaired water quality and emphasize the need for comprehensive monitoring to gain a more complete understanding of coastal ecosystem health.
Appraisal of levels and patterns of occupational exposure to 1,3-butadiene.
Scarselli, Alberto; Corfiati, Marisa; Di Marzi, Davide; Iavicoli, Sergio
2017-09-01
Objectives 1,3-butadiene is classified as carcinogenic to human by inhalation and the association with leukemia has been observed in several epidemiological studies. The aim of this study was to evaluate data about occupational exposure levels to 1,3-butadiene in the Italian working force. Methods Airborne concentrations of 1,3-butadiene were extracted from the Italian database on occupational exposure to carcinogens in the period 1996-2015. Descriptive statistics were calculated for exposure-related variables. An analysis through linear mixed model was performed to determine factors influencing the exposure level. The probability of exceeding the exposure limit was predicted using a mixed-effects logistic model. Concurrent exposures with other occupational carcinogens were investigated using the two-step cluster analysis. Results The total number of exposure measurements selected was 23 885, with an overall arithmetic mean of 0.12 mg/m3. The economic sector with the highest number of measurements was manufacturing of chemicals (18 744). The most predictive variables of the exposure level resulted to be the occupational group and its interaction with the measurement year. The highest likelihood of exceeding the exposure limit was found in the manufacture of coke and refined petroleum products. Concurrent exposures were frequently detected, mainly with benzene, acrylonitrile and ethylene dichloride, and three main clusters were identified. Conclusions Exposure to 1,3-butadiene occurs in a wide variety of activity sectors and occupational groups. The use of several statistical analysis methods applied to occupational exposure databases can help to identify exposure situations at high risk for workers' health and better target preventive interventions and research projects.
Development of the ageing management database of PUSPATI TRIGA reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila
Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.
1985-12-01
RELATIONAL TO NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM TH ESI S .L Kevin H. Mahoney -- Captain, USAF AFIT/GCS/ENG/85D-7...NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM - THESIS Presented to the Faculty of the School of Engineering of the Air Force...Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Systems - Kevin H. Mahoney
Tautomerism in chemical information management systems
NASA Astrophysics Data System (ADS)
Warr, Wendy A.
2010-06-01
Tautomerism has an impact on many of the processes in chemical information management systems including novelty checking during registration into chemical structure databases; storage of structures; exact and substructure searching in chemical structure databases; and depiction of structures retrieved by a search. The approaches taken by 27 different software vendors and database producers are compared. It is hoped that this comparison will act as a discussion document that could ultimately improve databases and software for researchers in the future.
Wang, L.; Infante, D.; Esselman, P.; Cooper, A.; Wu, D.; Taylor, W.; Beard, D.; Whelan, G.; Ostroff, A.
2011-01-01
Fisheries management programs, such as the National Fish Habitat Action Plan (NFHAP), urgently need a nationwide spatial framework and database for health assessment and policy development to protect and improve riverine systems. To meet this need, we developed a spatial framework and database using National Hydrography Dataset Plus (I-.100,000-scale); http://www.horizon-systems.com/nhdplus). This framework uses interconfluence river reaches and their local and network catchments as fundamental spatial river units and a series of ecological and political spatial descriptors as hierarchy structures to allow users to extract or analyze information at spatial scales that they define. This database consists of variables describing channel characteristics, network position/connectivity, climate, elevation, gradient, and size. It contains a series of catchment-natural and human-induced factors that are known to influence river characteristics. Our framework and database assembles all river reaches and their descriptors in one place for the first time for the conterminous United States. This framework and database provides users with the capability of adding data, conducting analyses, developing management scenarios and regulation, and tracking management progresses at a variety of spatial scales. This database provides the essential data needs for achieving the objectives of NFHAP and other management programs. The downloadable beta version database is available at http://ec2-184-73-40-15.compute-1.amazonaws.com/nfhap/main/.
Common Database Interface for Heterogeneous Software Engineering Tools.
1987-12-01
SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager
NASA Astrophysics Data System (ADS)
Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan
2010-10-01
The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.
Design and deployment of a large brain-image database for clinical and nonclinical research
NASA Astrophysics Data System (ADS)
Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.
2004-04-01
An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.
Schurr, K.M.; Cox, S.E.
1994-01-01
The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.
PERFORMANCE AUDITING OF A HUMAN AIR POLLUTION EXPOSURE CHAMBER FOR PM2.5
Databases derived from human health effects research play a vital role in setting environmental standards. An underlying assumption in using these databases for standard setting purposes is that they are of adequate quality. The performance auditing program described provides n...
Mater, Gautier; Paris, Christophe; Lavoué, Jérôme
2016-05-01
Several countries have built databases of occupational hygiene measurements. In France, COLCHIC and SCOLA co-exist, started in 1987 and 2007, respectively. A descriptive comparison of the content of the two databases was carried out during the period 1987-2012, including variables, workplaces and agents, as well as exposure levels. COLCHIC and SCOLA contain, respectively, 841,682 (670 chemicals) and 152,486 records (70). They cover similar industries and occupations, and contain the same ancillary information. Across 17 common agents with >500 samples, the ratio of the median concentration in COLCHIC to the median concentration in SCOLA was 3.45 [1.03-14.3] during 2007-2012. This pattern remained when stratified by industry, task, and occupation, but was attenuated when restricted to similar sampling duration. COLCHIC and SCOLA represent a considerable source of information, but result from different purposes (prevention, regulatory). Potential differences due to strategies should evaluated when interpreting data from these databases. © 2016 Wiley Periodicals, Inc.
WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides
NASA Astrophysics Data System (ADS)
Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston
2007-06-01
Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.
Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.
Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel
2012-01-01
Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012
Kingfisher: a system for remote sensing image database management
NASA Astrophysics Data System (ADS)
Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.
2003-04-01
At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.
The ID Database: Managing the Instructional Development Process
ERIC Educational Resources Information Center
Piña, Anthony A.; Sanford, Barry K.
2017-01-01
Management is evolving as a foundational domain to the field of instructional design and technology. However, there are few tools dedicated to the management of instructional design and development projects and activities. In this article, we describe the development, features and implementation of an instructional design database--built from a…
Expansion of the MANAGE database with forest and drainage studies
USDA-ARS?s Scientific Manuscript database
The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...
76 FR 12617 - Airworthiness Directives; The Boeing Company Model 777-200 and -300 Series Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... installing new operational software for the electrical load management system and configuration database... the electrical load management system operational software and configuration database software, in... Management, P.O. Box 3707, MC 2H-65, Seattle, Washington 98124-2207; telephone 206- 544-5000, extension 1...
Source-to-Outcome Microbial Exposure and Risk Modeling Framework
A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...
Medication exposure and spontaneous abortion: a case-control study using a French medical database.
Abadie, D; Hurault-Delarue, C; Damase-Michel, C; Montastruc, J L; Lacroix, I
2015-01-01
Few studies have been conducted to investigate drug effects on spontaneous abortion risk. The objective of the present study was to evaluate the potential association between first trimester drug exposure and spontaneous abortion occurrence. The authors performed a nested case-control study using data from TERAPPEL, a French medical database. Cases were the women who had a spontaneous abortion (before the 22nd week of amenorrhea) and controls were women who gave birth to a child. Analyzed variables were: maternal age, obstetric history, tobacco, and alcohol and drug consumption during the first trimester of pregnancy. For comparison of drug exposures between cases and controls, the authors calculated odds ratios (ORs) by means of multivariate logistic regressions adjusted on age and on other drug exposures. The study included 838 cases and 4,508 controls that were identified in the database. In adjusted analyses, cases were more exposed than controls to "non-selective monoamine reuptake inhibitors" [OR=2.2 (CI 95% 1.5-3.3)], "antiprotozoals" [OR = 1.6 (CI 95% 1.1 - 2.5)] and "centrally acting antiobesity products" [OR = 3.4 (CI 95% 1.9 - 6.2)]. Conversely, controls were more exposed than cases to H1 antihistamines [OR = 0.6 (CI 95% 0.4 - 0.9)]. This exploratory study highlights some potential associations between first trimester drug exposure and risk of spontaneous abortion. Further studies have to be carried out to investigate these findings.
Solving Relational Database Problems with ORDBMS in an Advanced Database Course
ERIC Educational Resources Information Center
Wang, Ming
2011-01-01
This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…
Migration from relational to NoSQL database
NASA Astrophysics Data System (ADS)
Ghotiya, Sunita; Mandal, Juhi; Kandasamy, Saravanakumar
2017-11-01
Data generated by various real time applications, social networking sites and sensor devices is of very huge amount and unstructured, which makes it difficult for Relational database management systems to handle the data. Data is very precious component of any application and needs to be analysed after arranging it in some structure. Relational databases are only able to deal with structured data, so there is need of NoSQL Database management System which can deal with semi -structured data also. Relational database provides the easiest way to manage the data but as the use of NoSQL is increasing it is becoming necessary to migrate the data from Relational to NoSQL databases. Various frameworks has been proposed previously which provides mechanisms for migration of data stored at warehouses in SQL, middle layer solutions which can provide facility of data to be stored in NoSQL databases to handle data which is not structured. This paper provides a literature review of some of the recent approaches proposed by various researchers to migrate data from relational to NoSQL databases. Some researchers proposed mechanisms for the co-existence of NoSQL and Relational databases together. This paper provides a summary of mechanisms which can be used for mapping data stored in Relational databases to NoSQL databases. Various techniques for data transformation and middle layer solutions are summarised in the paper.
Delaney-Black, V; Covington, C; Templin, T; Ager, J; Martier, S; Compton, S; Sokol, R
1998-06-21
Despite media reports and educators' concerns, little substantive data have been published to document or refute the emerging reports that children prenatally exposed to cocaine have serious behavioral problems in school. Recent pilot data from this institution have indeed demonstrated teacher-reported problem behaviors following prenatal cocaine exposure after controlling for the effects of prenatal alcohol use and cigarette exposure. Imperative in the study of prenatal exposure and child outcome is an acknowledgement of the influence of other control factors such as postnatal environment, secondary exposures, and parenting issues. We report preliminary evaluation from a large ongoing historical prospective study of prenatal cocaine exposure on school-age outcomes. The primary aim of this NIDA-funded study is to determine if a relationship exists between prenatal cocaine/alcohol exposures and school behavior and, if so, to determine if the relationship is characterized by a dose-response relationship. A secondary aim evaluates the relationship between prenatal cocaine/alcohol exposures and school achievement. Both relationships will be assessed in a black, urban sample of first grade students using multivariate statistical techniques for confounding as well as mediating and moderating prenatal and postnatal variables. A third aim is to evaluate the relationship between a general standardized classroom behavioral measure and a tool designed to tap the effects thought to be specific to prenatal cocaine exposure. This interdisciplinary research team can address these aims because of the existence of a unique, prospectively collected perinatal Database, funded in part by NIAAA and NICHD. The database includes repeated measures of cocaine, alcohol, and other substances for over 3,500 births since 1986. Information from this database is combined with information from the database of one of the largest public school systems in the nation. The final sample will be composed of over 600 first grade students for whom the independent variables, prenatal cocaine/alcohol exposures, were prospectively assessed and quantified at the university maternity center. After informed consent, the primary dependent variable, school behavior, is assessed, using the PROBS-14 (a teacher consensus developed instrument), the Child Behavior Check List, and the Conners' Teacher Rating Scale. The secondary dependent measure, school achievement, is measured by the Metropolitan Achievement Text and the Test of Early Reading Ability. Control variables, such as the environment and parenting, are measured by several instruments aimed at capturing the child and family ecology since birth. All analyses will be adjusted as appropriate for prospectively gathered control variables such as perinatal risk, neonatal risk, and other prenatal drug and cigarette exposures. Further adjustment will be made for postnatal social risk factors which may influence outcome. Of particular concern are characteristics of the home (adaptation of HOME), parent (depression, stress), and neighborhood (violence exposure). Finally, postnatal exposure to lead and other drugs is being considered.
... Disasters and Public Health Emergencies The NLM Disaster Information Management Research Center has tools, guides, and databases to ... Disasters and Public Health Emergencies The NLM Disaster Information Management Research Center has tools, guides, and databases to ...
Construction of In-house Databases in a Corporation
NASA Astrophysics Data System (ADS)
Dezaki, Kyoko; Saeki, Makoto
Rapid progress in advanced informationalization has increased need to enforce documentation activities in industries. Responding to it Tokin Corporation has been engaged in database construction for patent information, technical reports and so on accumulated inside the Company. Two results are obtained; One is TOPICS, inhouse patent information management system, the other is TOMATIS, management and technical information system by use of personal computers and all-purposed relational database software. These systems aim at compiling databases of patent and technological management information generated internally and externally by low labor efforts as well as low cost, and providing for comprehensive information company-wide. This paper introduces the outline of these systems and how they are actually used.
Ahmadi, Farshid Farnood; Ebadi, Hamid
2009-01-01
3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.
Yoo, Do Hyeon; Shin, Wook-Geun; Lee, Jaekook; Yeom, Yeon Soo; Kim, Chan Hyeong; Chang, Byung-Uck; Min, Chul Hee
2017-11-01
After the Fukushima accident in Japan, the Korean Government implemented the "Act on Protective Action Guidelines Against Radiation in the Natural Environment" to regulate unnecessary radiation exposure to the public. However, despite the law which came into effect in July 2012, an appropriate method to evaluate the equivalent and effective doses from naturally occurring radioactive material (NORM) in consumer products is not available. The aim of the present study is to develop and validate an effective dose coefficient database enabling the simple and correct evaluation of the effective dose due to the usage of NORM-added consumer products. To construct the database, we used a skin source method with a computational human phantom and Monte Carlo (MC) simulation. For the validation, the effective dose was compared between the database using interpolation method and the original MC method. Our result showed a similar equivalent dose across the 26 organs and a corresponding average dose between the database and the MC calculations of < 5% difference. The differences in the effective doses were even less, and the result generally show that equivalent and effective doses can be quickly calculated with the database with sufficient accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Personal Database Management System I TRIAS
NASA Astrophysics Data System (ADS)
Yamamoto, Yoneo; Kashihara, Akihiro; Kawagishi, Keisuke
The current paper provides TRIAS (TRIple Associative System) which is a database management system for a personal use. In order to implement TRIAS, we have developed an associative database, whose format is (e,a,v) : e for entity, a for attribute, v for value. ML-TREE is used to construct (e,a,v). ML-TREE is a reversion of B+-tree that is multiway valanced tree. The paper focuses mainly on the usage of associative database, demonstrating how to use basic commands, primary functions and applcations.
The ADAMS interactive interpreter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rietscha, E.R.
1990-12-17
The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.
2003-01-01
Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.
An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)
2002-01-01
Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.
2003-01-01
Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.
Rosset, Saharon; Aharoni, Ehud; Neuvirth, Hani
2014-07-01
Issues of publication bias, lack of replicability, and false discovery have long plagued the genetics community. Proper utilization of public and shared data resources presents an opportunity to ameliorate these problems. We present an approach to public database management that we term Quality Preserving Database (QPD). It enables perpetual use of the database for testing statistical hypotheses while controlling false discovery and avoiding publication bias on the one hand, and maintaining testing power on the other hand. We demonstrate it on a use case of a replication server for GWAS findings, underlining its practical utility. We argue that a shift to using QPD in managing current and future biological databases will significantly enhance the community's ability to make efficient and statistically sound use of the available data resources. © 2014 WILEY PERIODICALS, INC.
Marshall, Leisa L; Peasah, Samuel; Stevens, Gregg A
2017-01-01
Provide a systematic review of the primary literature on efforts to reduce Clostridium difficile infection (CDI) occurrence and improve outcomes in older adults. PubMed and CINAHL databases were searched for research studies using search terms CDI, CDI prevention, reduction, control, management, geriatric, elderly, adults 65 years of age and older. The MeSH categories Aged and Aged, 80 and older, were used. A second search of PubMed, CINAHL, National Guideline Clearinghouse, and TRIP databases was conducted for primary, secondary, and tertiary literature for CDI epidemiology, burden, and management in adults of all ages, and prevention and management guidelines. Of the 2,263 articles located, 105 were selected for full review: 55 primary and 50 secondary, tertiary. Primary literature selected for full review included studies of interventions to prevent, reduce occurrence, control, manage, or improve outcomes in adults 65 years of age and older. Patient settings included the community, assisted living, nursing facility, subacute care, or hospital. The main outcome measures for research studies were whether the studied intervention prevented, reduced occurrence, controlled, managed, or improved outcomes. Studies were conducted in acute or long-term hospitals, with a few in nursing facilities. Interventions that prevented or reduced CDI included antibiotic policy changes, education, procedure changes, infection control, and multi-intervention approaches. There were few management studies for adults 65 years of age and older or for all adults with results stratified by age. Treatments studied included efficacy of fidaxomicin, metronidazole, vancomycin, and fecal microbiota transplant. Though clinical outcomes were slightly less robust in those 65 years of age and older, age was not an independent predictor of success or failure. The current prevention and management guidelines for adults of all ages, as well as special considerations in skilled nursing facilities, extracted from the secondary/tertiary literature selected, are summarized. There are a limited number of studies designed for older adults. Our findings suggest that guideline recommendations for adults are adequate and appropriate for older adults. Exposure to antibiotics and Clostridium difficile remain the two major risk factors for CDI, reinforcing the importance of antibiotic stewardship and infection control.
VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases
NASA Technical Reports Server (NTRS)
Roussopoulos, N.; Sellis, Timos
1992-01-01
One of biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental database access method, VIEWCACHE, provides such an interface for accessing distributed data sets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image data sets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate distributed database search.
Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking
NASA Technical Reports Server (NTRS)
Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward
2011-01-01
To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk calculation procedure.
PERFORMANCE AUDITING OF A HUMAN AIR POLLUTION EXPOSURE SYSTEM FOR PM2.5
Databases derived from human health effects research play a vital role in setting environmental standards. An underlying assumption in using these databases for standard setting purposes is that they are of adequate quality. The performance auditing program described in this ma...
EPA's Integrated Risk Information System (IRIS) database was developed and is maintained by EPA's Office of Research and Developement, National Center for Environmental Assessment. IRIS is a database of human health effects that may result from exposure to various substances fou...
A case study for a digital seabed database: Bohai Sea engineering geology database
NASA Astrophysics Data System (ADS)
Tianyun, Su; Shikui, Zhai; Baohua, Liu; Ruicai, Liang; Yanpeng, Zheng; Yong, Wang
2006-07-01
This paper discusses the designing plan of ORACLE-based Bohai Sea engineering geology database structure from requisition analysis, conceptual structure analysis, logical structure analysis, physical structure analysis and security designing. In the study, we used the object-oriented Unified Modeling Language (UML) to model the conceptual structure of the database and used the powerful function of data management which the object-oriented and relational database ORACLE provides to organize and manage the storage space and improve its security performance. By this means, the database can provide rapid and highly effective performance in data storage, maintenance and query to satisfy the application requisition of the Bohai Sea Oilfield Paradigm Area Information System.
Geoscience research databases for coastal Alabama ecosystem management
Hummell, Richard L.
1995-01-01
Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.
Using Virtual Servers to Teach the Implementation of Enterprise-Level DBMSs: A Teaching Note
ERIC Educational Resources Information Center
Wagner, William P.; Pant, Vik
2010-01-01
One of the areas where demand has remained strong for MIS students is in the area of database management. Since the early days, this topic has been a mainstay in the MIS curriculum. Students of database management today typically learn about relational databases, SQL, normalization, and how to design and implement various kinds of database…
Towards G2G: Systems of Technology Database Systems
NASA Technical Reports Server (NTRS)
Maluf, David A.; Bell, David
2005-01-01
We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.
Adopting a corporate perspective on databases. Improving support for research and decision making.
Meistrell, M; Schlehuber, C
1996-03-01
The Veterans Health Administration (VHA) is at the forefront of designing and managing health care information systems that accommodate the needs of clinicians, researchers, and administrators at all levels. Rather than using one single-site, centralized corporate database VHA has constructed several large databases with different configurations to meet the needs of users with different perspectives. The largest VHA database is the Decentralized Hospital Computer Program (DHCP), a multisite, distributed data system that uses decoupled hospital databases. The centralization of DHCP policy has promoted data coherence, whereas the decentralization of DHCP management has permitted system development to be done with maximum relevance to the users'local practices. A more recently developed VHA data system, the Event Driven Reporting system (EDR), uses multiple, highly coupled databases to provide workload data at facility, regional, and national levels. The EDR automatically posts a subset of DHCP data to local and national VHA management. The development of the EDR illustrates how adoption of a corporate perspective can offer significant database improvements at reasonable cost and with modest impact on the legacy system.
Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System
Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail
1988-01-01
This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.
Hot wet spots of Swiss buildings - detecting clusters of flood exposure
NASA Astrophysics Data System (ADS)
Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth
2016-04-01
Where are the hotspots of flood exposure in Switzerland? There is no single answer but rather a wide range of findings depending on the databases and methods used. In principle, the analysis of flood exposure is the overlay of two spatial datasets, one on flood hazard and one on assets, e.g. buildings. The presented study aims to test a new developed approach which is based on public available Swiss data. On the hazard side, these are two different types of flood hazard maps each representing a similar return period beyond the dimensioning of structural protection systems. When it comes to assets we use nationwide harmonized data on building, namely a complete dataset of building polygons to which we assign features as volume, residents and monetary value. For the latter we apply findings of multivariate analyses of insurance data. By overlaying building polygons with the flood hazard map we identify the exposed buildings. We analyse the resulting spatial distribution of flood exposure at different levels of scales (local to regional) using administrative units (e.g. municipalities) but also artificial grids with a corresponding size (e.g. 5 000 m). The presentation focuses on the identification of hotspots highlighting the influence of the applied data and methods, e.g. local scan statistics testing intensities within and without potential clusters or log relative exposure surfaces based on kernel intensity estimates. We find a major difference of identified hotspots between absolute values and normalized values of exposure. Whereas the hotspots of flood exposure in absolute figures mirrors the underlying distribution of buildings, the hotspots of flood exposure ratios show very different pictures. We conclude that findings on flood exposure vary depending on the data and moreover the methods used and therefore need to be communicated carefully and appropriate to different stakeholders who may use the information for decision making on flood risk management.
Consumer exposure scenarios: development, challenges and possible solutions.
Van Engelen, J G M; Heinemeyer, G; Rodriguez, C
2007-12-01
Exposure scenarios (ES) under REACH (Registration, Evaluation, and Authorisation of Chemicals; new EU legislation) aim to describe safe conditions of product and substance use. Both operational conditions and risk management measures (RMMs) are part of the ES. For consumer use of chemicals, one of the challenges will be to identify all of the consumer uses of a given chemical and then quantify the exposure derived from each of them. Product use categories can be established to identify in a systematic fashion how products are used. These product categories comprise products that are used similarly (e.g. paints, adhesives). They deliver information about product use characteristics, and provide an easy-to-handle tool for exchanging standardised information. For practical reasons, broad ES will have to be developed, which cover a wide range of products and use. The challenge will be to define them broadly, but not in a way that they provide such an overestimation of exposure that a next iteration or a more complex model is always needed. Tiered and targeted approaches for estimation of exposure at the right level of detail may offer the best solution. RMMs relevant for consumers include those inherent to product design (controllable) and those that are communicated to consumers as directions for use (non-controllable). Quantification of the effect of non-controllable RMMs on consumer exposure can prove to be difficult. REACH requires aggregation of exposure from all relevant identified sources. Development of appropriate methodology for realistic aggregation of exposure will be no small challenge and will likely require probabilistic approaches and comprehensive databases on populations' habits, practices and behaviours. REACH regulation aims at controlling the use of chemicals so that exposure to every chemical can be demonstrated to be safe for consumers, workers, and the environment when considered separately, but also when considered in an integrated way. This integration will be another substantial challenge for the future.
DEVELOPING MEANINGFUL COHORTS FOR HUMAN EXPOSURE MODELS
This paper summarizes numerous statistical analyses focused on the U.S. Environmental Protection Agency's Consolidated Human Activity Database (CHAD), used by many exposure modelers as the basis for data on what people do and where they spend their time. In doing so, modelers ...
Data base management system for lymphatic filariasis--a neglected tropical disease.
Upadhyayula, Suryanaryana Murty; Mutheneni, Srinivasa Rao; Kadiri, Madhusudhan Rao; Kumaraswamy, Sriram; Nelaturu, Sarat Chandra Babu
2012-01-01
Researchers working in the area of Public Health are being confronted with large volumes of data on various aspects of entomology and epidemiology. To obtain the relevant information out of these data requires particular database management system. In this paper, we have described about the usages of our developed database on lymphatic filariasis. This database application is developed using Model View Controller (MVC) architecture, with MySQL as database and a web based interface. We have collected and incorporated the data on filariasis in the database from Karimnagar, Chittoor, East and West Godavari districts of Andhra Pradesh, India. The importance of this database is to store the collected data, retrieve the information and produce various combinational reports on filarial aspects which in turn will help the public health officials to understand the burden of disease in a particular locality. This information is likely to have an imperative role on decision making for effective control of filarial disease and integrated vector management operations.
Owens, John
2009-01-01
Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.
The Majorana Parts Tracking Database
Abgrall, N.; Aguayo, E.; Avignone, F. T.; ...
2015-01-16
The Majorana Demonstrator is an ultra-low background physics experiment searching for the neutrinoless double beta decay of 76Ge. The Majorana Parts Tracking Database is used to record the history of components used in the construction of the Demonstrator. The tracking implementation takes a novel approach based on the schema-free database technology CouchDB. Transportation, storage, and processes undergone by parts such as machining or cleaning are linked to part records. Tracking parts provides a great logistics benefit and an important quality assurance reference during construction. In addition, the location history of parts provides an estimate of their exposure to cosmic radiation.more » In summary, a web application for data entry and a radiation exposure calculator have been developed as tools for achieving the extreme radio-purity required for this rare decay search.« less
Deng, Chen-Hui; Zhang, Guan-Min; Bi, Shan-Shan; Zhou, Tian-Yan; Lu, Wei
2011-07-01
This study is to develop a therapeutic drug monitoring (TDM) network server of tacrolimus for Chinese renal transplant patients, which can facilitate doctor to manage patients' information and provide three levels of predictions. Database management system MySQL was employed to build and manage the database of patients and doctors' information, and hypertext mark-up language (HTML) and Java server pages (JSP) technology were employed to construct network server for database management. Based on the population pharmacokinetic model of tacrolimus for Chinese renal transplant patients, above program languages were used to construct the population prediction and subpopulation prediction modules. Based on Bayesian principle and maximization of the posterior probability function, an objective function was established, and minimized by an optimization algorithm to estimate patient's individual pharmacokinetic parameters. It is proved that the network server has the basic functions for database management and three levels of prediction to aid doctor to optimize the regimen of tacrolimus for Chinese renal transplant patients.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-22
... titled, ``Department of Homeland Security/Federal Emergency Management Agency--006 Citizen Corps Database...) authorities; (5) purpose; (6) routine uses of information; (7) system manager and address; (8) notification... Database'' and retitle it ``DHS/FEMA--006 Citizen Corps Program System of Records.'' FEMA administers the...
NREL: U.S. Life Cycle Inventory Database - Project Management Team
Project Management Team Information about the U.S. Life Cycle Inventory (LCI) Database project management team is listed on this page. Additional project information is available about the U.S. LCI Mechanical Engineering, Colorado State University Professional History Michael has worked as a Senior
Development of a Relational Database for Learning Management Systems
ERIC Educational Resources Information Center
Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul
2011-01-01
In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…
An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.
ERIC Educational Resources Information Center
Chen, I-Min A.; Markowitz, Victor M.
1995-01-01
Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…
Rajavi, Zhale; Safi, Sare; Javadi, Mohammad Ali; Jafarinasab, Mohammad Reza; Feizi, Sepehr; Moghadam, Mohammadreza Sedighi; Jadidi, Khosrow; Babaei, Mahmoud; Shirvani, Armin; Baradaran-Rafii, Alireza; Mohammad-Rabei, Hossein; Ziaei, Hossein; Ghassemi-Broumand, Mohammad; Baher, Siamak Delfaza; Naderi, Mostafa; Panahi-Bazaz, Mahmoodreza; Zarei-Ghanavati, Siamak; Hanjani, Shahriar; Ghasemi, Hassan; Salouti, Ramin; Pakbin, Mojgan; Kheiri, Bahareh
2017-01-01
Purpose: To develop clinical practice guidelines (CPGs) for prevention, diagnosis, treatment and follow-up of ocular injuries caused by exposure to mustard gas. Methods: The clinical questions were designed by the guideline team. Websites and databases including National Guidelines Clearinghouse, National Institute for Clinical Excellence, Cochrane, and PubMed were searched to find related CPGs and explore possible answers to the clinical questions. Since there were no relevant CPGs in the literature, related articles in Persian and English languages were extracted. Each article along with its level of evidence was summarized. Additionally, hand search was performed by looking the reference list of each article. Consequently, recommendations were developed considering the clinical benefits and side effects of each therapeutic modality. The recommendations were re-evaluated in terms of customization criteria. All recommendations along with the related evidence were scored from 1 to 9 by experts from all medical universities of Iran. The level of agreement among the experts was evaluated by analyzing the given scores. Results: The agreement was achieved for all recommendations. The experts suggested a number of minor modifications which were applied to the recommendations. Finally, CPGs were developed with 98 recommendations under three major domains including prevention of injury, diagnosis and management of the acute and delayed-onset mustard gas ocular injuries. Conclusion: Considering the lack of CPGs for the prevention, diagnosis, and management of mustard gas-induced keratitis, these recommendations would be useful to prevent the serious ocular complications of mustard gas and standardize eye care services to the affected individuals. PMID:28299009
Rajavi, Zhale; Safi, Sare; Javadi, Mohammad Ali; Jafarinasab, Mohammad Reza; Feizi, Sepehr; Moghadam, Mohammadreza Sedighi; Jadidi, Khosrow; Babaei, Mahmoud; Shirvani, Armin; Baradaran-Rafii, Alireza; Mohammad-Rabei, Hossein; Ziaei, Hossein; Ghassemi-Broumand, Mohammad; Baher, Siamak Delfaza; Naderi, Mostafa; Panahi-Bazaz, Mahmoodreza; Zarei-Ghanavati, Siamak; Hanjani, Shahriar; Ghasemi, Hassan; Salouti, Ramin; Pakbin, Mojgan; Kheiri, Bahareh
2017-01-01
To develop clinical practice guidelines (CPGs) for prevention, diagnosis, treatment and follow-up of ocular injuries caused by exposure to mustard gas. The clinical questions were designed by the guideline team. Websites and databases including National Guidelines Clearinghouse, National Institute for Clinical Excellence, Cochrane, and PubMed were searched to find related CPGs and explore possible answers to the clinical questions. Since there were no relevant CPGs in the literature, related articles in Persian and English languages were extracted. Each article along with its level of evidence was summarized. Additionally, hand search was performed by looking the reference list of each article. Consequently, recommendations were developed considering the clinical benefits and side effects of each therapeutic modality. The recommendations were re-evaluated in terms of customization criteria. All recommendations along with the related evidence were scored from 1 to 9 by experts from all medical universities of Iran. The level of agreement among the experts was evaluated by analyzing the given scores. The agreement was achieved for all recommendations. The experts suggested a number of minor modifications which were applied to the recommendations. Finally, CPGs were developed with 98 recommendations under three major domains including prevention of injury, diagnosis and management of the acute and delayed-onset mustard gas ocular injuries. Considering the lack of CPGs for the prevention, diagnosis, and management of mustard gas-induced keratitis, these recommendations would be useful to prevent the serious ocular complications of mustard gas and standardize eye care services to the affected individuals.
Graphical user interfaces for symbol-oriented database visualization and interaction
NASA Astrophysics Data System (ADS)
Brinkschulte, Uwe; Siormanolakis, Marios; Vogelsang, Holger
1997-04-01
In this approach, two basic services designed for the engineering of computer based systems are combined: a symbol-oriented man-machine-service and a high speed database-service. The man-machine service is used to build graphical user interfaces (GUIs) for the database service; these interfaces are stored using the database service. The idea is to create a GUI-builder and a GUI-manager for the database service based upon the man-machine service using the concept of symbols. With user-definable and predefined symbols, database contents can be visualized and manipulated in a very flexible and intuitive way. Using the GUI-builder and GUI-manager, a user can build and operate its own graphical user interface for a given database according to its needs without writing a single line of code.
Falcone, U; Gilardi, Luisella; Pasqualini, O; Santoro, S; Coffano, Elena
2010-01-01
Exposure to carcinogens is still widespread in working environments. For the purpose of defining priority of interventions, it is necessary to estimate the number and the geographic distribution of workers potentially exposed to carcinogens. It could therefore be useful to test the use of tools and information sources already available in order to map the distribution of exposure to carcinogens. Formaldehyde is suggested as an example of an occupational carcinogen in this study. The study aimed at verifying and investigating the potential of 3 integrated databases: MATline, CAREX, and company databases resulting from occupational accident and disease claims (INAIL), in order to estimate the number of workers exposed to formaldehyde and map their distribution in the Piedmont Region. The list of manufacturing processes involving exposure to formaldehyde was sorted by MIATline; for each process the number of firms and employees were obtained from the INAIL archives. By applying the prevalence of exposed workers obtained with CAREX, an estimate of exposure for each process was determined. A map of the distribution of employees associated with a specific process was produced using ArcView GIS software. It was estimated that more than 13,000 employees are exposed to formaldehyde in the Piedmont Region. The manufacture of furniture was identified as the process with the highest number of workers exposed to formaldehyde (3,130),followed by metal workers (2,301 exposed) and synthetic resin processing (1,391 exposed). The results obtained from the integrated use of databases provide a basis for defining priority of preventive interventions required in the industrial processes involving exposure to carcinogens in the Piedmont Region.
Databases derived from human health effects research play a vital role in setting environmental standards. An underlying assumption in using these databases for standard setting purposes is that they are of adequate quality. The performance auditing program described in this ma...
ERIC Educational Resources Information Center
Blair, John C., Jr.
1982-01-01
Outlines the important factors to be considered in selecting a database management system for use with a microcomputer and presents a series of guidelines for developing a database. General procedures, report generation, data manipulation, information storage, word processing, data entry, database indexes, and relational databases are among the…
A Summary of the Naval Postgraduate School Research Program
1989-08-30
5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database
Wachtel, Ruth E; Dexter, Franklin
2013-12-01
The purpose of this article is to teach operating room managers, financial analysts, and those with a limited knowledge of search engines, including PubMed, how to locate articles they need in the areas of operating room and anesthesia group management. Many physicians are unaware of current literature in their field and evidence-based practices. The most common source of information is colleagues. Many people making management decisions do not read published scientific articles. Databases such as PubMed are available to search for such articles. Other databases, such as citation indices and Google Scholar, can be used to uncover additional articles. Nevertheless, most people who do not know how to use these databases are reluctant to utilize help resources when they do not know how to accomplish a task. Most people are especially reluctant to use on-line help files. Help files and search databases are often difficult to use because they have been designed for users already familiar with the field. The help files and databases have specialized vocabularies unique to the application. MeSH terms in PubMed are not useful alternatives for operating room management, an important limitation, because MeSH is the default when search terms are entered in PubMed. Librarians or those trained in informatics can be valuable assets for searching unusual databases, but they must possess the domain knowledge relative to the subject they are searching. The search methods we review are especially important when the subject area (e.g., anesthesia group management) is so specific that only 1 or 2 articles address the topic of interest. The materials are presented broadly enough that the reader can extrapolate the findings to other areas of clinical and management issues in anesthesiology.
The Perfect Marriage: Integrated Word Processing and Data Base Management Programs.
ERIC Educational Resources Information Center
Pogrow, Stanley
1983-01-01
Discussion of database integration and how it operates includes recommendations on compatible brand name word processing and database management programs, and a checklist for evaluating essential and desirable features of the available programs. (MBR)
of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005
Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A
2005-06-01
A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.
Teaching Case: Adapting the Access Northwind Database to Support a Database Course
ERIC Educational Resources Information Center
Dyer, John N.; Rogers, Camille
2015-01-01
A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... administrator from the private sector to create and operate TV band databases. The TV band database... database administrator will be responsible for operation of their database and coordination of the overall functioning of the database with other administrators, and will provide database access to TVBDs. The...
Occupational exposure in the removal and disposal of asbestos-containing materials in Italy.
Scarselli, Alberto; Corfiati, Marisa; Di Marzio, Davide
2016-07-01
A great variety of asbestos-containing materials are present in both residential and work settings because of the widespread use made in the past, and many occupational activities still entail the risk of asbestos exposure in Italy, more than 2 decades after the total national ban, mainly those involved in the removal and disposal of asbestos. The aim of the study was to evaluate the level and extent of asbestos exposure in Italy between the years 1996-2013 in the sector of asbestos abatement. Data were collected from firm registries of asbestos-exposed workers and descriptive statistics were calculated for exposure-related variables. Overall, 15,860 measurements of asbestos exposure were selected from the national database of registries, mostly referring to the construction sector (N = 11,353). Despite the mean exposure levels are low, the air concentration of asbestos fibers measured during these activities may overcome the action level established by the Italian legislation and, in a limited number of cases, can exceed even the occupational limit value. Among occupations at higher risk, there are also garbage collectors and insulation workers. Starting from the analysis of the Italian database of occupational exposure registries, this study outlines the current levels of asbestos exposure in abatement-related sectors, discussing their possible implications for public health policies and surveillance programs.
Small Business Innovations (Integrated Database)
NASA Technical Reports Server (NTRS)
1992-01-01
Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.
ERIC Educational Resources Information Center
Fitzgibbons, Megan; Meert, Deborah
2010-01-01
The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…
REGRESSION MODELS OF RESIDENTIAL EXPOSURE TO CHLORPYRIFOS AND DIAZINON
This study examines the ability of regression models to predict residential exposures to chlorpyrifos and diazinon, based on the information from the NHEXAS-AZ database. The robust method was used to generate "fill-in" values for samples that are below the detection l...
A Global Geospatial Database of 5000+ Historic Flood Event Extents
NASA Astrophysics Data System (ADS)
Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.
2017-12-01
A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.
VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases
NASA Technical Reports Server (NTRS)
Roussopoulos, N.; Sellis, Timos
1993-01-01
One of the biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental data base access method, VIEWCACHE, provides such an interface for accessing distributed datasets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image datasets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate database search.
Stam, Rianne
2014-01-01
Some of the strongest electromagnetic fields (EMF) are found in the workplace. A European Directive sets limits to workers’ exposure to EMF. This review summarizes its origin and contents and compares magnetic field exposure levels in high-risk workplaces with the limits set in the revised Directive. Pubmed, Scopus, grey literature databases, and websites of organizations involved in occupational exposure measurements were searched. The focus was on EMF with frequencies up to 10 MHz, which can cause stimulation of the nervous system. Selected studies had to provide individual maximum exposure levels at the workplace, either in terms of the external magnetic field strength or flux density or as induced electric field strength or current density. Indicative action levels and the corresponding exposure limit values for magnetic fields in the revised European Directive will be higher than those in the previous version. Nevertheless, magnetic flux densities in excess of the action levels for peripheral nerve stimulation are reported for workers involved in welding, induction heating, transcranial magnetic stimulation, and magnetic resonance imaging (MRI). The corresponding health effects exposure limit values for the electric fields in the worker’s body can be exceeded for welding and MRI, but calculations for induction heating and transcranial magnetic stimulation are lacking. Since the revised European Directive conditionally exempts MRI-related activities from the exposure limits, measures to reduce exposure may be necessary for welding, induction heating, and transcranial nerve stimulation. Since such measures can be complicated, there is a clear need for exposure databases for different workplace scenarios with significant EMF exposure and guidance on good practices. PMID:24557933
Evidence generation from healthcare databases: recommendations for managing change.
Bourke, Alison; Bate, Andrew; Sauer, Brian C; Brown, Jeffrey S; Hall, Gillian C
2016-07-01
There is an increasing reliance on databases of healthcare records for pharmacoepidemiology and other medical research, and such resources are often accessed over a long period of time so it is vital to consider the impact of changes in data, access methodology and the environment. The authors discuss change in communication and management, and provide a checklist of issues to consider for both database providers and users. The scope of the paper is database research, and changes are considered in relation to the three main components of database research: the data content itself, how it is accessed, and the support and tools needed to use the database. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu
2014-01-01
The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.
NASA Technical Reports Server (NTRS)
Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert
2006-01-01
One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.
NASA Technical Reports Server (NTRS)
Dobinson, E.
1982-01-01
General requirements for an information management system for the deep space network (DSN) are examined. A concise review of available database management system technology is presented. It is recommended that a federation of logically decentralized databases be implemented for the Network Information Management System of the DSN. Overall characteristics of the federation are specified, as well as reasons for adopting this approach.
Hoover, Rebecca M; Hayes, V Autumn Gombert; Erramouspe, John
2015-12-01
To evaluate the effect of prenatal acetaminophen exposure on the future development of attention deficit/hyperactivity disorder (ADHD) in children. Literature searches of MEDLINE (1975 to June 2015), International Pharmaceutical Abstracts (1975 to June 2015), and Cochrane Database (publications through June 2015) for prospective clinical trials assessing the relationship of prenatal acetaminophen exposure and the development of attention deficit disorders or hyperactivity. Studies comparing self-reported maternal acetaminophen use during pregnancy to development of ADHD or ADHD-like behaviors in offspring between the ages of 3 and 12 years. Four studies examining the effects of prenatal acetaminophen exposure on subsequent ADHD behaviors were identified. Of these, one early study found no link to ADHD behaviors while the other studies found statistically significant correlations with the most prominent being a study finding a higher risk for using ADHD medications (hazard ratio = 1.29; 95% CI, 1.15-1.44) or having ADHD-like behaviors at age 7 years as determined by the Strengths and Difficulties Questionnaire (risk ratio = 1.13; 95% CI, 1.01-1.27) in children whose mothers used acetaminophen during pregnancy. While there does appear to be a mild correlation between prenatal acetaminophen use and the development of ADHD symptoms in children, current data do not provide sufficient evidence that prenatal acetaminophen exposure leads to development of ADHD symptoms late in life. Acetaminophen is a preferred option for pain management during pregnancy when compared with other medications such as nonsteroidal anti-inflammatory drugs or opioids for pyretic or pain relief. © The Author(s) 2015.
Applications of Database Machines in Library Systems.
ERIC Educational Resources Information Center
Salmon, Stephen R.
1984-01-01
Characteristics and advantages of database machines are summarized and their applications to library functions are described. The ability to attach multiple hosts to the same database and flexibility in choosing operating and database management systems for different functions without loss of access to common database are noted. (EJS)
The Thresholds of Toxicological Concern (TTC) are generic human exposure threshold for structural groups of chemicals below which no risk to human health is assumed and therefore no further testing is needed. Different thresholds have been developed for oral exposure e.g. for gen...
ACToR: Aggregated Computational Toxicology Resource (T) ...
The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information on chemicals of interest to the EPA, and in particular to be a central source for the testing data on all chemicals regulated by all EPA programs; (2) To be a source of in vivo training data sets for building in vitro to in vivo computational models; (3) To serve as a central source of chemical structure and identity information for the ToxCastTM and Tox21 programs. There are 4 main databases, all linked through a common set of chemical information and a common structure linking chemicals to assay data: the public ACToR system (available at http://actor.epa.gov), the ToxMiner database holding ToxCast and Tox21 data, along with results form statistical analyses on these data; the Tox21 chemical repository which is managing the ordering and sample tracking process for the larger Tox21 project; and the public version of ToxRefDB. The public ACToR system contains information on ~500K compounds with toxicology, exposure and chemical property information from >400 public sources. The web site is visited by ~1,000 unique users per month and generates ~1,000 page requests per day on average. The databases are built on open source technology, which has allowed us to export them to a number of col
Heterogeneous distributed query processing: The DAVID system
NASA Technical Reports Server (NTRS)
Jacobs, Barry E.
1985-01-01
The objective of the Distributed Access View Integrated Database (DAVID) project is the development of an easy to use computer system with which NASA scientists, engineers and administrators can uniformly access distributed heterogeneous databases. Basically, DAVID will be a database management system that sits alongside already existing database and file management systems. Its function is to enable users to access the data in other languages and file systems without having to learn the data manipulation languages. Given here is an outline of a talk on the DAVID project and several charts.
DOT National Transportation Integrated Search
2005-01-01
In 2003, an Internet-based Geotechnical Database Management System (GDBMS) was developed for the Virginia Department of Transportation (VDOT) using distributed Geographic Information System (GIS) methodology for data management, archival, retrieval, ...
Improving Recall Using Database Management Systems: A Learning Strategy.
ERIC Educational Resources Information Center
Jonassen, David H.
1986-01-01
Describes the use of microcomputer database management systems to facilitate the instructional uses of learning strategies relating to information processing skills, especially recall. Two learning strategies, cross-classification matrixing and node acquisition and integration, are highlighted. (Author/LRW)
Using Online Databases in Corporate Issues Management.
ERIC Educational Resources Information Center
Thomsen, Steven R.
1995-01-01
Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)
Risk assessment and management of radiofrequency radiation exposure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dabala, Dana; Surducan, Emanoil; Surducan, Vasile
2013-11-13
Radiofrequency radiation (RFR) industry managers, occupational physicians, security department, and other practitioners must be advised on the basic of biophysics and the health effects of RF electromagnetic fields so as to guide the management of exposure. Information on biophysics of RFR and biological/heath effects is derived from standard texts, literature and clinical experiences. Emergency treatment and ongoing care is outlined, with clinical approach integrating the circumstances of exposure and the patient's symptoms. Experimental risk assessment model in RFR chronic exposure is proposed. Planning for assessment and monitoring exposure, ongoing care, safety measures and work protection are outlining the proper management.
Risk assessment and management of radiofrequency radiation exposure
NASA Astrophysics Data System (ADS)
Dabala, Dana; Surducan, Emanoil; Surducan, Vasile; Neamtu, Camelia
2013-11-01
Radiofrequency radiation (RFR) industry managers, occupational physicians, security department, and other practitioners must be advised on the basic of biophysics and the health effects of RF electromagnetic fields so as to guide the management of exposure. Information on biophysics of RFR and biological/heath effects is derived from standard texts, literature and clinical experiences. Emergency treatment and ongoing care is outlined, with clinical approach integrating the circumstances of exposure and the patient's symptoms. Experimental risk assessment model in RFR chronic exposure is proposed. Planning for assessment and monitoring exposure, ongoing care, safety measures and work protection are outlining the proper management.
Defining the relationship between individuals’ aggregate and maximum source-specific exposures
The concepts of aggregate and source-specific exposures play an important role in chemical risk management. The concepts of aggregate and source-specific exposures play an important role in chemical risk management. Aggregate exposure to a chemical refers to combined exposures fr...
Bos, Marian E H; Te Beest, Dennis E; van Boven, Michiel; van Beest Holle, Mirna Robert-Du Ry; Meijer, Adam; Bosman, Arnold; Mulder, Yonne M; Koopmans, Marion P G; Stegeman, Arjan
2010-05-01
An epizootic of avian influenza (H7N7) caused a large number of human infections in The Netherlands in 2003. We used data from this epizootic to estimate infection probabilities for persons involved in disease control on infected farms. Analyses were based on databases containing information on the infected farms, person-visits to these farms, and exposure variables (number of birds present, housing type, poultry type, depopulation method, period during epizootic). Case definition was based on self-reported conjunctivitis and positive response to hemagglutination inhibition assay. A high infection probability was associated with clinical inspection of poultry in the area surrounding infected flocks (7.6%; 95% confidence interval [CI], 1.4%-18.9%) and active culling during depopulation (6.2%; 95% CI, 3.7%-9.6%). Low probabilities were estimated for management of biosecurity (0.0%; 95% CI, 0.0%-1.0%) and cleaning assistance during depopulation (0.0%; 95% CI, 0.0%-9.2%). No significant association was observed between the probability of infection and the exposure variables.
An incremental database access method for autonomous interoperable databases
NASA Technical Reports Server (NTRS)
Roussopoulos, Nicholas; Sellis, Timos
1994-01-01
We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.
Croft, A; Archer, R
1997-01-01
BACKGROUND: Rabies is a zoonosis that remains endemic in most parts of the world. Primary care physicians are in the first line of defence against the disease. An increasing number of British practitioners and medical students are being exposed to the dangers of rabies through humanitarian work on overseas attachments. Rabies is enzootic throughout Bosnia-Herzegovina and presents a hazard to the multinational troops currently deployed there. AIM: To describe the British Army's experience of animal bites and rabies prevention in Bosnia during the first six months of its current peace enforcement mission, and to make general recommendations on the good management of any rabies hazard at primary care level and under field conditions. METHODS: Routine data from the Army's epidemiological database (ARRC 97) were reviewed, and theatre issues of rabies vaccine and immune globulin were used as a proxy measure for administered post-exposure prophylaxis. RESULTS: A total of 62 animal bites were reported in British troops between December 1995 and May 1996, of which 28 were unprovoked bites and resulted in the administration of a course of rabies vaccine. Ten of these were severe bites and rabies immune globulin (RIG) was administered in addition. The total cost of rabies post-exposure prophylaxis was US$6914.00. CONCLUSION: The prevention of rabies has major human and resource implications, and primary care staff involved in rabies post-exposure management need to be well supported in their clinical decision-making. Rabies protocols should be clear and unambiguous. The effective medical surveillance of military or humanitarian missions in rabies-enzootic areas must include the prompt reporting of animal bites. The predeployment training of medical teams should include an up-to-date assessment of the local rabies threat. PMID:9281871
PACSY, a relational database management system for protein structure and chemical shift analysis.
Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L
2012-10-01
PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.
The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity
NASA Astrophysics Data System (ADS)
Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo
2015-05-01
The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.
Prevalence and Effects of Child Exposure to Domestic Violence.
ERIC Educational Resources Information Center
Fantuzzo, John W.; Mohr, Wanda K.
1999-01-01
Discusses the limitations of current databases documenting the prevalence and effects of child exposure to domestic violence and describes a model for the collection of reliable and valid prevalence data, the Spousal Assault Replication Program, which uses data collected by the police and university researchers. (SLD)
Reflective Database Access Control
ERIC Educational Resources Information Center
Olson, Lars E.
2009-01-01
"Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…
Pesch, Beate; Kendzia, Benjamin; Hauptmann, Kristin; Van Gelder, Rainer; Stamm, Roger; Hahn, Jens-Uwe; Zschiesche, Wolfgang; Behrens, Thomas; Weiss, Tobias; Siemiatycki, Jack; Lavoué, Jerome; Jöckel, Karl-Heinz; Brüning, Thomas
2015-07-01
This study aimed to estimate occupational exposure to inhalable hexavalent chromium (Cr(VI)) using the exposure database MEGA. The database has been compiling Cr(VI) concentrations and ancillary data about measurements at German workplaces. We analysed 3659 personal measurements of inhalable Cr(VI) collected between 1994 and 2009. Cr(VI) was determined spectrophotometrically at 540 nm after reaction with diphenylcarbazide. We assigned the measurements to pre-defined at-risk occupations using the information provided about the workplaces. Two-thirds of the measurements were below the limit of quantification (LOQ) and multiply imputed according to the distribution above LOQ. The 75th percentile value was 5.2 μg/m(3) and the 95th percentile was 57.2 μg/m(3). We predicted the geometric mean for 2h sampling in the year 2000, and the time trend of Cr(VI) exposure in these settings with and without adjustment for the duration of measurements. The largest dataset was available for welding (N = 1898), which could be further detailed according to technique. The geometric means were above 5 μg/m(3) in the following situations: spray painting, shielded metal arc welding, and flux-cored arc welding if applied to stainless steel. The geometric means were between 1 μg/m(3) and 5 μg/m(3) for gas metal arc welding of stainless steel, cutting, hard-chromium plating, metal spraying and in the chemical chromium industry. The exposure profiles described here are useful for epidemiologic and industrial health purposes. Exposure to Cr(VI) varies not only between occupations, but also within occupations as shown for welders. In epidemiologic studies, it would be desirable to collect exposure-specific information in addition to the job title. Copyright © 2015 Elsevier GmbH. All rights reserved.
Zartarian, V G; Streicker, J; Rivera, A; Cornejo, C S; Molina, S; Valadez, O F; Leckie, J O
1995-01-01
A pesticide exposure assessment pilot study was conducted in Salinas Valley, California during September, 1993. The pilot study had two main purposes: 1) to develop general methodologies for videotaping micro-activities of a population, and 2) to collect an initial database of activity patterns of two- to four-year-old farm labor children. Tools to accurately determine exposure and dose through all three pathways (dermal, ingestion, and inhalation) are needed to effectively assess and manage health risks posed by pesticides and other environmental pollutants. Eight to ten hours of videotape data were collected for each of four Mexican-American farm labor children. In addition, the researchers administered a day-after recall questionnaire to the caregivers of the children to test (for the study sample) the hypothesis that recall questionnaires are inadequate for collecting detailed information regarding dermal and hand-to-mouth exposures. The results of this study provide the first detailed set of videotape data on farm labor children, a population at high risk to pesticide exposures. In addition, this is the first project in the exposure assessment field to use direct observation videotaping for collecting micro-activity data in order to quantify dermal and ingestion exposure. The comparison of caregivers' recall of children's activities to actual videotapes from the pilot study supports the hypothesis that videotaping may greatly improve the accuracy of activity information used to compute dermal and ingestion exposures. However, as it was clear that the researchers' presence in some cases altered the activities of the subjects, further experiments need to be conducted to minimize interference of videotaping on exposure-related activities. This paper explains the selection of the study population, the methods used to implement the pilot study, and the lessons learned. While the discussion focuses on four case studies in the Mexican-American farm labor population, the data collection methods developed and the lessons learned can be applied to other populations.
Nishio, Shin-Ya; Usami, Shin-Ichi
2017-03-01
Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.
Grangeiro, Alexandre; Couto, Márcia Thereza; Peres, Maria Fernanda; Luiz, Olinda; Zucchi, Eliana Miura; de Castilho, Euclides Ayres; Estevam, Denize Lotufo; Alencar, Rosa; Wolffenbüttel, Karina; Escuder, Maria Mercedes; Calazans, Gabriela; Ferraz, Dulce; Arruda, Érico; Corrêa, Maria da Gloria; Amaral, Fabiana Rezende; Santos, Juliane Cardoso Villela; Alvarez, Vivian Salles; Kietzmann, Tiago
2015-08-25
Few results from programmes based on combination prevention methods are available. We propose to analyse the degree of protection provided by postexposure prophylaxis (PEP) for consensual sexual activity at healthcare clinics, its compensatory effects on sexual behaviour; and the effectiveness of combination prevention methods and pre-exposure prophylaxis (PrEP), compared with exclusively using traditional methods. A total of 3200 individuals aged 16 years or older presenting for PEP at 5 sexually transmitted disease (STD)/HIV clinics in 3 regions of Brazil will be allocated to one of two groups: the PEP group-individuals who come to the clinic within 72 h after a sexual exposure and start PEP; and the non-PEP group-individuals who come after 72 h but within 30 days of exposure and do not start PEP. Clinical follow-up will be conducted initially for 6 months and comprise educational interventions based on information and counselling for using prevention methods, including PrEP. In the second study phase, individuals who remain HIV negative will be regrouped according to the reported use of prevention methods and observed for 18 months: only traditional methods; combined methods; and PrEP. Effectiveness will be analysed according to the incidence of HIV, syphilis and hepatitis B and C and protected sexual behaviour. A structured questionnaire will be administered to participants at baseline and every 6 months thereafter. Qualitative methods will be employed to provide a comprehensive understanding of PEP-seeking behaviour, preventive choices and exposure to HIV. This study will be conducted in accordance with the resolution of the School of Medicine Research Ethics Commission of Universidade de São Paulo (protocol no. 251/14). The databases will be available for specific studies, after management committee approval. Findings will be presented to researchers, health managers and civil society members by means of newspapers, electronic media and scientific journals and meetings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Rattner, B.A.; Pearson, J.L.; Golden, N.H.; Erwin, R.M.; Ottinger, M.A.
1998-01-01
The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. One BEST program activity involves evaluation of retrospective data to assess and predict the condition of biota in Atlantic coast estuaries. A 'Contaminant Exposure and Effects--Terrestrial Vertebrates' database (CEE-TV) has been compiled through computerized literature searches of Fish and Wildlife Reviews, BIOSIS, AGRICOLA, and TOXLINE, review of existing databases (e.g., US EPA Ecological Incident Information System, USGS Diagnostic and Epizootic Databases), and solicitation of unpublished reports from conservation agencies, private groups, and universities. Summary information has been entered into the CEE-TV database, including species, collection date (1965-present), site coordinates, sample matrix, contaminant concentrations, biomarker and bioindicator responses, and reference source, utilizing a 96-field dBase format. Currently, the CEE-TV database contains 3500 georeferenced records representing >200 vertebrate species and > 100,000 individuals residing in estuaries from Maine through Florida. This relational database can be directly queried, imported into the ARC/INFO geographic information system (GIS) to examine spatial tendencies, and used to identify 'hot-spots', generate hypotheses, and focus ecotoxicological assessments. An overview of temporal, phylogenetic, and geographic contaminant exposure and effects information, trends, and data gaps will be presented for terrestrial vertebrates residing in estuaries in the northeast United States.
Long Duration Exposure Facility (LDEF) optical systems SIG summary and database
NASA Astrophysics Data System (ADS)
Bohnhoff-Hlavacek, Gail
1992-09-01
The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.
Long Duration Exposure Facility (LDEF) optical systems SIG summary and database
NASA Technical Reports Server (NTRS)
Bohnhoff-Hlavacek, Gail
1992-01-01
The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.
NASA Technical Reports Server (NTRS)
Ebersole, M. M.
1983-01-01
JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.
Nitrous oxide for the management of labor pain: a systematic review.
Likis, Frances E; Andrews, Jeffrey C; Collins, Michelle R; Lewis, Rashonda M; Seroogy, Jeffrey J; Starr, Sarah A; Walden, Rachel R; McPheeters, Melissa L
2014-01-01
We systematically reviewed evidence addressing the effectiveness of nitrous oxide for the management of labor pain, the influence of nitrous oxide on women's satisfaction with their birth experience and labor pain management, and adverse effects associated with nitrous oxide for labor pain management. We searched the MEDLINE, EMBASE, and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases for articles published in English. The study population included pregnant women in labor intending a vaginal birth, birth attendees or health care providers who may be exposed to nitrous oxide during labor, and the fetus/neonate. We identified a total of 58 publications, representing 59 distinct study populations: 2 studies were of good quality, 11 fair, and 46 poor. Inhalation of nitrous oxide provided less effective pain relief than epidural analgesia, but the quality of studies was predominately poor. The heterogeneous outcomes used to assess women's satisfaction with their birth experience and labor pain management made synthesis of studies difficult. Most maternal adverse effects reported in the literature were unpleasant side effects that affect tolerability, such as nausea, vomiting, dizziness, and drowsiness. Apgar scores in newborns whose mothers used nitrous oxide were not significantly different from those of newborns whose mothers used other labor pain management methods or no analgesia. Evidence about occupational harms and exposure was limited. The literature addressing nitrous oxide for the management of labor pain includes few studies of good or fair quality. Further research is needed across all of the areas examined: effectiveness, satisfaction, and adverse effects.
Plenge-Bönig, A; Schmolz, E
2014-05-01
The German Act on the Prevention and Control of Infectious Diseases in Man (Infektionsschutzgesetz, IfSG) provides a legal framework for activities and responsibilities concerning communal rodent control. However, actual governance of communal rodent control is relatively heterogeneous, as federal states (Bundesländer) have different or even no regulations for prevention and management of commensal rodent infestations (e.g. brown rats, roof rats and house mice). Control targets and control requirements are rarely precisely defined and often do not go beyond general measures and objectives. Although relevant regulations provide information about agreed preventive measures against rodents, the concept of sustainability is not expressed as such. A centrally managed database-supported municipal rodent control is a key factor for sustainability because it allows a systematic and analytical approach to identify and reduce rodent populations. The definition of control objectives and their establishment in legal decrees is mandatory for the implementation of a sustainable management strategy of rodent populations at a local level. Systematic recording of rodent infestations through municipal-operated monitoring provides the essential data foundation for a targeted rodent management which is already implemented in some German and European cities and nationwide in Denmark. A sustainable rodent management includes a more targeted rodenticide application which in the long-term will lead to an overall reduction of rodenticide use. Thus, the benefits of sustainable rodent management will be a reduction of rodenticide exposure to the environment, prevention of resistance and long-term economical savings.
Page, David S.; Chapman, Peter M.; Landrum, Peter F.; Neff, Jerry; Elston, Ralph
2012-01-01
This article presents a critical review of two groups of studies that reported adverse effects to salmon and herring eggs and fry from exposure to 1 μg/L or less of aqueous total polycyclic aromatic hydrocarbons (TPAH), as weathered oil, and a more toxic aqueous extract of “very weathered oil.” Exposure media were prepared by continuously flowing water up through vertical columns containing gravel oiled at different concentrations of Prudhoe Bay crude oil. Uncontrolled variables associated with the use of the oiled gravel columns included time- and treatment-dependent variations in the PAH concentration and composition in the exposure water, unexplored toxicity from other oil constituents/degradation products, potential toxicity from bacterial and fungal activity, oil droplets as a potential contaminant source, inherent differences between control and exposed embryo populations, and water flow rate differences. Based on a review of the evidence from published project reports, peer-reviewed publications, chemistry data in a public database, and unpublished reports and laboratory records, the reviewed studies did not establish consistent dose (concentration) response or causality and thus do not demonstrate that dissolved PAH alone from the weathered oil resulted in the claimed effects on fish embryos at low μg/L TPAH concentrations. Accordingly, these studies should not be relied on for management decision-making, when assessing the risk of very low–level PAH exposures to early life stages of fish. PMID:22754275
Gini, Rosa; Schuemie, Martijn; Brown, Jeffrey; Ryan, Patrick; Vacchi, Edoardo; Coppola, Massimo; Cazzola, Walter; Coloma, Preciosa; Berni, Roberto; Diallo, Gayo; Oliveira, José Luis; Avillach, Paul; Trifirò, Gianluca; Rijnbeek, Peter; Bellentani, Mariadonata; van Der Lei, Johan; Klazinga, Niek; Sturkenboom, Miriam
2016-01-01
Introduction: We see increased use of existing observational data in order to achieve fast and transparent production of empirical evidence in health care research. Multiple databases are often used to increase power, to assess rare exposures or outcomes, or to study diverse populations. For privacy and sociological reasons, original data on individual subjects can’t be shared, requiring a distributed network approach where data processing is performed prior to data sharing. Case Descriptions and Variation Among Sites: We created a conceptual framework distinguishing three steps in local data processing: (1) data reorganization into a data structure common across the network; (2) derivation of study variables not present in original data; and (3) application of study design to transform longitudinal data into aggregated data sets for statistical analysis. We applied this framework to four case studies to identify similarities and differences in the United States and Europe: Exploring and Understanding Adverse Drug Reactions by Integrative Mining of Clinical Records and Biomedical Knowledge (EU-ADR), Observational Medical Outcomes Partnership (OMOP), the Food and Drug Administration’s (FDA’s) Mini-Sentinel, and the Italian network—the Integration of Content Management Information on the Territory of Patients with Complex Diseases or with Chronic Conditions (MATRICE). Findings: National networks (OMOP, Mini-Sentinel, MATRICE) all adopted shared procedures for local data reorganization. The multinational EU-ADR network needed locally defined procedures to reorganize its heterogeneous data into a common structure. Derivation of new data elements was centrally defined in all networks but the procedure was not shared in EU-ADR. Application of study design was a common and shared procedure in all the case studies. Computer procedures were embodied in different programming languages, including SAS, R, SQL, Java, and C++. Conclusion: Using our conceptual framework we found several areas that would benefit from research to identify optimal standards for production of empirical knowledge from existing databases.an opportunity to advance evidence-based care management. In addition, formalized CM outcomes assessment methodologies will enable us to compare CM effectiveness across health delivery settings. PMID:27014709
Implementation of Risk Management in NASA's CEV Project- Ensuring Mission Success
NASA Astrophysics Data System (ADS)
Perera, Jeevan; Holsomback, Jerry D.
2005-12-01
Most project managers know that Risk Management (RM) is essential to good project management. At NASA, standards and procedures to manage risk through a tiered approach have been developed - from the global agency-wide requirements down to a program or project's implementation. The basic methodology for NASA's risk management strategy includes processes to identify, analyze, plan, track, control, communicate and document risks. The identification, characterization, mitigation plan, and mitigation responsibilities associated with specific risks are documented to help communicate, manage, and effectuate appropriate closure. This approach helps to ensure more consistent documentation and assessment and provides a means of archiving lessons learned for future identification or mitigation activities.A new risk database and management tool was developed by NASA in 2002 and since has been used successfully to communicate, document and manage a number of diverse risks for the International Space Station, Space Shuttle, and several other NASA projects and programs including at the Johnson Space Center. Organizations use this database application to effectively manage and track each risk and gain insight into impacts from other organization's viewpoint to develop integrated solutions. Schedule, cost, technical and safety issues are tracked in detail through this system.Risks are tagged within the system to ensure proper review, coordination and management at the necessary management level. The database is intended as a day-to- day tool for organizations to manage their risks and elevate those issues that need coordination from above. Each risk is assigned to a managing organization and a specific risk owner who generates mitigation plans as appropriate. In essence, the risk owner is responsible for shepherding the risk through closure. The individual that identifies a new risk does not necessarily get assigned as the risk owner. Whoever is in the best position to effectuate comprehensive closure is assigned as the risk owner. Each mitigation plan includes the specific tasks that will be conducted to either decrease the likelihood of the risk occurring and/or lessen the severity of the consequences if they do occur. As each mitigation task is completed, the responsible managing organization records the completion of the task in the risk database and then re-scores the risk considering the task's results. By keeping scores updated, a managing organization's current top risks and risk posture can be readily identified including the status of any risk in the system.A number of metrics measure risk process trends from data contained in the database. This allows for trend analysis to further identify improvements to the process and assist in the management of all risks. The metrics will also scrutinize both the effectiveness and compliance of risk management requirements.The risk database is an evolving tool and will be continuously improved with capabilities requested by the NASA project community. This paper presents the basic foundations of risk management, the elements necessary for effective risk management, and the capabilities of this new risk database and how it is implemented to support NASA's risk management needs.
JPEG2000 and dissemination of cultural heritage over the Internet.
Politou, Eugenia A; Pavlidis, George P; Chamzas, Christodoulos
2004-03-01
By applying the latest technologies in image compression for managing the storage of massive image data within cultural heritage databases and by exploiting the universality of the Internet we are now able not only to effectively digitize, record and preserve, but also to promote the dissemination of cultural heritage. In this work we present an application of the latest image compression standard JPEG2000 in managing and browsing image databases, focusing on the image transmission aspect rather than database management and indexing. We combine the technologies of JPEG2000 image compression with client-server socket connections and client browser plug-in, as to provide with an all-in-one package for remote browsing of JPEG2000 compressed image databases, suitable for the effective dissemination of cultural heritage.
Configuration management program plan for Hanford site systems engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kellie, C.L.
This plan establishes the integrated management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford Site Technical Baseline.
TRENDS: The aeronautical post-test database management system
NASA Technical Reports Server (NTRS)
Bjorkman, W. S.; Bondi, M. J.
1990-01-01
TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.
DOE technology information management system database study report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widing, M.A.; Blodgett, D.W.; Braun, M.D.
1994-11-01
To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performedmore » detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Browne, S.V.; Green, S.C.; Moore, K.
1994-04-01
The Netlib repository, maintained by the University of Tennessee and Oak Ridge National Laboratory, contains freely available software, documents, and databases of interest to the numerical, scientific computing, and other communities. This report includes both the Netlib User`s Guide and the Netlib System Manager`s Guide, and contains information about Netlib`s databases, interfaces, and system implementation. The Netlib repository`s databases include the Performance Database, the Conferences Database, and the NA-NET mail forwarding and Whitepages Databases. A variety of user interfaces enable users to access the Netlib repository in the manner most convenient and compatible with their networking capabilities. These interfaces includemore » the Netlib email interface, the Xnetlib X Windows client, the netlibget command-line TCP/IP client, anonymous FTP, anonymous RCP, and gopher.« less
The purpose of this report is to develop a database of physiological parameters needed for understanding and evaluating performance of the APEX and SHEDS exposure/intake dose rate model used by the Environmental Protection Agency (EPA) as part of its regulatory activities. The A...
HESI EXPOSURE FACTORS DATABASE FOR AGGREGATE AND CUMULATIVE RISK ASSESSMENT
In recent years, the risk analysis community has broadened its use of complex aggregate and cumulative residential exposure models (e.g., to meet the requirements of the 1996 Food Quality Protection Act). The value of these models is their ability to incorporate a range of inp...
The chemical form specific toxicity of arsenic dictates the need for species specific quantification in order to accurately assess the risk from an exposure. The literature has begun to produce preliminary species specific databases for certain dietary sources, but a quantitativ...
Concierge: Personal Database Software for Managing Digital Research Resources
Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro
2007-01-01
This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-03
... DEPARTMENT OF COMMERCE Minority Business Development Agency Proposed Information Collection; Comment Request; Online Customer Relationship Management (CRM)/Performance Databases, the Online Phoenix... of program goals via the Online CRM/Performance Databases. The data collected through the Online CRM...
Implementing a Microcomputer Database Management System.
ERIC Educational Resources Information Center
Manock, John J.; Crater, K. Lynne
1985-01-01
Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancey, P.; Logg, C.
DEPOT has been developed to provide tracking for the Stanford Linear Collider (SLC) control system equipment. For each piece of equipment entered into the database, complete location, service, maintenance, modification, certification, and radiation exposure histories can be maintained. To facilitate data entry accuracy, efficiency, and consistency, barcoding technology has been used extensively. DEPOT has been an important tool in improving the reliability of the microsystems controlling SLC. This document describes the components of the DEPOT database, the elements in the database records, and the use of the supporting programs for entering data, searching the database, and producing reports from themore » information.« less
NASA Astrophysics Data System (ADS)
Viegas, F.; Malon, D.; Cranshaw, J.; Dimitrov, G.; Nowak, M.; Nairz, A.; Goossens, L.; Gallas, E.; Gamboa, C.; Wong, A.; Vinek, E.
2010-04-01
The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.
PACSY, a relational database management system for protein structure and chemical shift analysis
Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo
2012-01-01
PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636
A UML Profile for Developing Databases that Conform to the Third Manifesto
NASA Astrophysics Data System (ADS)
Eessaar, Erki
The Third Manifesto (TTM) presents the principles of a relational database language that is free of deficiencies and ambiguities of SQL. There are database management systems that are created according to TTM. Developers need tools that support the development of databases by using these database management systems. UML is a widely used visual modeling language. It provides built-in extension mechanism that makes it possible to extend UML by creating profiles. In this paper, we introduce a UML profile for designing databases that correspond to the rules of TTM. We created the first version of the profile by translating existing profiles of SQL database design. After that, we extended and improved the profile. We implemented the profile by using UML CASE system StarUML™. We present an example of using the new profile. In addition, we describe problems that occurred during the profile development.
Heterogeneous distributed databases: A case study
NASA Technical Reports Server (NTRS)
Stewart, Tracy R.; Mukkamala, Ravi
1991-01-01
Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.
NASA Astrophysics Data System (ADS)
Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.
2009-12-01
Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.
Effects of exposure to malathion on blood glucose concentration: a meta-analysis.
Ramirez-Vargas, Marco Antonio; Flores-Alfaro, Eugenia; Uriostegui-Acosta, Mayrut; Alvarez-Fitz, Patricia; Parra-Rojas, Isela; Moreno-Godinez, Ma Elena
2018-02-01
Exposure to malathion (an organophosphate pesticide widely used around the world) has been associated with alterations in blood glucose concentration in animal models. However, the results are inconsistent. The aim of this meta-analysis was to evaluate whether malathion exposure can disturb the concentrations of blood glucose in exposed rats. We performed a literature search of online databases including PubMed, EBSCO, and Google Scholar and reviewed original articles that analyzed the relation between malathion exposure and glucose levels in animal models. The selection of articles was based on inclusion and exclusion criteria. The database search identified thirty-five possible articles, but only eight fulfilled our inclusion criteria, and these studies were included in the meta-analysis. The effect of malathion on blood glucose concentration showed a non-monotonic dose-response curve. In addition, pooled analysis showed that blood glucose concentrations were 3.3-fold higher in exposed rats than in the control group (95% CI, 2-5; Z = 3.9; p < 0.0001) in a random-effect model. This result suggested that alteration of glucose homeostasis is a possible mechanism of toxicity associated with exposure to malathion.
Job-Related Stressors of Classical Instrumental Musicians: A Systematic Qualitative Review.
Vervainioti, A; Alexopoulos, E C
2015-12-01
Epidemiological studies among performing artists have found elevated stress levels and health effects, but scarcely the full range of stressors has been reported. We review here the existing literature on job-related stressors of classical instrumental musicians (orchestra musicians). PubMed, Google Scholar and JSTOR databases were screened for relevant papers indexed up to August 2012. A total of 122 papers was initially identified which, after exclusion of duplicates and those not meeting eligibility criteria, yielded 67 articles for final analysis. We identified seven categories of stressors affecting musicians in their everyday working lives: public exposure, personal hazards, repertoire, competition, job context, injury/illness, and criticism, but with interrelated assigned factors. The proposed categories provide a framework for future comprehensive research on the impact and management of musician stressors.
Region 7 Laboratory Information Management System
This is metadata documentation for the Region 7 Laboratory Information Management System (R7LIMS) which maintains records for the Regional Laboratory. Any Laboratory analytical work performed is stored in this system which replaces LIMS-Lite, and before that LAST. The EPA and its contractors may use this database. The Office of Policy & Management (PLMG) Division at EPA Region 7 is the primary managing entity; contractors can access this database but it is not accessible to the public.
Johansson, Saga; Wallander, Mari-Ann; de Abajo, Francisco J; García Rodríguez, Luis Alberto
2010-03-01
Post-launch drug safety monitoring is essential for the detection of adverse drug signals that may be missed during preclinical trials. Traditional methods of postmarketing surveillance such as spontaneous reporting have intrinsic limitations, many of which can be overcome by the additional application of structured pharmacoepidemiological approaches. However, further improvement in drug safety monitoring requires a shift towards more proactive pharmacoepidemiological methods that can detect adverse drug signals as they occur in the population. To assess the feasibility of using proactive monitoring of an electronic medical record system, in combination with an independent endpoint adjudication committee, to detect adverse events among users of selected drugs. UK General Practice Research Database (GPRD) information was used to detect acute liver disorder associated with the use of amoxicillin/clavulanic acid (hepatotoxic) or low-dose aspirin (acetylsalicylic acid [non-hepatotoxic]). Individuals newly prescribed these drugs between 1 October 2005 and 31 March 2006 were identified. Acute liver disorder cases were assessed using GPRD computer records in combination with case validation by an independent endpoint adjudication committee. Signal generation thresholds were based on the background rate of acute liver disorder in the general population. Over a 6-month period, 8148 patients newly prescribed amoxicillin/clavulanic acid and 5577 patients newly prescribed low-dose aspirin were identified. Within this cohort, searches identified 11 potential liver disorder cases from computerized records: six for amoxicillin/clavulanic acid and five for low-dose aspirin. The independent endpoint adjudication committee refined this to four potential acute liver disorder cases for whom paper-based information was requested for final case assessment. Final case assessments confirmed no cases of acute liver disorder. The time taken for this study was 18 months (6 months for recruitment and 12 months for data management and case validation). To reach the estimated target exposure necessary to raise or rule out a signal of concern to public health, we determined that a recruitment period 2-3 times longer than that used in this study would be required. Based on the real market uptake of six commonly used medicinal products launched between 2001 and 2006 in the UK (budesonide/eformoterol [fixed-dose combination], duloxetine, ezetimibe, metformin/rosiglitazone [fixed-dose combination], tiotropium bromide and tadalafil) the target exposure would not have been reached until the fifth year of marketing using a single database. It is feasible to set up a system that actively monitors drug safety using a healthcare database and an independent endpoint adjudication committee. However, future successful implementation will require multiple databases to be queried so that larger study populations are included. This requires further development and harmonization of international healthcare databases.
Scarselli, Alberto
2011-01-01
The recording of occupational exposure to carcinogens is a fundamental step in order to assess exposure risk factors in workplaces. The aim of this paper is to describe the characteristics of the Italian register of occupational exposures to carcinogen agents (SIREP). The core data collected in the system are: firm characteristics, worker demographics, and exposure information. Statistical descriptive analyses were performed by economic activity sector, carcinogen agent and geographic location. Currently, the information recorded regard: 12,300 firms, 130,000 workers, and 250,000 exposures. The SIREP database has been set up in order to assess, control and reduce the carcinogen risk at workplace.
NASA Astrophysics Data System (ADS)
Paprotny, Dominik; Morales-Nápoles, Oswaldo; Jonkman, Sebastiaan N.
2018-03-01
The influence of social and economic change on the consequences of natural hazards has been a matter of much interest recently. However, there is a lack of comprehensive, high-resolution data on historical changes in land use, population, or assets available to study this topic. Here, we present the Historical Analysis of Natural Hazards in Europe (HANZE) database, which contains two parts: (1) HANZE-Exposure with maps for 37 countries and territories from 1870 to 2020 in 100 m resolution and (2) HANZE-Events, a compilation of past disasters with information on dates, locations, and losses, currently limited to floods only. The database was constructed using high-resolution maps of present land use and population, a large compilation of historical statistics, and relatively simple disaggregation techniques and rule-based land use reallocation schemes. Data encompassed in HANZE allow one to "normalize" information on losses due to natural hazards by taking into account inflation as well as changes in population, production, and wealth. This database of past events currently contains 1564 records (1870-2016) of flash, river, coastal, and compound floods. The HANZE database is freely available at https://data.4tu.nl/repository/collection:HANZE.
Stam, Rianne
2014-06-01
Some of the strongest electromagnetic fields (EMF) are found in the workplace. A European Directive sets limits to workers' exposure to EMF. This review summarizes its origin and contents and compares magnetic field exposure levels in high-risk workplaces with the limits set in the revised Directive. Pubmed, Scopus, grey literature databases, and websites of organizations involved in occupational exposure measurements were searched. The focus was on EMF with frequencies up to 10 MHz, which can cause stimulation of the nervous system. Selected studies had to provide individual maximum exposure levels at the workplace, either in terms of the external magnetic field strength or flux density or as induced electric field strength or current density. Indicative action levels and the corresponding exposure limit values for magnetic fields in the revised European Directive will be higher than those in the previous version. Nevertheless, magnetic flux densities in excess of the action levels for peripheral nerve stimulation are reported for workers involved in welding, induction heating, transcranial magnetic stimulation, and magnetic resonance imaging (MRI). The corresponding health effects exposure limit values for the electric fields in the worker's body can be exceeded for welding and MRI, but calculations for induction heating and transcranial magnetic stimulation are lacking. Since the revised European Directive conditionally exempts MRI-related activities from the exposure limits, measures to reduce exposure may be necessary for welding, induction heating, and transcranial nerve stimulation. Since such measures can be complicated, there is a clear need for exposure databases for different workplace scenarios with significant EMF exposure and guidance on good practices. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Creative Classroom Assignment Through Database Management.
ERIC Educational Resources Information Center
Shah, Vivek; Bryant, Milton
1987-01-01
The Faculty Scheduling System (FSS), a database management system designed to give administrators the ability to schedule faculty in a fast and efficient manner is described. The FSS, developed using dBASE III, requires an IBM compatible microcomputer with a minimum of 256K memory. (MLW)
A Database Management System for Interlibrary Loan.
ERIC Educational Resources Information Center
Chang, Amy
1990-01-01
Discusses the increasing complexity of dealing with interlibrary loan requests and describes a database management system for interlibrary loans used at Texas Tech University. System functions are described, including file control, records maintenance, and report generation, and the impact on staff productivity is discussed. (CLB)
DOT National Transportation Integrated Search
2006-05-01
Specific objectives of the Peer Exchange were: : Discuss and exchange information about databases and other software : used to support the program-cycles managed by state transportation : research offices. Elements of the program cycle include: :...
78 FR 23685 - Airworthiness Directives; The Boeing Company
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-22
... installing new operational software for the electrical load management system and configuration database. The..., installing a new electrical power control panel, and installing new operational software for the electrical load management system and configuration database. Since the proposed AD was issued, we have received...
Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements
NASA Technical Reports Server (NTRS)
Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri
2006-01-01
NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.
Implementation of an open adoption research data management system for clinical studies.
Müller, Jan; Heiss, Kirsten Ingmar; Oberhoffer, Renate
2017-07-06
Research institutions need to manage multiple studies with individual data sets, processing rules and different permissions. So far, there is no standard technology that provides an easy to use environment to create databases and user interfaces for clinical trials or research studies. Therefore various software solutions are being used-from custom software, explicitly designed for a specific study, to cost intensive commercial Clinical Trial Management Systems (CTMS) up to very basic approaches with self-designed Microsoft ® databases. The technology applied to conduct those studies varies tremendously from study to study, making it difficult to evaluate data across various studies (meta-analysis) and keeping a defined level of quality in database design, data processing, displaying and exporting. Furthermore, the systems being used to collect study data are often operated redundantly to systems used in patient care. As a consequence the data collection in studies is inefficient and data quality may suffer from unsynchronized datasets, non-normalized database scenarios and manually executed data transfers. With OpenCampus Research we implemented an open adoption software (OAS) solution on an open source basis, which provides a standard environment for state-of-the-art research database management at low cost.
Configuration management program plan for Hanford site systems engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, A.G.
This plan establishes the integrated configuration management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford site technical baseline.
The present report describes a strategy to refine the current Cramer classification of the TTC concept using a broad database (DB) termed TTC RepDose. Cramer classes 1-3 overlap to some extent, indicating a need for a better separation of structural classes likely to be toxic, mo...
PylotDB - A Database Management, Graphing, and Analysis Tool Written in Python
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnette, Daniel W.
2012-01-04
PylotDB, written completely in Python, provides a user interface (UI) with which to interact with, analyze, graph data from, and manage open source databases such as MySQL. The UI mitigates the user having to know in-depth knowledge of the database application programming interface (API). PylotDB allows the user to generate various kinds of plots from user-selected data; generate statistical information on text as well as numerical fields; backup and restore databases; compare database tables across different databases as well as across different servers; extract information from any field to create new fields; generate, edit, and delete databases, tables, and fields;more » generate or read into a table CSV data; and similar operations. Since much of the database information is brought under control of the Python computer language, PylotDB is not intended for huge databases for which MySQL and Oracle, for example, are better suited. PylotDB is better suited for smaller databases that might be typically needed in a small research group situation. PylotDB can also be used as a learning tool for database applications in general.« less
Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G
2007-01-01
Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328
Application of China's National Forest Continuous Inventory database.
Xie, Xiaokui; Wang, Qingli; Dai, Limin; Su, Dongkai; Wang, Xinchuang; Qi, Guang; Ye, Yujing
2011-12-01
The maintenance of a timely, reliable and accurate spatial database on current forest ecosystem conditions and changes is essential to characterize and assess forest resources and support sustainable forest management. Information for such a database can be obtained only through a continuous forest inventory. The National Forest Continuous Inventory (NFCI) is the first level of China's three-tiered inventory system. The NFCI is administered by the State Forestry Administration; data are acquired by five inventory institutions around the country. Several important components of the database include land type, forest classification and ageclass/ age-group. The NFCI database in China is constructed based on 5-year inventory periods, resulting in some of the data not being timely when reports are issued. To address this problem, a forest growth simulation model has been developed to update the database for years between the periodic inventories. In order to aid in forest plan design and management, a three-dimensional virtual reality system of forest landscapes for selected units in the database (compartment or sub-compartment) has also been developed based on Virtual Reality Modeling Language. In addition, a transparent internet publishing system for a spatial database based on open source WebGIS (UMN Map Server) has been designed and utilized to enhance public understanding and encourage free participation of interested parties in the development, implementation, and planning of sustainable forest management.
Architecture for biomedical multimedia information delivery on the World Wide Web
NASA Astrophysics Data System (ADS)
Long, L. Rodney; Goh, Gin-Hua; Neve, Leif; Thoma, George R.
1997-10-01
Research engineers at the National Library of Medicine are building a prototype system for the delivery of multimedia biomedical information on the World Wide Web. This paper discuses the architecture and design considerations for the system, which will be used initially to make images and text from the third National Health and Nutrition Examination Survey (NHANES) publicly available. We categorized our analysis as follows: (1) fundamental software tools: we analyzed trade-offs among use of conventional HTML/CGI, X Window Broadway, and Java; (2) image delivery: we examined the use of unconventional TCP transmission methods; (3) database manager and database design: we discuss the capabilities and planned use of the Informix object-relational database manager and the planned schema for the HNANES database; (4) storage requirements for our Sun server; (5) user interface considerations; (6) the compatibility of the system with other standard research and analysis tools; (7) image display: we discuss considerations for consistent image display for end users. Finally, we discuss the scalability of the system in terms of incorporating larger or more databases of similar data, and the extendibility of the system for supporting content-based retrieval of biomedical images. The system prototype is called the Web-based Medical Information Retrieval System. An early version was built as a Java applet and tested on Unix, PC, and Macintosh platforms. This prototype used the MiniSQL database manager to do text queries on a small database of records of participants in the second NHANES survey. The full records and associated x-ray images were retrievable and displayable on a standard Web browser. A second version has now been built, also a Java applet, using the MySQL database manager.
Choosing the Right Database Management Program.
ERIC Educational Resources Information Center
Vockell, Edward L.; Kopenec, Donald
1989-01-01
Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)
ERIC Educational Resources Information Center
Mills, Myron L.
1988-01-01
A system developed for more efficient evaluation of graduate medical students' progress uses numerical scoring and a microcomputer database management system as an alternative to manual methods to produce accurate, objective, and meaningful summaries of resident evaluations. (Author/MSE)
Integration of Information Retrieval and Database Management Systems.
ERIC Educational Resources Information Center
Deogun, Jitender S.; Raghavan, Vijay V.
1988-01-01
Discusses the motivation for integrating information retrieval and database management systems, and proposes a probabilistic retrieval model in which records in a file may be composed of attributes (formatted data items) and descriptors (content indicators). The details and resolutions of difficulties involved in integrating such systems are…
Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M
2005-01-01
As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.
EasyKSORD: A Platform of Keyword Search Over Relational Databases
NASA Astrophysics Data System (ADS)
Peng, Zhaohui; Li, Jing; Wang, Shan
Keyword Search Over Relational Databases (KSORD) enables casual users to use keyword queries (a set of keywords) to search relational databases just like searching the Web, without any knowledge of the database schema or any need of writing SQL queries. Based on our previous work, we design and implement a novel KSORD platform named EasyKSORD for users and system administrators to use and manage different KSORD systems in a novel and simple manner. EasyKSORD supports advanced queries, efficient data-graph-based search engines, multiform result presentations, and system logging and analysis. Through EasyKSORD, users can search relational databases easily and read search results conveniently, and system administrators can easily monitor and analyze the operations of KSORD and manage KSORD systems much better.
Advanced Query Formulation in Deductive Databases.
ERIC Educational Resources Information Center
Niemi, Timo; Jarvelin, Kalervo
1992-01-01
Discusses deductive databases and database management systems (DBMS) and introduces a framework for advanced query formulation for end users. Recursive processing is described, a sample extensional database is presented, query types are explained, and criteria for advanced query formulation from the end user's viewpoint are examined. (31…
Analysis of flash flood parameters and human impacts in the US from 2006 to 2012
NASA Astrophysics Data System (ADS)
Špitalar, Maruša; Gourley, Jonathan J.; Lutoff, Celine; Kirstetter, Pierre-Emmanuel; Brilly, Mitja; Carr, Nicholas
2014-11-01
Several different factors external to the natural hazard of flash flooding can contribute to the type and magnitude of their resulting damages. Human exposure, vulnerability, fatality and injury rates can be minimized by identifying and then mitigating the causative factors for human impacts. A database of flash flooding was used for statistical analysis of human impacts across the U.S. 21,549 flash flood events were analyzed during a 6-year period from October 2006 to 2012. Based on the information available in the database, physical parameters were introduced and then correlated to the reported human impacts. Probability density functions of the frequency of flash flood events and the PDF of occurrences weighted by the number of injuries and fatalities were used to describe the influence of each parameter. The factors that emerged as the most influential on human impacts are short flood durations, small catchment sizes in rural areas, vehicles, and nocturnal events with low visibility. Analyzing and correlating a diverse range of parameters to human impacts give us important insights into what contributes to fatalities and injuries and further raises questions on how to manage them.
Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang
2010-04-01
To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.
In recent years, the risk analysis community has broadened its use of complex aggregate and cumulative residential exposure models (e.g., to meet the requirements of the 1996 Food Quality Protection Act). The value of these models is their ability to incorporate a range of input...
The purpose of this report is to assess the application of tools to community-level assessments of exposure, health and the environment. Various tools and datasets provided different types of information, such as on health effects, chemical types and volumes, facility locations a...
NASA Astrophysics Data System (ADS)
Severson, R. L.; Peng, R. D.; Anderson, G. B.
2017-12-01
There is substantial evidence that extreme precipitation and flooding are serious threats to public health and safety. These threats are predicted to increase with climate change. Epidemiological studies investigating the health effects of these events vary in the methods used to characterize exposure. Here, we compare two sources of precipitation data (National Oceanic and Atmospheric Administration (NOAA) station-based and North American Land Data Assimilation Systems (NLDAS-2) Reanalysis data-based) for estimating exposure to extreme precipitation and two sources of flooding data, based on United States Geological Survey (USGS) streamflow gages and the NOAA Storm Events database. We investigate associations between each of the four exposure metrics and short-term risk of four causes of mortality (accidental, respiratory-related, cardiovascular-related, and all-cause) in the United States from 1987 through 2005. Average daily precipitation values from the two precipitation data sources were moderately correlated (Spearman's rho = 0.74); however, values from the two data sources were less correlated when comparing binary metrics of exposure to extreme precipitation days (Jaccard index (J) = 0.35). Binary metrics of daily flood exposure were poorly correlated between the two flood data sources (Spearman's rho = 0.07; J = 0.05). There was little correlation between extreme precipitation exposure and flood exposure in study communities. We did not observe evidence of a positive association between any of the four exposure metrics and risk of any of the four mortality outcomes considered. Our results suggest, due to the observed lack of agreement between different extreme precipitation and flood metrics, that exposure to extreme precipitation may not serve as an effective surrogate for exposures related to flooding. Furthermore, It is possible that extreme precipitation and flood exposures may often be too localized to allow accurate exposure assessment at the community level for epidemiological studies.
Role of the lower esophageal sphincter on esophageal acid exposure - a review of over 2000 patients.
Tsuboi, Kazuto; Hoshino, Masato; Sundaram, Abhishek; Yano, Fumiaki; Mittal, Sumeet K
2012-01-01
Three lower esophageal sphincter (LES) characteristics associated with gastro-esophageal reflux disease (GERD) are, LES pressure = 6 mmHg, abdominal length (AL) <1 cm and overall length (OL) <2 cm. The objective of this study was to validate this relationship and evaluate the extent of impact various LES characteristics have on the degree of distal esophageal acid exposure. A retrospective review of a prospectively maintained database identified patients who underwent esophageal manometry and pH studies at Creighton University Medical Center between 1984 and 2008. Patients with esophageal body dysmotility, prior foregut surgery, missing data, no documented symptoms or no pH study, were excluded. Study subjects were categorized as follows: (1) normal LES (N-LES): patients with LES pressure of 6-26 mmHg, AL = 1.0 cm and OL = 2 cm; (2) incompetent LES (Inc-LES): patients with LES pressure <6.0 mmHg orAL <1 cm or OL <2 cm; and (3) hypertensive LES (HTN-LES): patients with LES pressure >26.0 mmHg with AL = 1 cm and OL = 2 cm. The DeMeester score was used to compare differences in acid exposure between different groups. Two thousand and twenty patients satisfied study criteria. Distal esophageal acid exposure as reflected by the DeMeester score in patients with Inc-LES (median=20.05) was significantly higher than in patients with an N-LES (median=9.5), which in turn was significantly higher than in patients with an HTN-LES. Increasing LES pressure and AL provided protection against acid exposure in a graded fashion. Increasing number of inadequate LES characteristics were associated with an increase both in the percentage of patients with abnormal DeMeester score and the degree of acid exposure. LES pressure (=6 mmHg) and AL (<1 cm) are associated with increased lower esophageal acid exposure, and need to be addressed for definitive management of GERD.
A survey of commercial object-oriented database management systems
NASA Technical Reports Server (NTRS)
Atkins, John
1992-01-01
The object-oriented data model is the culmination of over thirty years of database research. Initially, database research focused on the need to provide information in a consistent and efficient manner to the business community. Early data models such as the hierarchical model and the network model met the goal of consistent and efficient access to data and were substantial improvements over simple file mechanisms for storing and accessing data. However, these models required highly skilled programmers to provide access to the data. Consequently, in the early 70's E.F. Codd, an IBM research computer scientists, proposed a new data model based on the simple mathematical notion of the relation. This model is known as the Relational Model. In the relational model, data is represented in flat tables (or relations) which have no physical or internal links between them. The simplicity of this model fostered the development of powerful but relatively simple query languages that now made data directly accessible to the general database user. Except for large, multi-user database systems, a database professional was in general no longer necessary. Database professionals found that traditional data in the form of character data, dates, and numeric data were easily represented and managed via the relational model. Commercial relational database management systems proliferated and performance of relational databases improved dramatically. However, there was a growing community of potential database users whose needs were not met by the relational model. These users needed to store data with data types not available in the relational model and who required a far richer modelling environment than that provided by the relational model. Indeed, the complexity of the objects to be represented in the model mandated a new approach to database technology. The Object-Oriented Model was the result.
Building an Ontology-driven Database for Clinical Immune Research
Ma, Jingming
2006-01-01
The clinical researches of immune response usually generate a huge amount of biomedical testing data over a certain period of time. The user-friendly data management systems based on the relational database will help immunologists/clinicians to fully manage the data. On the other hand, the same biological assays such as ELISPOT and flow cytometric assays are involved in immunological experiments no matter of different study purposes. The reuse of biological knowledge is one of driving forces behind this ontology-driven data management. Therefore, an ontology-driven database will help to handle different clinical immune researches and help immunologists/clinicians easily understand the immunological data from each other. We will discuss some outlines for building an ontology-driven data management for clinical immune researches (ODMim). PMID:17238637
Comprehensive European dietary exposure model (CEDEM) for food additives.
Tennant, David R
2016-05-01
European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.
The Majorana Parts Tracking Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abgrall, N.; Aguayo, E.; Avignone, F. T.
2015-04-01
The MAJORANA DEMONSTRATOR is an ultra-low background physics experiment searching for the neutrinoless double beta decay of 76Ge. The MAJORANA Parts Tracking Database is used to record the history of components used in the construction of the DEMONSTRATOR. Transportation, storage, and processes undergone by parts such as machining or cleaning are linked to part records. Tracking parts provides a great logistics benefit and an important quality assurance reference during construction. In addition, the location history of parts provides an estimate of their exposure to cosmic radiation. A web application for data entry and a radiation exposure calculator have been developedmore » as tools for achieving the extreme radiopurity required for this rare decay search.« less
Gale, Sara; Wilson, Jessica C; Chia, Jenny; Trinh, Huong; Tuckwell, Katie; Collinson, Neil; Dimonaco, Sophie; Jick, Susan; Meier, Christoph; Mohan, Shalini V; Sarsour, Khaled
2018-05-11
Treatment of giant cell arteritis (GCA) involves immediate initiation of high-dose glucocorticoid therapy with slow tapering of the dose over many months. Chronic exposure to glucocorticoids is associated with serious comorbidities. The objective of this analysis was to determine the glucocorticoid exposure and risk of glucocorticoid-related adverse events (AEs) in real-world patients with GCA. Data from the Truven Healthcare MarketScan ® database (from January 1, 2000, to June 30, 2015) and the Clinical Practice Research Datalink (CPRD; from January 1, 1995, to August 31, 2013) were used to retrospectively analyze patients aged ≥ 50 years with GCA in the USA and UK, respectively. Outcomes included oral glucocorticoid use (cumulative prednisone-equivalent exposure), glucocorticoid-related AEs and the association of AE risk with glucocorticoid exposure over 52 weeks. Of the 4804 patients in the US MarketScan database and 3973 patients in the UK CPRD database included, 71.3 and 74.6% were women and mean age was 73.4 and 73.0 years, respectively. Median starting glucocorticoid dose and cumulative glucocorticoid dose at 52 weeks were 20-50 mg/day and 4000-4800 mg, respectively. The most frequent glucocorticoid-related AEs were hypertension and eye, bone health, and glucose tolerance conditions. In the first year after diagnosis, the likelihood of any glucocorticoid-related AE was significantly increased for each 1 g increase in cumulative glucocorticoid dose in the US and UK cohorts (odds ratio [95% CI], 1.170 [1.063, 1.287] and 1.06 [1.03, 1.09], respectively; P < 0.05 for both). Similar trends were observed for the risk of glucocorticoid-related AEs over full follow-up (mean, USA: 3.9 years, UK: 6.3 years). In real-world patients with GCA, increased cumulative glucocorticoid exposure was associated with an increased risk of glucocorticoid-related AEs. F. Hoffmann-La Roche Ltd. Plain language summary available for this article.
Hardware Removal in Craniomaxillofacial Trauma
Cahill, Thomas J.; Gandhi, Rikesh; Allori, Alexander C.; Marcus, Jeffrey R.; Powers, David; Erdmann, Detlev; Hollenbeck, Scott T.; Levinson, Howard
2015-01-01
Background Craniomaxillofacial (CMF) fractures are typically treated with open reduction and internal fixation. Open reduction and internal fixation can be complicated by hardware exposure or infection. The literature often does not differentiate between these 2 entities; so for this study, we have considered all hardware exposures as hardware infections. Approximately 5% of adults with CMF trauma are thought to develop hardware infections. Management consists of either removing the hardware versus leaving it in situ. The optimal approach has not been investigated. Thus, a systematic review of the literature was undertaken and a resultant evidence-based approach to the treatment and management of CMF hardware infections was devised. Materials and Methods A comprehensive search of journal articles was performed in parallel using MEDLINE, Web of Science, and ScienceDirect electronic databases. Keywords and phrases used were maxillofacial injuries; facial bones; wounds and injuries; fracture fixation, internal; wound infection; and infection. Our search yielded 529 articles. To focus on CMF fractures with hardware infections, the full text of English-language articles was reviewed to identify articles focusing on the evaluation and management of infected hardware in CMF trauma. Each article’s reference list was manually reviewed and citation analysis performed to identify articles missed by the search strategy. There were 259 articles that met the full inclusion criteria and form the basis of this systematic review. The articles were rated based on the level of evidence. There were 81 grade II articles included in the meta-analysis. Result Our meta-analysis revealed that 7503 patients were treated with hardware for CMF fractures in the 81 grade II articles. Hardware infection occurred in 510 (6.8%) of these patients. Of those infections, hardware removal occurred in 264 (51.8%) patients; hardware was left in place in 166 (32.6%) patients; and in 80 (15.6%) cases, there was no report as to hardware management. Finally, our review revealed that there were no reported differences in outcomes between groups. Conclusions Management of CMF hardware infections should be performed in a sequential and consistent manner to optimize outcome. An evidence-based algorithm for management of CMF hardware infections based on this critical review of the literature is presented and discussed. PMID:25393499
Outdoor work and solar radiation exposure: Evaluation method for epidemiological studies.
Modenese, Alberto; Bisegna, Fabio; Borra, Massimo; Grandi, Carlo; Gugliermetti, Franco; Militello, Andrea; Gobba, Fabriziomaria
The health risk related to an excessive exposure to solar radiation (SR) is well known. The Sun represents the main exposure source for all the frequency bands of optical radiation, that is the part of the electromagnetic spectrum ranging between 100 nm and 1 mm, including infrared (IR), ultraviolet (UV) and visible radiation. According to recent studies, outdoor workers have a relevant exposure to SR but few studies available in scientific literature have attempted to retrace a detailed history of individual exposure. We propose a new method for the evaluation of SR cumulative exposure both during work and leisure time, integrating subjective and objective data. The former is collected by means of an interviewer administrated questionnaire. The latter is available through the Internet databases for many geographical regions and through individual exposure measurements. The data is integrated into a mathematical algorithm, in order to obtain an esteem of the individual total amount of SR the subjects have been exposed to during their lives. The questionnaire has been tested for 58 voluntary subjects. Environmental exposure data through online databases has been collected for 3 different places in Italy in 2012. Individual exposure by electronic UV dosimeter has been measured in 6 fishermen. A mathematical algorithm integrating subjective and objective data has been elaborated. The method proposed may be used in epidemiological studies to evaluate specific correlations with biological effects of SR and to weigh the role of the personal and environmental factors that may increase or reduce SR exposure. Med Pr 2016;67(5):577-587. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Do Librarians Really Do That? Or Providing Custom, Fee-Based Services.
ERIC Educational Resources Information Center
Whitmore, Susan; Heekin, Janet
This paper describes some of the fee-based, custom services provided by National Institutes of Health (NIH) Library to NIH staff, including knowledge management, clinical liaisons, specialized database searching, bibliographic database development, Web resource guide development, and journal management. The first section discusses selecting the…
Effects of Long-term Soil and Crop Management on Soil Hydraulic Properties for Claypan Soils
USDA-ARS?s Scientific Manuscript database
Regional and national soil maps and associated databases of soil properties have been developed to help land managers make decisions based on soil characteristics. Hydrologic modelers also utilize soil hydraulic properties provided in these databases, in which soil characterization is based on avera...
A Database Management Assessment Instrument
ERIC Educational Resources Information Center
Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr.
2013-01-01
This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…
Managing the world’s largest and complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that are comparable across the region. To meet such a need, we developed a hierarchi...
Two Student Self-Management Techniques Applied to Data-Based Program Modification.
ERIC Educational Resources Information Center
Wesson, Caren
Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…
2002-01-01
to the OODBMS approach. The ORDBMS approach produced such research prototypes as Postgres [155], and Starburst [67] and commercial products such as...Kemnitz. The POSTGRES Next-Generation Database Management System. Communications of the ACM, 34(10):78–92, 1991. [156] Michael Stonebreaker and Dorothy
2017-01-01
each change and its implementation status as well as supporting the audit of products to verify conformance to requirements. Through these change...management process for modifying DSAID aligns with information technology and project management industry standards. GAO reviewed DOD documents, and...Acknowledgments 32 Related GAO Products 33 Tables Table 1: Roles and Access Rights for Users of the Defense Sexual Assault Incident Database (DSAID
Integration and management of massive remote-sensing data based on GeoSOT subdivision model
NASA Astrophysics Data System (ADS)
Li, Shuang; Cheng, Chengqi; Chen, Bo; Meng, Li
2016-07-01
Owing to the rapid development of earth observation technology, the volume of spatial information is growing rapidly; therefore, improving query retrieval speed from large, rich data sources for remote-sensing data management systems is quite urgent. A global subdivision model, geographic coordinate subdivision grid with one-dimension integer coding on 2n-tree, which we propose as a solution, has been used in data management organizations. However, because a spatial object may cover several grids, ample data redundancy will occur when data are stored in relational databases. To solve this redundancy problem, we first combined the subdivision model with the spatial array database containing the inverted index. We proposed an improved approach for integrating and managing massive remote-sensing data. By adding a spatial code column in an array format in a database, spatial information in remote-sensing metadata can be stored and logically subdivided. We implemented our method in a Kingbase Enterprise Server database system and compared the results with the Oracle platform by simulating worldwide image data. Experimental results showed that our approach performed better than Oracle in terms of data integration and time and space efficiency. Our approach also offers an efficient storage management system for existing storage centers and management systems.
Development and Operation of a Database Machine for Online Access and Update of a Large Database.
ERIC Educational Resources Information Center
Rush, James E.
1980-01-01
Reviews the development of a fault tolerant database processor system which replaced OCLC's conventional file system. A general introduction to database management systems and the operating environment is followed by a description of the hardware selection, software processes, and system characteristics. (SW)
ERIC Educational Resources Information Center
Moore, Pam
2010-01-01
The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…
Tourism through Travel Club: A Database Project
ERIC Educational Resources Information Center
Pratt, Renée M. E.; Smatt, Cindi T.; Wynn, Donald E.
2017-01-01
This applied database exercise utilizes a scenario-based case study to teach the basics of Microsoft Access and database management in introduction to information systems and introduction to database course. The case includes background information on a start-up business (i.e., Carol's Travel Club), description of functional business requirements,…
An Improved Database System for Program Assessment
ERIC Educational Resources Information Center
Haga, Wayne; Morris, Gerard; Morrell, Joseph S.
2011-01-01
This research paper presents a database management system for tracking course assessment data and reporting related outcomes for program assessment. It improves on a database system previously presented by the authors and in use for two years. The database system presented is specific to assessment for ABET (Accreditation Board for Engineering and…
John F. Caratti
2006-01-01
The FIREMON database software allows users to enter data, store, analyze, and summarize plot data, photos, and related documents. The FIREMON database software consists of a Java application and a Microsoft® Access database. The Java application provides the user interface with FIREMON data through data entry forms, data summary reports, and other data management tools...
An image database management system for conducting CAD research
NASA Astrophysics Data System (ADS)
Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.
2007-03-01
The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.
NASA Astrophysics Data System (ADS)
Fletcher, Alex; Yoo, Terry S.
2004-04-01
Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.
FJET Database Project: Extract, Transform, and Load
NASA Technical Reports Server (NTRS)
Samms, Kevin O.
2015-01-01
The Data Mining & Knowledge Management team at Kennedy Space Center is providing data management services to the Frangible Joint Empirical Test (FJET) project at Langley Research Center (LARC). FJET is a project under the NASA Engineering and Safety Center (NESC). The purpose of FJET is to conduct an assessment of mild detonating fuse (MDF) frangible joints (FJs) for human spacecraft separation tasks in support of the NASA Commercial Crew Program. The Data Mining & Knowledge Management team has been tasked with creating and managing a database for the efficient storage and retrieval of FJET test data. This paper details the Extract, Transform, and Load (ETL) process as it is related to gathering FJET test data into a Microsoft SQL relational database, and making that data available to the data users. Lessons learned, procedures implemented, and programming code samples are discussed to help detail the learning experienced as the Data Mining & Knowledge Management team adapted to changing requirements and new technology while maintaining flexibility of design in various aspects of the data management project.
Research on computer virus database management system
NASA Astrophysics Data System (ADS)
Qi, Guoquan
2011-12-01
The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.
NASA Technical Reports Server (NTRS)
Davis, Robert P.; Underwood, Ian M.
1987-01-01
The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.
The methodology of database design in organization management systems
NASA Astrophysics Data System (ADS)
Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.
2017-01-01
The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.
Construction of a Linux based chemical and biological information system.
Molnár, László; Vágó, István; Fehér, András
2003-01-01
A chemical and biological information system with a Web-based easy-to-use interface and corresponding databases has been developed. The constructed system incorporates all chemical, numerical and textual data related to the chemical compounds, including numerical biological screen results. Users can search the database by traditional textual/numerical and/or substructure or similarity queries through the web interface. To build our chemical database management system, we utilized existing IT components such as ORACLE or Tripos SYBYL for database management and Zope application server for the web interface. We chose Linux as the main platform, however, almost every component can be used under various operating systems.
NASA Astrophysics Data System (ADS)
Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan
2014-03-01
We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.
MouseNet database: digital management of a large-scale mutagenesis project.
Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M
2000-07-01
The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.
NASA Technical Reports Server (NTRS)
Johnson, Paul W.
2008-01-01
ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.
Database structure for the Laser Accident and Incident Registry (LAIR)
NASA Astrophysics Data System (ADS)
Ness, James W.; Hoxie, Stephen W.; Zwick, Harry; Stuck, Bruce E.; Lund, David J.; Schmeisser, Elmar T.
1997-05-01
The ubiquity of laser radiation in military, medical, entertainment, telecommunications and research industries and the significant risk, of eye injury from this radiation are firmly established. While important advances have been made in understanding laser bioeffects using animal analogues and clinical data, the relationships among patient characteristics, exposure conditions, severity of the resulting injury, and visual function are fragmented, complex and varied. Although accident cases are minimized through laser safety regulations and control procedures, accumulated accident case information by the laser eye injury evaluation center warranted the development of a laser accident and incident registry. The registry includes clinical data for validating and refining hypotheses on injury and recovery mechanisms; a means for analyzing mechanisms unique to human injury; and a means for identifying future areas of investigation. The relational database supports three major sections: (1) the physics section defines exposure circumstances, (2) the clinical/ophthalmologic section includes fundus and scanning laser ophthalmoscope images, and (3) the visual functions section contains specialized visual function exam results. Tools are available for subject-matter experts to estimate parameters like total intraocular energy, ophthalmic lesion grade, and exposure probability. The database is research oriented to provide a means for generating empirical relationships to identify symptoms for definitive diagnosis and treatment of laser induced eye injuries.
Reef Ecosystem Services and Decision Support Database
This scientific and management information database utilizes systems thinking to describe the linkages between decisions, human activities, and provisioning of reef ecosystem goods and services. This database provides: (1) Hierarchy of related topics - Click on topics to navigat...
Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)
NASA Astrophysics Data System (ADS)
Koishibayev, Timur; Umarova, Zhanat
2016-04-01
This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.
Manganese Research Health Project (MHRP)
2006-01-01
ultrafine particles (or nanoparticles) on health (e.g. Royal Society 2004) and the apparent potential for translocation of these particles along the...evaluate the usefulness of particle counting methods (CPC) in assessing exposure to ultrafine particles in manganese production scenarios. Task 4. Database...R, Kreyling W, Cox C (2004). Translocation of Inhaled Ultrafine Particles to the Brain. Inhalation toxicology; 16:437 - 445 Ritchie P, Cherrie J
The purpose of this SOP is to define the procedures involved in appending cleaned individual data batches to the master databases. This procedure applies to the Arizona NHEXAS project and the "Border" study. Keywords: data; appending.
The National Human Exposure Assessment Sur...
Transparent Global Seismic Hazard and Risk Assessment
NASA Astrophysics Data System (ADS)
Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen
2013-04-01
Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits of different risk management measures. The following global data, models and methodologies will be available in the platform. Some of these will be released to the public already before, such as the ISC-GEM global instrumental catalogue (released January 2013). Datasets: • Global Earthquake History Catalogue [1000-1903] • Global Instrumental Catalogue [1900-2009] • Global Geodetic Strain Rate Model • Global Active Fault Database • Tectonic Regionalisation • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerability Database • Socio-Economic Vulnerability and Resilience Indicators Models: • Seismic Source Models • Ground Motion (Attenuation) Models • Physical Exposure Models • Physical Vulnerability Models • Composite Index Models (social vulnerability, resilience, indirect loss) The aforementioned models developed under the GEM framework will be combined to produce estimates of hazard and risk at a global scale. Furthermore, building on many ongoing efforts and knowledge of scientists worldwide, GEM will integrate state-of-the-art data, models, results and open-source tools into a single platform that is to serve as a "clearinghouse" on seismic risk. The platform will continue to increase in value, in particular for use in local contexts, through contributions and collaborations with scientists and organisations worldwide.
Holtby, Caitlin E; Guernsey, Judith R; Allen, Alexander C; Vanleeuwen, John A; Allen, Victoria M; Gordon, Robert J
2014-02-05
Animal studies and epidemiological evidence suggest an association between prenatal exposure to drinking water with elevated nitrate (NO3-N) concentrations and incidence of congenital anomalies. This study used Geographic Information Systems (GIS) to derive individual-level prenatal drinking-water nitrate exposure estimates from measured nitrate concentrations from 140 temporally monitored private wells and 6 municipal water supplies. Cases of major congenital anomalies in Kings County, Nova Scotia, Canada, between 1988 and 2006 were selected from province-wide population-based perinatal surveillance databases and matched to controls from the same databases. Unconditional multivariable logistic regression was performed to test for an association between drinking-water nitrate exposure and congenital anomalies after adjusting for clinically relevant risk factors. Employing all nitrate data there was a trend toward increased risk of congenital anomalies for increased nitrate exposure levels though this was not statistically significant. After stratification of the data by conception before or after folic acid supplementation, an increased risk of congenital anomalies for nitrate exposure of 1.5-5.56 mg/L (2.44; 1.05-5.66) and a trend toward increased risk for >5.56 mg/L (2.25; 0.92-5.52) was found. Though the study is likely underpowered, these results suggest that drinking-water nitrate exposure may contribute to increased risk of congenital anomalies at levels below the current Canadian maximum allowable concentration.
Holtby, Caitlin E.; Guernsey, Judith R.; Allen, Alexander C.; VanLeeuwen, John A.; Allen, Victoria M.; Gordon, Robert J.
2014-01-01
Animal studies and epidemiological evidence suggest an association between prenatal exposure to drinking water with elevated nitrate (NO3-N) concentrations and incidence of congenital anomalies. This study used Geographic Information Systems (GIS) to derive individual-level prenatal drinking-water nitrate exposure estimates from measured nitrate concentrations from 140 temporally monitored private wells and 6 municipal water supplies. Cases of major congenital anomalies in Kings County, Nova Scotia, Canada, between 1988 and 2006 were selected from province-wide population-based perinatal surveillance databases and matched to controls from the same databases. Unconditional multivariable logistic regression was performed to test for an association between drinking-water nitrate exposure and congenital anomalies after adjusting for clinically relevant risk factors. Employing all nitrate data there was a trend toward increased risk of congenital anomalies for increased nitrate exposure levels though this was not statistically significant. After stratification of the data by conception before or after folic acid supplementation, an increased risk of congenital anomalies for nitrate exposure of 1.5–5.56 mg/L (2.44; 1.05–5.66) and a trend toward increased risk for >5.56 mg/L (2.25; 0.92–5.52) was found. Though the study is likely underpowered, these results suggest that drinking-water nitrate exposure may contribute to increased risk of congenital anomalies at levels below the current Canadian maximum allowable concentration. PMID:24503976
Development of a Task-Exposure Matrix (TEM) for Pesticide Use (TEMPEST).
Dick, F D; Semple, S E; van Tongeren, M; Miller, B G; Ritchie, P; Sherriff, D; Cherrie, J W
2010-06-01
Pesticides have been associated with increased risks for a range of conditions including Parkinson's disease, but identifying the agents responsible has proven challenging. Improved pesticide exposure estimates would increase the power of epidemiological studies to detect such an association if one exists. Categories of pesticide use were identified from the tasks reported in a previous community-based case-control study in Scotland. Typical pesticides used in each task in each decade were identified from published scientific and grey literature and from expert interviews, with the number of potential agents collapsed into 10 groups of pesticides. A pesticide usage database was then created, using the task list and the typical pesticide groups employed in those tasks across seven decades spanning the period 1945-2005. Information about the method of application and concentration of pesticides used in these tasks was then incorporated into the database. A list was generated of 81 tasks involving pesticide exposure in Scotland covering seven decades producing a total of 846 task per pesticide per decade combinations. A Task-Exposure Matrix for PESTicides (TEMPEST) was produced by two occupational hygienists who quantified the likely probability and intensity of inhalation and dermal exposures for each pesticide group for a given use during each decade. TEMPEST provides a basis for assessing exposures to specific pesticide groups in Scotland covering the period 1945-2005. The methods used to develop TEMPEST could be used in a retrospective assessment of occupational exposure to pesticides for Scottish epidemiological studies or adapted for use in other countries.
Stephens, Susie M.; Chen, Jake Y.; Davidson, Marcel G.; Thomas, Shiby; Trute, Barry M.
2005-01-01
As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html PMID:15608287
A RESEARCH DATABASE FOR IMPROVED DATA MANAGEMENT AND ANALYSIS IN LONGITUDINAL STUDIES
BIELEFELD, ROGER A.; YAMASHITA, TOYOKO S.; KEREKES, EDWARD F.; ERCANLI, EHAT; SINGER, LYNN T.
2014-01-01
We developed a research database for a five-year prospective investigation of the medical, social, and developmental correlates of chronic lung disease during the first three years of life. We used the Ingres database management system and the Statit statistical software package. The database includes records containing 1300 variables each, the results of 35 psychological tests, each repeated five times (providing longitudinal data on the child, the parents, and behavioral interactions), both raw and calculated variables, and both missing and deferred values. The four-layer menu-driven user interface incorporates automatic activation of complex functions to handle data verification, missing and deferred values, static and dynamic backup, determination of calculated values, display of database status, reports, bulk data extraction, and statistical analysis. PMID:7596250
Mellentin, Angelina Isabella; Stenager, Elsebeth; Nielsen, Bent; Nielsen, Anette Søgaard; Yu, Fei
2017-01-30
Although the number of alcohol-related treatments in app stores is proliferating, none of them are based on a psychological framework and supported by empirical evidence. Cue exposure treatment (CET) with urge-specific coping skills (USCS) is often used in Danish treatment settings. It is an evidence-based psychological approach that focuses on promoting "confrontation with alcohol cues" as a means of reducing urges and the likelihood of relapse. The objective of this study was to describe the design and development of a CET-based smartphone app; an innovative delivery pathway for treating alcohol use disorder (AUD). The treatment is based on Monty and coworkers' manual for CET with USCS (2002). It was created by a multidisciplinary team of psychiatrists, psychologists, programmers, and graphic designers as well as patients with AUD. A database was developed for the purpose of registering and monitoring training activities. A final version of the CET app and database was developed after several user tests. The final version of the CET app includes an introduction, 4 sessions featuring USCS, 8 alcohol exposure videos promoting the use of one of the USCS, and a results component providing an overview of training activities and potential progress. Real-time urges are measured before, during, and after exposure to alcohol cues and are registered in the app together with other training activity variables. Data packages are continuously sent in encrypted form to an external database and will be merged with other data (in an internal database) in the future. The CET smartphone app is currently being tested at a large-scale, randomized controlled trial with the aim of clarifying whether it can be classified as an evidence-based treatment solution. The app has the potential to augment the reach of psychological treatment for AUD. ©Angelina Isabella Mellentin, Elsebeth Stenager, Bent Nielsen, Anette Søgaard Nielsen, Fei Yu. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 30.01.2017.
Stenager, Elsebeth; Nielsen, Bent; Nielsen, Anette Søgaard; Yu, Fei
2017-01-01
Background Although the number of alcohol-related treatments in app stores is proliferating, none of them are based on a psychological framework and supported by empirical evidence. Cue exposure treatment (CET) with urge-specific coping skills (USCS) is often used in Danish treatment settings. It is an evidence-based psychological approach that focuses on promoting “confrontation with alcohol cues” as a means of reducing urges and the likelihood of relapse. Objective The objective of this study was to describe the design and development of a CET-based smartphone app; an innovative delivery pathway for treating alcohol use disorder (AUD). Methods The treatment is based on Monty and coworkers’ manual for CET with USCS (2002). It was created by a multidisciplinary team of psychiatrists, psychologists, programmers, and graphic designers as well as patients with AUD. A database was developed for the purpose of registering and monitoring training activities. A final version of the CET app and database was developed after several user tests. Results The final version of the CET app includes an introduction, 4 sessions featuring USCS, 8 alcohol exposure videos promoting the use of one of the USCS, and a results component providing an overview of training activities and potential progress. Real-time urges are measured before, during, and after exposure to alcohol cues and are registered in the app together with other training activity variables. Data packages are continuously sent in encrypted form to an external database and will be merged with other data (in an internal database) in the future. Conclusions The CET smartphone app is currently being tested at a large-scale, randomized controlled trial with the aim of clarifying whether it can be classified as an evidence-based treatment solution. The app has the potential to augment the reach of psychological treatment for AUD. PMID:28137701
Sugiyama, Daisuke; Hattori, Takatoshi
2013-01-01
In environmental remediation after nuclear accidents, radioactive wastes have to be appropriately managed in existing exposure situations with contamination resulting from the emission of radionuclides by such accidents. In this paper, a framework of radiation protection from radioactive waste management in existing exposure situations for application to the practical and reasonable waste management in contaminated areas, referring to related ICRP recommendations was proposed. In the proposed concept, intermediate reference levels for waste management are adopted gradually according to the progress of the reduction in the existing ambient dose in the environment on the basis of the principles of justification and optimisation by taking into account the practicability of the management of radioactive waste and environmental remediation. It is essential to include the participation of relevant stakeholders living in existing exposure situations in the selection of reference levels for the existing ambient dose and waste management.
Sugiyama, Daisuke; Hattori, Takatoshi
2013-01-01
In environmental remediation after nuclear accidents, radioactive wastes have to be appropriately managed in existing exposure situations with contamination resulting from the emission of radionuclides by such accidents. In this paper, a framework of radiation protection from radioactive waste management in existing exposure situations for application to the practical and reasonable waste management in contaminated areas, referring to related ICRP recommendations was proposed. In the proposed concept, intermediate reference levels for waste management are adopted gradually according to the progress of the reduction in the existing ambient dose in the environment on the basis of the principles of justification and optimisation by taking into account the practicability of the management of radioactive waste and environmental remediation. It is essential to include the participation of relevant stakeholders living in existing exposure situations in the selection of reference levels for the existing ambient dose and waste management. PMID:22719047
1986 Year End Report for Road Following at Carnegie-Mellon
1987-05-01
how to make them work efficiently. We designed a hierarchical structure and a monitor module which manages all parts of the hierarchy (see figure 1...database, called the Local Map, is managed by a program known as the Local Map Builder (LMB). Each module stores and retrieves information in the...knowledge-intensive modules, and a database manager that synchronizes the modules-is characteristic of a traditional blackboard system. Such a system is
1983-12-16
management system (DBMS) is to record and maintain information used by an organization in the organization’s decision-making process. Some advantages of a...independence. Database Management Systems are classified into three major models; relational, network, and hierarchical. Each model uses a software...feeling impedes the overall effectiveness of the 4-" Acquisition Management Information System (AMIS), which currently uses S2k. The size of the AMIS
A Database of Woody Vegetation Responses to Elevated Atmospheric CO2 (NDP-072)
Curtis, Peter S [The Ohio State Univ., Columbus, OH (United States); Cushman, Robert M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brenkert, Antoinette L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
1999-01-01
To perform a statistically rigorous meta-analysis of research results on the response by woody vegetation to increased atmospheric CO2 levels, a multiparameter database of responses was compiled. Eighty-four independent CO2-enrichment studies, covering 65 species and 35 response parameters, met the necessary criteria for inclusion in the database: reporting mean response, sample size, and variance of the response (either as standard deviation or standard error). Data were retrieved from the published literature and unpublished reports. This numeric data package contains a 29-field data set of CO2-exposure experiment responses by woody plants (as both a flat ASCII file and a spreadsheet file), files listing the references to the CO2-exposure experiments and specific comments relevant to the data in the data set, and this documentation file (which includes SAS and Fortran codes to read the ASCII data file; SAS is a registered trademark of the SAS Institute, Inc., Cary, North Carolina 27511).
USDA-ARS?s Scientific Manuscript database
The ARS Microbial Genome Sequence Database (http://199.133.98.43), a web-based database server, was established utilizing the BIGSdb (Bacterial Isolate Genomics Sequence Database) software package, developed at Oxford University, as a tool to manage multi-locus sequence data for the family Streptomy...
DIMA.Tools: An R package for working with the database for inventory, monitoring, and assessment
USDA-ARS?s Scientific Manuscript database
The Database for Inventory, Monitoring, and Assessment (DIMA) is a Microsoft Access database used to collect, store and summarize monitoring data. This database is used by both local and national monitoring efforts within the National Park Service, the Forest Service, the Bureau of Land Management, ...
NREL: U.S. Life Cycle Inventory Database - Advisory Committee
Advisory Committee The U.S. Life Cycle Inventory (LCI) Database established an advisory committee to provide technical and financial guidance to the NREL database management team. The committee will Assessing and responding to user feedback to ensure that the database meets the needs of data providers
A User's Applications of Imaging Techniques: The University of Maryland Historic Textile Database.
ERIC Educational Resources Information Center
Anderson, Clarita S.
1991-01-01
Describes the incorporation of textile images into the University of Maryland Historic Textile Database by a computer user rather than a computer expert. Selection of a database management system is discussed, and PICTUREPOWER, a system that integrates photographic quality images with text and numeric information in databases, is described. (three…
Translating Frontier Knowledge into Contemporary Practice by Harnessing the Synergy of the Cluster
ERIC Educational Resources Information Center
Sprang, Ginny
2009-01-01
As practitioners have expanded their understanding of the types of events that constitute trauma exposure, the epidemiological database on prevalence suggests that childhood exposure to violence may best be framed as a public health crisis. There seems to be little recognition from policymakers and legislators, however, that violence may be the…
ERMes: Open Source Simplicity for Your E-Resource Management
ERIC Educational Resources Information Center
Doering, William; Chilton, Galadriel
2009-01-01
ERMes, the latest version of electronic resource management system (ERM), is a relational database; content in different tables connects to, and works with, content in other tables. ERMes requires Access 2007 (Windows) or Access 2008 (Mac) to operate as the database utilizes functionality not available in previous versions of Microsoft Access. The…
Selecting a Relational Database Management System for Library Automation Systems.
ERIC Educational Resources Information Center
Shekhel, Alex; O'Brien, Mike
1989-01-01
Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)
The Cocoa Shop: A Database Management Case
ERIC Educational Resources Information Center
Pratt, Renée M. E.; Smatt, Cindi T.
2015-01-01
This is an example of a real-world applicable case study, which includes background information on a small local business (i.e., TCS), description of functional business requirements, and sample data. Students are asked to design and develop a database to improve the management of the company's customers, products, and purchases by emphasizing…
Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.
ERIC Educational Resources Information Center
Rice, James
1988-01-01
Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…
Fidget Spinner Ingestions in Children-A Problem that Spun Out of Nowhere.
Reeves, Patrick T; Nylund, Cade M; Noel, James M; Jones, David S; Chumpitazi, Bruno P; Milczuk, Henry A; Noel, R Adam
2018-06-01
The Consumer Product Safety Risk Management System's injury and potential injury database records 13 cases of fidget spinner ingestion since 2016. In addition to a database query, we report 3 additional cases of fidget spinner ingestion to describe patient presentations and subsequent management strategies. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea
2017-12-01
Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.
Hg concentrations in fish from coastal waters of California and Western North America
Davis, Jay; Ross, John; Bezalel, Shira; Sim, Lawrence; Bonnema, Autumn; Ichikawa, Gary; Heim, Wes; Schiff, Kenneth C; Eagles-Smith, Collin A.; Ackerman, Joshua T.
2016-01-01
The State of California conducted an extensive and systematic survey of mercury (Hg) in fish from the California coast in 2009 and 2010. The California survey sampled 3483 fish representing 46 species at 68 locations, and demonstrated that methylHg in fish presents a widespread exposure risk to fish consumers. Most of the locations sampled (37 of 68) had a species with an average concentration above 0.3 μg/g wet weight (ww), and 10 locations an average above 1.0 μg/g ww. The recent and robust dataset from California provided a basis for a broader examination of spatial and temporal patterns in fish Hg in coastal waters of Western North America. There is a striking lack of data in publicly accessible databases on Hg and other contaminants in coastal fish. An assessment of the raw data from these databases suggested the presence of relatively high concentrations along the California coast and in Puget Sound, and relatively low concentrations along the coasts of Alaska and Oregon, and the outer coast of Washington. The dataset suggests that Hg concentrations of public health concern can be observed at any location on the coast of Western North America where long-lived predator species are sampled. Output from a linear mixed-effects model resembled the spatial pattern observed for the raw data and suggested, based on the limited dataset, a lack of trend in fish Hg over the nearly 30-year period covered by the dataset. Expanded and continued monitoring, accompanied by rigorous data management procedures, would be of great value in characterizing methylHg exposure, and tracking changes in contamination of coastal fish in response to possible increases in atmospheric Hg emissions in Asia, climate change, and terrestrial Hg control efforts in coastal watersheds.
Photokeratitis induced by ultraviolet radiation in travelers: A major health problem
Izadi, M; Jonaidi-Jafari, N; Pourazizi, M; Alemzadeh-Ansari, MH; Hoseinpourfard, MJ
2018-01-01
Ultraviolet (UV) irradiation is one of the several environmental hazards that may cause inflammatory reactions in ocular tissues, especially the cornea. One of the important factors that affect how much ultraviolet radiation (UVR) humans are exposed to is travel. Hence, traveling is considered to include a more acute UVR effect, and ophthalmologists frequently evaluate and manage the ocular manifestations of UV irradiation, including UV-induced keratitis. The purpose of this paper is to provide an evidence-based analysis of the clinical effect of UVR in ocular tissues. An extensive review of English literature was performed to gather all available articles from the National Library of Medicine PubMed database of the National Institute of Health, the Ovid MEDLINE database, Scopus, and ScienceDirect that had studied the effect of UVR on the eye and its complications, between January 1970 and June 2014. The results show that UVR at 300 nm causes apoptosis in all three layers of the cornea and induces keratitis. Apoptosis in all layers of the cornea occurs 5 h after exposure. The effect of UVR intensity on the eye can be linked to numerous factors, including solar elevation, time of day, season, hemisphere, clouds and haze, atmospheric scattering, atmospheric ozone, latitude, altitude, longitudinal changes, climate, ground reflection, and geographic directions. The most important factor affecting UVR reaching the earth's surface is solar elevation. Currently, people do not have great concern over eye protection. The methods of protection against UVR include avoiding direct sunlight exposure, using UVR-blocking eyewear (sunglasses or contact lenses), and wearing hats. Hence, by identifying UVR intensity factors, eye protection factors, and public education, especially in travelers, methods for safe traveling can be identified. PMID:29067921
Photokeratitis induced by ultraviolet radiation in travelers: A major health problem.
Izadi, M; Jonaidi-Jafari, N; Pourazizi, M; Alemzadeh-Ansari, M H; Hoseinpourfard, M J
2018-01-01
Ultraviolet (UV) irradiation is one of the several environmental hazards that may cause inflammatory reactions in ocular tissues, especially the cornea. One of the important factors that affect how much ultraviolet radiation (UVR) humans are exposed to is travel. Hence, traveling is considered to include a more acute UVR effect, and ophthalmologists frequently evaluate and manage the ocular manifestations of UV irradiation, including UV-induced keratitis. The purpose of this paper is to provide an evidence-based analysis of the clinical effect of UVR in ocular tissues. An extensive review of English literature was performed to gather all available articles from the National Library of Medicine PubMed database of the National Institute of Health, the Ovid MEDLINE database, Scopus, and ScienceDirect that had studied the effect of UVR on the eye and its complications, between January 1970 and June 2014. The results show that UVR at 300 nm causes apoptosis in all three layers of the cornea and induces keratitis. Apoptosis in all layers of the cornea occurs 5 h after exposure. The effect of UVR intensity on the eye can be linked to numerous factors, including solar elevation, time of day, season, hemisphere, clouds and haze, atmospheric scattering, atmospheric ozone, latitude, altitude, longitudinal changes, climate, ground reflection, and geographic directions. The most important factor affecting UVR reaching the earth's surface is solar elevation. Currently, people do not have great concern over eye protection. The methods of protection against UVR include avoiding direct sunlight exposure, using UVR-blocking eyewear (sunglasses or contact lenses), and wearing hats. Hence, by identifying UVR intensity factors, eye protection factors, and public education, especially in travelers, methods for safe traveling can be identified.
NASA Astrophysics Data System (ADS)
Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea
2018-06-01
Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.
An optical scan/statistical package for clinical data management in C-L psychiatry.
Hammer, J S; Strain, J J; Lyerly, M
1993-03-01
This paper explores aspects of the need for clinical database management systems that permit ongoing service management, measurement of the quality and appropriateness of care, databased administration of consultation liaison (C-L) services, teaching/educational observations, and research. It describes an OPTICAL SCAN databased management system that permits flexible form generation, desktop publishing, and linking of observations in multiple files. This enhanced MICRO-CARES software system--Medical Application Platform (MAP)--permits direct transfer of the data to ASCII and SAS format for mainframe manipulation of the clinical information. The director of a C-L service may now develop his or her own forms, incorporate structured instruments, or develop "branch chains" of essential data to add to the core data set without the effort and expense to reprint forms or consult with commercial vendors.
Service Management Database for DSN Equipment
NASA Technical Reports Server (NTRS)
Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed
2009-01-01
This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.
An optimal user-interface for EPIMS database conversions and SSQ 25002 EEE parts screening
NASA Technical Reports Server (NTRS)
Watson, John C.
1996-01-01
The Electrical, Electronic, and Electromechanical (EEE) Parts Information Management System (EPIMS) database was selected by the International Space Station Parts Control Board for providing parts information to NASA managers and contractors. Parts data is transferred to the EPIMS database by converting parts list data to the EP1MS Data Exchange File Format. In general, parts list information received from contractors and suppliers does not convert directly into the EPIMS Data Exchange File Format. Often parts lists use different variable and record field assignments. Many of the EPES variables are not defined in the parts lists received. The objective of this work was to develop an automated system for translating parts lists into the EPIMS Data Exchange File Format for upload into the EPIMS database. Once EEE parts information has been transferred to the EPIMS database it is necessary to screen parts data in accordance with the provisions of the SSQ 25002 Supplemental List of Qualified Electrical, Electronic, and Electromechanical Parts, Manufacturers, and Laboratories (QEPM&L). The SSQ 2S002 standards are used to identify parts which satisfy the requirements for spacecraft applications. An additional objective for this work was to develop an automated system which would screen EEE parts information against the SSQ 2S002 to inform managers of the qualification status of parts used in spacecraft applications. The EPIMS Database Conversion and SSQ 25002 User Interfaces are designed to interface through the World-Wide-Web(WWW)/Internet to provide accessibility by NASA managers and contractors.
Non-native (exotic) snake envenomations in the U.S., 2005-2011.
Warrick, Brandon J; Boyer, Leslie V; Seifert, Steven A
2014-09-29
Non-native (exotic) snakes are a problematic source of envenomation worldwide. This manuscript describes the current demographics, outcomes and challenges of non-native snakebites in the United States (U.S.). We performed a retrospective case series of the National Poison Data System (NPDS) database between 2005 and 2011. There were 258 human exposures involving at least 61 unique exotic venomous species (average = 37 per year; range = 33-40). Males comprised 79% and females 21%. The average age was 33 years with 16% less than 20 years old. 70% of bites occurred in a private residence and 86% were treated at a healthcare facility. 35% of cases received antivenom and 10% were given antibiotics. This study is compared to our previous study (1994-2004) in which there was a substantial coding error rate. Software modifications significantly reduced coding errors. Identification and acquisition of appropriate antivenoms pose a number of logistical difficulties in the management of these envenomations. In the U.S., poison centers have valuable systems and clinical roles in the provision of expert consultation and in the management of these cases.
Non-Native (Exotic) Snake Envenomations in the U.S., 2005–2011
Warrick, Brandon J.; Boyer, Leslie V.; Seifert, Steven A.
2014-01-01
Non-native (exotic) snakes are a problematic source of envenomation worldwide. This manuscript describes the current demographics, outcomes and challenges of non-native snakebites in the United States (U.S.). We performed a retrospective case series of the National Poison Data System (NPDS) database between 2005 and 2011. There were 258 human exposures involving at least 61 unique exotic venomous species (average = 37 per year; range = 33–40). Males comprised 79% and females 21%. The average age was 33 years with 16% less than 20 years old. 70% of bites occurred in a private residence and 86% were treated at a healthcare facility. 35% of cases received antivenom and 10% were given antibiotics. This study is compared to our previous study (1994–2004) in which there was a substantial coding error rate. Software modifications significantly reduced coding errors. Identification and acquisition of appropriate antivenoms pose a number of logistical difficulties in the management of these envenomations. In the U.S., poison centers have valuable systems and clinical roles in the provision of expert consultation and in the management of these cases. PMID:25268980
Techniques for Efficiently Managing Large Geosciences Data Sets
NASA Astrophysics Data System (ADS)
Kruger, A.; Krajewski, W. F.; Bradley, A. A.; Smith, J. A.; Baeck, M. L.; Steiner, M.; Lawrence, R. E.; Ramamurthy, M. K.; Weber, J.; Delgreco, S. A.; Domaszczynski, P.; Seo, B.; Gunyon, C. A.
2007-12-01
We have developed techniques and software tools for efficiently managing large geosciences data sets. While the techniques were developed as part of an NSF-Funded ITR project that focuses on making NEXRAD weather data and rainfall products available to hydrologists and other scientists, they are relevant to other geosciences disciplines that deal with large data sets. Metadata, relational databases, data compression, and networking are central to our methodology. Data and derived products are stored on file servers in a compressed format. URLs to, and metadata about the data and derived products are managed in a PostgreSQL database. Virtually all access to the data and products is through this database. Geosciences data normally require a number of processing steps to transform the raw data into useful products: data quality assurance, coordinate transformations and georeferencing, applying calibration information, and many more. We have developed the concept of crawlers that manage this scientific workflow. Crawlers are unattended processes that run indefinitely, and at set intervals query the database for their next assignment. A database table functions as a roster for the crawlers. Crawlers perform well-defined tasks that are, except for perhaps sequencing, largely independent from other crawlers. Once a crawler is done with its current assignment, it updates the database roster table, and gets its next assignment by querying the database. We have developed a library that enables one to quickly add crawlers. The library provides hooks to external (i.e., C-language) compiled codes, so that developers can work and contribute independently. Processes called ingesters inject data into the system. The bulk of the data are from a real-time feed using UCAR/Unidata's IDD/LDM software. An exciting recent development is the establishment of a Unidata HYDRO feed that feeds value-added metadata over the IDD/LDM. Ingesters grab the metadata and populate the PostgreSQL tables. These and other concepts we have developed have enabled us to efficiently manage a 70 Tb (and growing) data weather radar data set.
A Summary of Pavement and Material-Related Databases within the Texas Department of Transportation
DOT National Transportation Integrated Search
1999-09-01
This report summarizes important content and operational details about five different materials and pavement databases currently used by the Texas Department of Transportation (TxDOT). These databases include the Pavement Management Information Syste...
Computer Science Research in Europe.
1984-08-29
most attention, multi- database and its structure, and (3) the dependencies between databases Distributed Systems and multi- databases . Having...completed a multi- database Newcastle University, UK system for distributed data management, At the University of Newcastle the INRIA is now working on a real...communications re- INRIA quirements of distributed database A project called SIRIUS was estab- systems, protocols for checking the lished in 1977 at the
Lee, Pei-Chen; Liu, Li-Ling; Sun, Yu; Chen, Yu-An; Liu, Chih-Ching; Li, Chung-Yi; Yu, Hwa-Lung; Ritz, Beate
2016-11-01
Ambient air pollution has been associated with many health conditions, but little is known about its effects on neurodegenerative diseases, such as Parkinson's disease (PD). In this study, we investigated the influence of ambient air pollution on PD in a nationwide population-based case-control study in Taiwan. We identified 11,117 incident PD patients between 2007 and 2009 from the Taiwanese National Health Insurance Research Database and selected 44,468 age- and gender-matched population controls from the longitudinal health insurance database. The average ambient pollutant exposure concentrations from 1998 through the onset of PD were estimated using quantile-based Bayesian Maximum Entropy models. Basing from logistic regression models, we estimated the odds ratios (ORs) and 95% confidence intervals (CIs) of ambient pollutant exposures and PD risk. We observed positive associations between NO x , CO exposures, and PD. In multi-pollutant models, for NO x and CO above the 75th percentile exposure compared with the lowest percentile, the ORs of PD were 1.37 (95% CI=1.23-1.52) and 1.17 (95% CI=1.07-1.27), respectively. This study suggests that ambient air pollution exposure, especially from traffic-related pollutants such as NO x and CO, increases PD risk in the Taiwanese population. Copyright © 2016 Elsevier Ltd. All rights reserved.
Metal working fluid exposure and diseases in Switzerland.
Koller, Michael F; Pletscher, Claudia; Scholz, Stefan M; Schneuwly, Philippe
2016-07-01
Exposure to metal working fluids (MWF) is common in machining processes worldwide and may lead to diseases of the skin and the respiratory tract. The aim of the study was to investigate exposure and diseases due to MWF in Switzerland between 2004 and 2013. We performed descriptive statistics including determination of median and 90th percentile values of MWF concentrations listed in a database of Suva. Moreover, we clustered MWF-induced occupational diseases listed in a database from the Swiss Central Office for Statistics in Accident Insurance, and performed linear regression over time to investigate temporal course of the illnesses. The 90th percentile for MWF air concentration was 8.1 mg (aerosol + vapor)/m 3 and 0.9 mg aerosol/m 3 (inhalable fraction). One thousand two hundred and eighty skin diseases and 96 respiratory diseases were observed. This is the first investigation describing exposure to and diseases due to MWF in Switzerland over a timeframe of 10 years. In general, working conditions in the companies of this investigation were acceptable. Most measured MWF concentrations were below both the Swiss and most international occupational exposure limits of 2014. The percentage of workers declared unfit for work was 17% compared to the average of other occupational diseases (12%).
KREAM: Korean Radiation Exposure Assessment Model for Aviation Route Dose
NASA Astrophysics Data System (ADS)
Hwang, J.; Dokgo, K.; Choi, E. J.; Kim, K. C.; Kim, H. P.; Cho, K. S. F.
2014-12-01
Since Korean Air has begun to use the polar route from Seoul/ICN airport to New York/JFK airport on August 2006, there are explosive needs for the estimation and prediction against cosmic radiation exposure for Korean aircrew and passengers in South Korea from public. To keep pace with those needs of public, Korean government made the law on safety standards and managements of cosmic radiation for the flight attendants and the pilots in 2013. And we have begun to develop our own Korean Radiation Exposure Assessment Model (KREAM) for aviation route dose since last year funded by Korea Meteorological Administration (KMA). GEANT4 model and NRLMSIS 00 model are used for calculation of the energetic particles' transport in the atmosphere and for obtaining the background atmospheric neutral densities depending on altitude. For prediction the radiation exposure in many routes depending on the various space weather effects, we constructed a database from pre-arranged simulations using all possible combinations of R, S, and G, which are the space weather effect scales provided by the National Oceanic and Atmospheric Administration (NOAA). To get the solar energetic particles' spectrum at the 100 km altitude which we set as a top of the atmospheric layers in the KREAM, we use ACE and GOES satellites' proton flux observations. We compare the results between KREAM and the other cosmic radiation estimation programs such as CARI-6M which is provided by the Federal Aviation Agency (FAA). We also validate KREAM's results by comparison with the measurement from Liulin-6K LET spectrometer onboard Korean commercial flights and Korean Air Force reconnaissance flights.
Tao, Jing; Barry, Terrell; Segawa, Randy; Neal, Rosemary; Tuli, Atac
2013-01-01
Kettleman City, California, reported a higher than expected number of birth defect cases between 2007 and 2010, raising the concern of community and government agencies. A pesticide exposure evaluation was conducted as part of a complete assessment of community chemical exposure. Nineteen pesticides that potentially cause birth defects were investigated. The Industrial Source Complex Short-Term Model Version 3 (ISCST3) was used to estimate off-site air concentrations associated with pesticide applications within 8 km of the community from late 2006 to 2009. The health screening levels were designed to indicate potential health effects and used for preliminary health evaluations of estimated air concentrations. A tiered approach was conducted. The first tier modeled simple, hypothetical worst-case situations for each of 19 pesticides. The second tier modeled specific applications of the pesticides with estimated concentrations exceeding health screening levels in the first tier. The pesticide use report database of the California Department of Pesticide Regulation provided application information. Weather input data were summarized from the measurements of a local weather station in the California Irrigation Management Information System. The ISCST3 modeling results showed that during the target period, only two application days of one pesticide (methyl isothiocyanate) produced air concentration estimates above the health screening level for developmental effects at the boundary of Kettleman City. These results suggest that the likelihood of birth defects caused by pesticide exposure was low. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Study protocol for the Fukushima Health Management Survey.
Yasumura, Seiji; Hosoya, Mitsuaki; Yamashita, Shunichi; Kamiya, Kenji; Abe, Masafumi; Akashi, Makoto; Kodama, Kazunori; Ozasa, Kotaro
2012-01-01
The accidents that occurred at the Fukushima Daiichi Nuclear Power Plant after the Great East Japan Earthquake on 11 March 2011 have resulted in long-term, ongoing anxiety among the residents of Fukushima, Japan. Soon after the disaster, Fukushima Prefecture launched the Fukushima Health Management Survey to investigate long-term low-dose radiation exposure caused by the accident. Fukushima Medical University took the lead in planning and implementing this survey. The primary purposes of this survey are to monitor the long-term health of residents, promote their future well-being, and confirm whether long-term low-dose radiation exposure has health effects. This report describes the rationale and implementation of the Fukushima Health Management Survey. This cohort study enrolled all people living in Fukushima Prefecture after the earthquake and comprises a basic survey and 4 detailed surveys. The basic survey is to estimate levels of external radiation exposure among all 2.05 million residents. It should be noted that internal radiation levels were estimated by Fukushima Prefecture using whole-body counters. The detailed surveys comprise a thyroid ultrasound examination for all Fukushima children aged 18 years or younger, a comprehensive health check for all residents from the evacuation zones, an assessment of mental health and lifestyles of all residents from the evacuation zones, and recording of all pregnancies and births among all women in the prefecture who were pregnant on 11 March. All data have been entered into a database and will be used to support the residents and analyze the health effects of radiation. The low response rate (<30%) to the basic survey complicates the estimation of health effects. There have been no cases of malignancy to date among 38 114 children who received thyroid ultrasound examinations. The importance of mental health care was revealed by the mental health and lifestyle survey and the pregnancy and birth survey. This long-term large-scale epidemiologic study is expected to provide valuable data in the investigation of the health effects of low-dose radiation and disaster-related stress.
Study Protocol for the Fukushima Health Management Survey
Yasumura, Seiji; Hosoya, Mitsuaki; Yamashita, Shunichi; Kamiya, Kenji; Abe, Masafumi; Akashi, Makoto; Kodama, Kazunori; Ozasa, Kotaro
2012-01-01
Background The accidents that occurred at the Fukushima Daiichi Nuclear Power Plant after the Great East Japan Earthquake on 11 March 2011 have resulted in long-term, ongoing anxiety among the residents of Fukushima, Japan. Soon after the disaster, Fukushima Prefecture launched the Fukushima Health Management Survey to investigate long-term low-dose radiation exposure caused by the accident. Fukushima Medical University took the lead in planning and implementing this survey. The primary purposes of this survey are to monitor the long-term health of residents, promote their future well-being, and confirm whether long-term low-dose radiation exposure has health effects. This report describes the rationale and implementation of the Fukushima Health Management Survey. Methods This cohort study enrolled all people living in Fukushima Prefecture after the earthquake and comprises a basic survey and 4 detailed surveys. The basic survey is to estimate levels of external radiation exposure among all 2.05 million residents. It should be noted that internal radiation levels were estimated by Fukushima Prefecture using whole-body counters. The detailed surveys comprise a thyroid ultrasound examination for all Fukushima children aged 18 years or younger, a comprehensive health check for all residents from the evacuation zones, an assessment of mental health and lifestyles of all residents from the evacuation zones, and recording of all pregnancies and births among all women in the prefecture who were pregnant on 11 March. All data have been entered into a database and will be used to support the residents and analyze the health effects of radiation. Conclusions The low response rate (<30%) to the basic survey complicates the estimation of health effects. There have been no cases of malignancy to date among 38 114 children who received thyroid ultrasound examinations. The importance of mental health care was revealed by the mental health and lifestyle survey and the pregnancy and birth survey. This long-term large-scale epidemiologic study is expected to provide valuable data in the investigation of the health effects of low-dose radiation and disaster-related stress. PMID:22955043
DataSpread: Unifying Databases and Spreadsheets.
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-08-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.
DataSpread: Unifying Databases and Spreadsheets
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-01-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487