Sample records for high-performance secure database

  1. Building a highly available and intrusion tolerant Database Security and Protection System (DSPS).

    PubMed

    Cai, Liang; Yang, Xiao-Hu; Dong, Jin-Xiang

    2003-01-01

    Database Security and Protection System (DSPS) is a security platform for fighting malicious DBMS. The security and performance are critical to DSPS. The authors suggested a key management scheme by combining the server group structure to improve availability and the key distribution structure needed by proactive security. This paper detailed the implementation of proactive security in DSPS. After thorough performance analysis, the authors concluded that the performance difference between the replicated mechanism and proactive mechanism becomes smaller and smaller with increasing number of concurrent connections; and that proactive security is very useful and practical for large, critical applications.

  2. High-Performance Secure Database Access Technologies for HEP Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysismore » capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure authorization is pushed into the database engine will eliminate inefficient data transfer bottlenecks. Furthermore, traditionally separated database and security layers provide an extra vulnerability, leaving a weak clear-text password authorization as the only protection on the database core systems. Due to the legacy limitations of the systems’ security models, the allowed passwords often can not even comply with the DOE password guideline requirements. We see an opportunity for the tight integration of the secure authorization layer with the database server engine resulting in both improved performance and improved security. Phase I has focused on the development of a proof-of-concept prototype using Argonne National Laboratory’s (ANL) Argonne Tandem-Linac Accelerator System (ATLAS) project as a test scenario. By developing a grid-security enabled version of the ATLAS project’s current relation database solution, MySQL, PIOCON Technologies aims to offer a more efficient solution to secure database access.« less

  3. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.

    2004-05-12

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view,more » create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments.« less

  4. A case study for a digital seabed database: Bohai Sea engineering geology database

    NASA Astrophysics Data System (ADS)

    Tianyun, Su; Shikui, Zhai; Baohua, Liu; Ruicai, Liang; Yanpeng, Zheng; Yong, Wang

    2006-07-01

    This paper discusses the designing plan of ORACLE-based Bohai Sea engineering geology database structure from requisition analysis, conceptual structure analysis, logical structure analysis, physical structure analysis and security designing. In the study, we used the object-oriented Unified Modeling Language (UML) to model the conceptual structure of the database and used the powerful function of data management which the object-oriented and relational database ORACLE provides to organize and manage the storage space and improve its security performance. By this means, the database can provide rapid and highly effective performance in data storage, maintenance and query to satisfy the application requisition of the Bohai Sea Oilfield Paradigm Area Information System.

  5. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis

    PubMed Central

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153

  6. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    PubMed

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  7. Enabling search over encrypted multimedia databases

    NASA Astrophysics Data System (ADS)

    Lu, Wenjun; Swaminathan, Ashwin; Varna, Avinash L.; Wu, Min

    2009-02-01

    Performing information retrieval tasks while preserving data confidentiality is a desirable capability when a database is stored on a server maintained by a third-party service provider. This paper addresses the problem of enabling content-based retrieval over encrypted multimedia databases. Search indexes, along with multimedia documents, are first encrypted by the content owner and then stored onto the server. Through jointly applying cryptographic techniques, such as order preserving encryption and randomized hash functions, with image processing and information retrieval techniques, secure indexing schemes are designed to provide both privacy protection and rank-ordered search capability. Retrieval results on an encrypted color image database and security analysis of the secure indexing schemes under different attack models show that data confidentiality can be preserved while retaining very good retrieval performance. This work has promising applications in secure multimedia management.

  8. A High Speed Mobile Courier Data Access System That Processes Database Queries in Real-Time

    NASA Astrophysics Data System (ADS)

    Gatsheni, Barnabas Ndlovu; Mabizela, Zwelakhe

    A secure high-speed query processing mobile courier data access (MCDA) system for a Courier Company has been developed. This system uses the wireless networks in combination with wired networks for updating a live database at the courier centre in real-time by an offsite worker (the Courier). The system is protected by VPN based on IPsec. There is no system that we know of to date that performs the task for the courier as proposed in this paper.

  9. Draft secure medical database standard.

    PubMed

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  10. Information Security Considerations for Applications Using Apache Accumulo

    DTIC Science & Technology

    2014-09-01

    Distributed File System INSCOM United States Army Intelligence and Security Command JPA Java Persistence API JSON JavaScript Object Notation MAC Mandatory... MySQL [13]. BigTable can process 20 petabytes per day [14]. High degree of scalability on commodity hardware. NoSQL databases do not rely on highly...manipulation in relational databases. NoSQL databases each have a unique programming interface that uses a lower level procedural language (e.g., Java

  11. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences.

    PubMed

    Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.

  12. Successful linking of the Society of Thoracic Surgeons Database to Social Security data to examine the accuracy of Society of Thoracic Surgeons mortality data.

    PubMed

    Jacobs, Jeffrey P; O'Brien, Sean M; Shahian, David M; Edwards, Fred H; Badhwar, Vinay; Dokholyan, Rachel S; Sanchez, Juan A; Morales, David L; Prager, Richard L; Wright, Cameron D; Puskas, John D; Gammie, James S; Haan, Constance K; George, Kristopher M; Sheng, Shubin; Peterson, Eric D; Shewan, Cynthia M; Han, Jane M; Bongiorno, Phillip A; Yohe, Courtney; Williams, William G; Mayer, John E; Grover, Frederick L

    2013-04-01

    The Society of Thoracic Surgeons Adult Cardiac Surgery Database has been linked to the Social Security Death Master File to verify "life status" and evaluate long-term surgical outcomes. The objective of this study is explore practical applications of the linkage of the Society of Thoracic Surgeons Adult Cardiac Surgery Database to Social Securtiy Death Master File, including the use of the Social Securtiy Death Master File to examine the accuracy of the Society of Thoracic Surgeons 30-day mortality data. On January 1, 2008, the Society of Thoracic Surgeons Adult Cardiac Surgery Database began collecting Social Security numbers in its new version 2.61. This study includes all Society of Thoracic Surgeons Adult Cardiac Surgery Database records for operations with nonmissing Social Security numbers between January 1, 2008, and December 31, 2010, inclusive. To match records between the Society of Thoracic Surgeons Adult Cardiac Surgery Database and the Social Security Death Master File, we used a combined probabilistic and deterministic matching rule with reported high sensitivity and nearly perfect specificity. Between January 1, 2008, and December 31, 2010, the Society of Thoracic Surgeons Adult Cardiac Surgery Database collected data for 870,406 operations. Social Security numbers were available for 541,953 operations and unavailable for 328,453 operations. According to the Society of Thoracic Surgeons Adult Cardiac Surgery Database, the 30-day mortality rate was 17,757/541,953 = 3.3%. Linkage to the Social Security Death Master File identified 16,565 cases of suspected 30-day deaths (3.1%). Of these, 14,983 were recorded as 30-day deaths in the Society of Thoracic Surgeons database (relative sensitivity = 90.4%). Relative sensitivity was 98.8% (12,863/13,014) for suspected 30-day deaths occurring before discharge and 59.7% (2120/3551) for suspected 30-day deaths occurring after discharge. Linkage to the Social Security Death Master File confirms the accuracy of data describing "mortality within 30 days of surgery" in the Society of Thoracic Surgeons Adult Cardiac Surgery Database. The Society of Thoracic Surgeons and Social Security Death Master File link reveals that capture of 30-day deaths occurring before discharge is highly accurate, and that these in-hospital deaths represent the majority (79% [13,014/16,565]) of all 30-day deaths. Capture of the remaining 30-day deaths occurring after discharge is less complete and needs improvement. Efforts continue to encourage Society of Thoracic Surgeons Database participants to submit Social Security numbers to the Database, thereby enhancing accurate determination of 30-day life status. The Society of Thoracic Surgeons and Social Security Death Master File linkage can facilitate ongoing refinement of mortality reporting. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  13. Exploring the Lack of Interoperability of Databases within Department of Homeland Security Interagency Environment Concerning Maritime Port Security

    DTIC Science & Technology

    2009-03-01

    37 Figure 8 New Information Sharing Model from United States Intelligence Community Information Sharing...PRIDE while the Coast Guard has MISSLE and the newly constructed WATCHKEEPER. All these databases contain intelligence on incoming vessels...decisions making. Experts rely heavily on future projections as hallmarks of skilled performance." (Endsley et al. 2006) The SA model above

  14. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences

    PubMed Central

    Stephens, Susie M.; Chen, Jake Y.; Davidson, Marcel G.; Thomas, Shiby; Trute, Barry M.

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html PMID:15608287

  15. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  16. Hierarchical data security in a Query-By-Example interface for a shared database.

    PubMed

    Taylor, Merwyn

    2002-06-01

    Whenever a shared database resource, containing critical patient data, is created, protecting the contents of the database is a high priority goal. This goal can be achieved by developing a Query-By-Example (QBE) interface, designed to access a shared database, and embedding within the QBE a hierarchical security module that limits access to the data. The security module ensures that researchers working in one clinic do not get access to data from another clinic. The security can be based on a flexible taxonomy structure that allows ordinary users to access data from individual clinics and super users to access data from all clinics. All researchers submit queries through the same interface and the security module processes the taxonomy and user identifiers to limit access. Using this system, two different users with different access rights can submit the same query and get different results thus reducing the need to create different interfaces for different clinics and access rights.

  17. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  18. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  19. A Secure Multicast Framework in Large and High-Mobility Network Groups

    NASA Astrophysics Data System (ADS)

    Lee, Jung-San; Chang, Chin-Chen

    With the widespread use of Internet applications such as Teleconference, Pay-TV, Collaborate tasks, and Message services, how to construct and distribute the group session key to all group members securely is becoming and more important. Instead of adopting the point-to-point packet delivery, these emerging applications are based upon the mechanism of multicast communication, which allows the group member to communicate with multi-party efficiently. There are two main issues in the mechanism of multicast communication: Key Distribution and Scalability. The first issue is how to distribute the group session key to all group members securely. The second one is how to maintain the high performance in large network groups. Group members in conventional multicast systems have to keep numerous secret keys in databases, which makes it very inconvenient for them. Furthermore, in case that a member joins or leaves the communication group, many involved participants have to change their own secret keys to preserve the forward secrecy and the backward secrecy. We consequently propose a novel version for providing secure multicast communication in large network groups. Our proposed framework not only preserves the forward secrecy and the backward secrecy but also possesses better performance than existing alternatives. Specifically, simulation results demonstrate that our scheme is suitable for high-mobility environments.

  20. PACE: Proactively Secure Accumulo with Cryptographic Enforcement

    DTIC Science & Technology

    2017-05-27

    Abstract—Cloud-hosted databases have many compelling ben- efits, including high availability , flexible resource allocation, and resiliency to attack...infrastructure to the cloud. This move is motivated by the cloud’s increased availability , flexibility, and resilience [1]. Most importantly, the cloud enables...a level of availability and performance that would be impossible for many companies to achieve using their own infrastructure. For example, using a

  1. Construction and validation of a web-based epidemiological database for inflammatory bowel diseases in Europe An EpiCom study.

    PubMed

    Burisch, Johan; Cukovic-Cavka, Silvija; Kaimakliotis, Ioannis; Shonová, Olga; Andersen, Vibeke; Dahlerup, Jens F; Elkjaer, Margarita; Langholz, Ebbe; Pedersen, Natalia; Salupere, Riina; Kolho, Kaija-Leena; Manninen, Pia; Lakatos, Peter Laszlo; Shuhaibar, Mary; Odes, Selwyn; Martinato, Matteo; Mihu, Ion; Magro, Fernando; Belousova, Elena; Fernandez, Alberto; Almer, Sven; Halfvarson, Jonas; Hart, Ailsa; Munkholm, Pia

    2011-08-01

    The EpiCom-study investigates a possible East-West-gradient in Europe in the incidence of IBD and the association with environmental factors. A secured web-based database is used to facilitate and centralize data registration. To construct and validate a web-based inception cohort database available in both English and Russian language. The EpiCom database has been constructed in collaboration with all 34 participating centers. The database was translated into Russian using forward translation, patient questionnaires were translated by simplified forward-backward translation. Data insertion implies fulfillment of international diagnostic criteria, disease activity, medical therapy, quality of life, work productivity and activity impairment, outcome of pregnancy, surgery, cancer and death. Data is secured by the WinLog3 System, developed in cooperation with the Danish Data Protection Agency. Validation of the database has been performed in two consecutive rounds, each followed by corrections in accordance with comments. The EpiCom database fulfills the requirements of the participating countries' local data security agencies by being stored at a single location. The database was found overall to be "good" or "very good" by 81% of the participants after the second validation round and the general applicability of the database was evaluated as "good" or "very good" by 77%. In the inclusion period January 1st -December 31st 2010 1336 IBD patients have been included in the database. A user-friendly, tailor-made and secure web-based inception cohort database has been successfully constructed, facilitating remote data input. The incidence of IBD in 23 European countries can be found at www.epicom-ecco.eu. Copyright © 2011 European Crohn's and Colitis Organisation. All rights reserved.

  2. The research of network database security technology based on web service

    NASA Astrophysics Data System (ADS)

    Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin

    2013-03-01

    Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.

  3. Template protection and its implementation in 3D face recognition systems

    NASA Astrophysics Data System (ADS)

    Zhou, Xuebing

    2007-04-01

    As biometric recognition systems are widely applied in various application areas, security and privacy risks have recently attracted the attention of the biometric community. Template protection techniques prevent stored reference data from revealing private biometric information and enhance the security of biometrics systems against attacks such as identity theft and cross matching. This paper concentrates on a template protection algorithm that merges methods from cryptography, error correction coding and biometrics. The key component of the algorithm is to convert biometric templates into binary vectors. It is shown that the binary vectors should be robust, uniformly distributed, statistically independent and collision-free so that authentication performance can be optimized and information leakage can be avoided. Depending on statistical character of the biometric template, different approaches for transforming biometric templates into compact binary vectors are presented. The proposed methods are integrated into a 3D face recognition system and tested on the 3D facial images of the FRGC database. It is shown that the resulting binary vectors provide an authentication performance that is similar to the original 3D face templates. A high security level is achieved with reasonable false acceptance and false rejection rates of the system, based on an efficient statistical analysis. The algorithm estimates the statistical character of biometric templates from a number of biometric samples in the enrollment database. For the FRGC 3D face database, the small distinction of robustness and discriminative power between the classification results under the assumption of uniquely distributed templates and the ones under the assumption of Gaussian distributed templates is shown in our tests.

  4. Trends in Primary and Revision Hip Arthroplasty Among Orthopedic Surgeons Who Take the American Board of Orthopedics Part II Examination.

    PubMed

    Eslam Pour, Aidin; Bradbury, Thomas L; Horst, Patrick K; Harrast, John J; Erens, Greg A; Roberson, James R

    2016-07-01

    A certified list of all operative cases performed within a 6-month period is a required prerequisite for surgeons taking the American Board of Orthopaedic Surgery Part II oral examination. Using the American Board of Orthopaedic Surgery secure Internet database database containing these cases, this study (1) assessed changing trends for primary and revision total hip arthroplasty (THA) and (2) compared practices and early postoperative complications between 2 groups of examinees, those with and without adult reconstruction fellowship training. Secure Internet database was searched for all 2003-2013 procedures with a Current Procedural Terminology code for THA, hip resurfacing, hemiarthroplasty, revision hip arthroplasty, conversion to THA, or removal of hip implant (Girdlestone, static, or dynamic spacer). Adult reconstruction fellowship-trained surgeons performed 60% of the more than 33,000 surgeries identified (average 28.1) and nonfellowship-trained surgeons performed 40% (average 5.2) (P < .001). Fellowship-trained surgeons performed significantly more revision surgeries for infection (71% vs 29%)(P < .001). High-volume surgeons had significantly fewer complications in both primary (11.1% vs 19.6%) and revision surgeries (29% vs 35.5%) (P < .001). Those who passed the Part II examination reported higher rates of complications (21.5% vs 19.9%). In early practice, primary and revision hip arthroplasties are often performed by surgeons without adult reconstruction fellowship training. Complications are less frequently reported by surgeons with larger volumes of joint replacement surgery who perform either primary or more complex cases. Primary hip arthroplasty is increasingly performed by surgeons early in practice who have completed an adult reconstructive fellowship after residency training. This trend is even more pronounced for more complex cases such as revision or management of infection. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. An end to end secure CBIR over encrypted medical database.

    PubMed

    Bellafqira, Reda; Coatrieux, Gouenou; Bouslimi, Dalel; Quellec, Gwenole

    2016-08-01

    In this paper, we propose a new secure content based image retrieval (SCBIR) system adapted to the cloud framework. This solution allows a physician to retrieve images of similar content within an outsourced and encrypted image database, without decrypting them. Contrarily to actual CBIR approaches in the encrypted domain, the originality of the proposed scheme stands on the fact that the features extracted from the encrypted images are themselves encrypted. This is achieved by means of homomorphic encryption and two non-colluding servers, we however both consider as honest but curious. In that way an end to end secure CBIR process is ensured. Experimental results carried out on a diabetic retinopathy database encrypted with the Paillier cryptosystem indicate that our SCBIR achieves retrieval performance as good as if images were processed in their non-encrypted form.

  6. Re-designing the PhEDEx Security Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, C.-H.; Wildish, T.; Zhang, X.

    2014-01-01

    PhEDEx, the data-placement tool used by the CMS experiment at the LHC, was conceived in a more trusting time. The security model provided a safe environment for site agents and operators, but offerred little more protection than that. Data was not sufficiently protected against loss caused by operator error or software bugs or by deliberate manipulation of the database. Operators were given high levels of access to the database, beyond what was actually needed to accomplish their tasks. This exposed them to the risk of suspicion should an incident occur. Multiple implementations of the security model led to difficulties maintainingmore » code, which can lead to degredation of security over time. In order to meet the simultaneous goals of protecting CMS data, protecting the operators from undue exposure to risk, increasing monitoring capabilities and improving maintainability of the security model, the PhEDEx security model was redesigned and re-implemented. Security was moved from the application layer into the database itself, fine-grained access roles were established, and tools and procedures created to control the evolution of the security model over time. In this paper we describe this work, we describe the deployment of the new security model, and we show how these enhancements improve security on several fronts simultaneously.« less

  7. Re-designing the PhEDEx Security Model

    NASA Astrophysics Data System (ADS)

    C-H, Huang; Wildish, T.; X, Zhang

    2014-06-01

    PhEDEx, the data-placement tool used by the CMS experiment at the LHC, was conceived in a more trusting time. The security model provided a safe environment for site agents and operators, but offerred little more protection than that. Data was not sufficiently protected against loss caused by operator error or software bugs or by deliberate manipulation of the database. Operators were given high levels of access to the database, beyond what was actually needed to accomplish their tasks. This exposed them to the risk of suspicion should an incident occur. Multiple implementations of the security model led to difficulties maintaining code, which can lead to degredation of security over time. In order to meet the simultaneous goals of protecting CMS data, protecting the operators from undue exposure to risk, increasing monitoring capabilities and improving maintainability of the security model, the PhEDEx security model was redesigned and re-implemented. Security was moved from the application layer into the database itself, fine-grained access roles were established, and tools and procedures created to control the evolution of the security model over time. In this paper we describe this work, we describe the deployment of the new security model, and we show how these enhancements improve security on several fronts simultaneously.

  8. How ISO/IEC 17799 can be used for base lining information assurance among entities using data mining for defense, homeland security, commercial, and other civilian/commercial domains

    NASA Astrophysics Data System (ADS)

    Perry, William G.

    2006-04-01

    One goal of database mining is to draw unique and valid perspectives from multiple data sources. Insights that are fashioned from closely-held data stores are likely to possess a high degree of reliability. The degree of information assurance comes into question, however, when external databases are accessed, combined and analyzed to form new perspectives. ISO/IEC 17799, Information technology-Security techniques-Code of practice for information security management, can be used to establish a higher level of information assurance among disparate entities using data mining in the defense, homeland security, commercial and other civilian/commercial domains. Organizations that meet ISO/IEC information security standards have identified and assessed risks, threats and vulnerabilities and have taken significant proactive steps to meet their unique security requirements. The ISO standards address twelve domains: risk assessment and treatment, security policy, organization of information security, asset management, human resources security, physical and environmental security, communications and operations management, access control, information systems acquisition, development and maintenance, information security incident management and business continuity management and compliance. Analysts can be relatively confident that if organizations are ISO 17799 compliant, a high degree of information assurance is likely to be a characteristic of the data sets being used. The reverse may be true. Extracting, fusing and drawing conclusions based upon databases with a low degree of information assurance may be wrought with all of the hazards that come from knowingly using bad data to make decisions. Using ISO/IEC 17799 as a baseline for information assurance can help mitigate these risks.

  9. A privacy preserving protocol for tracking participants in phase I clinical trials.

    PubMed

    El Emam, Khaled; Farah, Hanna; Samet, Saeed; Essex, Aleksander; Jonker, Elizabeth; Kantarcioglu, Murat; Earle, Craig C

    2015-10-01

    Some phase 1 clinical trials offer strong financial incentives for healthy individuals to participate in their studies. There is evidence that some individuals enroll in multiple trials concurrently. This creates safety risks and introduces data quality problems into the trials. Our objective was to construct a privacy preserving protocol to track phase 1 participants to detect concurrent enrollment. A protocol using secure probabilistic querying against a database of trial participants that allows for screening during telephone interviews and on-site enrollment was developed. The match variables consisted of demographic information. The accuracy (sensitivity, precision, and negative predictive value) of the matching and its computational performance in seconds were measured under simulated environments. Accuracy was also compared to non-secure matching methods. The protocol performance scales linearly with the database size. At the largest database size of 20,000 participants, a query takes under 20s on a 64 cores machine. Sensitivity, precision, and negative predictive value of the queries were consistently at or above 0.9, and were very similar to non-secure versions of the protocol. The protocol provides a reasonable solution to the concurrent enrollment problems in phase 1 clinical trials, and is able to ensure that personal information about participants is kept secure. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Detection and Prevention of Insider Threats in Database Driven Web Services

    NASA Astrophysics Data System (ADS)

    Chumash, Tzvi; Yao, Danfeng

    In this paper, we take the first step to address the gap between the security needs in outsourced hosting services and the protection provided in the current practice. We consider both insider and outsider attacks in the third-party web hosting scenarios. We present SafeWS, a modular solution that is inserted between server side scripts and databases in order to prevent and detect website hijacking and unauthorized access to stored data. To achieve the required security, SafeWS utilizes a combination of lightweight cryptographic integrity and encryption tools, software engineering techniques, and security data management principles. We also describe our implementation of SafeWS and its evaluation. The performance analysis of our prototype shows the overhead introduced by security verification is small. SafeWS will allow business owners to significantly reduce the security risks and vulnerabilities of outsourcing their sensitive customer data to third-party providers.

  11. Using High-Dimensional Image Models to Perform Highly Undetectable Steganography

    NASA Astrophysics Data System (ADS)

    Pevný, Tomáš; Filler, Tomáš; Bas, Patrick

    This paper presents a complete methodology for designing practical and highly-undetectable stegosystems for real digital media. The main design principle is to minimize a suitably-defined distortion by means of efficient coding algorithm. The distortion is defined as a weighted difference of extended state-of-the-art feature vectors already used in steganalysis. This allows us to "preserve" the model used by steganalyst and thus be undetectable even for large payloads. This framework can be efficiently implemented even when the dimensionality of the feature set used by the embedder is larger than 107. The high dimensional model is necessary to avoid known security weaknesses. Although high-dimensional models might be problem in steganalysis, we explain, why they are acceptable in steganography. As an example, we introduce HUGO, a new embedding algorithm for spatial-domain digital images and we contrast its performance with LSB matching. On the BOWS2 image database and in contrast with LSB matching, HUGO allows the embedder to hide 7× longer message with the same level of security level.

  12. A DICOM based radiotherapy plan database for research collaboration and reporting

    NASA Astrophysics Data System (ADS)

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  13. Comment on "Secure quantum private information retrieval using phase-encoded queries"

    NASA Astrophysics Data System (ADS)

    Shi, Run-hua; Mu, Yi; Zhong, Hong; Zhang, Shun

    2016-12-01

    In this Comment, we reexamine the security of phase-encoded quantum private query (QPQ). We find that the current phase-encoded QPQ protocols, including their applications, are vulnerable to a probabilistic entangle-and-measure attack performed by the owner of the database. Furthermore, we discuss how to overcome this security loophole and present an improved cheat-sensitive QPQ protocol without losing the good features of the original protocol.

  14. POLARIS: A 30-meter probabilistic soil series map of the contiguous United States

    USGS Publications Warehouse

    Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P

    2016-01-01

    A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.

  15. Analyzing GAIAN Database (GaianDB) on a Tactical Network

    DTIC Science & Technology

    2015-11-30

    we connected 3 Raspberry Pi’s running GaianDB and our augmented version of splatform to a network of 3 CSRs. The Raspberry Pi is a low power, low...based on Debian from a connected secure digital high capacity (SDHC) card or a universal serial bus (USB) device. The Raspberry Pi comes equipped with...requirements, capabilities, and cost make the Raspberry Pi a useful device for sensor experimentation. From there, we performed 3 types of benchmarks

  16. Four barriers to the global understanding of biodiversity conservation: wealth, language, geographical location and security.

    PubMed

    Amano, Tatsuya; Sutherland, William J

    2013-04-07

    Global biodiversity conservation is seriously challenged by gaps and heterogeneity in the geographical coverage of existing information. Nevertheless, the key barriers to the collection and compilation of biodiversity information at a global scale have yet to be identified. We show that wealth, language, geographical location and security each play an important role in explaining spatial variations in data availability in four different types of biodiversity databases. The number of records per square kilometre is high in countries with high per capita gross domestic product (GDP), high proportion of English speakers and high security levels, and those located close to the country hosting the database; but these are not necessarily countries with high biodiversity. These factors are considered to affect data availability by impeding either the activities of scientific research or active international communications. Our results demonstrate that efforts to solve environmental problems at a global scale will gain significantly by focusing scientific education, communication, research and collaboration in low-GDP countries with fewer English speakers and located far from Western countries that host the global databases; countries that have experienced conflict may also benefit. Findings of this study may be broadly applicable to other fields that require the compilation of scientific knowledge at a global level.

  17. Survey of Machine Learning Methods for Database Security

    NASA Astrophysics Data System (ADS)

    Kamra, Ashish; Ber, Elisa

    Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.

  18. Security and health research databases: the stakeholders and questions to be addressed.

    PubMed

    Stewart, Sara

    2006-01-01

    Health research database security issues abound. Issues include subject confidentiality, data ownership, data integrity and data accessibility. There are also various stakeholders in database security. Each of these stakeholders has a different set of concerns and responsibilities when dealing with security issues. There is an obvious need for training in security issues, so that these issues may be addressed and health research will move on without added obstacles based on misunderstanding security methods and technologies.

  19. Secure searching of biomarkers through hybrid homomorphic encryption scheme.

    PubMed

    Kim, Miran; Song, Yongsoo; Cheon, Jung Hee

    2017-07-26

    As genome sequencing technology develops rapidly, there has lately been an increasing need to keep genomic data secure even when stored in the cloud and still used for research. We are interested in designing a protocol for the secure outsourcing matching problem on encrypted data. We propose an efficient method to securely search a matching position with the query data and extract some information at the position. After decryption, only a small amount of comparisons with the query information should be performed in plaintext state. We apply this method to find a set of biomarkers in encrypted genomes. The important feature of our method is to encode a genomic database as a single element of polynomial ring. Since our method requires a single homomorphic multiplication of hybrid scheme for query computation, it has the advantage over the previous methods in parameter size, computation complexity, and communication cost. In particular, the extraction procedure not only prevents leakage of database information that has not been queried by user but also reduces the communication cost by half. We evaluate the performance of our method and verify that the computation on large-scale personal data can be securely and practically outsourced to a cloud environment during data analysis. It takes about 3.9 s to search-and-extract the reference and alternate sequences at the queried position in a database of size 4M. Our solution for finding a set of biomarkers in DNA sequences shows the progress of cryptographic techniques in terms of their capability can support real-world genome data analysis in a cloud environment.

  20. WebBee: A Platform for Secure Coordination and Communication in Crisis Scenarios

    DTIC Science & Technology

    2008-04-16

    implemented through database triggers. The Webbee Database Server contains an Information Server, which is a Postgres database with PostGIS [5] extension...sends it to the target user. The heavy lifting for this mechanism is done through an extension of Postgres triggers (Figures 6.1 and 6.2), resulting...in fewer queries and better performance. Trigger support in Postgres is table-based and comparatively primitive: with n table triggers, an update

  1. Wireless LAN security management with location detection capability in hospitals.

    PubMed

    Tanaka, K; Atarashi, H; Yamaguchi, I; Watanabe, H; Yamamoto, R; Ohe, K

    2012-01-01

    In medical institutions, unauthorized access points and terminals obstruct the stable operation of a large-scale wireless local area network (LAN) system. By establishing a real-time monitoring method to detect such unauthorized wireless devices, we can improve the efficiency of security management. We detected unauthorized wireless devices by using a centralized wireless LAN system and a location detection system at 370 access points at the University of Tokyo Hospital. By storing the detected radio signal strength and location information in a database, we evaluated the risk level from the detection history. We also evaluated the location detection performance in our hospital ward using Wi-Fi tags. The presence of electric waves outside the hospital and those emitted from portable game machines with wireless communication capability was confirmed from the detection result. The location detection performance showed an error margin of approximately 4 m in detection accuracy and approximately 5% in false detection. Therefore, it was effective to consider the radio signal strength as both an index of likelihood at the detection location and an index for the level of risk. We determined the location of wireless devices with high accuracy by filtering the detection results on the basis of radio signal strength and detection history. Results of this study showed that it would be effective to use the developed location database containing radio signal strength and detection history for security management of wireless LAN systems and more general-purpose location detection applications.

  2. Performing private database queries in a real-world environment using a quantum protocol.

    PubMed

    Chan, Philip; Lucio-Martinez, Itzel; Mo, Xiaofan; Simon, Christoph; Tittel, Wolfgang

    2014-06-10

    In the well-studied cryptographic primitive 1-out-of-N oblivious transfer, a user retrieves a single element from a database of size N without the database learning which element was retrieved. While it has previously been shown that a secure implementation of 1-out-of-N oblivious transfer is impossible against arbitrarily powerful adversaries, recent research has revealed an interesting class of private query protocols based on quantum mechanics in a cheat sensitive model. Specifically, a practical protocol does not need to guarantee that the database provider cannot learn what element was retrieved if doing so carries the risk of detection. The latter is sufficient motivation to keep a database provider honest. However, none of the previously proposed protocols could cope with noisy channels. Here we present a fault-tolerant private query protocol, in which the novel error correction procedure is integral to the security of the protocol. Furthermore, we present a proof-of-concept demonstration of the protocol over a deployed fibre.

  3. Performing private database queries in a real-world environment using a quantum protocol

    PubMed Central

    Chan, Philip; Lucio-Martinez, Itzel; Mo, Xiaofan; Simon, Christoph; Tittel, Wolfgang

    2014-01-01

    In the well-studied cryptographic primitive 1-out-of-N oblivious transfer, a user retrieves a single element from a database of size N without the database learning which element was retrieved. While it has previously been shown that a secure implementation of 1-out-of-N oblivious transfer is impossible against arbitrarily powerful adversaries, recent research has revealed an interesting class of private query protocols based on quantum mechanics in a cheat sensitive model. Specifically, a practical protocol does not need to guarantee that the database provider cannot learn what element was retrieved if doing so carries the risk of detection. The latter is sufficient motivation to keep a database provider honest. However, none of the previously proposed protocols could cope with noisy channels. Here we present a fault-tolerant private query protocol, in which the novel error correction procedure is integral to the security of the protocol. Furthermore, we present a proof-of-concept demonstration of the protocol over a deployed fibre. PMID:24913129

  4. Economic Analysis of Cyber Security

    DTIC Science & Technology

    2006-07-01

    vulnerability databases and track the number of incidents reported by U.S. organizations. Many of these are private organizations, such as the security...VULNERABILITY AND ATTACK ESTIMATES Numerous organizations compile vulnerability databases and patch information, and track the number of reported incidents... database / security focus Databases of vulnerabilities identifying the software versions that are susceptible, including information on the method of

  5. Fermilab Security Site Access Request Database

    Science.gov Websites

    Fermilab Security Site Access Request Database Use of the online version of the Fermilab Security Site Access Request Database requires that you login into the ESH&Q Web Site. Note: Only Fermilab generated from the ESH&Q Section's Oracle database on May 27, 2018 05:48 AM. If you have a question

  6. 77 FR 52372 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... stolen securities. Reporting to the central database also allows reporting institutions to gain access to... proper performance of the functions of the agency, including whether the information shall have practical...

  7. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  8. The Protein Disease Database of human body fluids: II. Computer methods and data issues.

    PubMed

    Lemkin, P F; Orr, G A; Goldstein, M P; Creed, G J; Myrick, J E; Merril, C R

    1995-01-01

    The Protein Disease Database (PDD) is a relational database of proteins and diseases. With this database it is possible to screen for quantitative protein abnormalities associated with disease states. These quantitative relationships use data drawn from the peer-reviewed biomedical literature. Assays may also include those observed in high-resolution electrophoretic gels that offer the potential to quantitate many proteins in a single test as well as data gathered by enzymatic or immunologic assays. We are using the Internet World Wide Web (WWW) and the Web browser paradigm as an access method for wide distribution and querying of the Protein Disease Database. The WWW hypertext transfer protocol and its Common Gateway Interface make it possible to build powerful graphical user interfaces that can support easy-to-use data retrieval using query specification forms or images. The details of these interactions are totally transparent to the users of these forms. Using a client-server SQL relational database, user query access, initial data entry and database maintenance are all performed over the Internet with a Web browser. We discuss the underlying design issues, mapping mechanisms and assumptions that we used in constructing the system, data entry, access to the database server, security, and synthesis of derived two-dimensional gel image maps and hypertext documents resulting from SQL database searches.

  9. Relativistic quantum private database queries

    NASA Astrophysics Data System (ADS)

    Sun, Si-Jia; Yang, Yu-Guang; Zhang, Ming-Ou

    2015-04-01

    Recently, Jakobi et al. (Phys Rev A 83, 022301, 2011) suggested the first practical private database query protocol (J-protocol) based on the Scarani et al. (Phys Rev Lett 92, 057901, 2004) quantum key distribution protocol. Unfortunately, the J-protocol is just a cheat-sensitive private database query protocol. In this paper, we present an idealized relativistic quantum private database query protocol based on Minkowski causality and the properties of quantum information. Also, we prove that the protocol is secure in terms of the user security and the database security.

  10. Privacy Preserving Facial and Fingerprint Multi-biometric Authentication

    NASA Astrophysics Data System (ADS)

    Anzaku, Esla Timothy; Sohn, Hosik; Ro, Yong Man

    The cases of identity theft can be mitigated by the adoption of secure authentication methods. Biohashing and its variants, which utilizes secret keys and biometrics, are promising methods for secure authentication; however, their shortcoming is the degraded performance under the assumption that secret keys are compromised. In this paper, we extend the concept of Biohashing to multi-biometrics - facial and fingerprint traits. We chose these traits because they are widely used, howbeit, little research attention has been given to designing privacy preserving multi-biometric systems using them. Instead of just using a single modality (facial or fingerprint), we presented a framework for using both modalities. The improved performance of the proposed method, using face and fingerprint, as against either facial or fingerprint trait used in isolation is evaluated using two chimerical bimodal databases formed from publicly available facial and fingerprint databases.

  11. 76 FR 28795 - Privacy Act of 1974; Department of Homeland Security United States Coast Guard-024 Auxiliary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... 1974; Department of Homeland Security United States Coast Guard-024 Auxiliary Database System of... Security/United States Coast Guard-024 Auxiliary Database (AUXDATA) System of Records.'' This system of... titled, ``DHS/USCG-024 Auxiliary Database (AUXDATA) System of Records.'' The AUXDATA system is the USCG's...

  12. Database Security: What Students Need to Know

    ERIC Educational Resources Information Center

    Murray, Meg Coffin

    2010-01-01

    Database security is a growing concern evidenced by an increase in the number of reported incidents of loss of or unauthorized exposure to sensitive data. As the amount of data collected, retained and shared electronically expands, so does the need to understand database security. The Defense Information Systems Agency of the US Department of…

  13. An Autonomic Framework for Integrating Security and Quality of Service Support in Databases

    ERIC Educational Resources Information Center

    Alomari, Firas

    2013-01-01

    The back-end databases of multi-tiered applications are a major data security concern for enterprises. The abundance of these systems and the emergence of new and different threats require multiple and overlapping security mechanisms. Therefore, providing multiple and diverse database intrusion detection and prevention systems (IDPS) is a critical…

  14. The OAuth 2.0 Web Authorization Protocol for the Internet Addiction Bioinformatics (IABio) Database.

    PubMed

    Choi, Jeongseok; Kim, Jaekwon; Lee, Dong Kyun; Jang, Kwang Soo; Kim, Dai-Jin; Choi, In Young

    2016-03-01

    Internet addiction (IA) has become a widespread and problematic phenomenon as smart devices pervade society. Moreover, internet gaming disorder leads to increases in social expenditures for both individuals and nations alike. Although the prevention and treatment of IA are getting more important, the diagnosis of IA remains problematic. Understanding the neurobiological mechanism of behavioral addictions is essential for the development of specific and effective treatments. Although there are many databases related to other addictions, a database for IA has not been developed yet. In addition, bioinformatics databases, especially genetic databases, require a high level of security and should be designed based on medical information standards. In this respect, our study proposes the OAuth standard protocol for database access authorization. The proposed IA Bioinformatics (IABio) database system is based on internet user authentication, which is a guideline for medical information standards, and uses OAuth 2.0 for access control technology. This study designed and developed the system requirements and configuration. The OAuth 2.0 protocol is expected to establish the security of personal medical information and be applied to genomic research on IA.

  15. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    PubMed

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. LIS–lnterlink—connecting laboratory information systems to remote primary health–care centres via the Internet

    PubMed Central

    Clark, Barry; Wachowiak, Bartosz; Crawford, Ewan W.; Jakubowski, Zenon; Kabata, Janusz

    1998-01-01

    A pilot study was performed to evaluate the feasibility of using the Internet to securely deliver patient laboratory results, and the system has subsequently gone into routine use in Poland. The system went from design to pilot and then to live implementation within a four-month period, resulting in the LIS-Interlink software product. Test results are retrieved at regular intervals from the BioLinkTM LIS (Laboratory Information System), encrypted and transferred to a secure area on the Web server. The primary health-care centres dial into the Internet using a local-cell service provided by Polish Telecom (TP), obtain a TCP/IP address using the TP DHCP server, and perform HTTP ‘get’ and ‘post’ operations to obtain the files by secure handshaking. The data are then automatically inserted into a local SQL database (with optional printing of incoming reports)for cumulative reporting and searching functions. The local database is fully multi-user and can be accessed from different clinics within the centres by a variety of networking protocols. PMID:18924820

  17. LIS-lnterlink-connecting laboratory information systems to remote primary health-care centres via the Internet.

    PubMed

    Clark, B; Wachowiak, B; Crawford, E W; Jakubowski, Z; Kabata, J

    1998-01-01

    A pilot study was performed to evaluate the feasibility of using the Internet to securely deliver patient laboratory results, and the system has subsequently gone into routine use in Poland. The system went from design to pilot and then to live implementation within a four-month period, resulting in the LIS-Interlink software product. Test results are retrieved at regular intervals from the BioLink(TM) LIS (Laboratory Information System), encrypted and transferred to a secure area on the Web server. The primary health-care centres dial into the Internet using a local-cell service provided by Polish Telecom (TP), obtain a TCP/IP address using the TP DHCP server, and perform HTTP 'get' and 'post' operations to obtain the files by secure handshaking. The data are then automatically inserted into a local SQL database (with optional printing of incoming reports)for cumulative reporting and searching functions. The local database is fully multi-user and can be accessed from different clinics within the centres by a variety of networking protocols.

  18. 49 CFR 1570.13 - False statements regarding security background checks by public transportation agency or railroad...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., national security, or of terrorism: (i) Relevant criminal history databases; (ii) In the case of an alien... databases to determine the status of the alien under the immigration laws of the United States; and (iii) Other relevant information or databases, as determined by the Secretary of Homeland Security. (c...

  19. 49 CFR 1570.13 - False statements regarding security background checks by public transportation agency or railroad...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., national security, or of terrorism: (i) Relevant criminal history databases; (ii) In the case of an alien... databases to determine the status of the alien under the immigration laws of the United States; and (iii) Other relevant information or databases, as determined by the Secretary of Homeland Security. (c...

  20. 49 CFR 1570.13 - False statements regarding security background checks by public transportation agency or railroad...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., national security, or of terrorism: (i) Relevant criminal history databases; (ii) In the case of an alien... databases to determine the status of the alien under the immigration laws of the United States; and (iii) Other relevant information or databases, as determined by the Secretary of Homeland Security. (c...

  1. 49 CFR 1570.13 - False statements regarding security background checks by public transportation agency or railroad...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., national security, or of terrorism: (i) Relevant criminal history databases; (ii) In the case of an alien... databases to determine the status of the alien under the immigration laws of the United States; and (iii) Other relevant information or databases, as determined by the Secretary of Homeland Security. (c...

  2. Research Directions in Database Security IV

    DTIC Science & Technology

    1993-07-01

    second algorithm, which is based on multiversion timestamp ordering, is that high level transactions can be forced to read arbitrarily old data values...system. The first, the single ver- sion model, stores only the latest veision of each data item, while the second, the 88 multiversion model, stores... Multiversion Database Model In the standard database model, where there is only one version of each data item, all transactions compete for the most recent

  3. Security Management in a Multimedia System

    ERIC Educational Resources Information Center

    Rednic, Emanuil; Toma, Andrei

    2009-01-01

    In database security, the issue of providing a level of security for multimedia information is getting more and more known. For the moment the security of multimedia information is done through the security of the database itself, in the same way, for all classic and multimedia records. So what is the reason for the creation of a security…

  4. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  5. When in doubt, seize the day? Security values, prosocial values, and proactivity under ambiguity.

    PubMed

    Grant, Adam M; Rothbard, Nancy P

    2013-09-01

    Researchers have suggested that both ambiguity and values play important roles in shaping employees' proactive behaviors, but have not theoretically or empirically integrated these factors. Drawing on theories of situational strength and values, we propose that ambiguity constitutes a weak situation that strengthens the relationship between the content of employees' values and their proactivity. A field study of 204 employees and their direct supervisors in a water treatment plant provided support for this contingency perspective. Ambiguity moderated the relationship between employees' security and prosocial values and supervisor ratings of proactivity. Under high ambiguity, security values predicted lower proactivity, whereas prosocial values predicted higher proactivity. Under low ambiguity, values were not associated with proactivity. We replicated these findings in a laboratory experiment with 232 participants in which we measured proactivity objectively as initiative taken to correct errors: Participants with strong security values were less proactive, and participants with strong prosocial values were more proactive, but only when performance expectations were ambiguous. We discuss theoretical implications for research on proactivity, values, and ambiguity and uncertainty. PsycINFO Database Record (c) 2013 APA, all rights reserved

  6. Automatic recognition of emotions from facial expressions

    NASA Astrophysics Data System (ADS)

    Xue, Henry; Gertner, Izidor

    2014-06-01

    In the human-computer interaction (HCI) process it is desirable to have an artificial intelligent (AI) system that can identify and categorize human emotions from facial expressions. Such systems can be used in security, in entertainment industries, and also to study visual perception, social interactions and disorders (e.g. schizophrenia and autism). In this work we survey and compare the performance of different feature extraction algorithms and classification schemes. We introduce a faster feature extraction method that resizes and applies a set of filters to the data images without sacrificing the accuracy. In addition, we have enhanced SVM to multiple dimensions while retaining the high accuracy rate of SVM. The algorithms were tested using the Japanese Female Facial Expression (JAFFE) Database and the Database of Faces (AT&T Faces).

  7. Network Security Visualization

    DTIC Science & Technology

    1999-09-27

    performing SQL generation and result-set binding, inserting acquired security events into the database and gathering the requested data for Console scene...objects is also auto-generated by a VBA script. Built into the auto-generated table access objects are the preferred join paths between tables. This...much of the Server itself) never have to deal with SQL directly. This is one aspect of laying the groundwork for supporting RDBMSs from multiple vendors

  8. Geothopica and the interactive analysis and visualization of the updated Italian National Geothermal Database

    NASA Astrophysics Data System (ADS)

    Trumpy, Eugenio; Manzella, Adele

    2017-02-01

    The Italian National Geothermal Database (BDNG), is the largest collection of Italian Geothermal data and was set up in the 1980s. It has since been updated both in terms of content and management tools: information on deep wells and thermal springs (with temperature > 30 °C) are currently organized and stored in a PostgreSQL relational database management system, which guarantees high performance, data security and easy access through different client applications. The BDNG is the core of the Geothopica web site, whose webGIS tool allows different types of user to access geothermal data, to visualize multiple types of datasets, and to perform integrated analyses. The webGIS tool has been recently improved by two specially designed, programmed and implemented visualization tools to display data on well lithology and underground temperatures. This paper describes the contents of the database and its software and data update, as well as the webGIS tool including the new tools for data lithology and temperature visualization. The geoinformation organized in the database and accessible through Geothopica is of use not only for geothermal purposes, but also for any kind of georesource and CO2 storage project requiring the organization of, and access to, deep underground data. Geothopica also supports project developers, researchers, and decision makers in the assessment, management and sustainable deployment of georesources.

  9. An end-to-end secure patient information access card system.

    PubMed

    Alkhateeb, A; Singer, H; Yakami, M; Takahashi, T

    2000-03-01

    The rapid development of the Internet and the increasing interest in Internet-based solutions has promoted the idea of creating Internet-based health information applications. This will force a change in the role of IC cards in healthcare card systems from a data carrier to an access key medium. At the Medical Informatics Department of Kyoto University Hospital we are developing a smart card patient information project where patient databases are accessed via the Internet. Strong end-to-end data encryption is performed via Secure Socket Layers, transparent to transmit patient information. The smart card is playing the crucial role of access key to the database: user authentication is performed internally without ever revealing the actual key. For easy acceptance by healthcare professionals, the user interface is integrated as a plug-in for two familiar Web browsers, Netscape Navigator and MS Internet Explorer.

  10. Lessons Learned Implementing DOORS in a Citrix Environment

    NASA Technical Reports Server (NTRS)

    Bussman, Marie

    2005-01-01

    NASA's James Web Space Telescope (JWST) Project is a large multi-national project with geographically dispersed contractors that all need access to the Projects requirement database. Initially, the project utilized multiple DOORS databases with the built-in partitions feature to exchange modules amongst the various contractor sites. As the requirements databases matured the use of partitions became extremely difficult. There have been many issues such as incompatible versions of DOORS, inefficient mechanism for sharing modules, security concerns, performance issues, and inconsistent document import and export formats. Deployment of the client software with limited IT resources available was also an issue. The solution chosen by JWST was to integrate the use of a Citrix environment with the DOORS database to address most of the project concerns. The use of the Citrix solution allowed a single Requirements database in a secure environment via a web interface. The Citrix environment allows JWST to upgrade to the most current version of DOORS without having to coordinate multiple sites and user upgrades. The single requirements database eliminates a multitude of Configuration Management concerns and facilitated the standardization of documentation formats. This paper discusses the obstacles and the lessons learned throughout the installation, implementation, usage and deployment process of a centralized DOORS database solution.

  11. Turning Access into a web-enabled secure information system for clinical trials.

    PubMed

    Dongquan Chen; Chen, Wei-Bang; Soong, Mayhue; Soong, Seng-Jaw; Orthner, Helmuth F

    2009-08-01

    Organizations that have limited resources need to conduct clinical studies in a cost-effective, but secure way. Clinical data residing in various individual databases need to be easily accessed and secured. Although widely available, digital certification, encryption, and secure web server, have not been implemented as widely, partly due to a lack of understanding of needs and concerns over issues such as cost and difficulty in implementation. The objective of this study was to test the possibility of centralizing various databases and to demonstrate ways of offering an alternative to a large-scale comprehensive and costly commercial product, especially for simple phase I and II trials, with reasonable convenience and security. We report a working procedure to transform and develop a standalone Access database into a secure Web-based secure information system. For data collection and reporting purposes, we centralized several individual databases; developed, and tested a web-based secure server using self-issued digital certificates. The system lacks audit trails. The cost of development and maintenance may hinder its wide application. The clinical trial databases scattered in various departments of an institution could be centralized into a web-enabled secure information system. The limitations such as the lack of a calendar and audit trail can be partially addressed with additional programming. The centralized Web system may provide an alternative to a comprehensive clinical trial management system.

  12. Report: EPA Needs to Strengthen Financial Database Security Oversight and Monitor Compliance

    EPA Pesticide Factsheets

    Report #2007-P-00017, March 29, 2007. Weaknesses in how EPA offices monitor databases for known security vulnerabilities, communicate the status of critical system patches, and monitor the access to database administrator accounts and privileges.

  13. Database Systems and Oracle: Experiences and Lessons Learned

    ERIC Educational Resources Information Center

    Dunn, Deborah

    2005-01-01

    In a tight job market, IT professionals with database experience are likely to be in great demand. Companies need database personnel who can help improve access to and security of data. The events of September 11 have increased business' awareness of the need for database security, backup, and recovery procedures. It is our responsibility to…

  14. Homeland Security 2002: Evolving the Homeland Defense Infrastructure. Executive Summary Report (Conference Proceedings June 25 - 26, 2002) Volume 1, No. 2)

    DTIC Science & Technology

    2002-09-01

    ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Egov 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING / MONITORING...initiatives. The federal government has 55 databases that deal with security threats, but inter- agency access depends on establishing agreements through...which that information can be shared. True cooperation also will require government -wide commitment to enterprise architecture, integrated

  15. A sensor monitoring system for telemedicine, safety and security applications

    NASA Astrophysics Data System (ADS)

    Vlissidis, Nikolaos; Leonidas, Filippos; Giovanis, Christos; Marinos, Dimitrios; Aidinis, Konstantinos; Vassilopoulos, Christos; Pagiatakis, Gerasimos; Schmitt, Nikolaus; Pistner, Thomas; Klaue, Jirka

    2017-02-01

    A sensor system capable of medical, safety and security monitoring in avionic and other environments (e.g. homes) is examined. For application inside an aircraft cabin, the system relies on an optical cellular network that connects each seat to a server and uses a set of database applications to process data related to passengers' health, safety and security status. Health monitoring typically encompasses electrocardiogram, pulse oximetry and blood pressure, body temperature and respiration rate while safety and security monitoring is related to the standard flight attendance duties, such as cabin preparation for take-off, landing, flight in regions of turbulence, etc. In contrast to previous related works, this article focuses on the system's modules (medical and safety sensors and associated hardware), the database applications used for the overall control of the monitoring function and the potential use of the system for security applications. Further tests involving medical, safety and security sensing performed in an real A340 mock-up set-up are also described and reference is made to the possible use of the sensing system in alternative environments and applications, such as health monitoring within other means of transport (e.g. trains or small passenger sea vessels) as well as for remotely located home users, over a wired Ethernet network or the Internet.

  16. Security Controls in the Stockpoint Logistics Integrated Communications Environment (SPLICE).

    DTIC Science & Technology

    1985-03-01

    call programs as authorized after checks by the Terminal Management Subsystem on SAS databases . SAS overlays the TANDEM GUARDIAN operating system to...Security Access Profile database (SAP) and a query capability generating various security reports. SAS operates with the System Monitor (SMON) subsystem...system to DDN and other components. The first SAS component to be reviewed is the SAP database . SAP is organized into two types of files. Relational

  17. A Database as a Service for the Healthcare System to Store Physiological Signal Data.

    PubMed

    Chang, Hsien-Tsung; Lin, Tsai-Huei

    2016-01-01

    Wearable devices that measure physiological signals to help develop self-health management habits have become increasingly popular in recent years. These records are conducive for follow-up health and medical care. In this study, based on the characteristics of the observed physiological signal records- 1) a large number of users, 2) a large amount of data, 3) low information variability, 4) data privacy authorization, and 5) data access by designated users-we wish to resolve physiological signal record-relevant issues utilizing the advantages of the Database as a Service (DaaS) model. Storing a large amount of data using file patterns can reduce database load, allowing users to access data efficiently; the privacy control settings allow users to store data securely. The results of the experiment show that the proposed system has better database access performance than a traditional relational database, with a small difference in database volume, thus proving that the proposed system can improve data storage performance.

  18. A Database as a Service for the Healthcare System to Store Physiological Signal Data

    PubMed Central

    Lin, Tsai-Huei

    2016-01-01

    Wearable devices that measure physiological signals to help develop self-health management habits have become increasingly popular in recent years. These records are conducive for follow-up health and medical care. In this study, based on the characteristics of the observed physiological signal records– 1) a large number of users, 2) a large amount of data, 3) low information variability, 4) data privacy authorization, and 5) data access by designated users—we wish to resolve physiological signal record-relevant issues utilizing the advantages of the Database as a Service (DaaS) model. Storing a large amount of data using file patterns can reduce database load, allowing users to access data efficiently; the privacy control settings allow users to store data securely. The results of the experiment show that the proposed system has better database access performance than a traditional relational database, with a small difference in database volume, thus proving that the proposed system can improve data storage performance. PMID:28033415

  19. Formulating a strategy for securing high-speed rail in the United States.

    DOT National Transportation Integrated Search

    2013-03-01

    This report presents an analysis of information relating to attacks, attempted attacks, and plots against high-speed rail (HSR) : systems. It draws upon empirical data from MTIs Database of Terrorist and Serious Criminal Attacks Against Public Sur...

  20. 77 FR 66880 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... the database that stores information for the Lost and Stolen Securities Program. We estimate that 26... Lost and Stolen Securities Program database will be kept confidential. The Commission may not conduct... SECURITIES AND EXCHANGE COMMISSION Submission for OMB Review; Comment Request Upon Written Request...

  1. Practical Quantum Private Database Queries Based on Passive Round-Robin Differential Phase-shift Quantum Key Distribution.

    PubMed

    Li, Jian; Yang, Yu-Guang; Chen, Xiu-Bo; Zhou, Yi-Hua; Shi, Wei-Min

    2016-08-19

    A novel quantum private database query protocol is proposed, based on passive round-robin differential phase-shift quantum key distribution. Compared with previous quantum private database query protocols, the present protocol has the following unique merits: (i) the user Alice can obtain one and only one key bit so that both the efficiency and security of the present protocol can be ensured, and (ii) it does not require to change the length difference of the two arms in a Mach-Zehnder interferometer and just chooses two pulses passively to interfere with so that it is much simpler and more practical. The present protocol is also proved to be secure in terms of the user security and database security.

  2. New Resources for Computer-Aided Legal Research: An Assessment of the Usefulness of the DIALOG System in Securities Regulation Studies.

    ERIC Educational Resources Information Center

    Gruner, Richard; Heron, Carol E.

    1984-01-01

    Examines usefulness of DIALOG as legal research tool through use of DIALOG's DIALINDEX database to identify those databases among almost 200 available that contain large numbers of records related to federal securities regulation. Eight databases selected for further study are detailed. Twenty-six footnotes, database statistics, and samples are…

  3. Database security and encryption technology research and application

    NASA Astrophysics Data System (ADS)

    Zhu, Li-juan

    2013-03-01

    The main purpose of this paper is to discuss the current database information leakage problem, and discuss the important role played by the message encryption techniques in database security, As well as MD5 encryption technology principle and the use in the field of website or application. This article is divided into introduction, the overview of the MD5 encryption technology, the use of MD5 encryption technology and the final summary. In the field of requirements and application, this paper makes readers more detailed and clearly understood the principle, the importance in database security, and the use of MD5 encryption technology.

  4. 75 FR 43208 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-23

    ... securities to the database, (b) to confirm inquiry of the database, and (c) to demonstrate compliance with... SECURITIES AND EXCHANGE COMMISSION [Rule 17f-1(g); SEC File No. 270-30; OMB Control No. 3235-0290] Proposed Collection; Comment Request Upon Written Request, Copies Available From: Securities and Exchange...

  5. A Novel ECG Data Compression Method Using Adaptive Fourier Decomposition With Security Guarantee in e-Health Applications.

    PubMed

    Ma, JiaLi; Zhang, TanTan; Dong, MingChui

    2015-05-01

    This paper presents a novel electrocardiogram (ECG) compression method for e-health applications by adapting an adaptive Fourier decomposition (AFD) algorithm hybridized with a symbol substitution (SS) technique. The compression consists of two stages: first stage AFD executes efficient lossy compression with high fidelity; second stage SS performs lossless compression enhancement and built-in data encryption, which is pivotal for e-health. Validated with 48 ECG records from MIT-BIH arrhythmia benchmark database, the proposed method achieves averaged compression ratio (CR) of 17.6-44.5 and percentage root mean square difference (PRD) of 0.8-2.0% with a highly linear and robust PRD-CR relationship, pushing forward the compression performance to an unexploited region. As such, this paper provides an attractive candidate of ECG compression method for pervasive e-health applications.

  6. Computer Security Products Technology Overview

    DTIC Science & Technology

    1988-10-01

    13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a

  7. Molecule database framework: a framework for creating database applications with chemical structure search capability

    PubMed Central

    2013-01-01

    Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762

  8. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    PubMed

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.

  9. Practical Quantum Private Database Queries Based on Passive Round-Robin Differential Phase-shift Quantum Key Distribution

    PubMed Central

    Li, Jian; Yang, Yu-Guang; Chen, Xiu-Bo; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    A novel quantum private database query protocol is proposed, based on passive round-robin differential phase-shift quantum key distribution. Compared with previous quantum private database query protocols, the present protocol has the following unique merits: (i) the user Alice can obtain one and only one key bit so that both the efficiency and security of the present protocol can be ensured, and (ii) it does not require to change the length difference of the two arms in a Mach-Zehnder interferometer and just chooses two pulses passively to interfere with so that it is much simpler and more practical. The present protocol is also proved to be secure in terms of the user security and database security. PMID:27539654

  10. Nonintrusive multibiometrics on a mobile device: a comparison of fusion techniques

    NASA Astrophysics Data System (ADS)

    Allano, Lorene; Morris, Andrew C.; Sellahewa, Harin; Garcia-Salicetti, Sonia; Koreman, Jacques; Jassim, Sabah; Ly-Van, Bao; Wu, Dalei; Dorizzi, Bernadette

    2006-04-01

    In this article we test a number of score fusion methods for the purpose of multimodal biometric authentication. These tests were made for the SecurePhone project, whose aim is to develop a prototype mobile communication system enabling biometrically authenticated users to deal legally binding m-contracts during a mobile phone call on a PDA. The three biometrics of voice, face and signature were selected because they are all traditional non-intrusive and easy to use means of authentication which can readily be captured on a PDA. By combining multiple biometrics of relatively low security it may be possible to obtain a combined level of security which is at least as high as that provided by a PIN or handwritten signature, traditionally used for user authentication. As the relative success of different fusion methods depends on the database used and tests made, the database we used was recorded on a suitable PDA (the Qtek2020) and the test protocol was designed to reflect the intended application scenario, which is expected to use short text prompts. Not all of the fusion methods tested are original. They were selected for their suitability for implementation within the constraints imposed by the application. All of the methods tested are based on fusion of the match scores output by each modality. Though computationally simple, the methods tested have shown very promising results. All of the 4 fusion methods tested obtain a significant performance increase.

  11. ADAPTmap: International coordinated data resource for improving goat production effiency

    USDA-ARS?s Scientific Manuscript database

    Goats provide vital food and economic security, particularly in developing countries. We created a database that is a nexus for all performance, type, geographic information system (GIS), production environment, and genome information on goats. This resource provides a platform for meta-analysis tha...

  12. Multimodal person authentication on a smartphone under realistic conditions

    NASA Astrophysics Data System (ADS)

    Morris, Andrew C.; Jassim, Sabah; Sellahewa, Harin; Allano, Lorene; Ehlers, Johan; Wu, Dalei; Koreman, Jacques; Garcia-Salicetti, Sonia; Ly-Van, Bao; Dorizzi, Bernadette

    2006-05-01

    Verification of a person's identity by the combination of more than one biometric trait strongly increases the robustness of person authentication in real applications. This is particularly the case in applications involving signals of degraded quality, as for person authentication on mobile platforms. The context of mobility generates degradations of input signals due to the variety of environments encountered (ambient noise, lighting variations, etc.), while the sensors' lower quality further contributes to decrease in system performance. Our aim in this work is to combine traits from the three biometric modalities of speech, face and handwritten signature in a concrete application, performing non intrusive biometric verification on a personal mobile device (smartphone/PDA). Most available biometric databases have been acquired in more or less controlled environments, which makes it difficult to predict performance in a real application. Our experiments are performed on a database acquired on a PDA as part of the SecurePhone project (IST-2002-506883 project "Secure Contracts Signed by Mobile Phone"). This database contains 60 virtual subjects balanced in gender and age. Virtual subjects are obtained by coupling audio-visual signals from real English speaking subjects with signatures from other subjects captured on the touch screen of the PDA. Video data for the PDA database was recorded in 2 recording sessions separated by at least one week. Each session comprises 4 acquisition conditions: 2 indoor and 2 outdoor recordings (with in each case, a good and a degraded quality recording). Handwritten signatures were captured in one session in realistic conditions. Different scenarios of matching between training and test conditions are tested to measure the resistance of various fusion systems to different types of variability and different amounts of enrolment data.

  13. Successful linking of the Society of Thoracic Surgeons database to social security data to examine survival after cardiac operations.

    PubMed

    Jacobs, Jeffrey Phillip; Edwards, Fred H; Shahian, David M; Prager, Richard L; Wright, Cameron D; Puskas, John D; Morales, David L S; Gammie, James S; Sanchez, Juan A; Haan, Constance K; Badhwar, Vinay; George, Kristopher M; O'Brien, Sean M; Dokholyan, Rachel S; Sheng, Shubin; Peterson, Eric D; Shewan, Cynthia M; Feehan, Kelly M; Han, Jane M; Jacobs, Marshall Lewis; Williams, William G; Mayer, John E; Chitwood, W Randolph; Murray, Gordon F; Grover, Frederick L

    2011-07-01

    Long-term evaluation of cardiothoracic surgical outcomes is a major goal of The Society of Thoracic Surgeons (STS). Linking the STS Database to the Social Security Death Master File (SSDMF) allows for the verification of "life status." This study demonstrates the feasibility of linking the STS Database to the SSDMF and examines longitudinal survival after cardiac operations. For all operations in the STS Adult Cardiac Surgery Database performed in 2008 in patients with an available Social Security Number, the SSDMF was searched for a matching Social Security Number. Survival probabilities at 30 days and 1 year were estimated for nine common operations. A Social Security Number was available for 101,188 patients undergoing isolated coronary artery bypass grafting, 12,336 patients undergoing isolated aortic valve replacement, and 6,085 patients undergoing isolated mitral valve operations. One-year survival for isolated coronary artery bypass grafting was 88.9% (6,529 of 7,344) with all vein grafts, 95.2% (84,696 of 88,966) with a single mammary artery graft, 97.4% (4,422 of 4,540) with bilateral mammary artery grafts, and 95.6% (7,543 of 7,890) with all arterial grafts. One-year survival was 92.4% (11,398 of 12,336) for isolated aortic valve replacement (95.6% [2,109 of 2,206] with mechanical prosthesis and 91.7% [9,289 of 10,130] with biologic prosthesis), 86.5% (2,312 of 2,674) for isolated mitral valve replacement (91.7% [923 of 1,006] with mechanical prosthesis and 83.3% [1,389 of 1,668] with biologic prosthesis), and 96.0% (3,275 of 3,411) for isolated mitral valve repair. Successful linkage to the SSDMF has substantially increased the power of the STS Database. These longitudinal survival data from this large multi-institutional study provide reassurance about the durability and long-term benefits of cardiac operations and constitute a contemporary benchmark for survival after cardiac operations. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Design and Analysis of A Multi-Backend Database System for Performance Improvement, Functionality Expansion and Capacity Growth. Part II.

    DTIC Science & Technology

    1981-08-01

    of Transactions ..... . 29 5.5.2 Attached Execution of Transactions ........ ... 29 5.5.3 The Choice of Transaction Execution for Access Control...basic access control mech- anism for statistical security and value-dependent security. In Section 5.5, * we describe the process of execution of ...the process of request execution with access control for in- sert and non-insert requests in MDBS. We recall again (see Chapter 4) that the process

  15. Cost Considerations in Cloud Computing

    DTIC Science & Technology

    2014-01-01

    investments. 2. Database Options The potential promise that “ big data ” analytics holds for many enterprise mission areas makes relevant the question of the...development of a range of new distributed file systems and data - bases that have better scalability properties than traditional SQL databases. Hadoop ... data . Many systems exist that extend or supplement Hadoop —such as Apache Accumulo, which provides a highly granular mechanism for managing security

  16. A novel chaotic stream cipher and its application to palmprint template protection

    NASA Astrophysics Data System (ADS)

    Li, Heng-Jian; Zhang, Jia-Shu

    2010-04-01

    Based on a coupled nonlinear dynamic filter (NDF), a novel chaotic stream cipher is presented in this paper and employed to protect palmprint templates. The chaotic pseudorandom bit generator (PRBG) based on a coupled NDF, which is constructed in an inverse flow, can generate multiple bits at one iteration and satisfy the security requirement of cipher design. Then, the stream cipher is employed to generate cancelable competitive code palmprint biometrics for template protection. The proposed cancelable palmprint authentication system depends on two factors: the palmprint biometric and the password/token. Therefore, the system provides high-confidence and also protects the user's privacy. The experimental results of verification on the Hong Kong PolyU Palmprint Database show that the proposed approach has a large template re-issuance ability and the equal error rate can achieve 0.02%. The performance of the palmprint template protection scheme proves the good practicability and security of the proposed stream cipher.

  17. 49 CFR 1570.13 - False statements regarding security background checks by public transportation agency or railroad...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Regulations Relating to Transportation (Continued) TRANSPORTATION SECURITY ADMINISTRATION, DEPARTMENT OF..., national security, or of terrorism: (i) Relevant criminal history databases; (ii) In the case of an alien... databases to determine the status of the alien under the immigration laws of the United States; and (iii...

  18. The European general thoracic surgery database project.

    PubMed

    Falcoz, Pierre Emmanuel; Brunelli, Alessandro

    2014-05-01

    The European Society of Thoracic Surgeons (ESTS) Database is a free registry created by ESTS in 2001. The current online version was launched in 2007. It runs currently on a Dendrite platform with extensive data security and frequent backups. The main features are a specialty-specific, procedure-specific, prospectively maintained, periodically audited and web-based electronic database, designed for quality control and performance monitoring, which allows for the collection of all general thoracic procedures. Data collection is the "backbone" of the ESTS database. It includes many risk factors, processes of care and outcomes, which are specially designed for quality control and performance audit. The user can download and export their own data and use them for internal analyses and quality control audits. The ESTS database represents the gold standard of clinical data collection for European General Thoracic Surgery. Over the past years, the ESTS database has achieved many accomplishments. In particular, the database hit two major milestones: it now includes more than 235 participating centers and 70,000 surgical procedures. The ESTS database is a snapshot of surgical practice that aims at improving patient care. In other words, data capture should become integral to routine patient care, with the final objective of improving quality of care within Europe.

  19. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  20. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  1. Secure Database Management Study.

    DTIC Science & Technology

    1978-12-01

    covers cases Involving indus- trial economics (e.g., Industrial spies) and commercial finances (e.g., fraud). Priv¢j--Protection of date about people...California, Berke - lay [STONM76aI. * The approach to protection taken in INGRE (STOM74| has attracted a lot of Interest* Queries, in a high level query...Material Command Support Activity (NMCSA), and another DoD agency, Cullinane Corporation developed a prototype version of the IDS database system on a

  2. 20 CFR 411.250 - How will SSA evaluate a PM?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 411.250 Employees' Benefits SOCIAL SECURITY ADMINISTRATION THE TICKET TO WORK AND SELF-SUFFICIENCY PROGRAM Use of One or More Program Managers To Assist in Administration of the Ticket to Work Program... determine the PM's final rating. (c) These performance evaluations will be made part of our database on...

  3. 20 CFR 411.250 - How will SSA evaluate a PM?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 411.250 Employees' Benefits SOCIAL SECURITY ADMINISTRATION THE TICKET TO WORK AND SELF-SUFFICIENCY PROGRAM Use of One or More Program Managers To Assist in Administration of the Ticket to Work Program... determine the PM's final rating. (c) These performance evaluations will be made part of our database on...

  4. 20 CFR 411.250 - How will SSA evaluate a PM?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 411.250 Employees' Benefits SOCIAL SECURITY ADMINISTRATION THE TICKET TO WORK AND SELF-SUFFICIENCY PROGRAM Use of One or More Program Managers To Assist in Administration of the Ticket to Work Program... determine the PM's final rating. (c) These performance evaluations will be made part of our database on...

  5. 20 CFR 411.250 - How will SSA evaluate a PM?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 411.250 Employees' Benefits SOCIAL SECURITY ADMINISTRATION THE TICKET TO WORK AND SELF-SUFFICIENCY PROGRAM Use of One or More Program Managers To Assist in Administration of the Ticket to Work Program... determine the PM's final rating. (c) These performance evaluations will be made part of our database on...

  6. Database Design Methodology and Database Management System for Computer-Aided Structural Design Optimization.

    DTIC Science & Technology

    1984-12-01

    52242 Prepared for the AIR FORCE OFFICE OF SCIENTIFIC RESEARCH Under Grant No. AFOSR 82-0322 December 1984 ~ " ’w Unclassified SECURITY CLASSIFICATION4...OF THIS PAGE REPORT DOCUMENTATION PAGE is REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE MARKINGS Unclassified None 20 SECURITY CLASSIFICATION...designer .and computer- are 20 DIiRIBUTION/AVAILABI LIT Y 0P ABSTR4ACT 21 ABSTRACT SECURITY CLASSIFICA1ONr UNCLASSIFIED/UNLIMITED SAME AS APT OTIC USERS

  7. Biometric identification based on feature fusion with PCA and SVM

    NASA Astrophysics Data System (ADS)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina

    2018-04-01

    Biometric identification is gaining ground compared to traditional identification methods. Many biometric measurements may be used for secure human identification. The most reliable among them is the iris pattern because of its uniqueness, stability, unforgeability and inalterability over time. The approach presented in this paper is a fusion of different feature descriptor methods such as HOG, LIOP, LBP, used for extracting iris texture information. The classifiers obtained through the SVM and PCA methods demonstrate the effectiveness of our system applied to one and both irises. The performances measured are highly accurate and foreshadow a fusion system with a rate of identification approaching 100% on the UPOL database.

  8. Practical security and privacy attacks against biometric hashing using sparse recovery

    NASA Astrophysics Data System (ADS)

    Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan

    2016-12-01

    Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.

  9. A sharable cloud-based pancreaticoduodenectomy collaborative database for physicians: emphasis on security and clinical rule supporting.

    PubMed

    Yu, Hwan-Jeu; Lai, Hong-Shiee; Chen, Kuo-Hsin; Chou, Hsien-Cheng; Wu, Jin-Ming; Dorjgochoo, Sarangerel; Mendjargal, Adilsaikhan; Altangerel, Erdenebaatar; Tien, Yu-Wen; Hsueh, Chih-Wen; Lai, Feipei

    2013-08-01

    Pancreaticoduodenectomy (PD) is a major operation with high complication rate. Thereafter, patients may develop morbidity because of the complex reconstruction and loss of pancreatic parenchyma. A well-designed database is very important to address both the short-term and long-term outcomes after PD. The objective of this research was to build an international PD database implemented with security and clinical rule supporting functions, which made the data-sharing easier and improve the accuracy of data. The proposed system is a cloud-based application. To fulfill its requirements, the system comprises four subsystems: a data management subsystem, a clinical rule supporting subsystem, a short message notification subsystem, and an information security subsystem. After completing the surgery, the physicians input the data retrospectively, which are analyzed to study factors associated with post-PD common complications (delayed gastric emptying and pancreatic fistula) to validate the clinical value of this system. Currently, this database contains data from nearly 500 subjects. Five medical centers in Taiwan and two cancer centers in Mongolia are participating in this study. A data mining model of the decision tree analysis showed that elderly patients (>76 years) with pylorus-preserving PD (PPPD) have higher proportion of delayed gastric emptying. About the pancreatic fistula, the data mining model of the decision tree analysis revealed that cases with non-pancreaticogastrostomy (PG) reconstruction - body mass index (BMI)>29.65 or PG reconstruction - BMI>23.7 - non-classic PD have higher proportion of pancreatic fistula after PD. The proposed system allows medical staff to collect and store clinical data in a cloud, sharing the data with other physicians in a secure manner to achieve collaboration in research. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Income distribution patterns from a complete social security database

    NASA Astrophysics Data System (ADS)

    Derzsy, N.; Néda, Z.; Santos, M. A.

    2012-11-01

    We analyze the income distribution of employees for 9 consecutive years (2001-2009) using a complete social security database for an economically important district of Romania. The database contains detailed information on more than half million taxpayers, including their monthly salaries from all employers where they worked. Besides studying the characteristic distribution functions in the high and low/medium income limits, the database allows us a detailed dynamical study by following the time-evolution of the taxpayers income. To our knowledge, this is the first extensive study of this kind (a previous Japanese taxpayers survey was limited to two years). In the high income limit we prove once again the validity of Pareto’s law, obtaining a perfect scaling on four orders of magnitude in the rank for all the studied years. The obtained Pareto exponents are quite stable with values around α≈2.5, in spite of the fact that during this period the economy developed rapidly and also a financial-economic crisis hit Romania in 2007-2008. For the low and medium income category we confirmed the exponential-type income distribution. Following the income of employees in time, we have found that the top limit of the income distribution is a highly dynamical region with strong fluctuations in the rank. In this region, the observed dynamics is consistent with a multiplicative random growth hypothesis. Contrarily with previous results obtained for the Japanese employees, we find that the logarithmic growth-rate is not independent of the income.

  11. Security and privacy qualities of medical devices: an analysis of FDA postmarket surveillance.

    PubMed

    Kramer, Daniel B; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R

    2012-01-01

    Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients' stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware.

  12. Security and Privacy Qualities of Medical Devices: An Analysis of FDA Postmarket Surveillance

    PubMed Central

    Kramer, Daniel B.; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R.

    2012-01-01

    Background Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients’ stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. Methods We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Results Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Conclusions Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware. PMID:22829874

  13. Rhinoplasty perioperative database using a personal digital assistant.

    PubMed

    Kotler, Howard S

    2004-01-01

    To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.

  14. MedBlock: Efficient and Secure Medical Data Sharing Via Blockchain.

    PubMed

    Fan, Kai; Wang, Shangyang; Ren, Yanhui; Li, Hui; Yang, Yintang

    2018-06-21

    With the development of electronic information technology, electronic medical records (EMRs) have been a common way to store the patients' data in hospitals. They are stored in different hospitals' databases, even for the same patient. Therefore, it is difficult to construct a summarized EMR for one patient from multiple hospital databases due to the security and privacy concerns. Meanwhile, current EMRs systems lack a standard data management and sharing policy, making it difficult for pharmaceutical scientists to develop precise medicines based on data obtained under different policies. To solve the above problems, we proposed a blockchain-based information management system, MedBlock, to handle patients' information. In this scheme, the distributed ledger of MedBlock allows the efficient EMRs access and EMRs retrieval. The improved consensus mechanism achieves consensus of EMRs without large energy consumption and network congestion. In addition, MedBlock also exhibits high information security combining the customized access control protocols and symmetric cryptography. MedBlock can play an important role in the sensitive medical information sharing.

  15. Safeguarding Databases Basic Concepts Revisited.

    ERIC Educational Resources Information Center

    Cardinali, Richard

    1995-01-01

    Discusses issues of database security and integrity, including computer crime and vandalism, human error, computer viruses, employee and user access, and personnel policies. Suggests some precautions to minimize system vulnerability such as careful personnel screening, audit systems, passwords, and building and software security systems. (JKP)

  16. Intrusion Detection in Database Systems

    NASA Astrophysics Data System (ADS)

    Javidi, Mohammad M.; Sohrabi, Mina; Rafsanjani, Marjan Kuchaki

    Data represent today a valuable asset for organizations and companies and must be protected. Ensuring the security and privacy of data assets is a crucial and very difficult problem in our modern networked world. Despite the necessity of protecting information stored in database systems (DBS), existing security models are insufficient to prevent misuse, especially insider abuse by legitimate users. One mechanism to safeguard the information in these databases is to use an intrusion detection system (IDS). The purpose of Intrusion detection in database systems is to detect transactions that access data without permission. In this paper several database Intrusion detection approaches are evaluated.

  17. Securely and Flexibly Sharing a Biomedical Data Management System

    PubMed Central

    Wang, Fusheng; Hussels, Phillip; Liu, Peiya

    2011-01-01

    Biomedical database systems need not only to address the issues of managing complex data, but also to provide data security and access control to the system. These include not only system level security, but also instance level access control such as access of documents, schemas, or aggregation of information. The latter is becoming more important as multiple users can share a single scientific data management system to conduct their research, while data have to be protected before they are published or IP-protected. This problem is challenging as users’ needs for data security vary dramatically from one application to another, in terms of who to share with, what resources to be shared, and at what access level. We develop a comprehensive data access framework for a biomedical data management system SciPort. SciPort provides fine-grained multi-level space based access control of resources at not only object level (documents and schemas), but also space level (resources set aggregated in a hierarchy way). Furthermore, to simplify the management of users and privileges, customizable role-based user model is developed. The access control is implemented efficiently by integrating access privileges into the backend XML database, thus efficient queries are supported. The secure access approach we take makes it possible for multiple users to share the same biomedical data management system with flexible access management and high data security. PMID:21625285

  18. Online database for documenting clinical pathology resident education.

    PubMed

    Hoofnagle, Andrew N; Chou, David; Astion, Michael L

    2007-01-01

    Training of clinical pathologists is evolving and must now address the 6 core competencies described by the Accreditation Council for Graduate Medical Education (ACGME), which include patient care. A substantial portion of the patient care performed by the clinical pathology resident takes place while the resident is on call for the laboratory, a practice that provides the resident with clinical experience and assists the laboratory in providing quality service to clinicians in the hospital and surrounding community. Documenting the educational value of these on-call experiences and providing evidence of competence is difficult for residency directors. An online database of these calls, entered by residents and reviewed by faculty, would provide a mechanism for documenting and improving the education of clinical pathology residents. With Microsoft Access we developed an online database that uses active server pages and secure sockets layer encryption to document calls to the clinical pathology resident. Using the data collected, we evaluated the efficacy of 3 interventions aimed at improving resident education. The database facilitated the documentation of more than 4 700 calls in the first 21 months it was online, provided archived resident-generated data to assist in serving clients, and demonstrated that 2 interventions aimed at improving resident education were successful. We have developed a secure online database, accessible from any computer with Internet access, that can be used to easily document clinical pathology resident education and competency.

  19. Practical quantum private query with better performance in resisting joint-measurement attack

    NASA Astrophysics Data System (ADS)

    Wei, Chun-Yan; Wang, Tian-Yin; Gao, Fei

    2016-04-01

    As a kind of practical protocol, quantum-key-distribution (QKD)-based quantum private queries (QPQs) have drawn lots of attention. However, joint-measurement (JM) attack poses a noticeable threat to the database security in such protocols. That is, by JM attack a malicious user can illegally elicit many more items from the database than the average amount an honest one can obtain. Taking Jacobi et al.'s protocol as an example, by JM attack a malicious user can obtain as many as 500 bits, instead of the expected 2.44 bits, from a 104-bit database in one query. It is a noticeable security flaw in theory, and would also arise in application with the development of quantum memories. To solve this problem, we propose a QPQ protocol based on a two-way QKD scheme, which behaves much better in resisting JM attack. Concretely, the user Alice cannot get more database items by conducting JM attack on the qubits because she has to send them back to Bob (the database holder) before knowing which of them should be jointly measured. Furthermore, JM attack by both Alice and Bob would be detected with certain probability, which is quite different from previous protocols. Moreover, our protocol retains the good characters of QKD-based QPQs, e.g., it is loss tolerant and robust against quantum memory attack.

  20. Planar ultra thin glass seals with optical fiber interface for monitoring tamper attacks on security eminent components

    NASA Astrophysics Data System (ADS)

    Thiel, M.; Flachenecker, G.; Schade, W.; Gorecki, C.; Thoma, A.; Rathje, R.

    2017-11-01

    Optical seals consisting of waveguide Bragg grating sensor structures in ultra thin glass transparencies have been developed to cover security relevant objects for detection of unauthorized access. For generation of optical signature in the seals, femtosecond laser pulses were used. The optical seals were connected with an optical fiber to enable external read out of the seal. Different attack scenarios for getting undetected access to the object, covered by the seal, were proven and evaluated. The results presented here, verify a very high level of security. An unauthorized detaching and subsequent replacement by original or copy of the seals for tampering would be accompanied with a very high technological effort, posing a substantial barrier towards an attacker. Additionally, environmental influences like temperature effects have a strong but reproducible influence on signature, which in context of a temperature reference database increases the level of security significantly.

  1. Performance analysis of AES-Blowfish hybrid algorithm for security of patient medical record data

    NASA Astrophysics Data System (ADS)

    Mahmud H, Amir; Angga W, Bayu; Tommy; Marwan E, Andi; Siregar, Rosyidah

    2018-04-01

    A file security is one method to protect data confidentiality, integrity and information security. Cryptography is one of techniques used to secure and guarantee data confidentiality by doing conversion to the plaintext (original message) to cipher text (hidden message) with two important processes, they are encrypt and decrypt. Some researchers proposed a hybrid method to improve data security. In this research we proposed hybrid method of AES-blowfish (BF) to secure the patient’s medical report data into the form PDF file that sources from database. Generation method of private and public key uses two ways of approach, those are RSA method f RSA and ECC. We will analyze impact of these two ways of approach for hybrid method at AES-blowfish based on time and Throughput. Based on testing results, BF method is faster than AES and AES-BF hybrid, however AES-BF hybrid is better for throughput compared with AES and BF is higher.

  2. Human health risk assessment database, "the NHSRC toxicity value database": supporting the risk assessment process at US EPA's National Homeland Security Research Center.

    PubMed

    Moudgal, Chandrika J; Garrahan, Kevin; Brady-Roberts, Eletha; Gavrelis, Naida; Arbogast, Michelle; Dun, Sarah

    2008-11-15

    The toxicity value database of the United States Environmental Protection Agency's (EPA) National Homeland Security Research Center has been in development since 2004. The toxicity value database includes a compilation of agent property, toxicity, dose-response, and health effects data for 96 agents: 84 chemical and radiological agents and 12 biotoxins. The database is populated with multiple toxicity benchmark values and agent property information from secondary sources, with web links to the secondary sources, where available. A selected set of primary literature citations and associated dose-response data are also included. The toxicity value database offers a powerful means to quickly and efficiently gather pertinent toxicity and dose-response data for a number of agents that are of concern to the nation's security. This database, in conjunction with other tools, will play an important role in understanding human health risks, and will provide a means for risk assessors and managers to make quick and informed decisions on the potential health risks and determine appropriate responses (e.g., cleanup) to agent release. A final, stand alone MS ACESSS working version of the toxicity value database was completed in November, 2007.

  3. Autonomous facial recognition system inspired by human visual system based logarithmical image visualization technique

    NASA Astrophysics Data System (ADS)

    Wan, Qianwen; Panetta, Karen; Agaian, Sos

    2017-05-01

    Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.

  4. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  5. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  6. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  7. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  8. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  9. 49 CFR 1572.107 - Other analyses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...

  10. Automatic Identification of Critical Data Items in a Database to Mitigate the Effects of Malicious Insiders

    NASA Astrophysics Data System (ADS)

    White, Jonathan; Panda, Brajendra

    A major concern for computer system security is the threat from malicious insiders who target and abuse critical data items in the system. In this paper, we propose a solution to enable automatic identification of critical data items in a database by way of data dependency relationships. This identification of critical data items is necessary because insider threats often target mission critical data in order to accomplish malicious tasks. Unfortunately, currently available systems fail to address this problem in a comprehensive manner. It is more difficult for non-experts to identify these critical data items because of their lack of familiarity and due to the fact that data systems are constantly changing. By identifying the critical data items automatically, security engineers will be better prepared to protect what is critical to the mission of the organization and also have the ability to focus their security efforts on these critical data items. We have developed an algorithm that scans the database logs and forms a directed graph showing which items influence a large number of other items and at what frequency this influence occurs. This graph is traversed to reveal the data items which have a large influence throughout the database system by using a novel metric based formula. These items are critical to the system because if they are maliciously altered or stolen, the malicious alterations will spread throughout the system, delaying recovery and causing a much more malignant effect. As these items have significant influence, they are deemed to be critical and worthy of extra security measures. Our proposal is not intended to replace existing intrusion detection systems, but rather is intended to complement current and future technologies. Our proposal has never been performed before, and our experimental results have shown that it is very effective in revealing critical data items automatically.

  11. Performance evaluation of wavelet-based face verification on a PDA recorded database

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  12. Design and implementation of a nuclear weapons management system submodule: Shipboard security force system. Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Settlemyer, S.R.

    1991-09-01

    The Nuclear Weapons Management System combines the strengths of an expert system with the flexibility of a database management system to assist the Weapons Officer, Security Officer, and the Personnel Reliability Program Officer in the performance of administrative duties associated with the nuclear weapons programs in the United States Navy. This thesis examines the need for, and ultimately the design of, a system that will assist the Security Officer in administrative duties associated with the Shipboard Self Defense Force. This system, designed and coded utilizing dBASE IV, can be implemented as a stand alone system. Furthermore, it interfaces with themore » expert system submodule that handles the PRP screening process.« less

  13. Proceedings of the IFIP WG 11.3 Working Conference on Database Security (6th) Held in Vancouver, British Columbia on 19-22 August 1992.

    DTIC Science & Technology

    1992-01-01

    multiversioning scheme for this purpose was presented in [9]. The scheme guarantees that high level methods would read down object states at lower levels that...order given by fork-stamp, and terminated writing versions with timestamp WStamp. Such a history is needed to implement the multiversioning scheme...recovery protocol for multiversion schedulers and show that this protocol is both correct and secure. The behavior of the recovery protocol depends

  14. Security Behavior Observatory: Infrastructure for Long-term Monitoring of Client Machines

    DTIC Science & Technology

    2014-07-14

    desired data. In Wmdows, this is most often a .NET language (e.g., C#, PowerShell), a command-line batch script, or Java . 3) Least privilege: To ensure...modules are written in Java , and thus should be easily-portable to any OS. B. Deployment There are several high-level requirements the SBO must meet...practically feasible with such solutions. Instead, one researcher with access to all the clients’ keys (stored in an isolated and secured MySQL database

  15. Efficient Aho-Corasick String Matching on Emerging Multicore Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumeo, Antonino; Villa, Oreste; Secchi, Simone

    String matching algorithms are critical to several scientific fields. Beside text processing and databases, emerging applications such as DNA protein sequence analysis, data mining, information security software, antivirus, ma- chine learning, all exploit string matching algorithms [3]. All these applica- tions usually process large quantity of textual data, require high performance and/or predictable execution times. Among all the string matching algorithms, one of the most studied, especially for text processing and security applica- tions, is the Aho-Corasick algorithm. 1 2 Book title goes here Aho-Corasick is an exact, multi-pattern string matching algorithm which performs the search in a time linearlymore » proportional to the length of the input text independently from pattern set size. However, depending on the imple- mentation, when the number of patterns increase, the memory occupation may raise drastically. In turn, this can lead to significant variability in the performance, due to the memory access times and the caching effects. This is a significant concern for many mission critical applications and modern high performance architectures. For example, security applications such as Network Intrusion Detection Systems (NIDS), must be able to scan network traffic against very large dictionaries in real time. Modern Ethernet links reach up to 10 Gbps, and malicious threats are already well over 1 million, and expo- nentially growing [28]. When performing the search, a NIDS should not slow down the network, or let network packets pass unchecked. Nevertheless, on the current state-of-the-art cache based processors, there may be a large per- formance variability when dealing with big dictionaries and inputs that have different frequencies of matching patterns. In particular, when few patterns are matched and they are all in the cache, the procedure is fast. Instead, when they are not in the cache, often because many patterns are matched and the caches are continuously thrashed, they should be retrieved from the system memory and the procedure is slowed down by the increased latency. Efficient implementations of string matching algorithms have been the fo- cus of several works, targeting Field Programmable Gate Arrays [4, 25, 15, 5], highly multi-threaded solutions like the Cray XMT [34], multicore proces- sors [19] or heterogeneous processors like the Cell Broadband Engine [35, 22]. Recently, several researchers have also started to investigate the use Graphic Processing Units (GPUs) for string matching algorithms in security applica- tions [20, 10, 32, 33]. Most of these approaches mainly focus on reaching high peak performance, or try to optimize the memory occupation, rather than looking at performance stability. However, hardware solutions supports only small dictionary sizes due to lack of memory and are difficult to customize, while platforms such as the Cell/B.E. are very complex to program.« less

  16. The ESID Online Database network.

    PubMed

    Guzman, D; Veit, D; Knerr, V; Kindle, G; Gathmann, B; Eades-Perner, A M; Grimbacher, B

    2007-03-01

    Primary immunodeficiencies (PIDs) belong to the group of rare diseases. The European Society for Immunodeficiencies (ESID), is establishing an innovative European patient and research database network for continuous long-term documentation of patients, in order to improve the diagnosis, classification, prognosis and therapy of PIDs. The ESID Online Database is a web-based system aimed at data storage, data entry, reporting and the import of pre-existing data sources in an enterprise business-to-business integration (B2B). The online database is based on Java 2 Enterprise System (J2EE) with high-standard security features, which comply with data protection laws and the demands of a modern research platform. The ESID Online Database is accessible via the official website (http://www.esid.org/). Supplementary data are available at Bioinformatics online.

  17. Design of Cancelable Palmprint Templates Based on Look Up Table

    NASA Astrophysics Data System (ADS)

    Qiu, Jian; Li, Hengjian; Dong, Jiwen

    2018-03-01

    A novel cancelable palmprint templates generation scheme is proposed in this paper. Firstly, the Gabor filter and chaotic matrix are used to extract palmprint features. It is then arranged into a row vector and divided into equal size blocks. These blocks are converted to corresponding decimals and mapped to look up tables, forming final cancelable palmprint features based on the selected check bits. Finally, collaborative representation based classification with regularized least square is used for classification. Experimental results on the Hong Kong PolyU Palmprint Database verify that the proposed cancelable templates can achieve very high performance and security levels. Meanwhile, it can also satisfy the needs of real-time applications.

  18. Visible School Security Measures and Student Academic Performance, Attendance, and Postsecondary Aspirations

    ERIC Educational Resources Information Center

    Tanner-Smith, Emily E.; Fisher, Benjamin W.

    2015-01-01

    Many U.S. schools use visible security measures (security cameras, metal detectors, security personnel) in an effort to keep schools safe and promote adolescents' academic success. This study examined how different patterns of visible security utilization were associated with U.S. middle and high school students' academic performance, attendance,…

  19. Comparative study of minutiae selection algorithms for ISO fingerprint templates

    NASA Astrophysics Data System (ADS)

    Vibert, B.; Charrier, C.; Le Bars, J.-M.; Rosenberger, C.

    2015-03-01

    We address the selection of fingerprint minutiae given a fingerprint ISO template. Minutiae selection plays a very important role when a secure element (i.e. a smart-card) is used. Because of the limited capability of computation and memory, the number of minutiae of a stored reference in the secure element is limited. We propose in this paper a comparative study of 6 minutiae selection methods including 2 methods from the literature and 1 like reference (No Selection). Experimental results on 3 fingerprint databases from the Fingerprint Verification Competition show their relative efficiency in terms of performance and computation time.

  20. A national cross-sectional analysis of dermatology away rotations using the Visiting Student Application Service database.

    PubMed

    Cao, Severine Z; Nambudiri, Vinod E

    2017-12-15

    The highly competitive nature of the dermatology match requires applicants to undertake a variety of measures in the hopes of securing a residency position. Among the opportunities available to applicants is the chance to participate in away or "audition" rotations during their final year of undergraduate medical education. Away rotations are now performed by a majority of medical students applying into dermatology, but littleresearch has been done to describe the nature of this opportunity for interested applicants. An analysis of all dermatology electives offered in the Visiting Student Application Service (VSAS) database wasperformed. Results indicate that students have the option to pursue electives in a variety of subjects offered by 100 sponsoring institutions spread across a wide geographic distribution. Although manyopportunities exist, this analysis sheds light on several areas for improving the quality of this experience for interested applicants, including providing more electives in advanced subject matter, permitting more flexibility in scheduling, and promoting wider participation in VSAS.

  1. Quantifying the Correctness, Computational Complexity, and Security of Privacy-Preserving String Comparators for Record Linkage

    PubMed Central

    Durham, Elizabeth; Xue, Yuan; Kantarcioglu, Murat; Malin, Bradley

    2011-01-01

    Record linkage is the task of identifying records from disparate data sources that refer to the same entity. It is an integral component of data processing in distributed settings, where the integration of information from multiple sources can prevent duplication and enrich overall data quality, thus enabling more detailed and correct analysis. Privacy-preserving record linkage (PPRL) is a variant of the task in which data owners wish to perform linkage without revealing identifiers associated with the records. This task is desirable in various domains, including healthcare, where it may not be possible to reveal patient identity due to confidentiality requirements, and in business, where it could be disadvantageous to divulge customers' identities. To perform PPRL, it is necessary to apply string comparators that function in the privacy-preserving space. A number of privacy-preserving string comparators (PPSCs) have been proposed, but little research has compared them in the context of a real record linkage application. This paper performs a principled and comprehensive evaluation of six PPSCs in terms of three key properties: 1) correctness of record linkage predictions, 2) computational complexity, and 3) security. We utilize a real publicly-available dataset, derived from the North Carolina voter registration database, to evaluate the tradeoffs between the aforementioned properties. Among our results, we find that PPSCs that partition, encode, and compare strings yield highly accurate record linkage results. However, as a tradeoff, we observe that such PPSCs are less secure than those that map and compare strings in a reduced dimensional space. PMID:22904698

  2. Quantifying the Correctness, Computational Complexity, and Security of Privacy-Preserving String Comparators for Record Linkage.

    PubMed

    Durham, Elizabeth; Xue, Yuan; Kantarcioglu, Murat; Malin, Bradley

    2012-10-01

    Record linkage is the task of identifying records from disparate data sources that refer to the same entity. It is an integral component of data processing in distributed settings, where the integration of information from multiple sources can prevent duplication and enrich overall data quality, thus enabling more detailed and correct analysis. Privacy-preserving record linkage (PPRL) is a variant of the task in which data owners wish to perform linkage without revealing identifiers associated with the records. This task is desirable in various domains, including healthcare, where it may not be possible to reveal patient identity due to confidentiality requirements, and in business, where it could be disadvantageous to divulge customers' identities. To perform PPRL, it is necessary to apply string comparators that function in the privacy-preserving space. A number of privacy-preserving string comparators (PPSCs) have been proposed, but little research has compared them in the context of a real record linkage application. This paper performs a principled and comprehensive evaluation of six PPSCs in terms of three key properties: 1) correctness of record linkage predictions, 2) computational complexity, and 3) security. We utilize a real publicly-available dataset, derived from the North Carolina voter registration database, to evaluate the tradeoffs between the aforementioned properties. Among our results, we find that PPSCs that partition, encode, and compare strings yield highly accurate record linkage results. However, as a tradeoff, we observe that such PPSCs are less secure than those that map and compare strings in a reduced dimensional space.

  3. CICS Region Virtualization for Cost Effective Application Development

    ERIC Educational Resources Information Center

    Khan, Kamal Waris

    2012-01-01

    Mainframe is used for hosting large commercial databases, transaction servers and applications that require a greater degree of reliability, scalability and security. Customer Information Control System (CICS) is a mainframe software framework for implementing transaction services. It is designed for rapid, high-volume online processing. In order…

  4. Cryptographically secure biometrics

    NASA Astrophysics Data System (ADS)

    Stoianov, A.

    2010-04-01

    Biometric systems usually do not possess a cryptographic level of security: it has been deemed impossible to perform a biometric authentication in the encrypted domain because of the natural variability of biometric samples and of the cryptographic intolerance even to a single bite error. Encrypted biometric data need to be decrypted on authentication, which creates privacy and security risks. On the other hand, the known solutions called "Biometric Encryption (BE)" or "Fuzzy Extractors" can be cracked by various attacks, for example, by running offline a database of images against the stored helper data in order to obtain a false match. In this paper, we present a novel approach which combines Biometric Encryption with classical Blum-Goldwasser cryptosystem. In the "Client - Service Provider (SP)" or in the "Client - Database - SP" architecture it is possible to keep the biometric data encrypted on all the stages of the storage and authentication, so that SP never has an access to unencrypted biometric data. It is shown that this approach is suitable for two of the most popular BE schemes, Fuzzy Commitment and Quantized Index Modulation (QIM). The approach has clear practical advantages over biometric systems using "homomorphic encryption". Future work will deal with the application of the proposed solution to one-to-many biometric systems.

  5. 78 FR 43890 - Privacy Act of 1974; Department of Homeland Security, Federal Emergency Management Agency-006...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... titled, ``Department of Homeland Security/Federal Emergency Management Agency--006 Citizen Corps Database...) authorities; (5) purpose; (6) routine uses of information; (7) system manager and address; (8) notification... Database'' and retitle it ``DHS/FEMA--006 Citizen Corps Program System of Records.'' FEMA administers the...

  6. [Comparing different treatments for femoral neck fracture of displacement type in the elderly:a meta analysis].

    PubMed

    Zhao, Wenbo; Tu, Chongqi; Zhang, Hui; Fang, Yue; Wang, Guanglin; Liu, Lei

    2014-04-01

    To compare the effects and security between internal fixation and total hip arthroplasty for the patients in elderly with femoral neck fracture of displacement type through a meta analysis. Studies on comparison between internal fixation and total hip arthroplasty for the patients in the elderly with femoral neck fracture of displacement type were identified from PubMed database,EMBase database, COCHRANE library, CMB database, CNKI database and MEDLINE database. Data analysis were performed using Revman 5.2.6(the Cochrane Collaboration). Six published randomized controlled trials including 627 patients were suitable for the review, 286 cases in internal fixation group and 341 cases in total hip arthroplasty group. The results of meta analysis indicated that statistically significant difference were observed between the two groups in the quality of life which was reflected by the Harris scale (RR = 0.82, 95%CI:0.72-0.93, P < 0.05) , the reoperation rate (RR = 5.81, 95%CI:3.09-10.95, P < 0.05) and the major complications rate (RR = 3.60, 95%CI:2.29-5.67, P < 0.05) postoperatively. There were no difference in the mortality at 1 year and 5 years postoperatively(P > 0.05). For the patients with femoral neck fracture of displacement type in the elderly, there is no statistical difference between two groups in the mortality postoperatively. The quality of life and the security of operation in internal fixation group is worse than the total hip arthroplasty group.

  7. Adaptive weighted local textural features for illumination, expression, and occlusion invariant face recognition

    NASA Astrophysics Data System (ADS)

    Cui, Chen; Asari, Vijayan K.

    2014-03-01

    Biometric features such as fingerprints, iris patterns, and face features help to identify people and restrict access to secure areas by performing advanced pattern analysis and matching. Face recognition is one of the most promising biometric methodologies for human identification in a non-cooperative security environment. However, the recognition results obtained by face recognition systems are a affected by several variations that may happen to the patterns in an unrestricted environment. As a result, several algorithms have been developed for extracting different facial features for face recognition. Due to the various possible challenges of data captured at different lighting conditions, viewing angles, facial expressions, and partial occlusions in natural environmental conditions, automatic facial recognition still remains as a difficult issue that needs to be resolved. In this paper, we propose a novel approach to tackling some of these issues by analyzing the local textural descriptions for facial feature representation. The textural information is extracted by an enhanced local binary pattern (ELBP) description of all the local regions of the face. The relationship of each pixel with respect to its neighborhood is extracted and employed to calculate the new representation. ELBP reconstructs a much better textural feature extraction vector from an original gray level image in different lighting conditions. The dimensionality of the texture image is reduced by principal component analysis performed on each local face region. Each low dimensional vector representing a local region is now weighted based on the significance of the sub-region. The weight of each sub-region is determined by employing the local variance estimate of the respective region, which represents the significance of the region. The final facial textural feature vector is obtained by concatenating the reduced dimensional weight sets of all the modules (sub-regions) of the face image. Experiments conducted on various popular face databases show promising performance of the proposed algorithm in varying lighting, expression, and partial occlusion conditions. Four databases were used for testing the performance of the proposed system: Yale Face database, Extended Yale Face database B, Japanese Female Facial Expression database, and CMU AMP Facial Expression database. The experimental results in all four databases show the effectiveness of the proposed system. Also, the computation cost is lower because of the simplified calculation steps. Research work is progressing to investigate the effectiveness of the proposed face recognition method on pose-varying conditions as well. It is envisaged that a multilane approach of trained frameworks at different pose bins and an appropriate voting strategy would lead to a good recognition rate in such situation.

  8. Neurology diagnostics security and terminal adaptation for PocketNeuro project.

    PubMed

    Chemak, C; Bouhlel, M-S; Lapayre, J-C

    2008-09-01

    This paper presents new approaches of medical information security and terminal mobile phone adaptation for the PocketNeuro project. The latter term refers to a project created for the management of neurological diseases. It consists of transmitting information about patients ("desk of patients") to a doctor's mobile phone during a visit and examination of a patient. These new approaches for the PocketNeuro project were analyzed in terms of medical information security and adaptation of the diagnostic images to the doctor's mobile phone. Images were extracted from a DICOM library. Matlab and its library were used as software to test our approaches and to validate our results. Experiments performed on a database of 30 256 x 256 pixel-sized neuronal medical images indicated that our new approaches for PocketNeuro project are valid and support plans for large-scale studies between French and Swiss hospitals using secured connections.

  9. EMRlog method for computer security for electronic medical records with logic and data mining.

    PubMed

    Martínez Monterrubio, Sergio Mauricio; Frausto Solis, Juan; Monroy Borja, Raúl

    2015-01-01

    The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.

  10. EMRlog Method for Computer Security for Electronic Medical Records with Logic and Data Mining

    PubMed Central

    Frausto Solis, Juan; Monroy Borja, Raúl

    2015-01-01

    The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system. PMID:26495300

  11. A review of materials for spectral design coatings in signature management applications

    NASA Astrophysics Data System (ADS)

    Andersson, Kent E.; Škerlind, Christina

    2014-10-01

    The current focus in Swedish policy towards national security and high-end technical systems, together with a rapid development in multispectral sensor technology, adds to the utility of developing advanced materials for spectral design in signature management applications. A literature study was performed probing research databases for advancements. Qualitative text analysis was performed using a six-indicator instrument: spectrally selective reflectance; low gloss; low degree of polarization; low infrared emissivity; non-destructive properties in radar and in general controllability of optical properties. Trends are identified and the most interesting materials and coating designs are presented with relevant performance metrics. They are sorted into categories in the order of increasing complexity: pigments and paints, one-dimensional structures, multidimensional structures (including photonic crystals), and lastly biomimic and metamaterials. The military utility of the coatings is assessed qualitatively. The need for developing a framework for assessing the military utility of incrementally increasing the performance of spectrally selective coatings is identified.

  12. Visible School Security Measures and Student Academic Performance, Attendance, and Postsecondary Aspirations.

    PubMed

    Tanner-Smith, Emily E; Fisher, Benjamin W

    2016-01-01

    Many U.S. schools use visible security measures (security cameras, metal detectors, security personnel) in an effort to keep schools safe and promote adolescents' academic success. This study examined how different patterns of visible security utilization were associated with U.S. middle and high school students' academic performance, attendance, and postsecondary educational aspirations. The data for this study came from two large national surveys--the School Crime Supplement to the National Crime Victimization Survey (N = 38,707 students; 51% male, 77% White, MAge = 14.72) and the School Survey on Crime and Safety (N = 10,340 schools; average student composition of 50% male, 57% White). The results provided no evidence that visible security measures had consistent beneficial effects on adolescents' academic outcomes; some security utilization patterns had modest detrimental effects on adolescents' academic outcomes, particularly the heavy surveillance patterns observed in a small subset of high schools serving predominantly low socioeconomic students. The findings of this study provide no evidence that visible security measures have any sizeable effects on academic performance, attendance, or postsecondary aspirations among U.S. middle and high school students.

  13. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    PubMed

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  14. A Summary of the Naval Postgraduate School Research Program

    DTIC Science & Technology

    1989-08-30

    5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database

  15. School Security: For Whom and with What Results?

    ERIC Educational Resources Information Center

    Servoss, Timothy J.; Finn, Jeremy D.

    2014-01-01

    This study utilized school-level data from several combined national databases to address two questions regarding school security policy: (1) What are the school characteristics related to levels of security? (2) How does security relate to school suspension, dropout, and college attendance rates? Among the predictors of school security, having a…

  16. Line-scan system for continuous hand authentication

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofeng; Kong, Lingsheng; Diao, Zhihui; Jia, Ping

    2017-03-01

    An increasing number of heavy machinery and vehicles have come into service, giving rise to a significant concern over protecting these high-security systems from misuse. Conventionally, authentication performed merely at the initial login may not be sufficient for detecting intruders throughout the operating session. To address this critical security flaw, a line-scan continuous hand authentication system with the appearance of an operating rod is proposed. Given that the operating rod is occupied throughout the operating period, it can be a possible solution for unobtrusively recording the personal characteristics for continuous monitoring. The ergonomics in the physiological and psychological aspects are fully considered. Under the shape constraints, a highly integrated line-scan sensor, a controller unit, and a gear motor with encoder are utilized. This system is suitable for both the desktop and embedded platforms with a universal serial bus interface. The volume of the proposed system is smaller than 15% of current multispectral area-based camera systems. Based on experiments on a database with 4000 images from 200 volunteers, a competitive equal error rate of 0.1179% is achieved, which is far more accurate than the state-of-the-art continuous authentication systems using other modalities.

  17. Organizing a breast cancer database: data management.

    PubMed

    Yi, Min; Hunt, Kelly K

    2016-06-01

    Developing and organizing a breast cancer database can provide data and serve as valuable research tools for those interested in the etiology, diagnosis, and treatment of cancer. Depending on the research setting, the quality of the data can be a major issue. Assuring that the data collection process does not contribute inaccuracies can help to assure the overall quality of subsequent analyses. Data management is work that involves the planning, development, implementation, and administration of systems for the acquisition, storage, and retrieval of data while protecting it by implementing high security levels. A properly designed database provides you with access to up-to-date, accurate information. Database design is an important component of application design. If you take the time to design your databases properly, you'll be rewarded with a solid application foundation on which you can build the rest of your application.

  18. Review of Aircraft Crash Databases and Evaluation of the Probability of Aircraft Crashes on to a MAGLEV Guide-way: Technical Report

    DOT National Transportation Integrated Search

    1991-12-09

    The System Safety & Security Division at The Volpe National Transportation System Center (VNTSC), Cambridge, MA is participating in an overall risk assessment study on the safety of High Speed Magnetic Levitation Transportation Systems ("MagLev"). Tr...

  19. Secure UNIX socket-based controlling system for high-throughput protein crystallography experiments.

    PubMed

    Gaponov, Yurii; Igarashi, Noriyuki; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Suzuki, Mamoru; Kosuge, Takashi; Wakatsuki, Soichi

    2004-01-01

    A control system for high-throughput protein crystallography experiments has been developed based on a multilevel secure (SSL v2/v3) UNIX socket under the Linux operating system. Main features of protein crystallography experiments (purification, crystallization, loop preparation, data collecting, data processing) are dealt with by the software. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data, that are stored in Network File Server) in a relational database (MySQL). The system consists of several servers and clients. TCP/IP secure UNIX sockets with four predefined behaviors [(a) listening to a request followed by a reply, (b) sending a request and waiting for a reply, (c) listening to a broadcast message, and (d) sending a broadcast message] support communications between all servers and clients allowing one to control experiments, view data, edit experimental conditions and perform data processing remotely. The usage of the interface software is well suited for developing well organized control software with a hierarchical structure of different software units (Gaponov et al., 1998), which will pass and receive different types of information. All communication is divided into two parts: low and top levels. Large and complicated control tasks are split into several smaller ones, which can be processed by control clients independently. For communicating with experimental equipment (beamline optical elements, robots, and specialized experimental equipment etc.), the STARS server, developed at the Photon Factory, is used (Kosuge et al., 2002). The STARS server allows any application with an open socket to be connected with any other clients that control experimental equipment. Majority of the source code is written in C/C++. GUI modules of the system were built mainly using Glade user interface builder for GTK+ and Gnome under Red Hat Linux 7.1 operating system.

  20. A Hybrid Approach to Protect Palmprint Templates

    PubMed Central

    Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach. PMID:24982977

  1. A hybrid approach to protect palmprint templates.

    PubMed

    Liu, Hailun; Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach.

  2. A Molecular Framework for Understanding DCIS

    DTIC Science & Technology

    2016-10-01

    well. Pathologic and Clinical Annotation Database A clinical annotation database titled the Breast Oncology Database has been established to...complement the procured SPORE sample characteristics and annotated pathology data. This Breast Oncology Database is an offsite clinical annotation...database adheres to CSMC Enterprise Information Services (EIS) research database security standards. The Breast Oncology Database consists of: 9 Baseline

  3. Clinical records anonymisation and text extraction (CRATE): an open-source software system.

    PubMed

    Cardinal, Rudolf N

    2017-04-26

    Electronic medical records contain information of value for research, but contain identifiable and often highly sensitive confidential information. Patient-identifiable information cannot in general be shared outside clinical care teams without explicit consent, but anonymisation/de-identification allows research uses of clinical data without explicit consent. This article presents CRATE (Clinical Records Anonymisation and Text Extraction), an open-source software system with separable functions: (1) it anonymises or de-identifies arbitrary relational databases, with sensitivity and precision similar to previous comparable systems; (2) it uses public secure cryptographic methods to map patient identifiers to research identifiers (pseudonyms); (3) it connects relational databases to external tools for natural language processing; (4) it provides a web front end for research and administrative functions; and (5) it supports a specific model through which patients may consent to be contacted about research. Creation and management of a research database from sensitive clinical records with secure pseudonym generation, full-text indexing, and a consent-to-contact process is possible and practical using entirely free and open-source software.

  4. Bigdata Driven Cloud Security: A Survey

    NASA Astrophysics Data System (ADS)

    Raja, K.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.

  5. Research on computer virus database management system

    NASA Astrophysics Data System (ADS)

    Qi, Guoquan

    2011-12-01

    The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex. Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus database, the communication between each other lacks, or virus information is incomplete, or a small number of sample information. This paper introduces the current construction status of the virus database at home and abroad, analyzes how to standardize and complete description of virus characteristics, and then gives the information integrity, storage security and manageable computer virus database design scheme.

  6. Applying World Wide Web technology to the study of patients with rare diseases.

    PubMed

    de Groen, P C; Barry, J A; Schaller, W J

    1998-07-15

    Randomized, controlled trials of sporadic diseases are rarely conducted. Recent developments in communication technology, particularly the World Wide Web, allow efficient dissemination and exchange of information. However, software for the identification of patients with a rare disease and subsequent data entry and analysis in a secure Web database are currently not available. To study cholangiocarcinoma, a rare cancer of the bile ducts, we developed a computerized disease tracing system coupled with a database accessible on the Web. The tracing system scans computerized information systems on a daily basis and forwards demographic information on patients with bile duct abnormalities to an electronic mailbox. If informed consent is given, the patient's demographic and preexisting medical information available in medical database servers are electronically forwarded to a UNIX research database. Information from further patient-physician interactions and procedures is also entered into this database. The database is equipped with a Web user interface that allows data entry from various platforms (PC-compatible, Macintosh, and UNIX workstations) anywhere inside or outside our institution. To ensure patient confidentiality and data security, the database includes all security measures required for electronic medical records. The combination of a Web-based disease tracing system and a database has broad applications, particularly for the integration of clinical research within clinical practice and for the coordination of multicenter trials.

  7. Mandatory and Location-Aware Access Control for Relational Databases

    NASA Astrophysics Data System (ADS)

    Decker, Michael

    Access control is concerned with determining which operations a particular user is allowed to perform on a particular electronic resource. For example, an access control decision could say that user Alice is allowed to perform the operation read (but not write) on the resource research report. With conventional access control this decision is based on the user's identity whereas the basic idea of Location-Aware Access Control (LAAC) is to evaluate also a user's current location when making the decision if a particular request should be granted or denied. LAAC is an interesting approach for mobile information systems because these systems are exposed to specific security threads like the loss of a device. Some data models for LAAC can be found in literature, but almost all of them are based on RBAC and none of them is designed especially for Database Management Systems (DBMS). In this paper we therefore propose a LAAC-approach for DMBS and describe a prototypical implementation of that approach that is based on database triggers.

  8. Method for secure electronic voting system: face recognition based approach

    NASA Astrophysics Data System (ADS)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  9. Geodata Modeling and Query in Geographic Information Systems

    NASA Technical Reports Server (NTRS)

    Adam, Nabil

    1996-01-01

    Geographic information systems (GIS) deal with collecting, modeling, man- aging, analyzing, and integrating spatial (locational) and non-spatial (attribute) data required for geographic applications. Examples of spatial data are digital maps, administrative boundaries, road networks, and those of non-spatial data are census counts, land elevations and soil characteristics. GIS shares common areas with a number of other disciplines such as computer- aided design, computer cartography, database management, and remote sensing. None of these disciplines however, can by themselves fully meet the requirements of a GIS application. Examples of such requirements include: the ability to use locational data to produce high quality plots, perform complex operations such as network analysis, enable spatial searching and overlay operations, support spatial analysis and modeling, and provide data management functions such as efficient storage, retrieval, and modification of large datasets; independence, integrity, and security of data; and concurrent access to multiple users. It is on the data management issues that we devote our discussions in this monograph. Traditionally, database management technology have been developed for business applications. Such applications require, among other things, capturing the data requirements of high-level business functions and developing machine- level implementations; supporting multiple views of data and yet providing integration that would minimize redundancy and maintain data integrity and security; providing a high-level language for data definition and manipulation; allowing concurrent access to multiple users; and processing user transactions in an efficient manner. The demands on database management systems have been for speed, reliability, efficiency, cost effectiveness, and user-friendliness. Significant progress have been made in all of these areas over the last two decades to the point that many generalized database platforms are now available for developing data intensive applications that run in real-time. While continuous improvement is still being made at a very fast-paced and competitive rate, new application areas such as computer aided design, image processing, VLSI design, and GIS have been identified by many as the next generation of database applications. These new application areas pose serious challenges to the currently available database technology. At the core of these challenges is the nature of data that is manipulated. In traditional database applications, the database objects do not have any spatial dimension, and as such, can be thought of as point data in a multi-dimensional space. For example, each instance of an entity EMPLOYEE will have a unique value corresponding to every attribute such as employee id, employee name, employee address and so on. Thus, every Employee instance can be thought of as a point in a multi-dimensional space where each dimension is represented by an attribute. Furthermore, all operations on such data are one-dimensional. Thus, users may retrieve all entities satisfying one or more constraints. Examples of such constraints include employees with addresses in a certain area code, or salaries within a certain range. Even though constraints can be specified on multiple attributes (dimensions), the search for such data is essentially orthogonal across these dimensions.

  10. Using National Databases To Study the College Choice of Low-SES Students. AIR 2000 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Cabrera, Alberto F.; La Nasa, Steven M.

    This study investigated how economically and sociologically underprivileged students readied themselves for college, highlighting factors affecting the lowest socioeconomic status (SES) students' chances to: secure college qualifications, graduate from high school, and apply to four-year institutions. Data from the 1998 National Educational…

  11. From Fault-Diagnosis and Performance Recovery of a Controlled System to Chaotic Secure Communication

    NASA Astrophysics Data System (ADS)

    Hsu, Wen-Teng; Tsai, Jason Sheng-Hong; Guo, Fang-Cheng; Guo, Shu-Mei; Shieh, Leang-San

    Chaotic systems are often applied to encryption on secure communication, but they may not provide high-degree security. In order to improve the security of communication, chaotic systems may need to add other secure signals, but this may cause the system to diverge. In this paper, we redesign a communication scheme that could create secure communication with additional secure signals, and the proposed scheme could keep system convergence. First, we introduce the universal state-space adaptive observer-based fault diagnosis/estimator and the high-performance tracker for the sampled-data linear time-varying system with unanticipated decay factors in actuators/system states. Besides, robustness, convergence in the mean, and tracking ability are given in this paper. A residual generation scheme and a mechanism for auto-tuning switched gain is also presented, so that the introduced methodology is applicable for the fault detection and diagnosis (FDD) for actuator and state faults to yield a high tracking performance recovery. The evolutionary programming-based adaptive observer is then applied to the problem of secure communication. Whenever the tracker induces a large control input which might not conform to the input constraint of some physical systems, the proposed modified linear quadratic optimal tracker (LQT) can effectively restrict the control input within the specified constraint interval, under the acceptable tracking performance. The effectiveness of the proposed design methodology is illustrated through tracking control simulation examples.

  12. Evaluation of Accelerometer-Based Fall Detection Algorithms on Real-World Falls

    PubMed Central

    Bagalà, Fabio; Becker, Clemens; Cappello, Angelo; Chiari, Lorenzo; Aminian, Kamiar; Hausdorff, Jeffrey M.; Zijlstra, Wiebren; Klenk, Jochen

    2012-01-01

    Despite extensive preventive efforts, falls continue to be a major source of morbidity and mortality among elderly. Real-time detection of falls and their urgent communication to a telecare center may enable rapid medical assistance, thus increasing the sense of security of the elderly and reducing some of the negative consequences of falls. Many different approaches have been explored to automatically detect a fall using inertial sensors. Although previously published algorithms report high sensitivity (SE) and high specificity (SP), they have usually been tested on simulated falls performed by healthy volunteers. We recently collected acceleration data during a number of real-world falls among a patient population with a high-fall-risk as part of the SensAction-AAL European project. The aim of the present study is to benchmark the performance of thirteen published fall-detection algorithms when they are applied to the database of 29 real-world falls. To the best of our knowledge, this is the first systematic comparison of fall detection algorithms tested on real-world falls. We found that the SP average of the thirteen algorithms, was (mean±std) 83.0%±30.3% (maximum value = 98%). The SE was considerably lower (SE = 57.0%±27.3%, maximum value = 82.8%), much lower than the values obtained on simulated falls. The number of false alarms generated by the algorithms during 1-day monitoring of three representative fallers ranged from 3 to 85. The factors that affect the performance of the published algorithms, when they are applied to the real-world falls, are also discussed. These findings indicate the importance of testing fall-detection algorithms in real-life conditions in order to produce more effective automated alarm systems with higher acceptance. Further, the present results support the idea that a large, shared real-world fall database could, potentially, provide an enhanced understanding of the fall process and the information needed to design and evaluate a high-performance fall detector. PMID:22615890

  13. [Review of food policy approaches: from food security to food sovereignty (2000-2013)].

    PubMed

    López-Giraldo, Luis Alirio; Franco-Giraldo, Álvaro

    2015-07-01

    Food policies have attracted special interest due to the global food crisis in 2008 and promotion of the Millennium Development Goals, leading to approaches by different fields. This thematic review aims to describe the main theoretical and methodological approaches to food security and food sovereignty policies. A search was performed in databases of scientific journals from 2000 to 2013. 320 complete articles were selected from a total of 2,699. After reading the articles to apply the inclusion criteria, 55 items were maintained for analysis. In conclusion, with the predominance of food security as a guiding policy, food sovereignty has emerged as a critical response to be included in designing and researching food policies. Food policies are essential for achieving public health goals. Public health should thus take a leading role in linking and orienting such policies.

  14. A Partnership for Public Health: USDA Branded Food Products Database

    USDA-ARS?s Scientific Manuscript database

    The importance of comprehensive food composition databases is more critical than ever in helping to address global food security. The USDA National Nutrient Database for Standard Reference is the “gold standard” for food composition databases. The presentation will include new developments in stren...

  15. Achieving the four dimensions of food security for resettled refugees in Australia: A systematic review.

    PubMed

    Lawlis, Tanya; Islam, Wasima; Upton, Penney

    2018-04-01

    Food security is defined by four dimensions: food availability, access, utilisation and stability. Resettled refugees face unique struggles securing these dimensions and, thus, food security when moving to a new country. This systematic review aimed to identify the challenges Australian refugees experience in achieving the four dimensions of food security. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed; the SPIDER tool was used to determine eligibility criteria. Three databases were searched using terms relating to food in/security and refugees from 2000 to 20 May 2017. Seven articles were retained for analysis. Studies were categorised against the four dimensions, with four studies identifying challenges against all dimensions. Challenges contributing to high levels of food insecurity in each dimension included: availability and cost of traditional foods, difficulty in accessing preferred food outlets, limited food knowledge and preparation skills and food stability due to low income and social support. Food insecurity adversely impacts refugee health and integration. Methodical research framed by the four dimensions of food security is imperative to address challenges to securing food security in refugee groups and assisting in the development of sustainable interventions. © 2017 Dietitians Association of Australia.

  16. A scalable database model for multiparametric time series: a volcano observatory case study

    NASA Astrophysics Data System (ADS)

    Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea

    2014-05-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  17. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  18. Application of advanced data collection and quality assurance methods in open prospective study - a case study of PONS project.

    PubMed

    Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A

    2011-01-01

    Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical measurements.

  19. Utilization of a Clinical Trial Management System for the Whole Clinical Trial Process as an Integrated Database: System Development

    PubMed Central

    Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho

    2018-01-01

    Background Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. Objective The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. Methods This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. Results In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. Conclusions The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. PMID:29691212

  20. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases.

    PubMed

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-07-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.

  1. Interactive analysis of geographically distributed population imaging data collections over light-path data networks

    NASA Astrophysics Data System (ADS)

    van Lew, Baldur; Botha, Charl P.; Milles, Julien R.; Vrooman, Henri A.; van de Giessen, Martijn; Lelieveldt, Boudewijn P. F.

    2015-03-01

    The cohort size required in epidemiological imaging genetics studies often mandates the pooling of data from multiple hospitals. Patient data, however, is subject to strict privacy protection regimes, and physical data storage may be legally restricted to a hospital network. To enable biomarker discovery, fast data access and interactive data exploration must be combined with high-performance computing resources, while respecting privacy regulations. We present a system using fast and inherently secure light-paths to access distributed data, thereby obviating the need for a central data repository. A secure private cloud computing framework facilitates interactive, computationally intensive exploration of this geographically distributed, privacy sensitive data. As a proof of concept, MRI brain imaging data hosted at two remote sites were processed in response to a user command at a third site. The system was able to automatically start virtual machines, run a selected processing pipeline and write results to a user accessible database, while keeping data locally stored in the hospitals. Individual tasks took approximately 50% longer compared to a locally hosted blade server but the cloud infrastructure reduced the total elapsed time by a factor of 40 using 70 virtual machines in the cloud. We demonstrated that the combination light-path and private cloud is a viable means of building an analysis infrastructure for secure data analysis. The system requires further work in the areas of error handling, load balancing and secure support of multiple users.

  2. 6 CFR 37.33 - DMV databases.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 6 Domestic Security 1 2012-01-01 2012-01-01 false DMV databases. 37.33 Section 37.33 Domestic... IDENTIFICATION CARDS Other Requirements § 37.33 DMV databases. (a) States must maintain a State motor vehicle database that contains, at a minimum— (1) All data fields printed on driver's licenses and identification...

  3. 6 CFR 37.33 - DMV databases.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false DMV databases. 37.33 Section 37.33 Domestic... IDENTIFICATION CARDS Other Requirements § 37.33 DMV databases. (a) States must maintain a State motor vehicle database that contains, at a minimum— (1) All data fields printed on driver's licenses and identification...

  4. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  5. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  6. 6 CFR 37.33 - DMV databases.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 6 Domestic Security 1 2014-01-01 2014-01-01 false DMV databases. 37.33 Section 37.33 Domestic... IDENTIFICATION CARDS Other Requirements § 37.33 DMV databases. (a) States must maintain a State motor vehicle database that contains, at a minimum— (1) All data fields printed on driver's licenses and identification...

  7. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  8. 42 CFR 455.436 - Federal database checks.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Federal database checks. 455.436 Section 455.436....436 Federal database checks. The State Medicaid agency must do all of the following: (a) Confirm the... databases. (b) Check the Social Security Administration's Death Master File, the National Plan and Provider...

  9. 6 CFR 37.33 - DMV databases.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 6 Domestic Security 1 2013-01-01 2013-01-01 false DMV databases. 37.33 Section 37.33 Domestic... IDENTIFICATION CARDS Other Requirements § 37.33 DMV databases. (a) States must maintain a State motor vehicle database that contains, at a minimum— (1) All data fields printed on driver's licenses and identification...

  10. 75 FR 61553 - National Transit Database: Amendments to the Urbanized Area Annual Reporting Manual and to the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-05

    ... Transit Database: Amendments to the Urbanized Area Annual Reporting Manual and to the Safety and Security... the 2011 National Transit Database Urbanized Area Annual Reporting Manual and Announcement of... Transit Administration's (FTA) National Transit Database (NTD) reporting requirements, including...

  11. 6 CFR 37.33 - DMV databases.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 6 Domestic Security 1 2011-01-01 2011-01-01 false DMV databases. 37.33 Section 37.33 Domestic... IDENTIFICATION CARDS Other Requirements § 37.33 DMV databases. (a) States must maintain a State motor vehicle database that contains, at a minimum— (1) All data fields printed on driver's licenses and identification...

  12. Alignment and bit extraction for secure fingerprint biometrics

    NASA Astrophysics Data System (ADS)

    Nagar, A.; Rane, S.; Vetro, A.

    2010-01-01

    Security of biometric templates stored in a system is important because a stolen template can compromise system security as well as user privacy. Therefore, a number of secure biometrics schemes have been proposed that facilitate matching of feature templates without the need for a stored biometric sample. However, most of these schemes suffer from poor matching performance owing to the difficulty of designing biometric features that remain robust over repeated biometric measurements. This paper describes a scheme to extract binary features from fingerprints using minutia points and fingerprint ridges. The features are amenable to direct matching based on binary Hamming distance, but are especially suitable for use in secure biometric cryptosystems that use standard error correcting codes. Given all binary features, a method for retaining only the most discriminable features is presented which improves the Genuine Accept Rate (GAR) from 82% to 90% at a False Accept Rate (FAR) of 0.1% on a well-known public database. Additionally, incorporating singular points such as a core or delta feature is shown to improve the matching tradeoff.

  13. Improving the security of international ISO container traffic by centralizing the archival of inspection results

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2004-09-01

    To increase the security and throughput of ISO traffic through international terminals more technology must be applied to the problem. A transnational central archive of inspection records is discussed that can be accessed by national agencies as ISO containers approach their borders. The intent is to improve the throughput and security of the cargo inspection process. A review of currently available digital media archiving technologies is presented and their possible application to the tracking of international ISO container shipments. Specific image formats employed by current x-ray inspection systems are discussed. Sample x-ray data from systems in use today are shown that could be entered into such a system. Data from other inspection technologies are shown to be easily integrated, as well as the creation of database records suitable for interfacing with other computer systems. Overall system performance requirements are discussed in terms of security, response time and capacity. Suggestions for pilot projects based on existing border inspection processes are made also.

  14. KernPaeP - a web-based pediatric palliative documentation system for home care.

    PubMed

    Hartz, Tobias; Verst, Hendrik; Ueckert, Frank

    2009-01-01

    KernPaeP is a new web-based on- and offline documentation system, which has been developed for pediatric palliative care-teams supporting patient documentation and communication among health care professionals. It provides a reliable system making fast and secure home care documentation possible. KernPaeP is accessible online by registered users using any web-browser. Home care teams use an offline version of KernPaeP running on a netbook for patient documentation on site. Identifying and medical patient data are strictly separated and stored on two database servers. The system offers a stable, enhanced two-way algorithm for synchronization between the offline component and the central database servers. KernPaeP is implemented meeting highest security standards while still maintaining high usability. The web-based documentation system allows ubiquitous and immediate access to patient data. Sumptuous paper work is replaced by secure and comprehensive electronic documentation. KernPaeP helps saving time and improving the quality of documentation. Due to development in close cooperation with pediatric palliative professionals, KernPaeP fulfils the broad needs of home-care documentation. The technique of web-based online and offline documentation is in general applicable for arbitrary home care scenarios.

  15. Shape-based human detection for threat assessment

    NASA Astrophysics Data System (ADS)

    Lee, Dah-Jye; Zhan, Pengcheng; Thomas, Aaron; Schoenberger, Robert B.

    2004-07-01

    Detection of intrusions for early threat assessment requires the capability of distinguishing whether the intrusion is a human, an animal, or other objects. Most low-cost security systems use simple electronic motion detection sensors to monitor motion or the location of objects within the perimeter. Although cost effective, these systems suffer from high rates of false alarm, especially when monitoring open environments. Any moving objects including animals can falsely trigger the security system. Other security systems that utilize video equipment require human interpretation of the scene in order to make real-time threat assessment. Shape-based human detection technique has been developed for accurate early threat assessments for open and remote environment. Potential threats are isolated from the static background scene using differential motion analysis and contours of the intruding objects are extracted for shape analysis. Contour points are simplified by removing redundant points connecting short and straight line segments and preserving only those with shape significance. Contours are represented in tangent space for comparison with shapes stored in database. Power cepstrum technique has been developed to search for the best matched contour in database and to distinguish a human from other objects from different viewing angles and distances.

  16. Fine-grained Database Field Search Using Attribute-Based Encryption for E-Healthcare Clouds.

    PubMed

    Guo, Cheng; Zhuang, Ruhan; Jie, Yingmo; Ren, Yizhi; Wu, Ting; Choo, Kim-Kwang Raymond

    2016-11-01

    An effectively designed e-healthcare system can significantly enhance the quality of access and experience of healthcare users, including facilitating medical and healthcare providers in ensuring a smooth delivery of services. Ensuring the security of patients' electronic health records (EHRs) in the e-healthcare system is an active research area. EHRs may be outsourced to a third-party, such as a community healthcare cloud service provider for storage due to cost-saving measures. Generally, encrypting the EHRs when they are stored in the system (i.e. data-at-rest) or prior to outsourcing the data is used to ensure data confidentiality. Searchable encryption (SE) scheme is a promising technique that can ensure the protection of private information without compromising on performance. In this paper, we propose a novel framework for controlling access to EHRs stored in semi-trusted cloud servers (e.g. a private cloud or a community cloud). To achieve fine-grained access control for EHRs, we leverage the ciphertext-policy attribute-based encryption (CP-ABE) technique to encrypt tables published by hospitals, including patients' EHRs, and the table is stored in the database with the primary key being the patient's unique identity. Our framework can enable different users with different privileges to search on different database fields. Differ from previous attempts to secure outsourcing of data, we emphasize the control of the searches of the fields within the database. We demonstrate the utility of the scheme by evaluating the scheme using datasets from the University of California, Irvine.

  17. 75 FR 39290 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-08

    ... intermediary. The number of custodians is from Lipper Inc.'s Lana Database. Securities depositories include the... SECURITIES AND EXCHANGE COMMISSION Submission for OMB Review; Comment Request Upon Written Request, Copies Available From: Securities and Exchange Commission, Office of Investor Education and Advocacy...

  18. Secure Genomic Computation through Site-Wise Encryption

    PubMed Central

    Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu

    2015-01-01

    Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients’ genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds. PMID:26306278

  19. Secure Genomic Computation through Site-Wise Encryption.

    PubMed

    Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu

    2015-01-01

    Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients' genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds.

  20. National Vulnerability Database (NVD)

    National Institute of Standards and Technology Data Gateway

    National Vulnerability Database (NVD) (Web, free access)   NVD is a comprehensive cyber security vulnerability database that integrates all publicly available U.S. Government vulnerability resources and provides references to industry resources. It is based on and synchronized with the CVE vulnerability naming standard.

  1. Disability pension from back pain among social security beneficiaries, Brazil.

    PubMed

    Meziat Filho, Ney; Silva, Gulnar Azevedo E

    2011-06-01

    To describe disability pension from back pain. Descriptive study based on data from the Brazilian Social Security Beneficiary Database and the Social Security Statistics Annual Report in 2007. The incidence rate of disability pension from back pain was estimated according to gender and age by Brazilian states. There were also estimated working days lost due to back pain disability by occupation. Idiopathic back pain was the most common cause of disability among social security pension and accidental retirement. Most pensioners were living in urban areas and were commercial workers. The rate of disability pension from back pain in Brazil was 29.96 per 100,000 beneficiaries. A higher rate was seen among males and older individuals. Rondônia showed the highest rate, four times as high as expected (RR= 4.05) followed by Bahia with a rate about twice as high as expected (RR=2.07). Commercial workers accounted for 96.9% of working days lost due to disability. Back pain was a major cause of disability in 2007 mostly among commercial workers showing great differences between the Brazilian states.

  2. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    PubMed Central

    Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services. PMID:29375652

  3. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services.

    PubMed

    Chrimes, Dillon; Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  4. Report on Legal Protection for Databases. A Report of the Register of Copyrights. August, 1997.

    ERIC Educational Resources Information Center

    Library of Congress, Washington, DC. Copyright Office.

    This report gives an overview of the past and present domestic and international legal framework for database protection. It describes database industry practices in securing protection against unauthorized use and Copyright Office registration practices relating to databases. Finally, it discusses issues raised and concerns expressed in a series…

  5. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security... Integrity Act of 1982, Public Law 97-255, provided authority for the system. The ATF database has been...

  6. Strategies to Minimize the Effects of Information Security Threats on Business Performance

    ERIC Educational Resources Information Center

    Okoye, Stella Ifeyinwa

    2017-01-01

    Business leaders in Nigeria are concerned about the high rates of business failure and economic loss from security incidents and may not understand strategies for reducing the effects of information security threats on business performance. Guided by general systems theory and transformational leadership theory, the focus of this exploratory…

  7. Development and evaluation of a de-identification procedure for a case register sourced from mental health electronic records.

    PubMed

    Fernandes, Andrea C; Cloete, Danielle; Broadbent, Matthew T M; Hayes, Richard D; Chang, Chin-Kuo; Jackson, Richard G; Roberts, Angus; Tsang, Jason; Soncul, Murat; Liebscher, Jennifer; Stewart, Robert; Callard, Felicity

    2013-07-11

    Electronic health records (EHRs) provide enormous potential for health research but also present data governance challenges. Ensuring de-identification is a pre-requisite for use of EHR data without prior consent. The South London and Maudsley NHS Trust (SLaM), one of the largest secondary mental healthcare providers in Europe, has developed, from its EHRs, a de-identified psychiatric case register, the Clinical Record Interactive Search (CRIS), for secondary research. We describe development, implementation and evaluation of a bespoke de-identification algorithm used to create the register. It is designed to create dictionaries using patient identifiers (PIs) entered into dedicated source fields and then identify, match and mask them (with ZZZZZ) when they appear in medical texts. We deemed this approach would be effective, given high coverage of PI in the dedicated fields and the effectiveness of the masking combined with elements of a security model. We conducted two separate performance tests i) to test performance of the algorithm in masking individual true PIs entered in dedicated fields and then found in text (using 500 patient notes) and ii) to compare the performance of the CRIS pattern matching algorithm with a machine learning algorithm, called the MITRE Identification Scrubber Toolkit - MIST (using 70 patient notes - 50 notes to train, 20 notes to test on). We also report any incidences of potential breaches, defined by occurrences of 3 or more true or apparent PIs in the same patient's notes (and in an additional set of longitudinal notes for 50 patients); and we consider the possibility of inferring information despite de-identification. True PIs were masked with 98.8% precision and 97.6% recall. As anticipated, potential PIs did appear, owing to misspellings entered within the EHRs. We found one potential breach. In a separate performance test, with a different set of notes, CRIS yielded 100% precision and 88.5% recall, while MIST yielded a 95.1% and 78.1%, respectively. We discuss how we overcome the realistic possibility - albeit of low probability - of potential breaches through implementation of the security model. CRIS is a de-identified psychiatric database sourced from EHRs, which protects patient anonymity and maximises data available for research. CRIS demonstrates the advantage of combining an effective de-identification algorithm with a carefully designed security model. The paper advances much needed discussion of EHR de-identification - particularly in relation to criteria to assess de-identification, and considering the contexts of de-identified research databases when assessing the risk of breaches of confidential patient information.

  8. Protecting Database Centric Web Services against SQL/XPath Injection Attacks

    NASA Astrophysics Data System (ADS)

    Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique

    Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.

  9. 77 FR 49475 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-16

    .... Registration also allows entities in the securities industry to gain access to a confidential database that... SECURITIES AND EXCHANGE COMMISSION Submission for OMB Review; Comment Request Upon Written Request, Copies Available From: U.S. Securities and Exchange Commission, Office of Investor Education and Advocacy...

  10. Addition of a breeding database in the Genome Database for Rosaceae

    PubMed Central

    Evans, Kate; Jung, Sook; Lee, Taein; Brutcher, Lisa; Cho, Ilhyung; Peace, Cameron; Main, Dorrie

    2013-01-01

    Breeding programs produce large datasets that require efficient management systems to keep track of performance, pedigree, geographical and image-based data. With the development of DNA-based screening technologies, more breeding programs perform genotyping in addition to phenotyping for performance evaluation. The integration of breeding data with other genomic and genetic data is instrumental for the refinement of marker-assisted breeding tools, enhances genetic understanding of important crop traits and maximizes access and utility by crop breeders and allied scientists. Development of new infrastructure in the Genome Database for Rosaceae (GDR) was designed and implemented to enable secure and efficient storage, management and analysis of large datasets from the Washington State University apple breeding program and subsequently expanded to fit datasets from other Rosaceae breeders. The infrastructure was built using the software Chado and Drupal, making use of the Natural Diversity module to accommodate large-scale phenotypic and genotypic data. Breeders can search accessions within the GDR to identify individuals with specific trait combinations. Results from Search by Parentage lists individuals with parents in common and results from Individual Variety pages link to all data available on each chosen individual including pedigree, phenotypic and genotypic information. Genotypic data are searchable by markers and alleles; results are linked to other pages in the GDR to enable the user to access tools such as GBrowse and CMap. This breeding database provides users with the opportunity to search datasets in a fully targeted manner and retrieve and compare performance data from multiple selections, years and sites, and to output the data needed for variety release publications and patent applications. The breeding database facilitates efficient program management. Storing publicly available breeding data in a database together with genomic and genetic data will further accelerate the cross-utilization of diverse data types by researchers from various disciplines. Database URL: http://www.rosaceae.org/breeders_toolbox PMID:24247530

  11. Addition of a breeding database in the Genome Database for Rosaceae.

    PubMed

    Evans, Kate; Jung, Sook; Lee, Taein; Brutcher, Lisa; Cho, Ilhyung; Peace, Cameron; Main, Dorrie

    2013-01-01

    Breeding programs produce large datasets that require efficient management systems to keep track of performance, pedigree, geographical and image-based data. With the development of DNA-based screening technologies, more breeding programs perform genotyping in addition to phenotyping for performance evaluation. The integration of breeding data with other genomic and genetic data is instrumental for the refinement of marker-assisted breeding tools, enhances genetic understanding of important crop traits and maximizes access and utility by crop breeders and allied scientists. Development of new infrastructure in the Genome Database for Rosaceae (GDR) was designed and implemented to enable secure and efficient storage, management and analysis of large datasets from the Washington State University apple breeding program and subsequently expanded to fit datasets from other Rosaceae breeders. The infrastructure was built using the software Chado and Drupal, making use of the Natural Diversity module to accommodate large-scale phenotypic and genotypic data. Breeders can search accessions within the GDR to identify individuals with specific trait combinations. Results from Search by Parentage lists individuals with parents in common and results from Individual Variety pages link to all data available on each chosen individual including pedigree, phenotypic and genotypic information. Genotypic data are searchable by markers and alleles; results are linked to other pages in the GDR to enable the user to access tools such as GBrowse and CMap. This breeding database provides users with the opportunity to search datasets in a fully targeted manner and retrieve and compare performance data from multiple selections, years and sites, and to output the data needed for variety release publications and patent applications. The breeding database facilitates efficient program management. Storing publicly available breeding data in a database together with genomic and genetic data will further accelerate the cross-utilization of diverse data types by researchers from various disciplines. Database URL: http://www.rosaceae.org/breeders_toolbox.

  12. Monitoring SLAC High Performance UNIX Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC

    2005-12-15

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia.more » Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.« less

  13. High Performance Semantic Factoring of Giga-Scale Semantic Graph Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Adolf, Robert D.; Al-Saffar, Sinan

    2010-10-04

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to bring high performance computational resources to bear on their analysis, interpretation, and visualization, especially with respect to their innate semantic structure. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multithreaded architecture of the Cray XMT platform, conventional clusters, and large data stores. In this paper we describe that architecture, and present the results of our deployingmore » that for the analysis of the Billion Triple dataset with respect to its semantic factors.« less

  14. Changes in Exercise Data Management

    NASA Technical Reports Server (NTRS)

    Buxton, R. E.; Kalogera, K. L.; Hanson, A. M.

    2018-01-01

    The suite of exercise hardware aboard the International Space Station (ISS) generates an immense amount of data. The data collected from the treadmill, cycle ergometer, and resistance strength training hardware are basic exercise parameters (time, heart rate, speed, load, etc.). The raw data are post processed in the laboratory and more detailed parameters are calculated from each exercise data file. Updates have recently been made to how this valuable data are stored, adding an additional level of data security, increasing data accessibility, and resulting in overall increased efficiency of medical report delivery. Questions regarding exercise performance or how exercise may influence other variables of crew health frequently arise within the crew health care community. Inquiries over the health of the exercise hardware often need quick analysis and response to ensure the exercise system is operable on a continuous basis. Consolidating all of the exercise system data in a single repository enables a quick response to both the medical and engineering communities. A SQL server database is currently in use, and provides a secure location for all of the exercise data starting at ISS Expedition 1 - current day. The database has been structured to update derived metrics automatically, making analysis and reporting available within minutes of dropping the inflight data it into the database. Commercial tools were evaluated to help aggregate and visualize data from the SQL database. The Tableau software provides manageable interface, which has improved the laboratory's output time of crew reports by 67%. Expansion of the SQL database to be inclusive of additional medical requirement metrics, addition of 'app-like' tools for mobile visualization, and collaborative use (e.g. operational support teams, research groups, and International Partners) of the data system is currently being explored.

  15. Practical private database queries based on a quantum-key-distribution protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakobi, Markus; Humboldt-Universitaet zu Berlin, D-10117 Berlin; Simon, Christoph

    2011-02-15

    Private queries allow a user, Alice, to learn an element of a database held by a provider, Bob, without revealing which element she is interested in, while limiting her information about the other elements. We propose to implement private queries based on a quantum-key-distribution protocol, with changes only in the classical postprocessing of the key. This approach makes our scheme both easy to implement and loss tolerant. While unconditionally secure private queries are known to be impossible, we argue that an interesting degree of security can be achieved by relying on fundamental physical principles instead of unverifiable security assumptions inmore » order to protect both the user and the database. We think that the scope exists for such practical private queries to become another remarkable application of quantum information in the footsteps of quantum key distribution.« less

  16. Loss-tolerant measurement-device-independent quantum private queries.

    PubMed

    Zhao, Liang-Yuan; Yin, Zhen-Qiang; Chen, Wei; Qian, Yong-Jun; Zhang, Chun-Mei; Guo, Guang-Can; Han, Zheng-Fu

    2017-01-04

    Quantum private queries (QPQ) is an important cryptography protocol aiming to protect both the user's and database's privacy when the database is queried privately. Recently, a variety of practical QPQ protocols based on quantum key distribution (QKD) have been proposed. However, for QKD-based QPQ the user's imperfect detectors can be subjected to some detector- side-channel attacks launched by the dishonest owner of the database. Here, we present a simple example that shows how the detector-blinding attack can damage the security of QKD-based QPQ completely. To remove all the known and unknown detector side channels, we propose a solution of measurement-device-independent QPQ (MDI-QPQ) with single- photon sources. The security of the proposed protocol has been analyzed under some typical attacks. Moreover, we prove that its security is completely loss independent. The results show that practical QPQ will remain the same degree of privacy as before even with seriously uncharacterized detectors.

  17. Steganography in arrhythmic electrocardiogram signal.

    PubMed

    Edward Jero, S; Ramu, Palaniappan; Ramakrishnan, S

    2015-08-01

    Security and privacy of patient data is a vital requirement during exchange/storage of medical information over communication network. Steganography method hides patient data into a cover signal to prevent unauthenticated accesses during data transfer. This study evaluates the performance of ECG steganography to ensure secured transmission of patient data where an abnormal ECG signal is used as cover signal. The novelty of this work is to hide patient data into two dimensional matrix of an abnormal ECG signal using Discrete Wavelet Transform and Singular Value Decomposition based steganography method. A 2D ECG is constructed according to Tompkins QRS detection algorithm. The missed R peaks are computed using RR interval during 2D conversion. The abnormal ECG signals are obtained from the MIT-BIH arrhythmia database. Metrics such as Peak Signal to Noise Ratio, Percentage Residual Difference, Kullback-Leibler distance and Bit Error Rate are used to evaluate the performance of the proposed approach.

  18. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aderholdt, Ferrol; Caldwell, Blake A.; Hicks, Susan Elaine

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges formore » the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.« less

  19. Security in the CernVM File System and the Frontier Distributed Database Caching System

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  20. Research on high availability architecture of SQL and NoSQL

    NASA Astrophysics Data System (ADS)

    Wang, Zhiguo; Wei, Zhiqiang; Liu, Hao

    2017-03-01

    With the advent of the era of big data, amount and importance of data have increased dramatically. SQL database develops in performance and scalability, but more and more companies tend to use NoSQL database as their databases, because NoSQL database has simpler data model and stronger extension capacity than SQL database. Almost all database designers including SQL database and NoSQL database aim to improve performance and ensure availability by reasonable architecture which can reduce the effects of software failures and hardware failures, so that they can provide better experiences for their customers. In this paper, I mainly discuss the architectures of MySQL, MongoDB, and Redis, which are high available and have been deployed in practical application environment, and design a hybrid architecture.

  1. A Web-Based GIS for Reporting Water Usage in the High Plains Underground Water Conservation District

    NASA Astrophysics Data System (ADS)

    Jia, M.; Deeds, N.; Winckler, M.

    2012-12-01

    The High Plains Underground Water Conservation District (HPWD) is the largest and oldest of the Texas water conservation districts, and oversees approximately 1.7 million irrigated acres. Recent rule changes have motivated HPWD to develop a more automated system to allow owners and operators to report well locations, meter locations, meter readings, the association between meters and wells, and contiguous acres. INTERA, Inc. has developed a web-based interactive system for HPWD water users to report water usage and for the district to better manage its water resources. The HPWD web management system utilizes state-of-the-art GIS techniques, including cloud-based Amazon EC2 virtual machine, ArcGIS Server, ArcSDE and ArcGIS Viewer for Flex, to support web-based water use management. The system enables users to navigate to their area of interest using a well-established base-map and perform a variety of operations and inquiries against their spatial features. The application currently has six components: user privilege management, property management, water meter registration, area registration, meter-well association and water use report. The system is composed of two main databases: spatial database and non-spatial database. With the help of Adobe Flex application at the front end and ArcGIS Server as the middle-ware, the spatial feature geometry and attributes update will be reflected immediately in the back end. As a result, property owners, along with the HPWD staff, collaborate together to weave the fabric of the spatial database. Interactions between the spatial and non-spatial databases are established by Windows Communication Foundation (WCF) services to record water-use report, user-property associations, owner-area associations, as well as meter-well associations. Mobile capabilities will be enabled in the near future for field workers to collect data and synchronize them to the spatial database. The entire solution is built on a highly scalable cloud server to dynamically allocate the computational resources so as to reduce the cost on security and hardware maintenance. In addition to the default capabilities provided by ESRI, customizations include 1) enabling interactions between spatial and non-spatial databases, 2) providing role-based feature editing, 3) dynamically filtering spatial features on the map based on user accounts and 4) comprehensive data validation.

  2. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases

    PubMed Central

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-01-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org. PMID:21632604

  3. MV-OPES: Multivalued-Order Preserving Encryption Scheme: A Novel Scheme for Encrypting Integer Value to Many Different Values

    NASA Astrophysics Data System (ADS)

    Kadhem, Hasan; Amagasa, Toshiyuki; Kitagawa, Hiroyuki

    Encryption can provide strong security for sensitive data against inside and outside attacks. This is especially true in the “Database as Service” model, where confidentiality and privacy are important issues for the client. In fact, existing encryption approaches are vulnerable to a statistical attack because each value is encrypted to another fixed value. This paper presents a novel database encryption scheme called MV-OPES (Multivalued — Order Preserving Encryption Scheme), which allows privacy-preserving queries over encrypted databases with an improved security level. Our idea is to encrypt a value to different multiple values to prevent statistical attacks. At the same time, MV-OPES preserves the order of the integer values to allow comparison operations to be directly applied on encrypted data. Using calculated distance (range), we propose a novel method that allows a join query between relations based on inequality over encrypted values. We also present techniques to offload query execution load to a database server as much as possible, thereby making a better use of server resources in a database outsourcing environment. Our scheme can easily be integrated with current database systems as it is designed to work with existing indexing structures. It is robust against statistical attack and the estimation of true values. MV-OPES experiments show that security for sensitive data can be achieved with reasonable overhead, establishing the practicability of the scheme.

  4. 75 FR 23311 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-03

    ... instead of using an intermediary. The number of custodians is from Lipper Inc.'s Lana Database. Securities... SECURITIES AND EXCHANGE COMMISSION [Rule 17f-4; SEC File No. 270-232; OMB Control No. 3235-0225] Proposed Collection; Comment Request Upon Written Request, Copies Available From: Securities and Exchange...

  5. 78 FR 55274 - Privacy Act of 1974; Department of Homeland Security/Transportation Security Administration-DHS...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-10

    ... enforcement, immigration, and intelligence databases, including a fingerprint-based criminal history records... boarding pass printing instruction. If the passenger's identifying information matches the entry on the TSA... enforcement, immigration, intelligence, or other homeland security functions. In addition, TSA may share...

  6. 39 CFR 501.11 - Reporting Postage Evidencing System security weaknesses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... postal administration; or has been submitted for approval by the provider to the Postal Service or other foreign postal administration(s). (2) All potential security weaknesses or methods of tampering with the... security breaches of the Computerized Meter Resetting System (CMRS) or databases housing confidential...

  7. 39 CFR 501.11 - Reporting Postage Evidencing System security weaknesses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... postal administration; or has been submitted for approval by the provider to the Postal Service or other foreign postal administration(s). (2) All potential security weaknesses or methods of tampering with the... security breaches of the Computerized Meter Resetting System (CMRS) or databases housing confidential...

  8. A dedicated database system for handling multi-level data in systems biology.

    PubMed

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  9. Outsourcing Security Services for Low Performance Portable Devices

    NASA Astrophysics Data System (ADS)

    Szentgyörgyi, Attila; Korn, András

    The number of portable devices using wireless network technologies is on the rise. Some of these devices are incapable of, or at a disadvantage at using secure Internet services, because secure communication often requires comparatively high computing capacity. In this paper, we propose a solution which can be used to offer secure network services for low performance portable devices without severely degrading data transmission rates. We also show that using our approach these devices can utilize some secure network services which were so far unavailable to them due to a lack of software support. In order to back up our claims, we present performance measurement results obtained in a test network.

  10. A New Approach To Secure Federated Information Bases Using Agent Technology.

    ERIC Educational Resources Information Center

    Weippi, Edgar; Klug, Ludwig; Essmayr, Wolfgang

    2003-01-01

    Discusses database agents which can be used to establish federated information bases by integrating heterogeneous databases. Highlights include characteristics of federated information bases, including incompatible database management systems, schemata, and frequently changing context; software agent technology; Java agents; system architecture;…

  11. Image Hashes as Templates for Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less

  12. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  13. 2017 Joint Annual NDIA/AIA Industrial Security Committee Fall Conference

    DTIC Science & Technology

    2017-11-15

    beyond credit data to offer the insights that government professionals need to make informed decisions and ensure citizen safety, manage compliance...business that provides information technology and professional services. We specialize in managing business processes and systems integration for both... Information Security System ISFD Industrial Security Facilities Database OBMS ODAA Business Management System STEPP Security, Training, Education and

  14. Cyber Incidents Involving Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert J. Turk

    2005-10-01

    The Analysis Function of the US-CERT Control Systems Security Center (CSSC) at the Idaho National Laboratory (INL) has prepared this report to document cyber security incidents for use by the CSSC. The description and analysis of incidents reported herein support three CSSC tasks: establishing a business case; increasing security awareness and private and corporate participation related to enhanced cyber security of control systems; and providing informational material to support model development and prioritize activities for CSSC. The stated mission of CSSC is to reduce vulnerability of critical infrastructure to cyber attack on control systems. As stated in the Incident Managementmore » Tool Requirements (August 2005) ''Vulnerability reduction is promoted by risk analysis that tracks actual risk, emphasizes high risk, determines risk reduction as a function of countermeasures, tracks increase of risk due to external influence, and measures success of the vulnerability reduction program''. Process control and Supervisory Control and Data Acquisition (SCADA) systems, with their reliance on proprietary networks and hardware, have long been considered immune to the network attacks that have wreaked so much havoc on corporate information systems. New research indicates this confidence is misplaced--the move to open standards such as Ethernet, Transmission Control Protocol/Internet Protocol, and Web technologies is allowing hackers to take advantage of the control industry's unawareness. Much of the available information about cyber incidents represents a characterization as opposed to an analysis of events. The lack of good analyses reflects an overall weakness in reporting requirements as well as the fact that to date there have been very few serious cyber attacks on control systems. Most companies prefer not to share cyber attack incident data because of potential financial repercussions. Uniform reporting requirements will do much to make this information available to Department of Homeland Security (DHS) and others who require it. This report summarizes the rise in frequency of cyber attacks, describes the perpetrators, and identifies the means of attack. This type of analysis, when used in conjunction with vulnerability analyses, can be used to support a proactive approach to prevent cyber attacks. CSSC will use this document to evolve a standardized approach to incident reporting and analysis. This document will be updated as needed to record additional event analyses and insights regarding incident reporting. This report represents 120 cyber security incidents documented in a number of sources, including: the British Columbia Institute of Technology (BCIT) Industrial Security Incident Database, the 2003 CSI/FBI Computer Crime and Security Survey, the KEMA, Inc., Database, Lawrence Livermore National Laboratory, the Energy Incident Database, the INL Cyber Incident Database, and other open-source data. The National Memorial Institute for the Prevention of Terrorism (MIPT) database was also interrogated but, interestingly, failed to yield any cyber attack incidents. The results of this evaluation indicate that historical evidence provides insight into control system related incidents or failures; however, that the limited available information provides little support to future risk estimates. The documented case history shows that activity has increased significantly since 1988. The majority of incidents come from the Internet by way of opportunistic viruses, Trojans, and worms, but a surprisingly large number are directed acts of sabotage. A substantial number of confirmed, unconfirmed, and potential events that directly or potentially impact control systems worldwide are also identified. Twelve selected cyber incidents are presented at the end of this report as examples of the documented case studies (see Appendix B).« less

  15. 77 FR 72335 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... computer networks, systems, or databases. The records contain the individual's name; social security number... control and track access to DLA-controlled networks, computer systems, and databases. The records may also...

  16. Standards for Clinical Grade Genomic Databases.

    PubMed

    Yohe, Sophia L; Carter, Alexis B; Pfeifer, John D; Crawford, James M; Cushman-Vokoun, Allison; Caughron, Samuel; Leonard, Debra G B

    2015-11-01

    Next-generation sequencing performed in a clinical environment must meet clinical standards, which requires reproducibility of all aspects of the testing. Clinical-grade genomic databases (CGGDs) are required to classify a variant and to assist in the professional interpretation of clinical next-generation sequencing. Applying quality laboratory standards to the reference databases used for sequence-variant interpretation presents a new challenge for validation and curation. To define CGGD and the categories of information contained in CGGDs and to frame recommendations for the structure and use of these databases in clinical patient care. Members of the College of American Pathologists Personalized Health Care Committee reviewed the literature and existing state of genomic databases and developed a framework for guiding CGGD development in the future. Clinical-grade genomic databases may provide different types of information. This work group defined 3 layers of information in CGGDs: clinical genomic variant repositories, genomic medical data repositories, and genomic medicine evidence databases. The layers are differentiated by the types of genomic and medical information contained and the utility in assisting with clinical interpretation of genomic variants. Clinical-grade genomic databases must meet specific standards regarding submission, curation, and retrieval of data, as well as the maintenance of privacy and security. These organizing principles for CGGDs should serve as a foundation for future development of specific standards that support the use of such databases for patient care.

  17. High performance semantic factoring of giga-scale semantic graph databases.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    al-Saffar, Sinan; Adolf, Bob; Haglin, David

    2010-10-01

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to bring high performance computational resources to bear on their analysis, interpretation, and visualization, especially with respect to their innate semantic structure. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multithreaded architecture of the Cray XMT platform, conventional clusters, and large data stores. In this paper we describe that architecture, and present the results of our deployingmore » that for the analysis of the Billion Triple dataset with respect to its semantic factors, including basic properties, connected components, namespace interaction, and typed paths.« less

  18. Novel Method for Recruiting Representative At-Risk Individuals into Cancer Prevention Trials: Online Health Risk Assessment in Employee Wellness Programs.

    PubMed

    Hui, Siu-Kuen Azor; Miller, Suzanne M; Hazuda, Leah; Engelman, Kimberly; Ellerbeck, Edward F

    2016-09-01

    Participation in cancer prevention trials (CPT) is lower than 3 % among high-risk healthy individuals, and racial/ethnic minorities are the most under-represented. Novel recruitment strategies are therefore needed. Online health risk assessment (HRA) serves as a gateway component of nearly all employee wellness programs (EWPs) and may be a missed opportunity. This study aimed to explore employees' interest, willingness, motivators, and barriers of releasing their HRA responses to an external secure research database for recruitment purpose. We used qualitative research methods (focus group and individual interviews) to examine employees' interest and willingness in releasing their online HRA responses to an external, secure database to register as potential CPT participants. Fifteen structured interviews (40 % of study participants were of racial/ethnic minority) were conducted, and responses reached saturation after four interviews. All employees showed interest and willingness to release their online HRA responses to register as a potential CPT participant. Content analyses revealed that 91 % of participants were motivated to do so, and the major motivators were to (1) obtain help in finding personally relevant prevention trials, (2) help people they know who are affected by cancer, and/or (3) increase knowledge about CPT. A subset of participants (45 %) expressed barriers of releasing their HRA responses due to concerns about credibility and security of the external database. Online HRA may be a feasible but underutilized recruitment method for cancer prevention trials. EWP-sponsored HRA shows promise for the development of a large, centralized registry of racially/ethnically representative CPT potential participants.

  19. Novel method for recruiting representative at-risk individuals into cancer prevention trials: on-line health risk assessment in employee wellness programs

    PubMed Central

    Hui, Siu-kuen Azor; Miller, Suzanne M.; Hazuda, Leah; Engelman, Kimberly; Ellerbeck, Edward F.

    2015-01-01

    Participation in cancer prevention trials (CPT) is lower than 3% among high-risk healthy individuals, and racial/ethnic minorities are the most under-represented. Novel recruitment strategies are therefore needed. On-line health risk assessment (HRA) serves as a gateway component of nearly all employee wellness programs (EWP) and may be a missed opportunity. This study aimed to explore employees’ interest, willingness, motivators, and barriers of releasing their HRA responses to an external secure research database for recruitment purpose. We used qualitative research methods (focus group and individual interviews) to examine employees’ interest and willingness in releasing their on-line HRA responses to an external, secure database to register as potential CPT participants. Fifteen structured interviews (40% of study participants were of racial/ethnic minority) were conducted and responses reached saturation after four interviews. All employees showed interest and willingness to release their on-line HRA responses to register as a potential CPT participant. Content analyses revealed that 91% of participants were motivated to do so, and the major motivators were to: 1) obtain help in finding personally relevant prevention trials, 2) help people they know who are affected by cancer, and/or 3) increase knowledge about CPT. A subset of participants (45%) expressed barriers of releasing their HRA responses due to concerns about credibility and security of the external database. On-line HRA may be a feasible but underutilized recruitment method for cancer prevention trials. EWP-sponsored HRA shows promise for the development of a large, centralized registry of racially/ethnically representative CPT potential participants. PMID:26507744

  20. [Quality management and participation into clinical database].

    PubMed

    Okubo, Suguru; Miyata, Hiroaki; Tomotaki, Ai; Motomura, Noboru; Murakami, Arata; Ono, Minoru; Iwanaka, Tadashi

    2013-07-01

    Quality management is necessary for establishing useful clinical database in cooperation with healthcare professionals and facilities. The ways of management are 1) progress management of data entry, 2) liaison with database participants (healthcare professionals), and 3) modification of data collection form. In addition, healthcare facilities are supposed to consider ethical issues and information security for joining clinical databases. Database participants should check ethical review boards and consultation service for patients.

  1. 14 CFR 221.180 - Requirements for electronic filing of tariffs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of Transportation, for the maintenance and security of the on-line tariff database. (b) No carrier or... to its on-line tariff database. The filer shall be responsible for the transportation, installation... installation or maintenance. (3) The filer shall provide public access to its on-line tariff database, at...

  2. 14 CFR 221.180 - Requirements for electronic filing of tariffs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of Transportation, for the maintenance and security of the on-line tariff database. (b) No carrier or... to its on-line tariff database. The filer shall be responsible for the transportation, installation... installation or maintenance. (3) The filer shall provide public access to its on-line tariff database, at...

  3. 14 CFR 221.180 - Requirements for electronic filing of tariffs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of Transportation, for the maintenance and security of the on-line tariff database. (b) No carrier or... to its on-line tariff database. The filer shall be responsible for the transportation, installation... installation or maintenance. (3) The filer shall provide public access to its on-line tariff database, at...

  4. 14 CFR 221.180 - Requirements for electronic filing of tariffs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of Transportation, for the maintenance and security of the on-line tariff database. (b) No carrier or... to its on-line tariff database. The filer shall be responsible for the transportation, installation... installation or maintenance. (3) The filer shall provide public access to its on-line tariff database, at...

  5. 14 CFR 221.180 - Requirements for electronic filing of tariffs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of Transportation, for the maintenance and security of the on-line tariff database. (b) No carrier or... to its on-line tariff database. The filer shall be responsible for the transportation, installation... installation or maintenance. (3) The filer shall provide public access to its on-line tariff database, at...

  6. Getting acquainted: Actor and partner effects of attachment and temperament on young children's peer behavior.

    PubMed

    McElwain, Nancy L; Holland, Ashley S; Engle, Jennifer M; Ogolsky, Brian G

    2014-06-01

    Guided by a dyadic view of children's peer behavior, this study assessed actor and partner effects of attachment security and temperament on young children's behavior with an unfamiliar peer. At 33 months of age, child-mother attachment security was assessed via a modified Strange Situation procedure, and parents reported on child temperament (anger proneness and social fearfulness). At 39 months, same-sex children (N = 114, 58 girls) were randomly paired, and child dyads were observed during 3 laboratory visits occurring over 1 month. Actor-partner interdependence models, tested via multilevel modeling, revealed that actor security, partner anger proneness, and acquaintanceship (e.g., initial vs. later visits) combined to predict child behavior. Actor security predicted more responsiveness to the new peer partner at the initial visit, regardless of partner anger proneness. Actor security continued to predict responsiveness at the 2nd and 3rd visits when partner anger was low, but these associations were nonsignificant when partner anger was high. Actor security also predicted a less controlling assertiveness style at the initial visit when partner anger proneness was high, yet this association was nonsignificant by the final visit. The findings shed light on the dynamic nature of young children's peer behavior and indicate that attachment security is related to behavior in expected ways during initial interactions with a new peer, but may change as children become acquainted. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. Multifeature-based high-resolution palmprint recognition.

    PubMed

    Dai, Jifeng; Zhou, Jie

    2011-05-01

    Palmprint is a promising biometric feature for use in access control and forensic applications. Previous research on palmprint recognition mainly concentrates on low-resolution (about 100 ppi) palmprints. But for high-security applications (e.g., forensic usage), high-resolution palmprints (500 ppi or higher) are required from which more useful information can be extracted. In this paper, we propose a novel recognition algorithm for high-resolution palmprint. The main contributions of the proposed algorithm include the following: 1) use of multiple features, namely, minutiae, density, orientation, and principal lines, for palmprint recognition to significantly improve the matching performance of the conventional algorithm. 2) Design of a quality-based and adaptive orientation field estimation algorithm which performs better than the existing algorithm in case of regions with a large number of creases. 3) Use of a novel fusion scheme for an identification application which performs better than conventional fusion methods, e.g., weighted sum rule, SVMs, or Neyman-Pearson rule. Besides, we analyze the discriminative power of different feature combinations and find that density is very useful for palmprint recognition. Experimental results on the database containing 14,576 full palmprints show that the proposed algorithm has achieved a good performance. In the case of verification, the recognition system's False Rejection Rate (FRR) is 16 percent, which is 17 percent lower than the best existing algorithm at a False Acceptance Rate (FAR) of 10(-5), while in the identification experiment, the rank-1 live-scan partial palmprint recognition rate is improved from 82.0 to 91.7 percent.

  8. 6 CFR 11.6 - Reporting debts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Reporting debts. 11.6 Section 11.6 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY CLAIMS § 11.6 Reporting debts. DHS will report delinquent debts to credit bureaus and other automated databases in accordance with 31 U.S.C. 3711...

  9. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  10. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  11. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  12. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  13. 14 CFR 158.20 - Submission of required documents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... due to the security process. (b) Once the database development is completed with air carrier capability, public agencies and air carriers may use the FAA's national PFC database to post their required...

  14. Securing palmprint authentication systems using spoof detection approach

    NASA Astrophysics Data System (ADS)

    Kanhangad, Vivek; Kumar, Abhishek

    2013-12-01

    Automated human authentication using features extracted from palmprint images has been studied extensively in the literature. Primary focus of the studies thus far has been the improvement of matching performance. As more biometric systems get deployed for wide range of applications, the threat of impostor attacks on these systems is on the rise. The most common among various types of attacks is the sensor level spoof attack using fake hands created using different materials. This paper investigates an approach for securing palmprint based biometric systems against spoof attacks that use photographs of the human hand for circumventing the system. The approach is based on the analysis of local texture patterns of acquired palmprint images for extracting discriminatory features. A trained binary classifier utilizes the discriminating information to determine if the input image is of real hand or a fake one. Experimental results, using 611 palmprint images corresponding to 100 subjects in the publicly available IITD palmprint image database, show that 1) palmprint authentication systems are highly vulnerable to spoof attacks and 2) the proposed spoof detection approach is effective for discriminating between real and fake image samples. In particular, the proposed approach achieves the best classification accuracy of 97.35%.

  15. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  16. Biometric template transformation: a security analysis

    NASA Astrophysics Data System (ADS)

    Nagar, Abhishek; Nandakumar, Karthik; Jain, Anil K.

    2010-01-01

    One of the critical steps in designing a secure biometric system is protecting the templates of the users that are stored either in a central database or on smart cards. If a biometric template is compromised, it leads to serious security and privacy threats because unlike passwords, it is not possible for a legitimate user to revoke his biometric identifiers and switch to another set of uncompromised identifiers. One methodology for biometric template protection is the template transformation approach, where the template, consisting of the features extracted from the biometric trait, is transformed using parameters derived from a user specific password or key. Only the transformed template is stored and matching is performed directly in the transformed domain. In this paper, we formally investigate the security strength of template transformation techniques and define six metrics that facilitate a holistic security evaluation. Furthermore, we analyze the security of two wellknown template transformation techniques, namely, Biohashing and cancelable fingerprint templates based on the proposed metrics. Our analysis indicates that both these schemes are vulnerable to intrusion and linkage attacks because it is relatively easy to obtain either a close approximation of the original template (Biohashing) or a pre-image of the transformed template (cancelable fingerprints). We argue that the security strength of template transformation techniques must consider also consider the computational complexity of obtaining a complete pre-image of the transformed template in addition to the complexity of recovering the original biometric template.

  17. Strategies for Improving Polio Surveillance Performance in the Security-Challenged Nigerian States of Adamawa, Borno, and Yobe During 2009-2014.

    PubMed

    Hamisu, Abdullahi Walla; Johnson, Ticha Muluh; Craig, Kehinde; Mkanda, Pascal; Banda, Richard; Tegegne, Sisay G; Oyetunji, Ajiboye; Ningi, Nuhu; Mohammed, Said M; Adamu, Mohammed Isa; Abdulrahim, Khalid; Nsubuga, Peter; Vaz, Rui G; Muhammed, Ado J G

    2016-05-01

    The security-challenged states of Adamawa, Borno, and Yobe bear most of the brunt of the Boko Haram insurgency in Nigeria. The security challenge has led to the killing of health workers, destruction of health facilities, and displacement of huge populations. To identify areas of polio transmission and promptly detect possible cases of importation in these states, polio surveillance must be very sensitive. We conducted a retrospective review of acute flaccid paralysis surveillance in the security-compromised states between 2009 and 2014, using the acute flaccid paralysis database at the World Health Organization Nigeria Country Office. We also reviewed the reports of surveillance activities conducted in these security-challenged states, to identify strategies that were implemented to improve polio surveillance. Environmental surveillance was implemented in Borno in 2013 and in Yobe in 2014. All disease surveillance and notification officers in the 3 security-challenged states now receive annual training, and the number of community informants in these states has dramatically increased. Media-based messaging (via radio and television) is now used to sensitize the public to the importance of surveillance, and contact samples have been regularly collected in both states since 2014. The strategies implemented in the security-challenged states improved the quality of polio surveillance during the review period. © 2016 World Health Organization; licensee Oxford Journals.

  18. Strategies for Improving Polio Surveillance Performance in the Security-Challenged Nigerian States of Adamawa, Borno, and Yobe During 2009–2014

    PubMed Central

    Hamisu, Abdullahi Walla; Johnson, Ticha Muluh; Craig, Kehinde; Mkanda, Pascal; Banda, Richard; Tegegne, Sisay G.; Oyetunji, Ajiboye; Ningi, Nuhu; Mohammed, Said M.; Adamu, Mohammed Isa; Abdulrahim, Khalid; Nsubuga, Peter; Vaz, Rui G.; Muhammed, Ado J. G.

    2016-01-01

    Background. The security-challenged states of Adamawa, Borno, and Yobe bear most of the brunt of the Boko Haram insurgency in Nigeria. The security challenge has led to the killing of health workers, destruction of health facilities, and displacement of huge populations. To identify areas of polio transmission and promptly detect possible cases of importation in these states, polio surveillance must be very sensitive. Methods. We conducted a retrospective review of acute flaccid paralysis surveillance in the security-compromised states between 2009 and 2014, using the acute flaccid paralysis database at the World Health Organization Nigeria Country Office. We also reviewed the reports of surveillance activities conducted in these security-challenged states, to identify strategies that were implemented to improve polio surveillance. Results. Environmental surveillance was implemented in Borno in 2013 and in Yobe in 2014. All disease surveillance and notification officers in the 3 security-challenged states now receive annual training, and the number of community informants in these states has dramatically increased. Media-based messaging (via radio and television) is now used to sensitize the public to the importance of surveillance, and contact samples have been regularly collected in both states since 2014. Conclusions. The strategies implemented in the security-challenged states improved the quality of polio surveillance during the review period. PMID:26655842

  19. Timely response to secure messages from primary care patients.

    PubMed

    Rohrer, James E; North, Frederick; Angstman, Kurt B; Oberhelman, Sara S; Meunier, Matthew R

    2013-01-01

    To assess delays in response to patient secure e-mail messages in primary care. Secure electronic messages are initiated by primary care patients. Timely response is necessary for patient safety and quality. A database of secure messages. A random sample of 353 secure electronic messages initiated by primary care patients treated in 4 clinics. Message not opened after 12 hours or messages not responded to after 36 hours. A total of 8.5% of electronic messages were not opened within 12 hours, and 17.6% did not receive a response in 36 hours. Clinic location, being a clinic employee, and patient sex were not related to delays. Patients older than 50 years were more likely to receive a delayed response (25.7% delayed, P = .013). The risk of both kinds of delays was higher on weekends (P < .001 for both). The e-mail message system resulted in high rates of delayed response. Delays were concentrated on weekends (Friday-Sunday). Reducing delayed responses may require automatic rerouting of messages to message centers staffed 24-7 or other mechanisms to manage this after-hours work flow.

  20. [Food availability according to food security-insecurity among Mexican households].

    PubMed

    Valencia-Valero, Reyna Guadalupe; Ortiz-Hernández, Luis

    2014-04-01

    To know the differences in food availability according to food insecurity level among the Mexican households. We analyzed the database of the National Survey of Household's Incomes and Expenditures (n=27 445 households). Households were classified according to the Latin American and Caribbean Inventory of Food Security. The availability of each food group was estimated as grams per day per equivalent adult. 50.0% of Mexican households experienced some degree of food insecurity. Among households with food insecurity there was high availability of corn, wheat, egg, and sugars; but there was low availability of fresh fruits and vegetables, lean meat, poultry, seafood, milk, cheeses, and sweetened beverages. Although in households with food insecurity there is lower availability of most food groups (both with high nutrient density and with high energy density); they have higher availability of cheap foods, which in some cases are only source of energy but do not provide nutrients.

  1. The perfect heist :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lafleur, Jarret Marshall; Purvis, Liston Keith; Roesler, Alexander William

    2014-04-01

    Of the many facets of the criminal world, few have captured societys fascination as has that of high stakes robbery. The combination of meticulousness, cunning, and audacity required to execute a real-life Oceans Eleven may be uncommon among criminals, but fortunately it is common enough to extract a wealth of lessons for the protection of high-value assets. To assist in informing the analyses and decisions of security professionals, this paper surveys 23 sophisticated and high-value heists that have occurred or been attempted around the world, particularly over the past three decades. The results, compiled in a Heist Methods and Characteristicsmore » Database, have been analyzed qualitatively and quantitatively, with the goals of both identifying common characteristics and characterizing the range and diversity of criminal methods used. The analysis is focused in six areas: (1) Defeated Security Measures and Devices, (2) Deception Methods, (3) Timing, (4) Weapons, (5) Resources, and (6) Insiders.« less

  2. Architecture and Assessment: Privacy Preserving Biometrically Secured Electronic Documents

    DTIC Science & Technology

    2015-01-01

    very large public and private fingerprint databases comprehensive risk analysis and system security contribution to developing international ...Safety and Security Program which is led by Defence Research and Development Canada’s Centre for Security Science, in partnership with Public Safety...201 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 201 Science and Engineering

  3. A secure data outsourcing scheme based on Asmuth-Bloom secret sharing

    NASA Astrophysics Data System (ADS)

    Idris Muhammad, Yusuf; Kaiiali, Mustafa; Habbal, Adib; Wazan, A. S.; Sani Ilyasu, Auwal

    2016-11-01

    Data outsourcing is an emerging paradigm for data management in which a database is provided as a service by third-party service providers. One of the major benefits of offering database as a service is to provide organisations, which are unable to purchase expensive hardware and software to host their databases, with efficient data storage accessible online at a cheap rate. Despite that, several issues of data confidentiality, integrity, availability and efficient indexing of users' queries at the server side have to be addressed in the data outsourcing paradigm. Service providers have to guarantee that their clients' data are secured against internal (insider) and external attacks. This paper briefly analyses the existing indexing schemes in data outsourcing and highlights their advantages and disadvantages. Then, this paper proposes a secure data outsourcing scheme based on Asmuth-Bloom secret sharing which tries to address the issues in data outsourcing such as data confidentiality, availability and order preservation for efficient indexing.

  4. High security chaotic multiple access scheme for visible light communication systems with advanced encryption standard interleaving

    NASA Astrophysics Data System (ADS)

    Qiu, Junchao; Zhang, Lin; Li, Diyang; Liu, Xingcheng

    2016-06-01

    Chaotic sequences can be applied to realize multiple user access and improve the system security for a visible light communication (VLC) system. However, since the map patterns of chaotic sequences are usually well known, eavesdroppers can possibly derive the key parameters of chaotic sequences and subsequently retrieve the information. We design an advanced encryption standard (AES) interleaving aided multiple user access scheme to enhance the security of a chaotic code division multiple access-based visible light communication (C-CDMA-VLC) system. We propose to spread the information with chaotic sequences, and then the spread information is interleaved by an AES algorithm and transmitted over VLC channels. Since the computation complexity of performing inverse operations to deinterleave the information is high, the eavesdroppers in a high speed VLC system cannot retrieve the information in real time; thus, the system security will be enhanced. Moreover, we build a mathematical model for the AES-aided VLC system and derive the theoretical information leakage to analyze the system security. The simulations are performed over VLC channels, and the results demonstrate the effectiveness and high security of our presented AES interleaving aided chaotic CDMA-VLC system.

  5. DOE`s nation-wide system for access control can solve problems for the federal government

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callahan, S.; Tomes, D.; Davis, G.

    1996-07-01

    The U.S. Department of Energy`s (DOE`s) ongoing efforts to improve its physical and personnel security systems while reducing its costs, provide a model for federal government visitor processing. Through the careful use of standardized badges, computer databases, and networks of automated access control systems, the DOE is increasing the security associated with travel throughout the DOE complex, and at the same time, eliminating paperwork, special badging, and visitor delays. The DOE is also improving badge accountability, personnel identification assurance, and access authorization timeliness and accuracy. Like the federal government, the DOE has dozens of geographically dispersed locations run by manymore » different contractors operating a wide range of security systems. The DOE has overcome these obstacles by providing data format standards, a complex-wide virtual network for security, the adoption of a standard high security system, and an open-systems-compatible link for any automated access control system. If the location`s level of security requires it, positive visitor identification is accomplished by personal identification number (PIN) and/or by biometrics. At sites with automated access control systems, this positive identification is integrated into the portals.« less

  6. 76 FR 39315 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security/ALL-030 Use of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-06

    ... Terrorist Screening Database System of Records AGENCY: Privacy Office, DHS. ACTION: Notice of proposed... Use of the Terrorist Screening Database System of Records'' and this proposed rulemaking. In this... Use of the Terrorist Screening Database (TSDB) System of Records.'' DHS is maintaining a mirror copy...

  7. Tao of Gateway: Providing Internet Access to Licensed Databases.

    ERIC Educational Resources Information Center

    McClellan, Gregory A.; Garrison, William V.

    1997-01-01

    Illustrates an approach for providing networked access to licensed databases over the Internet by positioning the library between patron and vendor. Describes how the gateway systems and database connection servers work and discusses how treatment of security has evolved with the introduction of the World Wide Web. Outlines plans to reimplement…

  8. mantisGRID: a grid platform for DICOM medical images management in Colombia and Latin America.

    PubMed

    Garcia Ruiz, Manuel; Garcia Chaves, Alvin; Ruiz Ibañez, Carlos; Gutierrez Mazo, Jorge Mario; Ramirez Giraldo, Juan Carlos; Pelaez Echavarria, Alejandro; Valencia Diaz, Edison; Pelaez Restrepo, Gustavo; Montoya Munera, Edwin Nelson; Garcia Loaiza, Bernardo; Gomez Gonzalez, Sebastian

    2011-04-01

    This paper presents the mantisGRID project, an interinstitutional initiative from Colombian medical and academic centers aiming to provide medical grid services for Colombia and Latin America. The mantisGRID is a GRID platform, based on open source grid infrastructure that provides the necessary services to access and exchange medical images and associated information following digital imaging and communications in medicine (DICOM) and health level 7 standards. The paper focuses first on the data abstraction architecture, which is achieved via Open Grid Services Architecture Data Access and Integration (OGSA-DAI) services and supported by the Globus Toolkit. The grid currently uses a 30-Mb bandwidth of the Colombian High Technology Academic Network, RENATA, connected to Internet 2. It also includes a discussion on the relational database created to handle the DICOM objects that were represented using Extensible Markup Language Schema documents, as well as other features implemented such as data security, user authentication, and patient confidentiality. Grid performance was tested using the three current operative nodes and the results demonstrated comparable query times between the mantisGRID (OGSA-DAI) and Distributed mySQL databases, especially for a large number of records.

  9. Utilization of a Clinical Trial Management System for the Whole Clinical Trial Process as an Integrated Database: System Development.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho; Kim, Tae Won

    2018-04-24

    Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. ©Yu Rang Park, Young Jo Yoon, HaYeong Koo, Soyoung Yoo, Chang-Min Choi, Sung-Ho Beck, Tae Won Kim. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.04.2018.

  10. 32 CFR 2001.42 - Standards for security equipment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION... Administration (GSA) shall, in coordination with agency heads originating classified information, establish and publish uniform standards, specifications, qualified product lists or databases, and supply schedules for...

  11. 32 CFR 2001.42 - Standards for security equipment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION... Administration (GSA) shall, in coordination with agency heads originating classified information, establish and publish uniform standards, specifications, qualified product lists or databases, and supply schedules for...

  12. 32 CFR 2001.42 - Standards for security equipment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION... Administration (GSA) shall, in coordination with agency heads originating classified information, establish and publish uniform standards, specifications, qualified product lists or databases, and supply schedules for...

  13. 32 CFR 2001.42 - Standards for security equipment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION... Administration (GSA) shall, in coordination with agency heads originating classified information, establish and publish uniform standards, specifications, qualified product lists or databases, and supply schedules for...

  14. Father involvement, paternal sensitivity, and father-child attachment security in the first 3 years.

    PubMed

    Brown, Geoffrey L; Mangelsdorf, Sarah C; Neff, Cynthia

    2012-06-01

    To reach a greater understanding of the early father-child attachment relationship, this study examined concurrent and longitudinal associations among father involvement, paternal sensitivity, and father-child attachment security at 13 months and 3 years of age. Analyses revealed few associations among these variables at 13 months of age, but involvement and sensitivity independently predicted father-child attachment security at age 3. Moreover, sensitivity moderated the association between involvement and attachment security at 3 years. Specifically, involvement was unrelated to attachment security when fathers were highly sensitive, but positively related to attachment security when fathers were relatively less sensitive. Father involvement was also moderately stable across the two time points, but paternal sensitivity was not. Furthermore, there was significant stability in father-child attachment security from 13 months to 3 years. Secure attachment at 13 months also predicted greater levels of paternal sensitivity at 3 years, with sensitivity at age 3 mediating the association between 13 month and 3 year attachment security. In sum, a secure father-child attachment relationship (a) was related to both quantity and quality of fathering behavior, (b) remained relatively stable across early childhood, and (c) predicted increased paternal sensitivity over time. These findings further our understanding of the correlates of early father-child attachment, and underscore the need to consider multiple domains of fathers' parenting and reciprocal relations between fathering behavior and father-child attachment security. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  15. Secure environment for real-time tele-collaboration on virtual simulation of radiation treatment planning.

    PubMed

    Ntasis, Efthymios; Maniatis, Theofanis A; Nikita, Konstantina S

    2003-01-01

    A secure framework is described for real-time tele-collaboration on Virtual Simulation procedure of Radiation Treatment Planning. An integrated approach is followed clustering the security issues faced by the system into organizational issues, security issues over the LAN and security issues over the LAN-to-LAN connection. The design and the implementation of the security services are performed according to the identified security requirements, along with the need for real time communication between the collaborating health care professionals. A detailed description of the implementation is given, presenting a solution, which can directly be tailored to other tele-collaboration services in the field of health care. The pilot study of the proposed security components proves the feasibility of the secure environment, and the consistency with the high performance demands of the application.

  16. Toward a mtDNA locus-specific mutation database using the LOVD platform.

    PubMed

    Elson, Joanna L; Sweeney, Mary G; Procaccio, Vincent; Yarham, John W; Salas, Antonio; Kong, Qing-Peng; van der Westhuizen, Francois H; Pitceathly, Robert D S; Thorburn, David R; Lott, Marie T; Wallace, Douglas C; Taylor, Robert W; McFarland, Robert

    2012-09-01

    The Human Variome Project (HVP) is a global effort to collect and curate all human genetic variation affecting health. Mutations of mitochondrial DNA (mtDNA) are an important cause of neurogenetic disease in humans; however, identification of the pathogenic mutations responsible can be problematic. In this article, we provide explanations as to why and suggest how such difficulties might be overcome. We put forward a case in support of a new Locus Specific Mutation Database (LSDB) implemented using the Leiden Open-source Variation Database (LOVD) system that will not only list primary mutations, but also present the evidence supporting their role in disease. Critically, we feel that this new database should have the capacity to store information on the observed phenotypes alongside the genetic variation, thereby facilitating our understanding of the complex and variable presentation of mtDNA disease. LOVD supports fast queries of both seen and hidden data and allows storage of sequence variants from high-throughput sequence analysis. The LOVD platform will allow construction of a secure mtDNA database; one that can fully utilize currently available data, as well as that being generated by high-throughput sequencing, to link genotype with phenotype enhancing our understanding of mitochondrial disease, with a view to providing better prognostic information. © 2012 Wiley Periodicals, Inc.

  17. Toward a mtDNA Locus-Specific Mutation Database Using the LOVD Platform

    PubMed Central

    Elson, Joanna L.; Sweeney, Mary G.; Procaccio, Vincent; Yarham, John W.; Salas, Antonio; Kong, Qing-Peng; van der Westhuizen, Francois H.; Pitceathly, Robert D.S.; Thorburn, David R.; Lott, Marie T.; Wallace, Douglas C.; Taylor, Robert W.; McFarland, Robert

    2015-01-01

    The Human Variome Project (HVP) is a global effort to collect and curate all human genetic variation affecting health. Mutations of mitochondrial DNA (mtDNA) are an important cause of neurogenetic disease in humans; however, identification of the pathogenic mutations responsible can be problematic. In this article, we provide explanations as to why and suggest how such difficulties might be overcome. We put forward a case in support of a new Locus Specific Mutation Database (LSDB) implemented using the Leiden Open-source Variation Database (LOVD) system that will not only list primary mutations, but also present the evidence supporting their role in disease. Critically, we feel that this new database should have the capacity to store information on the observed phenotypes alongside the genetic variation, thereby facilitating our understanding of the complex and variable presentation of mtDNA disease. LOVD supports fast queries of both seen and hidden data and allows storage of sequence variants from high-throughput sequence analysis. The LOVD platform will allow construction of a secure mtDNA database; one that can fully utilize currently available data, as well as that being generated by high-throughput sequencing, to link genotype with phenotype enhancing our understanding of mitochondrial disease, with a view to providing better prognostic information. PMID:22581690

  18. Verifying the secure setup of UNIX client/servers and detection of network intrusion

    NASA Astrophysics Data System (ADS)

    Feingold, Richard; Bruestle, Harry R.; Bartoletti, Tony; Saroyan, R. A.; Fisher, John M.

    1996-03-01

    This paper describes our technical approach to developing and delivering Unix host- and network-based security products to meet the increasing challenges in information security. Today's global `Infosphere' presents us with a networked environment that knows no geographical, national, or temporal boundaries, and no ownership, laws, or identity cards. This seamless aggregation of computers, networks, databases, applications, and the like store, transmit, and process information. This information is now recognized as an asset to governments, corporations, and individuals alike. This information must be protected from misuse. The Security Profile Inspector (SPI) performs static analyses of Unix-based clients and servers to check on their security configuration. SPI's broad range of security tests and flexible usage options support the needs of novice and expert system administrators alike. SPI's use within the Department of Energy and Department of Defense has resulted in more secure systems, less vulnerable to hostile intentions. Host-based information protection techniques and tools must also be supported by network-based capabilities. Our experience shows that a weak link in a network of clients and servers presents itself sooner or later, and can be more readily identified by dynamic intrusion detection techniques and tools. The Network Intrusion Detector (NID) is one such tool. NID is designed to monitor and analyze activity on the Ethernet broadcast Local Area Network segment and product transcripts of suspicious user connections. NID's retrospective and real-time modes have proven invaluable to security officers faced with ongoing attacks to their systems and networks.

  19. Surviving the Glut: The Management of Event Streams in Cyberphysical Systems

    NASA Astrophysics Data System (ADS)

    Buchmann, Alejandro

    Alejandro Buchmann is Professor in the Department of Computer Science, Technische Universität Darmstadt, where he heads the Databases and Distributed Systems Group. He received his MS (1977) and PhD (1980) from the University of Texas at Austin. He was an Assistant/Associate Professor at the Institute for Applied Mathematics and Systems IIMAS/UNAM in Mexico, doing research on databases for CAD, geographic information systems, and objectoriented databases. At Computer Corporation of America (later Xerox Advanced Information Systems) in Cambridge, Mass., he worked in the areas of active databases and real-time databases, and at GTE Laboratories, Waltham, in the areas of distributed object systems and the integration of heterogeneous legacy systems. 1991 he returned to academia and joined T.U. Darmstadt. His current research interests are at the intersection of middleware, databases, eventbased distributed systems, ubiquitous computing, and very large distributed systems (P2P, WSN). Much of the current research is concerned with guaranteeing quality of service and reliability properties in these systems, for example, scalability, performance, transactional behaviour, consistency, and end-to-end security. Many research projects imply collaboration with industry and cover a broad spectrum of application domains. Further information can be found at http://www.dvs.tu-darmstadt.de

  20. Project Manager’s Guide to the Scientific and Technical Information (STINFO) Program and Technical Publications Process

    DTIC Science & Technology

    1993-12-01

    Iaporta .. y be definitive for the tubjoct proaentod, exploratory in natura, or an evaluation of critical Aubayato• or of technical problema , 4...International Security 9 Social and Natural Science Studies Field 41 Edit: (Type 3) -Entry of an invalid code when Performance Type is "C" or "M" will...analysis SF Foreign area social science research SP Foreign area policy planAing research BF Identifies databases with data on foreign forces or

  1. Genomic analysis and geographic visualization of H5N1 and SARS-CoV.

    PubMed

    Hill, Andrew W; Alexandrov, Boyan; Guralnick, Robert P; Janies, Daniel

    2007-10-11

    Emerging infectious diseases and organisms present critical issues of national security public health, and economic welfare. We still understand little about the zoonotic potential of many viruses. To this end, we are developing novel database tools to manage comparative genomic datasets. These tools add value because they allow us to summarize the direction, frequency and order of genomic changes. We will perform numerous real world tests with our tools with both Avian Influenza and Coronaviruses.

  2. Applying an MVC Framework for The System Development Life Cycle with Waterfall Model Extended

    NASA Astrophysics Data System (ADS)

    Hardyanto, W.; Purwinarko, A.; Sujito, F.; Masturi; Alighiri, D.

    2017-04-01

    This paper describes the extension of the waterfall model using MVC architectural pattern for software development. The waterfall model is the based model of the most widely used in software development, yet there are still many problems in it. The general issue usually happens on data changes that cause the delays on the process itself. On the other hand, the security factor on the software as well as one of the major problems. This study uses PHP programming language for implementation. Although this model can be implemented in several programming languages with the same concept. This study is based on MVC architecture so that it can improve the performance of both software development and maintenance, especially concerning security, validation, database access, and routing.

  3. Food security in indigenous and peasant populations: a systematic review.

    PubMed

    Restrepo-Arango, Marcos; Gutiérrez-Builes, Lina Andrea; Ríos-Osorio, Leonardo Alberto

    2018-04-01

    Food security and the vulnerability among indigenous and peasant populations has become a topic of interest to public health all around the world, leading to the investigation about measurement, classification and factors that determine it. This systematic review aims to describe the situation of food security in indigenous and peasant communities, and the methods used for evaluation. The literature search was performed on the Pub Med (5), ScienceDirect (221) and Scopus (377) databases searching for publications between 2004 and 2015, a total of 603 items were located with the search engines. At the end of the screening process and after adding the items found in the gray literature, 25 papers were obtained to write the review. In the 11 years evaluated between 2004 and 2015, scientific activity around the theme was poor with just 4.54% of the publications on this subject, but for 2011 the percentage increased to 13 publications, 63%. Various factors that influence the development of food insecurity are climate change, the diversity of agriculture, globalization and market westernization.

  4. Automatic public access to documents and maps stored on and internal secure system.

    NASA Astrophysics Data System (ADS)

    Trench, James; Carter, Mary

    2013-04-01

    The Geological Survey of Ireland operates a Document Management System for providing documents and maps stored internally in high resolution and in a high level secure environment, to an external service where the documents are automatically presented in a lower resolution to members of the public. Security is devised through roles and Individual Users where role level and folder level can be set. The application is an electronic document/data management (EDM) system which has a Geographical Information System (GIS) component integrated to allow users to query an interactive map of Ireland for data that relates to a particular area of interest. The data stored in the database consists of Bedrock Field Sheets, Bedrock Notebooks, Bedrock Maps, Geophysical Surveys, Geotechnical Maps & Reports, Groundwater, GSI Publications, Marine, Mine Records, Mineral Localities, Open File, Quaternary and Unpublished Reports. The Konfig application Tool is both an internal and public facing application. It acts as a tool for high resolution data entry which are stored in a high resolution vault. The public facing application is a mirror of the internal application and differs only in that the application furnishes high resolution data into low resolution format which is stored in a low resolution vault thus, making the data web friendly to the end user for download.

  5. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data.

    PubMed

    Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.

  6. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data

    PubMed Central

    Vaccarino, Anthony L.; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R.; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G.; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F. Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M.; Stuss, Donald T.; Theriault, Elizabeth; Evans, Kenneth R.

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute’s “Brain-CODE” is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care. PMID:29875648

  7. Performance analysis of different database in new internet mapping system

    NASA Astrophysics Data System (ADS)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  8. Measuring Information Security Performance with 10 by 10 Model for Holistic State Evaluation.

    PubMed

    Bernik, Igor; Prislan, Kaja

    Organizations should measure their information security performance if they wish to take the right decisions and develop it in line with their security needs. Since the measurement of information security is generally underdeveloped in practice and many organizations find the existing recommendations too complex, the paper presents a solution in the form of a 10 by 10 information security performance measurement model. The model-ISP 10×10M is composed of ten critical success factors, 100 key performance indicators and 6 performance levels. Its content was devised on the basis of findings presented in the current research studies and standards, while its structure results from an empirical research conducted among information security professionals from Slovenia. Results of the study show that a high level of information security performance is mostly dependent on measures aimed at managing information risks, employees and information sources, while formal and environmental factors have a lesser impact. Experts believe that information security should evolve systematically, where it's recommended that beginning steps include technical, logical and physical security controls, while advanced activities should relate predominantly strategic management activities. By applying the proposed model, organizations are able to determine the actual level of information security performance based on the weighted indexing technique. In this manner they identify the measures they ought to develop in order to improve the current situation. The ISP 10×10M is a useful tool for conducting internal system evaluations and decision-making. It may also be applied to a larger sample of organizations in order to determine the general state-of-play for research purposes.

  9. Adhesives: Test Method, Group Assignment, and Categorization Guide for High-Loading-Rate Applications Preparation and Testing of Single Lap Joints (Ver. 2.2, Unlimited)

    DTIC Science & Technology

    2016-04-01

    Gerard Chaney, and Charles Pergantis Weapons and Materials Research Directorate, ARL Coatings, Corrosion, and Engineered Polymers Branch (CCEPB...SUBJECT TERMS single lap joint, adhesive, sample preparation, testing, database, metadata, material pedigree, ISO 16. SECURITY CLASSIFICATION OF: 17...temperature/water immersion conditioning test for lap-joint test specimens using the test tubes and convection oven method

  10. An object-oriented approach to deploying highly configurable Web interfaces for the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Lange, Bruno; Maidantchik, Carmen; Pommes, Kathy; Pavani, Varlen; Arosa, Breno; Abreu, Igor

    2015-12-01

    The ATLAS Technical Coordination disposes of 17 Web systems to support its operation. These applications, whilst ranging from managing the process of publishing scientific papers to monitoring radiation levels in the equipment in the experimental cavern, are constantly prone to changes in requirements due to the collaborative nature of the experiment and its management. In this context, a Web framework is proposed to unify the generation of the supporting interfaces. FENCE assembles classes to build applications by making extensive use of JSON configuration files. It relies heavily on Glance, a technology that was set forth in 2003 to create an abstraction layer on top of the heterogeneous sources that store the technical coordination data. Once Glance maps out the database modeling, records can be referenced in the configuration files by wrapping unique identifiers around double enclosing brackets. The deployed content can be individually secured by attaching clearance attributes to their description thus ensuring that view/edit privileges are granted to eligible users only. The framework also provides tools for securely writing into a database. Fully HTML5-compliant multi-step forms can be generated from their JSON description to assure that the submitted data comply with a series of constraints. Input validation is carried out primarily on the server- side but, following progressive enhancement guidelines, verification might also be performed on the client-side by enabling specific markup data attributes which are then handed over to the jQuery validation plug-in. User monitoring is accomplished by thoroughly logging user requests along with any POST data. Documentation is built from the source code using the phpDocumentor tool and made readily available for developers online. Fence, therefore, speeds up the implementation of Web interfaces and reduces the response time to requirement changes by minimizing maintenance overhead.

  11. Methods and implementation of a central biosample and data management in a three-centre clinical study.

    PubMed

    Angelow, Aniela; Schmidt, Matthias; Weitmann, Kerstin; Schwedler, Susanne; Vogt, Hannes; Havemann, Christoph; Hoffmann, Wolfgang

    2008-07-01

    In our report we describe concept, strategies and implementation of a central biosample and data management (CSDM) system in the three-centre clinical study of the Transregional Collaborative Research Centre "Inflammatory Cardiomyopathy - Molecular Pathogenesis and Therapy" SFB/TR 19, Germany. Following the requirements of high system resource availability, data security, privacy protection and quality assurance, a web-based CSDM was developed based on Java 2 Enterprise Edition using an Oracle database. An efficient and reliable sample documentation system using bar code labelling, a partitioning storage algorithm and an online documentation software was implemented. An online electronic case report form is used to acquire patient-related data. Strict rules for access to the online applications and secure connections are used to account for privacy protection and data security. Challenges for the implementation of the CSDM resided at project, technical and organisational level as well as at staff level.

  12. Planning for CD-ROM in the Reference Department.

    ERIC Educational Resources Information Center

    Graves, Gail T.; And Others

    1987-01-01

    Outlines the evaluation criteria used by the reference department at the Williams Library at the University of Mississippi in selecting databases and hardware used in CD-ROM workstations. The factors discussed include database coverage, costs, and security. (CLB)

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less

  14. Palmprint Based Verification System Using SURF Features

    NASA Astrophysics Data System (ADS)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  15. Building An Integrated Neurodegenerative Disease Database At An Academic Health Center

    PubMed Central

    Xie, Sharon X.; Baek, Young; Grossman, Murray; Arnold, Steven E.; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M.-Y.; Trojanowski, John Q.

    2010-01-01

    Background It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), amyotrophic lateral sclerosis (ALS), and frontotemporal lobar degeneration (FTLD). These comparative studies rely on powerful database tools to quickly generate data sets which match diverse and complementary criteria set by the studies. Methods In this paper, we present a novel Integrated NeuroDegenerative Disease (INDD) database developed at the University of Pennsylvania (Penn) through a consortium of Penn investigators. Since these investigators work on AD, PD, ALS and FTLD, this allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used Microsoft SQL Server as the platform with built-in “backwards” functionality to provide Access as a front-end client to interface with the database. We used PHP hypertext Preprocessor to create the “front end” web interface and then integrated individual neurodegenerative disease databases using a master lookup table. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Results We compare the results of a biomarker study using the INDD database to those using an alternative approach by querying individual database separately. Conclusions We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies across several neurodegenerative diseases. PMID:21784346

  16. Reactome graph database: Efficient access to complex pathway data

    PubMed Central

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  17. Reactome graph database: Efficient access to complex pathway data.

    PubMed

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  18. Survey of Cyber Crime in Big Data

    NASA Astrophysics Data System (ADS)

    Rajeswari, C.; Soni, Krishna; Tandon, Rajat

    2017-11-01

    Big data is like performing computation operations and database operations for large amounts of data, automatically from the data possessor’s business. Since a critical strategic offer of big data access to information from numerous and various areas, security and protection will assume an imperative part in big data research and innovation. The limits of standard IT security practices are notable, with the goal that they can utilize programming sending to utilize programming designers to incorporate pernicious programming in a genuine and developing risk in applications and working frameworks, which are troublesome. The impact gets speedier than big data. In this way, one central issue is that security and protection innovation are sufficient to share controlled affirmation for countless direct get to. For powerful utilization of extensive information, it should be approved to get to the information of that space or whatever other area from a space. For a long time, dependable framework improvement has arranged a rich arrangement of demonstrated ideas of demonstrated security to bargain to a great extent with the decided adversaries, however this procedure has been to a great extent underestimated as “needless excess” and sellers In this discourse, essential talks will be examined for substantial information to exploit this develop security and protection innovation, while the rest of the exploration difficulties will be investigated.

  19. Secure quantum private information retrieval using phase-encoded queries

    NASA Astrophysics Data System (ADS)

    Olejnik, Lukasz

    2011-08-01

    We propose a quantum solution to the classical private information retrieval (PIR) problem, which allows one to query a database in a private manner. The protocol offers privacy thresholds and allows the user to obtain information from a database in a way that offers the potential adversary, in this model the database owner, no possibility of deterministically establishing the query contents. This protocol may also be viewed as a solution to the symmetrically private information retrieval problem in that it can offer database security (inability for a querying user to steal its contents). Compared to classical solutions, the protocol offers substantial improvement in terms of communication complexity. In comparison with the recent quantum private queries [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.100.230502 100, 230502 (2008)] protocol, it is more efficient in terms of communication complexity and the number of rounds, while offering a clear privacy parameter. We discuss the security of the protocol and analyze its strengths and conclude that using this technique makes it challenging to obtain the unconditional (in the information-theoretic sense) privacy degree; nevertheless, in addition to being simple, the protocol still offers a privacy level. The oracle used in the protocol is inspired both by the classical computational PIR solutions as well as the Deutsch-Jozsa oracle.

  20. Secure quantum private information retrieval using phase-encoded queries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olejnik, Lukasz

    We propose a quantum solution to the classical private information retrieval (PIR) problem, which allows one to query a database in a private manner. The protocol offers privacy thresholds and allows the user to obtain information from a database in a way that offers the potential adversary, in this model the database owner, no possibility of deterministically establishing the query contents. This protocol may also be viewed as a solution to the symmetrically private information retrieval problem in that it can offer database security (inability for a querying user to steal its contents). Compared to classical solutions, the protocol offersmore » substantial improvement in terms of communication complexity. In comparison with the recent quantum private queries [Phys. Rev. Lett. 100, 230502 (2008)] protocol, it is more efficient in terms of communication complexity and the number of rounds, while offering a clear privacy parameter. We discuss the security of the protocol and analyze its strengths and conclude that using this technique makes it challenging to obtain the unconditional (in the information-theoretic sense) privacy degree; nevertheless, in addition to being simple, the protocol still offers a privacy level. The oracle used in the protocol is inspired both by the classical computational PIR solutions as well as the Deutsch-Jozsa oracle.« less

  1. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  2. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE PAGES

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...

    2017-04-24

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  3. Remote online monitoring and measuring system for civil engineering structures

    NASA Astrophysics Data System (ADS)

    Kujawińska, Malgorzata; Sitnik, Robert; Dymny, Grzegorz; Karaszewski, Maciej; Michoński, Kuba; Krzesłowski, Jakub; Mularczyk, Krzysztof; Bolewicki, Paweł

    2009-06-01

    In this paper a distributed intelligent system for civil engineering structures on-line measurement, remote monitoring, and data archiving is presented. The system consists of a set of optical, full-field displacement sensors connected to a controlling server. The server conducts measurements according to a list of scheduled tasks and stores the primary data or initial results in a remote centralized database. Simultaneously the server performs checks, ordered by the operator, which may in turn result with an alert or a specific action. The structure of whole system is analyzed along with the discussion on possible fields of application and the ways to provide a relevant security during data transport. Finally, a working implementation consisting of a fringe projection, geometrical moiré, digital image correlation and grating interferometry sensors and Oracle XE database is presented. The results from database utilized for on-line monitoring of a threshold value of strain for an exemplary area of interest at the engineering structure are presented and discussed.

  4. EMR Database Upgrade from MUMPS to CACHE: Lessons Learned.

    PubMed

    Alotaibi, Abduallah; Emshary, Mshary; Househ, Mowafa

    2014-01-01

    Over the past few years, Saudi hospitals have been implementing and upgrading Electronic Medical Record Systems (EMRs) to ensure secure data transfer and exchange between EMRs.This paper focuses on the process and lessons learned in upgrading the MUMPS database to a the newer Caché database to ensure the integrity of electronic data transfer within a local Saudi hospital. This paper examines the steps taken by the departments concerned, their action plans and how the change process was managed. Results show that user satisfaction was achieved after the upgrade was completed. The system was stable and offered better healthcare quality to patients as a result of the data exchange. Hardware infrastructure upgrades improved scalability and software upgrades to Caché improved stability. The overall performance was enhanced and new functions were added (CPOE) during the upgrades. The essons learned were: 1) Involve higher management; 2) Research multiple solutions available in the market; 3) Plan for a variety of implementation scenarios.

  5. Efficacy and Safety of Tension-Free Vaginal Tape-Secur Mini-Sling Versus Standard Midurethral Slings for Female Stress Urinary Incontinence: A Systematic Review and Meta-Analysis

    PubMed Central

    Wang, Tao; Zhang, Yong

    2015-01-01

    Purpose: To assess the efficacy and safety of tension-free vaginal tape (TVT)-Secur for stress urinary incontinence (SUI). Methods: A literature review was performed to identify all published trials of TVT-Secur. The search included the following databases: MEDLINE, Embase, and the Cochrane Controlled Trial Register. Results: Seventeen publications involving a total of 1,879 patients were used to compare TVT-Secur with tension-free obturator tape (TVT-O) and TVT. We found that TVT-Secur had significant reductions in operative time, visual analog score for pain, and postoperative complications compared with TVT-O. Even though TVT-Secur had a significantly lower subjective cure rate (P<0.00001), lower objective cure rate (P<0.00001), and higher intraoperative complication rate, compared with TVT-O at 1 to 3 years, there was no significant difference between TVT-Secur and TVT-O in the subjective cure rate (odds ratio [OR], 0.49; 95% confidence interval [CI], 0.22–1.08; P=0.08), objective cure rate (OR, 0.49; 95% CI, 0.22–1.09; P=0.08), or complications at 3 to 5 years. Moreover, TVT-Secur had significantly lower subjective and objective cure rates compared with TVT. Conclusions: This meta-analysis indicates that TVT-Secur did not show an inferior efficacy and safety compared with TVT-O for SUI in 3 to 5 years, even though displaying a clear tread toward a lower efficacy in 1 to 3 years. Considering that the safety is similar, there are no advantages in using TVT-Secur. PMID:26739179

  6. Ubiquitous-Severance Hospital Project: Implementation and Results

    PubMed Central

    Chang, Bung-Chul; Kim, Young-A; Kim, Jee Hea; Jung, Hae Kyung; Kang, Eun Hae; Kang, Hee Suk; Lee, Hyung Il; Kim, Yong Ook; Yoo, Sun Kook; Sunwoo, Ilnam; An, Seo Yong; Jeong, Hye Jeong

    2010-01-01

    Objectives The purpose of this study was to review an implementation of u-Severance information system with focus on electronic hospital records (EHR) and to suggest future improvements. Methods Clinical Data Repository (CDR) of u-Severance involved implementing electronic medical records (EMR) as the basis of EHR and the management of individual health records. EHR were implemented with service enhancements extending to the clinical decision support system (CDSS) and expanding the knowledge base for research with a repository for clinical data and medical care information. Results The EMR system of Yonsei University Health Systems (YUHS) consists of HP integrity superdome servers using MS SQL as a database management system and MS Windows as its operating system. Conclusions YUHS is a high-performing medical institution with regards to efficient management and customer satisfaction; however, after 5 years of implementation of u-Severance system, several limitations with regards to expandability and security have been identified. PMID:21818425

  7. Ubiquitous-severance hospital project: implementation and results.

    PubMed

    Chang, Bung-Chul; Kim, Nam-Hyun; Kim, Young-A; Kim, Jee Hea; Jung, Hae Kyung; Kang, Eun Hae; Kang, Hee Suk; Lee, Hyung Il; Kim, Yong Ook; Yoo, Sun Kook; Sunwoo, Ilnam; An, Seo Yong; Jeong, Hye Jeong

    2010-03-01

    The purpose of this study was to review an implementation of u-Severance information system with focus on electronic hospital records (EHR) and to suggest future improvements. Clinical Data Repository (CDR) of u-Severance involved implementing electronic medical records (EMR) as the basis of EHR and the management of individual health records. EHR were implemented with service enhancements extending to the clinical decision support system (CDSS) and expanding the knowledge base for research with a repository for clinical data and medical care information. The EMR system of Yonsei University Health Systems (YUHS) consists of HP integrity superdome servers using MS SQL as a database management system and MS Windows as its operating system. YUHS is a high-performing medical institution with regards to efficient management and customer satisfaction; however, after 5 years of implementation of u-Severance system, several limitations with regards to expandability and security have been identified.

  8. Performance assessment of EMR systems based on post-relational database.

    PubMed

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  9. Evolving the US Army Research Laboratory (ARL) Technical Communication Strategy

    DTIC Science & Technology

    2016-10-01

    of added value and enhanced tech transfer, and strengthened relationships with academic and industry collaborators. In support of increasing ARL’s...communication skills; and Prong 3: Promote a Stakeholder Database to implement a stakeholder database (including names and preferences) and use a...Group, strategic planning, communications strategy, stakeholder database , workforce improvement, science and technology, S&T 16. SECURITY

  10. Measuring Information Security Performance with 10 by 10 Model for Holistic State Evaluation

    PubMed Central

    2016-01-01

    Organizations should measure their information security performance if they wish to take the right decisions and develop it in line with their security needs. Since the measurement of information security is generally underdeveloped in practice and many organizations find the existing recommendations too complex, the paper presents a solution in the form of a 10 by 10 information security performance measurement model. The model—ISP 10×10M is composed of ten critical success factors, 100 key performance indicators and 6 performance levels. Its content was devised on the basis of findings presented in the current research studies and standards, while its structure results from an empirical research conducted among information security professionals from Slovenia. Results of the study show that a high level of information security performance is mostly dependent on measures aimed at managing information risks, employees and information sources, while formal and environmental factors have a lesser impact. Experts believe that information security should evolve systematically, where it’s recommended that beginning steps include technical, logical and physical security controls, while advanced activities should relate predominantly strategic management activities. By applying the proposed model, organizations are able to determine the actual level of information security performance based on the weighted indexing technique. In this manner they identify the measures they ought to develop in order to improve the current situation. The ISP 10×10M is a useful tool for conducting internal system evaluations and decision-making. It may also be applied to a larger sample of organizations in order to determine the general state-of-play for research purposes. PMID:27655001

  11. Policies | High-Performance Computing | NREL

    Science.gov Websites

    Use Learn about policy governing user accountability, resource use, use by foreign nationals states. Data Security Learn about the data security policy, including data protection, data security retention policy, including project-centric and user-centric data. Shared Storage Usage Learn about a policy

  12. Forced Shortsightedness: Security Force Assistance Missions

    DTIC Science & Technology

    2014-06-01

    legislation , it is therefore the intention of the Congress to promote the peace of the world and the foreign policy, security, and general welfare of the... legislation , Congressional Research Service (CRS) reports, the Defense Institute of Security Assistance Management’s (DISAM) Green Book, and interviews with...developed database, there are “184 separate legislative authorities that power the 165 Building Partnership Capacity (BPC) programs managed across

  13. Composite Bloom Filters for Secure Record Linkage.

    PubMed

    Durham, Elizabeth Ashley; Kantarcioglu, Murat; Xue, Yuan; Toth, Csaba; Kuzu, Mehmet; Malin, Bradley

    2014-12-01

    The process of record linkage seeks to integrate instances that correspond to the same entity. Record linkage has traditionally been performed through the comparison of identifying field values ( e.g., Surname ), however, when databases are maintained by disparate organizations, the disclosure of such information can breach the privacy of the corresponding individuals. Various private record linkage (PRL) methods have been developed to obscure such identifiers, but they vary widely in their ability to balance competing goals of accuracy, efficiency and security. The tokenization and hashing of field values into Bloom filters (BF) enables greater linkage accuracy and efficiency than other PRL methods, but the encodings may be compromised through frequency-based cryptanalysis. Our objective is to adapt a BF encoding technique to mitigate such attacks with minimal sacrifices in accuracy and efficiency. To accomplish these goals, we introduce a statistically-informed method to generate BF encodings that integrate bits from multiple fields, the frequencies of which are provably associated with a minimum number of fields. Our method enables a user-specified tradeoff between security and accuracy. We compare our encoding method with other techniques using a public dataset of voter registration records and demonstrate that the increases in security come with only minor losses to accuracy.

  14. [Airport security check of medical substances used during patient repatriation].

    PubMed

    Felkai, Péter

    2012-09-16

    During airport security check of passenger luggage, hazardous items and substances are prohibited to be taken into the restricted safety zone of the airport and the aircraft. Among equipment of the medical staff escorting the patient, there are several devices and materials which are considered hazardous for security reasons. However, medical equipment and substances are indispensable for treating patients during the flight. The aim of the author was to present his experience obtained with the use of an instrument developed for testing liquids, aerosols and gels for security reasons. An instrument based on Raman spectroscopy was used for the identification of medical substances. The results confirmed that the instrument was able to recognize the tested medical substances. The non-destructive testing maintained sample integrity and asepsis. The data indicate that the instrument has a promising utility for the identification of medical substances. It seems important that during repatriation medical substances should be selected not only on the ground of their medical necessity, but their packaging should be also taken into consideration. It is necessary to perform more tests on different medical substances used in emergency care in order to make the database of medical substances stored in the library of instrument more complete.

  15. Composite Bloom Filters for Secure Record Linkage

    PubMed Central

    Durham, Elizabeth Ashley; Kantarcioglu, Murat; Xue, Yuan; Toth, Csaba; Kuzu, Mehmet; Malin, Bradley

    2014-01-01

    The process of record linkage seeks to integrate instances that correspond to the same entity. Record linkage has traditionally been performed through the comparison of identifying field values (e.g., Surname), however, when databases are maintained by disparate organizations, the disclosure of such information can breach the privacy of the corresponding individuals. Various private record linkage (PRL) methods have been developed to obscure such identifiers, but they vary widely in their ability to balance competing goals of accuracy, efficiency and security. The tokenization and hashing of field values into Bloom filters (BF) enables greater linkage accuracy and efficiency than other PRL methods, but the encodings may be compromised through frequency-based cryptanalysis. Our objective is to adapt a BF encoding technique to mitigate such attacks with minimal sacrifices in accuracy and efficiency. To accomplish these goals, we introduce a statistically-informed method to generate BF encodings that integrate bits from multiple fields, the frequencies of which are provably associated with a minimum number of fields. Our method enables a user-specified tradeoff between security and accuracy. We compare our encoding method with other techniques using a public dataset of voter registration records and demonstrate that the increases in security come with only minor losses to accuracy. PMID:25530689

  16. An overview of the roles and structure of international high-security veterinary laboratories for infectious animal diseases.

    PubMed

    Murray, P K

    1998-08-01

    The unique structure, role and operations of government high-security (HS) laboratories which work on animal diseases are described, with particular reference to the laboratories of nine countries. High-security laboratories provide cost-effective insurance against catastrophic losses which could occur following exotic disease outbreaks. The importance of these laboratories is reflected in the fact that several new laboratories have recently been constructed at considerable expense and older facilities have undergone major renovations. Biosecurity is fundamental to the operation of high-security laboratories, so good facility design and microbiological security practices are very important. High-security laboratories conduct exotic disease diagnosis, certification and surveillance, and also perform research into virology, disease pathogenesis and improvements to diagnostic tests and vaccines. The mandate of these laboratories includes the training of veterinarians in the recognition of exotic diseases. One extremely important role is the provision of expert advice on exotic diseases and participation (both nationally and internationally) in policy decisions regarding animal disease issues.

  17. Verifying the secure setup of Unix client/servers and detection of network intrusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feingold, R.; Bruestle, H.R.; Bartoletti, T.

    1995-07-01

    This paper describes our technical approach to developing and delivering Unix host- and network-based security products to meet the increasing challenges in information security. Today`s global ``Infosphere`` presents us with a networked environment that knows no geographical, national, or temporal boundaries, and no ownership, laws, or identity cards. This seamless aggregation of computers, networks, databases, applications, and the like store, transmit, and process information. This information is now recognized as an asset to governments, corporations, and individuals alike. This information must be protected from misuse. The Security Profile Inspector (SPI) performs static analyses of Unix-based clients and servers to checkmore » on their security configuration. SPI`s broad range of security tests and flexible usage options support the needs of novice and expert system administrators alike. SPI`s use within the Department of Energy and Department of Defense has resulted in more secure systems, less vulnerable to hostile intentions. Host-based information protection techniques and tools must also be supported by network-based capabilities. Our experience shows that a weak link in a network of clients and servers presents itself sooner or later, and can be more readily identified by dynamic intrusion detection techniques and tools. The Network Intrusion Detector (NID) is one such tool. NID is designed to monitor and analyze activity on an Ethernet broadcast Local Area Network segment and produce transcripts of suspicious user connections. NID`s retrospective and real-time modes have proven invaluable to security officers faced with ongoing attacks to their systems and networks.« less

  18. Biocontainment, biosecurity, and security practices in beef feedyards.

    PubMed

    Brandt, Aric W; Sanderson, Michael W; DeGroot, Brad D; Thomson, Dan U; Hollis, Larry C

    2008-01-15

    To determine the biocontainment, biosecurity, and security practices at beef feedyards in the Central Plains of the United States. Survey. Managers of feedyards in Colorado, Kansas, Nebraska, Oklahoma, and Texas that feed beef cattle for finish before slaughter; feedyards had to have an active concentrated animal feeding operation permit with a 1-time capacity of >or= 1,000 cattle. A voluntary survey of feedyard personnel was conducted. Identified feedyard personnel were interviewed and responses regarding facility design, security, employees, disease preparedness, feedstuffs, hospital or treatment systems, sanitation, cattle sources, handling of sick cattle, and disposal of carcasses were collected in a database questionnaire. The survey was conducted for 106 feedyards with a 1-time capacity that ranged from 1,300 to 125,000 cattle. Feedyards in general did not have high implementation of biocontainment, biosecurity, or security practices. Smaller feedyards were, in general, less likely to use good practices than were larger feedyards. Results of the survey provided standard practices for biocontainment, biosecurity, and security in feedyards located in Central Plains states. Information gained from the survey results can be used by consulting veterinarians and feedyard managers as a basis for discussion and to target training efforts.

  19. Ensuring Data Storage Security in Tree cast Routing Architecture for Sensor Networks

    NASA Astrophysics Data System (ADS)

    Kumar, K. E. Naresh; Sagar, U. Vidya; Waheed, Mohd. Abdul

    2010-10-01

    In this paper presents recent advances in technology have made low-cost, low-power wireless sensors with efficient energy consumption. A network of such nodes can coordinate among themselves for distributed sensing and processing of certain data. For which, we propose an architecture to provide a stateless solution in sensor networks for efficient routing in wireless sensor networks. This type of architecture is known as Tree Cast. We propose a unique method of address allocation, building up multiple disjoint trees which are geographically inter-twined and rooted at the data sink. Using these trees, routing messages to and from the sink node without maintaining any routing state in the sensor nodes is possible. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, this routing architecture moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this paper, we focus on data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users' data in this architecture, we propose an effective and flexible distributed scheme with two salient features, opposing to its predecessors. By utilizing the homomorphic token with distributed verification of erasure-coded data, our scheme achieves the integration of storage correctness insurance and data error localization, i.e., the identification of misbehaving server(s). Unlike most prior works, the new scheme further supports secure and efficient dynamic operations on data blocks, including: data update, delete and append. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against Byzantine failure, malicious data modification attack, and even server colluding attacks.

  20. Online Patron Records and Privacy: Service vs. Security.

    ERIC Educational Resources Information Center

    Fouty, Kathleen G.

    1993-01-01

    Examines issues regarding the privacy of information contained in patron databases that have resulted from online circulation systems. Topics discussed include library policies to protect information in patron records; ensuring compliance with policies; limiting the data collected; security authorizations; and creating and modifying patron…

  1. Intelligent community management system based on the devicenet fieldbus

    NASA Astrophysics Data System (ADS)

    Wang, Yulan; Wang, Jianxiong; Liu, Jiwen

    2013-03-01

    With the rapid development of the national economy and the improvement of people's living standards, people are making higher demands on the living environment. And the estate management content, management efficiency and service quality have been higher required. This paper in-depth analyzes about the intelligent community of the structure and composition. According to the users' requirements and related specifications, it achieves the district management systems, which includes Basic Information Management: the management level of housing, household information management, administrator-level management, password management, etc. Service Management: standard property costs, property charges collecting, the history of arrears and other property expenses. Security Management: household gas, water, electricity and security and other security management, security management district and other public places. Systems Management: backup database, restore database, log management. This article also carries out on the Intelligent Community System analysis, proposes an architecture which is based on B / S technology system. And it has achieved a global network device management with friendly, easy to use, unified human - machine interface.

  2. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprisingmore » computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.« less

  3. The International Collaboration for Autism Registry Epidemiology (iCARE): multinational registry-based investigations of autism risk factors and trends.

    PubMed

    Schendel, Diana E; Bresnahan, Michaeline; Carter, Kim W; Francis, Richard W; Gissler, Mika; Grønborg, Therese K; Gross, Raz; Gunnes, Nina; Hornig, Mady; Hultman, Christina M; Langridge, Amanda; Lauritsen, Marlene B; Leonard, Helen; Parner, Erik T; Reichenberg, Abraham; Sandin, Sven; Sourander, Andre; Stoltenberg, Camilla; Suominen, Auli; Surén, Pål; Susser, Ezra

    2013-11-01

    The International Collaboration for Autism Registry Epidemiology (iCARE) is the first multinational research consortium (Australia, Denmark, Finland, Israel, Norway, Sweden, USA) to promote research in autism geographical and temporal heterogeneity, phenotype, family and life course patterns, and etiology. iCARE devised solutions to challenges in multinational collaboration concerning data access security, confidentiality and management. Data are obtained by integrating existing national or state-wide, population-based, individual-level data systems and undergo rigorous harmonization and quality control processes. Analyses are performed using database federation via a computational infrastructure with a secure, web-based, interface. iCARE provides a unique, unprecedented resource in autism research that will significantly enhance the ability to detect environmental and genetic contributions to the causes and life course of autism.

  4. Multi-Bit Quantum Private Query

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Xu; Liu, Xing-Tong; Wang, Jian; Tang, Chao-Jing

    2015-09-01

    Most of the existing Quantum Private Queries (QPQ) protocols provide only single-bit queries service, thus have to be repeated several times when more bits are retrieved. Wei et al.'s scheme for block queries requires a high-dimension quantum key distribution system to sustain, which is still restricted in the laboratory. Here, based on Markus Jakobi et al.'s single-bit QPQ protocol, we propose a multi-bit quantum private query protocol, in which the user can get access to several bits within one single query. We also extend the proposed protocol to block queries, using a binary matrix to guard database security. Analysis in this paper shows that our protocol has better communication complexity, implementability and can achieve a considerable level of security.

  5. Internet Portal For A Distributed Management of Groundwater

    NASA Astrophysics Data System (ADS)

    Meissner, U. F.; Rueppel, U.; Gutzke, T.; Seewald, G.; Petersen, M.

    The management of groundwater resources for the supply of German cities and sub- urban areas has become a matter of public interest during the last years. Negative headlines in the Rhein-Main-Area dealt with cracks in buildings as well as damaged woodlands and inundated agriculture areas as an effect of varying groundwater levels. Usually a holistic management of groundwater resources is not existent because of the complexity of the geological system, the large number of involved groups and their divergent interests and a lack of essential information. The development of a network- based information system for an efficient groundwater management was the target of the project: ?Grundwasser-Online?[1]. The management of groundwater resources has to take into account various hydro- geological, climatic, water-economical, chemical and biological interrelations [2]. Thus, the traditional approaches in information retrieval, which are characterised by a high personnel and time expenditure, are not sufficient. Furthermore, the efficient control of the groundwater cultivation requires a direct communication between the different water supply companies, the consultant engineers, the scientists, the govern- mental agencies and the public, by using computer networks. The presented groundwater information system consists of different components, especially for the collection, storage, evaluation and visualisation of groundwater- relevant information. Network-based technologies are used [3]. For the collection of time-dependant groundwater-relevant information, modern technologies of Mobile Computing have been analysed in order to provide an integrated approach in the man- agement of large groundwater systems. The aggregated information is stored within a distributed geo-scientific database system which enables a direct integration of simu- lation programs for the evaluation of interactions in groundwater systems. Thus, even a prognosis for the evolution of groundwater states can be given. In order to gener- ate reports automatically, technologies are utilised. The visualisation of geo-scientific databases in the internet considering their geographic reference is performed with internet map servers. According to the communication of the map server with the un- derlying geo-scientific database, it is necessary that the demanded data can be filtered interactively in the internet browser using chronological and logical criteria. With re- gard to public use the security aspects within the described distributed system are of 1 major importance. Therefore, security methods for the modelling of access rights in combination with digital signatures have been analysed and implemented in order to provide a secure data exchange and communication between the different partners in the network 2

  6. Bayesian performance metrics and small system integration in recent homeland security and defense applications

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Kostrzewski, Andrew; Patton, Edward; Pradhan, Ranjit; Shih, Min-Yi; Walter, Kevin; Savant, Gajendra; Shie, Rick; Forrester, Thomas

    2010-04-01

    In this paper, Bayesian inference is applied to performance metrics definition of the important class of recent Homeland Security and defense systems called binary sensors, including both (internal) system performance and (external) CONOPS. The medical analogy is used to define the PPV (Positive Predictive Value), the basic Bayesian metrics parameter of the binary sensors. Also, Small System Integration (SSI) is discussed in the context of recent Homeland Security and defense applications, emphasizing a highly multi-technological approach, within the broad range of clusters ("nexus") of electronics, optics, X-ray physics, γ-ray physics, and other disciplines.

  7. Detection of people in military and security context imagery

    NASA Astrophysics Data System (ADS)

    Shannon, Thomas M. L.; Spier, Emmet H.; Wiltshire, Ben

    2014-10-01

    A high level of manual visual surveillance of complex scenes is dependent solely on the awareness of human operators whereas an autonomous person detection solution could assist by drawing their attention to potential issues, in order to reduce cognitive burden and achieve more with less manpower. Our research addressed the challenge of the reliable identification of persons in a scene who may be partially obscured by structures or by handling weapons or tools. We tested the efficacy of a recently published computer vision approach based on the construction of cascaded, non-linear classifiers from part-based deformable models by assessing performance using imagery containing infantrymen in the open or when obscured, undertaking low level tactics or acting as civilians using tools. Results were compared with those obtained from published upright pedestrian imagery. The person detector yielded a precision of approximately 65% for a recall rate of 85% for military context imagery as opposed to a precision of 85% for the upright pedestrian image cases. These results compared favorably with those reported by the authors when applied to a range of other on-line imagery databases. Our conclusion is that the deformable part-based model method may be a potentially useful people detection tool in the challenging environment of military and security context imagery.

  8. 7 CFR 274.3 - Retailer management.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... retailer, and it must include acceptable privacy and security features. Such systems shall only be... terminals that are capable of relaying electronic transactions to a central database computer for... specifications prior to implementation of the EBT system to enable third party processors to access the database...

  9. 76 FR 19376 - Statement of Organizations, Functions, and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... safety mission. These outside groups include academic organizations, private organizations, and other Federal Agencies. 3. Coordinates the access to large databases for pharmacoepidemiologic and..., procedures, training, and security or databases available to OSE. 3. Acts as focal point for all hardware...

  10. Encryption Characteristics of Two USB-based Personal Health Record Devices

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2007-01-01

    Personal health records (PHRs) hold great promise for empowering patients and increasing the accuracy and completeness of health information. We reviewed two small USB-based PHR devices that allow a patient to easily store and transport their personal health information. Both devices offer password protection and encryption features. Analysis of the devices shows that they store their data in a Microsoft Access database. Due to a flaw in the encryption of this database, recovering the user’s password can be accomplished with minimal effort. Our analysis also showed that, rather than encrypting health information with the password chosen by the user, the devices stored the user’s password as a string in the database and then encrypted that database with a common password set by the manufacturer. This is another serious vulnerability. This article describes the weaknesses we discovered, outlines three critical flaws with the security model used by the devices, and recommends four guidelines for improving the security of similar devices. PMID:17460132

  11. Building an integrated neurodegenerative disease database at an academic health center.

    PubMed

    Xie, Sharon X; Baek, Young; Grossman, Murray; Arnold, Steven E; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M-Y; Trojanowski, John Q

    2011-07-01

    It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration. These comparative studies rely on powerful database tools to quickly generate data sets that match diverse and complementary criteria set by them. In this article, we present a novel integrated neurodegenerative disease (INDD) database, which was developed at the University of Pennsylvania (Penn) with the help of a consortium of Penn investigators. Because the work of these investigators are based on Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration, it allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used the Microsoft SQL server as a platform, with built-in "backwards" functionality to provide Access as a frontend client to interface with the database. We used PHP Hypertext Preprocessor to create the "frontend" web interface and then used a master lookup table to integrate individual neurodegenerative disease databases. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Using the INDD database, we compared the results of a biomarker study with those using an alternative approach by querying individual databases separately. We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies on several neurodegenerative diseases. Copyright © 2011 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  12. 49 CFR 228.203 - Program components.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Program components. (a) System security. The integrity of the program and database must be protected by a security system that utilizes an employee identification number and password, or a comparable method, to... system to pre-populate fields of the hours of service record provided that— (A) The recordkeeping system...

  13. Access control based on attribute certificates for medical intranet applications.

    PubMed

    Mavridis, I; Georgiadis, C; Pangalos, G; Khair, M

    2001-01-01

    Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy.

  14. Department of Defense High Performance Computing Modernization Program. 2008 Annual Report

    DTIC Science & Technology

    2009-04-01

    place to another on the network. Without it, a computer could only talk to itself - no email, no web browsing, and no iTunes . Most of the Internet...Your SecurID Card ), Ken Renard Secure Wireless, Rob Scott and Stephen Bowman Securing Today’s Networks, Rich Whittney, Juniper Networks, Federal

  15. Independent Review of Aviation Technology and Research Information Analysis System (ATRIAS) Database

    DTIC Science & Technology

    1994-02-01

    capability to support the Federal Aviation Administration (FAA)/ Aviation Security Research and Development Service’s (ACA) Explosive Detection...Systems (EDS) programs and Aviation Security Human Factors Program (ASHFP). This review was conducted by an independent consultant selected by the FAA...sections 2 and 3 of the report. Overall, ATRIAS was found to address many technology application areas relevant to the FAA’s aviation security programs

  16. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network

    PubMed Central

    Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-01-01

    Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596

  17. Intellectual property (IP) analysis of embossed hologram business

    NASA Astrophysics Data System (ADS)

    Hunt, David; Reingand, Nadya; Cantrell, Robert

    2006-02-01

    This paper presents an overview of patents and patent applications on security embossed holograms, and highlights the possibilities offered by patent searching and analysis. Thousands of patent documents relevant to embossed holograms were uncovered by the study. The search was performed in the following databases: U.S. Patent Office, European Patent Office, Japanese Patent Office and Korean Patent Office for the time frame from 1971 through November 2005. The patent analysis unveils trends in patent temporal distribution, patent families formation, significant technological coverage within the embossed holography market and other interesting insights.

  18. Comprehensive Routing Security Development and Deployment for the Internet

    DTIC Science & Technology

    2015-02-01

    feature enhancement and bug fixes. • MySQL : MySQL is a widely used and popular open source database package. It was chosen for database support in the...RPSTIR depends on several other open source packages. • MySQL : MySQL is used for the the local RPKI database cache. • OpenSSL: OpenSSL is used for...cryptographic libraries for X.509 certificates. • ODBC mySql Connector: ODBC (Open Database Connectivity) is a standard programming interface (API) for

  19. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    NASA Astrophysics Data System (ADS)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  20. Lidar and Dial application for detection and identification: a proposal to improve safety and security

    NASA Astrophysics Data System (ADS)

    Gaudio, P.; Malizia, A.; Gelfusa, M.; Murari, A.; Parracino, S.; Poggi, L. A.; Lungaroni, M.; Ciparisse, J. F.; Di Giovanni, D.; Cenciarelli, O.; Carestia, M.; Peluso, E.; Gabbarini, V.; Talebzadeh, S.; Bellecci, C.

    2017-01-01

    Nowadays the intentional diffusion in air (both in open and confined environments) of chemical contaminants is a dramatic source of risk for the public health worldwide. The needs of a high-tech networks composed by software, diagnostics, decision support systems and cyber security tools are urging all the stakeholders (military, public, research & academic entities) to create innovative solutions to face this problem and improve both safety and security. The Quantum Electronics and Plasma Physics (QEP) Research Group of the University of Rome Tor Vergata is working since the 1960s on the development of laser-based technologies for the stand-off detection of contaminants in the air. Up to now, four demonstrators have been developed (two LIDAR-based and two DIAL-based) and have been used in experimental campaigns during all 2015. These systems and technologies can be used together to create an innovative solution to the problem of public safety and security: the creation of a network composed by detection systems: A low cost LIDAR based system has been tested in an urban area to detect pollutants coming from urban traffic, in this paper the authors show the results obtained in the city of Crotone (south of Italy). This system can be used as a first alarm and can be coupled with an identification system to investigate the nature of the threat. A laboratory dial based system has been used in order to create a database of absorption spectra of chemical substances that could be release in atmosphere, these spectra can be considered as the fingerprints of the substances that have to be identified. In order to create the database absorption measurements in cell, at different conditions, are in progress and the first results are presented in this paper.

  1. A new security solution to JPEG using hyper-chaotic system and modified zigzag scan coding

    NASA Astrophysics Data System (ADS)

    Ji, Xiao-yong; Bai, Sen; Guo, Yu; Guo, Hui

    2015-05-01

    Though JPEG is an excellent compression standard of images, it does not provide any security performance. Thus, a security solution to JPEG was proposed in Zhang et al. (2014). But there are some flaws in Zhang's scheme and in this paper we propose a new scheme based on discrete hyper-chaotic system and modified zigzag scan coding. By shuffling the identifiers of zigzag scan encoded sequence with hyper-chaotic sequence and accurately encrypting the certain coefficients which have little relationship with the correlation of the plain image in zigzag scan encoded domain, we achieve high compression performance and robust security simultaneously. Meanwhile we present and analyze the flaws in Zhang's scheme through theoretical analysis and experimental verification, and give the comparisons between our scheme and Zhang's. Simulation results verify that our method has better performance in security and efficiency.

  2. 75 FR 70047 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-16

    ... to the Office of Management and Budget for approval. The Securities and Exchange Commission has begun the design of a new Electronic Data Collection System database (the Database) and invites comment on... Investor Education and Advocacy, Washington, DC 20549-0213. Electronic Data Collection System Notice is...

  3. Prevalence of Food Insecurity in Iran: A Systematic Review and Meta-analysis.

    PubMed

    Behzadifar, Meysam; Behzadifar, Masoud; Abdi, Shadi; Malekzadeh, Reza; Arab Salmani, Masoumeh; Ghoreishinia, Gholamreza; Falahi, Ebrahim; Mirzaei, Masoud; Shams Biranvand, Nabi; Sayehmiri, Kourosh

    2016-04-01

    Food security is one of the main factors of individual and social health. It is of such importance that the World Bank and Food and Agriculture Organization (FAO) announced it as one of the Millennium Development Goals. This study aimed to report the prevalence of food insecurity in Iran. We searched English databases including; Scopus, Ovid, Web of Science, PubMed and Google Scholar and also Iranian databases; SID, Magiran and IranMedex for words Iran, food insecurity, and prevalence up to August 2015. The pooled food insecurity prevalence was calculated using Der-Simonian test. All analyses were performed using random effects model with 95% CI. We assessed heterogeneity of the studies using sub-group and meta-regression analyses. A total of 31 studies were included. The prevalence of food insecurity was 49% among households (95% CI: %40-%59), 67% in children (95% CI: %63-%70), 61% in mothers (95% CI: %35-%88), 49% in adolescents (95% CI: %33-%66) and 65% in the elderly (95% CI: %44-%86). The prevalence of food insecurity is high in Iran. Fiscal policies should promote the nutritional knowledge of household members and also support the households to meet their nutritional needs. This plan should give priority to mid and low socioeconomic groups.

  4. Design and implementation of a high performance network security processor

    NASA Astrophysics Data System (ADS)

    Wang, Haixin; Bai, Guoqiang; Chen, Hongyi

    2010-03-01

    The last few years have seen many significant progresses in the field of application-specific processors. One example is network security processors (NSPs) that perform various cryptographic operations specified by network security protocols and help to offload the computation intensive burdens from network processors (NPs). This article presents a high performance NSP system architecture implementation intended for both internet protocol security (IPSec) and secure socket layer (SSL) protocol acceleration, which are widely employed in virtual private network (VPN) and e-commerce applications. The efficient dual one-way pipelined data transfer skeleton and optimised integration scheme of the heterogenous parallel crypto engine arrays lead to a Gbps rate NSP, which is programmable with domain specific descriptor-based instructions. The descriptor-based control flow fragments large data packets and distributes them to the crypto engine arrays, which fully utilises the parallel computation resources and improves the overall system data throughput. A prototyping platform for this NSP design is implemented with a Xilinx XC3S5000 based FPGA chip set. Results show that the design gives a peak throughput for the IPSec ESP tunnel mode of 2.85 Gbps with over 2100 full SSL handshakes per second at a clock rate of 95 MHz.

  5. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  6. Granular Security in a Graph Database

    DTIC Science & Technology

    2016-03-01

    have a presence in more than one layer. For example, a single social media user may have an account in Twitter, Facebook, and Instagram with... Instagram layers. This restriction re- flects the reality that user A’s Facebook account cannot connect directly to user B’s Twitter account. A security

  7. 76 FR 34616 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security/National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... questions please contact: Emily Andrew (703-235-2182), Privacy Officer, National Protection and Programs... U.S.C. 552a, the Department of Homeland Security (DHS)/National Protection and Programs Directorate... Screening Database (TSDB). The TSDB is the Federal government's consolidated and integrated terrorist...

  8. 15 CFR 743.2 - High performance computers: Post shipment verification reporting.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 2 2012-01-01 2012-01-01 false High performance computers: Post... Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS SPECIAL REPORTING § 743.2 High performance computers: Post shipment verification...

  9. 15 CFR 743.2 - High performance computers: Post shipment verification reporting.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false High performance computers: Post... Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS SPECIAL REPORTING § 743.2 High performance computers: Post shipment verification...

  10. 15 CFR 743.2 - High performance computers: Post shipment verification reporting.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false High performance computers: Post... Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS SPECIAL REPORTING § 743.2 High performance computers: Post shipment verification...

  11. 15 CFR 743.2 - High performance computers: Post shipment verification reporting.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 2 2013-01-01 2013-01-01 false High performance computers: Post... Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS SPECIAL REPORTING § 743.2 High performance computers: Post shipment verification...

  12. Correlates of quality educational programs.

    PubMed

    Chester, Deborah R; Tracy, Jessamyn A; Earp, Emily; Chauhan, Reetu

    2002-06-01

    Preliminary evaluation findings are presented that explore relationships between educational program quality and program characteristics such as program type, security level, aftercare, teacher certification, facility size, and private versus public provider. Several program characteristics are found to be related to measurements of educational program quality. Among the major quality characteristics are proportion of program teachers that are professionally certified, smaller sized facilities versus larger facilities, level of aftercare services, and provider sources, with private for-profit providers being the lowest performing and public providers being the highest performing. The article closes with description of the Juvenile Justice Educational Enhancement Program's continuing evaluation of correlates to educational program quality through the continued development of a comprehensive database.

  13. Developing High-resolution Soil Database for Regional Crop Modeling in East Africa

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A. V. M.

    2014-12-01

    The most readily available soil data for regional crop modeling in Africa is the World Inventory of Soil Emission potentials (WISE) dataset, which has 1125 soil profiles for the world, but does not extensively cover countries Ethiopia, Kenya, Uganda and Tanzania in East Africa. Another dataset available is the HC27 (Harvest Choice by IFPRI) in a gridded format (10km) but composed of generic soil profiles based on only three criteria (texture, rooting depth, and organic carbon content). In this paper, we present a development and application of a high-resolution (1km), gridded soil database for regional crop modeling in East Africa. Basic soil information is extracted from Africa Soil Information Service (AfSIS), which provides essential soil properties (bulk density, soil organic carbon, soil PH and percentages of sand, silt and clay) for 6 different standardized soil layers (5, 15, 30, 60, 100 and 200 cm) in 1km resolution. Soil hydraulic properties (e.g., field capacity and wilting point) are derived from the AfSIS soil dataset using well-proven pedo-transfer functions and are customized for DSSAT-CSM soil data requirements. The crop model is used to evaluate crop yield forecasts using the new high resolution soil database and compared with WISE and HC27. In this paper we will present also the results of DSSAT loosely coupled with a hydrologic model (VIC) to assimilate root-zone soil moisture. Creating a grid-based soil database, which provides a consistent soil input for two different models (DSSAT and VIC) is a critical part of this work. The created soil database is expected to contribute to future applications of DSSAT crop simulation in East Africa where food security is highly vulnerable.

  14. Long-acting injectable antipsychotics for prevention and management of violent behaviour in psychotic patients.

    PubMed

    Mohr, Pavel; Knytl, Pavel; Voráčková, Veronika; Bravermanová, Anna; Melicher, Tomáš

    2017-09-01

    It has been well established that long-term antipsychotic treatment prevents relapse, lowers number of rehospitalisations, and also effectively reduces violent behaviour. Although violent behaviour is not a typical manifestation of schizophrenia or other psychotic disorders, the diagnosis of psychosis increases the overall risk of violence. One of the few modifiable factors of violence risk is adherence with medication. In contrast, non-adherence with drug treatment and subsequent relapse increases risk of violent acts. Non-adherence can be addressed partially by long-acting injectable antipsychotics (LAI). The aim of our review was to examine the role of antipsychotic drugs, especially LAI, in prevention and management of violent behaviour in psychosis. This is a non-systematic, narrative review of the data from open, naturalistic, retrospective, and population studies, case series, and post hoc analyses of randomised controlled trials. Search of electronic databases (PubMed, Embase) was performed to identify relevant papers. Nine published papers (3 cross-sectional chart reviews, 4 retrospective studies, 2 prospective, randomised trials) were found. The results indicated positive clinical and antiaggressive effects of LAI in psychotic patients with high risk of violent behaviour. Reviewed evidence suggests that secured drug treatment with LAI may have clinical benefit in schizophrenia patients with high risk of violent behaviour. LAI significantly reduced the severity of hostility, aggressivity, number of violent incidents, and criminal offences. These findings are supported further by the empirical evidence from clinical practice, high rates of prescribed LAI to schizophrenia patients in high-security and forensic psychiatric facilities. Available data encourage the use of LAI in forensic psychiatry, especially during court-ordered commitment treatment. © 2017 John Wiley & Sons Ltd.

  15. Methods to Secure Databases Against Vulnerabilities

    DTIC Science & Technology

    2015-12-01

    for several languages such as C, C++, PHP, Java and Python [16]. MySQL will work well with very large databases. The documentation references...using Eclipse and connected to each database management system using Python and Java drivers provided by MySQL , MongoDB, and Datastax (for Cassandra...tiers in Python and Java . Problem MySQL MongoDB Cassandra 1. Injection a. Tautologies Vulnerable Vulnerable Not Vulnerable b. Illegal query

  16. ProXL (Protein Cross-Linking Database): A Platform for Analysis, Visualization, and Sharing of Protein Cross-Linking Mass Spectrometry Data

    PubMed Central

    2016-01-01

    ProXL is a Web application and accompanying database designed for sharing, visualizing, and analyzing bottom-up protein cross-linking mass spectrometry data with an emphasis on structural analysis and quality control. ProXL is designed to be independent of any particular software pipeline. The import process is simplified by the use of the ProXL XML data format, which shields developers of data importers from the relative complexity of the relational database schema. The database and Web interfaces function equally well for any software pipeline and allow data from disparate pipelines to be merged and contrasted. ProXL includes robust public and private data sharing capabilities, including a project-based interface designed to ensure security and facilitate collaboration among multiple researchers. ProXL provides multiple interactive and highly dynamic data visualizations that facilitate structural-based analysis of the observed cross-links as well as quality control. ProXL is open-source, well-documented, and freely available at https://github.com/yeastrc/proxl-web-app. PMID:27302480

  17. ProXL (Protein Cross-Linking Database): A Platform for Analysis, Visualization, and Sharing of Protein Cross-Linking Mass Spectrometry Data.

    PubMed

    Riffle, Michael; Jaschob, Daniel; Zelter, Alex; Davis, Trisha N

    2016-08-05

    ProXL is a Web application and accompanying database designed for sharing, visualizing, and analyzing bottom-up protein cross-linking mass spectrometry data with an emphasis on structural analysis and quality control. ProXL is designed to be independent of any particular software pipeline. The import process is simplified by the use of the ProXL XML data format, which shields developers of data importers from the relative complexity of the relational database schema. The database and Web interfaces function equally well for any software pipeline and allow data from disparate pipelines to be merged and contrasted. ProXL includes robust public and private data sharing capabilities, including a project-based interface designed to ensure security and facilitate collaboration among multiple researchers. ProXL provides multiple interactive and highly dynamic data visualizations that facilitate structural-based analysis of the observed cross-links as well as quality control. ProXL is open-source, well-documented, and freely available at https://github.com/yeastrc/proxl-web-app .

  18. Mind matters: A meta-analysis on parental mentalization and sensitivity as predictors of infant-parent attachment.

    PubMed

    Zeegers, Moniek A J; Colonnesi, Cristina; Stams, Geert-Jan J M; Meins, Elizabeth

    2017-12-01

    Major developments in attachment research over the past 2 decades have introduced parental mentalization as a predictor of infant-parent attachment security. Parental mentalization is the degree to which parents show frequent, coherent, or appropriate appreciation of their infants' internal states. The present study examined the triangular relations between parental mentalization, parental sensitivity, and attachment security. A total of 20 effect sizes (N = 974) on the relation between parental mentalization and attachment, 82 effect sizes (N = 6,664) on the relation between sensitivity and attachment, and 24 effect sizes (N = 2,029) on the relation between mentalization and sensitivity were subjected to multilevel meta-analyses. The results showed a pooled correlation of r = .30 between parental mentalization and infant attachment security, and rs of .25 for the correlations between sensitivity and attachment security, and between parental mentalization and sensitivity. A meta-analytic structural equation model was performed to examine the combined effects of mentalization and sensitivity as predictors of infant attachment. Together, the predictors explained 12% of the variance in attachment security. After controlling for the effect of sensitivity, the relation between parental mentalization and attachment remained, r = .24; the relation between sensitivity and attachment remained after controlling for parental mentalization, r = .19. Sensitivity also mediated the relation between parental mentalization and attachment security, r = .07, suggesting that mentalization exerts both direct and indirect influences on attachment security. The results imply that parental mentalization should be incorporated into existing models that map the predictors of infant-parent attachment. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldridge, Chris D.

    Mobile biometric devices (MBDs) capable of both enrolling individuals in databases and performing identification checks of subjects in the field are seen as an important capability for military, law enforcement, and homeland security operations. The technology is advancing rapidly. The Department of Homeland Security Science and Technology Directorate through an Interagency Agreement with Sandia sponsored a series of pilot projects to obtain information for the first responder law enforcement community on further identification of requirements for mobile biometric device technology. Working with 62 different jurisdictions, including components of the Department of Homeland Security, Sandia delivered a series of reports onmore » user operation of state-of-the-art mobile biometric devices. These reports included feedback information on MBD usage in both operational and exercise scenarios. The findings and conclusions of the project address both the limitations and possibilities of MBD technology to improve operations. Evidence of these possibilities can be found in the adoption of this technology by many agencies today and the cooperation of several law enforcement agencies in both participating in the pilot efforts and sharing of information about their own experiences in efforts undertaken separately.« less

  20. A community effort to protect genomic data sharing, collaboration and outsourcing.

    PubMed

    Wang, Shuang; Jiang, Xiaoqian; Tang, Haixu; Wang, Xiaofeng; Bu, Diyue; Carey, Knox; Dyke, Stephanie Om; Fox, Dov; Jiang, Chao; Lauter, Kristin; Malin, Bradley; Sofia, Heidi; Telenti, Amalio; Wang, Lei; Wang, Wenhao; Ohno-Machado, Lucila

    2017-01-01

    The human genome can reveal sensitive information and is potentially re-identifiable, which raises privacy and security concerns about sharing such data on wide scales. In 2016, we organized the third Critical Assessment of Data Privacy and Protection competition as a community effort to bring together biomedical informaticists, computer privacy and security researchers, and scholars in ethical, legal, and social implications (ELSI) to assess the latest advances on privacy-preserving techniques for protecting human genomic data. Teams were asked to develop novel protection methods for emerging genome privacy challenges in three scenarios: Track (1) data sharing through the Beacon service of the Global Alliance for Genomics and Health. Track (2) collaborative discovery of similar genomes between two institutions; and Track (3) data outsourcing to public cloud services. The latter two tracks represent continuing themes from our 2015 competition, while the former was new and a response to a recently established vulnerability. The winning strategy for Track 1 mitigated the privacy risk by hiding approximately 11% of the variation in the database while permitting around 160,000 queries, a significant improvement over the baseline. The winning strategies in Tracks 2 and 3 showed significant progress over the previous competition by achieving multiple orders of magnitude performance improvement in terms of computational runtime and memory requirements. The outcomes suggest that applying highly optimized privacy-preserving and secure computation techniques to safeguard genomic data sharing and analysis is useful. However, the results also indicate that further efforts are needed to refine these techniques into practical solutions.

  1. Secondary Use of Claims Data from the Austrian Health Insurance System with i2b2: A Pilot Study.

    PubMed

    Endel, Florian; Duftschmid, Georg

    2016-01-01

    In conformity with increasing international efforts to reuse routine health data for scientific purposes, the Main Association of Austrian Social Security Organisations provides pseudonymized claims data of the Austrian health care system for clinical research. We aimed to examine, whether an integration of the corresponding database into i2b2 would be possible and provide benefits. We applied docker-based software containers and data transformations to set up the system. To assess the benefits of i2b2 we plan to reenact the task of cohort formation of an earlier research project. The claims database was successfully integrated into i2b2. The docker-based installation approach will be published as git repository. The assessment of i2b2's benefits is currently work in progress and will be presented at the conference. Docker enables a flexible, reproducible, and resource-efficient installation of i2b2 within the restricted environment implied by our highly secured target system. First preliminary tests indicated several potential benefits of i2b2 compared to the methods applied during the earlier research project.

  2. Data, Data Everywhere but Not a Byte to Read: Managing Monitoring Information.

    ERIC Educational Resources Information Center

    Stafford, Susan G.

    1993-01-01

    Describes the Forest Science Data Bank that contains 2,400 data sets from over 350 existing ecological studies. Database features described include involvement of the scientific community; database documentation; data quality assurance; security; data access and retrieval; and data import/export flexibility. Appendices present the Quantitative…

  3. 76 FR 26776 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-09

    ... current collection of information to the Office of Management and Budget for approval. The Securities and Exchange Commission has begun the design of a new Electronic Data Collection System database (the Database..., Washington, DC 20549-0213. Extension: Electronic Data Collection System; OMB Control No. 3235-0672; SEC File...

  4. 15 CFR 743.2 - High performance computers: Post shipment verification reporting.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 2 2014-01-01 2014-01-01 false High performance computers: Post... Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS SPECIAL REPORTING AND NOTIFICATION § 743.2 High performance computers: Post shipment...

  5. Environmental security: a geographic information system analysis approach--the case of Kenya.

    PubMed

    Bocchi, Stefano; Disperati, Stefano Peppino; Rossi, Simone

    2006-02-01

    Studies into the relationships between environmental factors and violence or conflicts constitute a very debated research field called environmental security. Several authors think that environmental scarcity, which is scarcity of renewable resources, can contribute to generate violence or social unrest, particularly within states scarcely endowed with technical know-how and social structures, such as developing countries. In this work, we referred to the theoretical model developed by the Environmental Change and Acute Conflict Project. Our goal was to use easily available spatial databases to map the various sources of environmental scarcity through geographic information systems, in order to locate the areas apparently most at risk of suffering negative social effects and their consequences in terms of internal security. The analysis was carried out at a subnational level and applied to the case of Kenya. A first phase of the work included a careful selection of databases relative to renewable resources. Spatial operations among these data allowed us to obtain new information on the availability of renewable resources (cropland, forests, water), on the present and foreseen demographic pressure, as well as on the social and technical ingenuity. The results made it possible to identify areas suffering from scarcity of one or more renewable resources, indicating different levels of gravity. Accounts from Kenya seem to confirm our results, reporting clashes between tribal groups over the access to scarce resources in areas that our work showed to be at high risk.

  6. Conference Proceedings for the Thirteenth Annual IFIP Working Group 11.3 Conference on Database Security Held in Seattle, Washington, July 25 - 28, 1999.

    DTIC Science & Technology

    1999-07-28

    Inf Med, 35 (1996). 8. J. P . O’Connor, J. W. Gray, C. McCollum, L. Notargiacomo, in Research Directions in Database Security, T. F. Lunt, Ed...Therefore, in general we favour refusal over lying. 104 There are several directions for further interesting research. We only mention a few of them...of the rules of P . Given two nodes p1 and p2 there is a direct edge from p1 to p2 if and only if predicate p2 occurs positively or negatively in the

  7. National Security Personnel System (NSPS): An Analysis of Key Stakeholders’ Perceptions during DoD’s Implementation of NSPS

    DTIC Science & Technology

    2010-06-01

    62 1. KPP 1: High Performing Workplace and Environment................65 a. Attribute 1. System...source for employee values and actions. The stereotypical value of the federal government employee, especially under the GS system, was job security...most directly met by this model is job security. This job security is often stereotyped by the saying; you cannot fire a government employee

  8. A high performance, ad-hoc, fuzzy query processing system for relational databases

    NASA Technical Reports Server (NTRS)

    Mansfield, William H., Jr.; Fleischman, Robert M.

    1992-01-01

    Database queries involving imprecise or fuzzy predicates are currently an evolving area of academic and industrial research. Such queries place severe stress on the indexing and I/O subsystems of conventional database environments since they involve the search of large numbers of records. The Datacycle architecture and research prototype is a database environment that uses filtering technology to perform an efficient, exhaustive search of an entire database. It has recently been modified to include fuzzy predicates in its query processing. The approach obviates the need for complex index structures, provides unlimited query throughput, permits the use of ad-hoc fuzzy membership functions, and provides a deterministic response time largely independent of query complexity and load. This paper describes the Datacycle prototype implementation of fuzzy queries and some recent performance results.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    B. Gardiner; L.Graton; J.Longo

    Classified removable electronic media (CREM) are tracked in several different ways at the Laboratory. To ensure greater security for CREM, we are creating a single, Laboratory-wide system to track CREM. We are researching technology that can be used to electronically tag and detect CREM, designing a database to track the movement of CREM, and planning to test the system at several locations around the Laboratory. We focus on affixing ''smart tags'' to items we want to track and installing gates at pedestrian portals to detect the entry or exit of tagged items. By means of an enterprise database, the systemmore » will track the entry and exit of tagged items into and from CREM storage vaults, vault-type rooms, access corridors, or boundaries of secure areas, as well as the identity of the person carrying an item. We are considering several options for tracking items that can give greater security, but at greater expense.« less

  10. Loss-tolerant measurement-device-independent quantum private queries

    NASA Astrophysics Data System (ADS)

    Zhao, Liang-Yuan; Yin, Zhen-Qiang; Chen, Wei; Qian, Yong-Jun; Zhang, Chun-Mei; Guo, Guang-Can; Han, Zheng-Fu

    2017-01-01

    Quantum private queries (QPQ) is an important cryptography protocol aiming to protect both the user’s and database’s privacy when the database is queried privately. Recently, a variety of practical QPQ protocols based on quantum key distribution (QKD) have been proposed. However, for QKD-based QPQ the user’s imperfect detectors can be subjected to some detector- side-channel attacks launched by the dishonest owner of the database. Here, we present a simple example that shows how the detector-blinding attack can damage the security of QKD-based QPQ completely. To remove all the known and unknown detector side channels, we propose a solution of measurement-device-independent QPQ (MDI-QPQ) with single- photon sources. The security of the proposed protocol has been analyzed under some typical attacks. Moreover, we prove that its security is completely loss independent. The results show that practical QPQ will remain the same degree of privacy as before even with seriously uncharacterized detectors.

  11. Error Rates in Users of Automatic Face Recognition Software

    PubMed Central

    White, David; Dunn, James D.; Schmid, Alexandra C.; Kemp, Richard I.

    2015-01-01

    In recent years, wide deployment of automatic face recognition systems has been accompanied by substantial gains in algorithm performance. However, benchmarking tests designed to evaluate these systems do not account for the errors of human operators, who are often an integral part of face recognition solutions in forensic and security settings. This causes a mismatch between evaluation tests and operational accuracy. We address this by measuring user performance in a face recognition system used to screen passport applications for identity fraud. Experiment 1 measured target detection accuracy in algorithm-generated ‘candidate lists’ selected from a large database of passport images. Accuracy was notably poorer than in previous studies of unfamiliar face matching: participants made over 50% errors for adult target faces, and over 60% when matching images of children. Experiment 2 then compared performance of student participants to trained passport officers–who use the system in their daily work–and found equivalent performance in these groups. Encouragingly, a group of highly trained and experienced “facial examiners” outperformed these groups by 20 percentage points. We conclude that human performance curtails accuracy of face recognition systems–potentially reducing benchmark estimates by 50% in operational settings. Mere practise does not attenuate these limits, but superior performance of trained examiners suggests that recruitment and selection of human operators, in combination with effective training and mentorship, can improve the operational accuracy of face recognition systems. PMID:26465631

  12. The impact of humanitarian context conditions and individual characteristics on aid worker retention.

    PubMed

    Korff, Valeska P; Balbo, Nicoletta; Mills, Melinda; Heyse, Liesbet; Wittek, Rafael

    2015-07-01

    High employee turnover rates constitute a major challenge to effective aid provision. This study examines how features of humanitarian work and aid workers' individual characteristics affect retention within one humanitarian organisation, Médecins Sans Frontières (MSF) Holland. The study extends existing research by providing new theoretical explanations of employment opportunities and constraints and by engaging in the first large-scale quantitative analysis of aid worker retention. Using a database of field staff (N=1,955), a logistic regression is performed of the likelihood of reenlistment after a first mission. The findings demonstrate that only 40 per cent of employees reenlist for a second mission with MSF Holland, and that workplace location and security situation, age, and gender have no significant effect. Individuals are less likely to reenlist if they returned early from the first mission for a personal reason, are in a relationship, are medical doctors, or if they come from highly developed countries. The paper reflects on the findings in the light of policy. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  13. WebCIS: large scale deployment of a Web-based clinical information system.

    PubMed

    Hripcsak, G; Cimino, J J; Sengupta, S

    1999-01-01

    WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.

  14. An Analysis of China’s Information Technology Strategies and their Implication for US National Security

    DTIC Science & Technology

    2006-06-01

    environment of Web-enabled database searches, online shopping , e-business, and daily credit-card use, which are very common in the United States. Cyberspace...establishing credibility for data exchange such as online shopping . Present regulations stipulate that security chips used by the Chinese government and

  15. 76 FR 12397 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-07

    ...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...

  16. Access Control based on Attribute Certificates for Medical Intranet Applications

    PubMed Central

    Georgiadis, Christos; Pangalos, George; Khair, Marie

    2001-01-01

    Background Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. Objectives To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. Methods We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Results Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Conclusions Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy. PMID:11720951

  17. Design and implementation of website information disclosure assessment system.

    PubMed

    Cho, Ying-Chiang; Pan, Jen-Yi

    2015-01-01

    Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people's lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website's information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, K.; Tsai, H.; Liu, Y. Y.

    Radio frequency identification (RFID) is one of today's most rapidly growing technologies in the automatic data collection industry. Although commercial applications are already widespread, the use of this technology for managing nuclear materials is only in its infancy. Employing an RFID system has the potential to offer an immense payback: enhanced safety and security, reduced need for manned surveillance, real-time access to status and event history data, and overall cost-effectiveness. The Packaging Certification Program (PCP) in the U.S. Department of Energy's (DOE's) Office of Environmental Management (EM), Office of Packaging and Transportation (EM-63), is developing an RFID system for nuclearmore » materials management. The system consists of battery-powered RFID tags with onboard sensors and memories, a reader network, application software, a database server and web pages. The tags monitor and record critical parameters, including the status of seals, movement of objects, and environmental conditions of the nuclear material packages in real time. They also provide instant warnings or alarms when preset thresholds for the sensors are exceeded. The information collected by the readers is transmitted to a dedicated central database server that can be accessed by authorized users across the DOE complex via a secured network. The onboard memory of the tags allows the materials manifest and event history data to reside with the packages throughout their life cycles in storage, transportation, and disposal. Data security is currently based on Advanced Encryption Standard-256. The software provides easy-to-use graphical interfaces that allow access to all vital information once the security and privilege requirements are met. An innovative scheme has been developed for managing batteries in service for more than 10 years without needing to be changed. A miniature onboard dosimeter is being developed for applications that require radiation surveillance. A field demonstration of the RFID system was recently conducted to assess its performance. The preliminary results of the demonstration are reported in this paper.« less

  19. DoD Identity Matching Engine for Security and Analysis (IMESA) Access to Criminal Justice Information (CJI) and Terrorist Screening Databases (TSDB)

    DTIC Science & Technology

    2016-05-04

    IMESA) Access to Criminal Justice Information (CJI) and Terrorist Screening Databases (TSDB) References: See Enclosure 1 1. PURPOSE. In...CJI database mirror image files. (3) Memorandums of understanding with the FBI CJIS as the data broker for DoD organizations that need access ...not for access determinations. (3) Legal restrictions established by the Sex Offender Registration and Notification Act (SORNA) jurisdictions on

  20. Application Analysis and Decision with Dynamic Analysis

    DTIC Science & Technology

    2014-12-01

    pushes the application file and the JSON file containing the metadata from the database . When the 2 files are in place, the consumer thread starts...human analysts and stores it in a database . It would then use some of these data to generate a risk score for the application. However, static analysis...and store them in the primary A2D database for future analysis. 15. SUBJECT TERMS Android, dynamic analysis 16. SECURITY CLASSIFICATION OF: 17

  1. The Free Trade Area of the Americas: Can Regional Economic Integration Lead to Greater Cooperation on Security?

    DTIC Science & Technology

    2002-12-01

    Brazilian Air Force has been testing a new surveillance system called Sistema de Vigilancia da Amazonia (SIVAM), designed to...2000 Online Database, 23 April 1998 and “Plan de seguridad para la triple frontera,” Ser en el 2000 Online Database, 01 June...Plan de seguridad para la triple frontera,” Ser en el 2000 Online Database, 01 June 1998. 64 Robert Devlin, Antoni Estevadeordal

  2. Mere exposure revisited: the influence of growth versus security cues on evaluations of novel and familiar stimuli.

    PubMed

    Gillebaart, Marleen; Förster, Jens; Rotteveel, Mark

    2012-11-01

    Combining regulatory focus theory (Higgins, 1997) and novelty categorization theory (Förster, Marguc, & Gillebaart, 2010), we predicted that novel stimuli would be more positively evaluated when focused on growth as compared with security and that familiar stimuli would be more negatively evaluated when focused on growth as compared with security. This would occur, at least in part, because of changes in category breadth. We tested effects of several variables linked to growth and security on evaluations of novel and familiar stimuli. Using a subliminal mere exposure paradigm, results showed novel stimuli were evaluated more positively in a promotion focus compared to a prevention focus (Experiments 1A-1C), with high power compared to low power (Experiment 2A), and with the color blue compared to red (Experiment 2B). For familiar stimuli, all effects were reversed. Additionally, as predicted by novelty categorization theory, novel stimuli were liked better after broad compared to narrow category priming, and familiar stimuli were liked better after narrow compared with broad category priming (Experiment 3). We suggest, therefore, that although familiarity glows warmly in security-related contexts, people prefer novelty when they are primarily focused on growth. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  3. Joint image encryption and compression scheme based on IWT and SPIHT

    NASA Astrophysics Data System (ADS)

    Zhang, Miao; Tong, Xiaojun

    2017-03-01

    A joint lossless image encryption and compression scheme based on integer wavelet transform (IWT) and set partitioning in hierarchical trees (SPIHT) is proposed to achieve lossless image encryption and compression simultaneously. Making use of the properties of IWT and SPIHT, encryption and compression are combined. Moreover, the proposed secure set partitioning in hierarchical trees (SSPIHT) via the addition of encryption in the SPIHT coding process has no effect on compression performance. A hyper-chaotic system, nonlinear inverse operation, Secure Hash Algorithm-256(SHA-256), and plaintext-based keystream are all used to enhance the security. The test results indicate that the proposed methods have high security and good lossless compression performance.

  4. Alignment of high-throughput sequencing data inside in-memory databases.

    PubMed

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  5. A Test-Bed of Secure Mobile Cloud Computing for Military Applications

    DTIC Science & Technology

    2016-09-13

    searching databases. This kind of applications is a typical example of mobile cloud computing (MCC). MCC has lots of applications in the military...Release; Distribution Unlimited UU UU UU UU 13-09-2016 1-Aug-2014 31-Jul-2016 Final Report: A Test-bed of Secure Mobile Cloud Computing for Military...Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Test-bed, Mobile Cloud Computing , Security, Military Applications REPORT

  6. POLICY VARIATION, LABOR SUPPLY ELASTICITIES, AND A STRUCTURAL MODEL OF RETIREMENT

    PubMed Central

    MANOLI, DAY; MULLEN, KATHLEEN J.; WAGNER, MATHIS

    2015-01-01

    This paper exploits a combination of policy variation from multiple pension reforms in Austria and administrative data from the Austrian Social Security Database. Using the policy changes for identification, we estimate social security wealth and accrual elasticities in individuals’ retirement decisions. Next, we use these elasticities to estimate a dynamic programming model of retirement decisions. Finally, we use the estimated model to examine the labor supply and welfare consequences of potential social security reforms. PMID:26472916

  7. National Computer Security Conference (15th) held in Baltimore, Maryland on October 13-16, 1992. Volume 2: Proceedings

    DTIC Science & Technology

    1992-10-16

    the DNA Fingerprint Laboratory. The Los Angeles Police Department and its former Chief, Daryl Gates for permitting a secret unit, the ...authorized to change information in. Conclusions Where angels fear .... Of all the reasons for compartmentation for which the level of evaluation...database, and a security label attribute is associated with data in each tuple in a relation. The range and distribution of security levels may

  8. Agile Datacube Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Merticariu, Vlad; Baumann, Peter

    2017-04-01

    Metadata are considered small, smart, and queryable; data, on the other hand, are known as big, clumsy, hard to analyze. Consequently, gridded data - such as images, image timeseries, and climate datacubes - are managed separately from the metadata, and with different, restricted retrieval capabilities. One reason for this silo approach is that databases, while good at tables, XML hierarchies, RDF graphs, etc., traditionally do not support multi-dimensional arrays well. This gap is being closed by Array Databases which extend the SQL paradigm of "any query, anytime" to NoSQL arrays. They introduce semantically rich modelling combined with declarative, high-level query languages on n-D arrays. On Server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. This way, they offer new vistas in flexibility, scalability, performance, and data integration. In this respect, the forthcoming ISO SQL extension MDA ("Multi-dimensional Arrays") will be a game changer in Big Data Analytics. We introduce concepts and opportunities through the example of rasdaman ("raster data manager") which in fact has pioneered the field of Array Databases and forms the blueprint for ISO SQL/MDA and further Big Data standards, such as OGC WCPS for querying spatio-temporal Earth datacubes. With operational installations exceeding 140 TB queries have been split across more than one thousand cloud nodes, using CPUs as well as GPUs. Installations can easily be mashed up securely, enabling large-scale location-transparent query processing in federations. Federation queries have been demonstrated live at EGU 2016 spanning Europe and Australia in the context of the intercontinental EarthServer initiative, visualized through NASA WorldWind.

  9. Agile Datacube Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2016-12-01

    Metadata are considered small, smart, and queryable; data, on the other hand, are known as big, clumsy, hard to analyze. Consequently, gridded data - such as images, image timeseries, and climate datacubes - are managed separately from the metadata, and with different, restricted retrieval capabilities. One reason for this silo approach is that databases, while good at tables, XML hierarchies, RDF graphs, etc., traditionally do not support multi-dimensional arrays well.This gap is being closed by Array Databases which extend the SQL paradigm of "any query, anytime" to NoSQL arrays. They introduce semantically rich modelling combined with declarative, high-level query languages on n-D arrays. On Server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. This way, they offer new vistas in flexibility, scalability, performance, and data integration. In this respect, the forthcoming ISO SQL extension MDA ("Multi-dimensional Arrays") will be a game changer in Big Data Analytics.We introduce concepts and opportunities through the example of rasdaman ("raster data manager") which in fact has pioneered the field of Array Databases and forms the blueprint for ISO SQL/MDA and further Big Data standards, such as OGC WCPS for querying spatio-temporal Earth datacubes. With operational installations exceeding 140 TB queries have been split across more than one thousand cloud nodes, using CPUs as well as GPUs. Installations can easily be mashed up securely, enabling large-scale location-transparent query processing in federations. Federation queries have been demonstrated live at EGU 2016 spanning Europe and Australia in the context of the intercontinental EarthServer initiative, visualized through NASA WorldWind.

  10. Use of medical information by computer networks raises major concerns about privacy.

    PubMed Central

    OReilly, M

    1995-01-01

    The development of computer data-bases and long-distance computer networks is leading to improvements in Canada's health care system. However, these developments come at a cost and require a balancing act between access and confidentiality. Columnist Michael OReilly, who in this article explores the security of computer networks, notes that respect for patients' privacy must be given as high a priority as the ability to see their records in the first place. Images p213-a PMID:7600474

  11. An investigation of fake fingerprint detection approaches

    NASA Astrophysics Data System (ADS)

    Ahmad, Asraful Syifaa'; Hassan, Rohayanti; Othman, Razib M.

    2017-10-01

    The most reliable biometrics technology, fingerprint recognition is widely used in terms of security due to its permanence and uniqueness. However, it is also vulnerable to the certain type of attacks including presenting fake fingerprints to the sensor which requires the development of new and efficient protection measures. Particularly, the aim is to identify the most recent literature related to the fake fingerprint recognition and only focus on software-based approaches. A systematic review is performed by analyzing 146 primary studies from the gross collection of 34 research papers to determine the taxonomy, approaches, online public databases, and limitations of the fake fingerprint. Fourteen software-based approaches have been briefly described, four limitations of fake fingerprint image were revealed and two known fake fingerprint databases were addressed briefly in this review. Therefore this work provides an overview of an insight into the current understanding of fake fingerprint recognition besides identifying future research possibilities.

  12. 8 CFR 338.12 - Endorsement by clerk of court in case name is changed.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Endorsement by clerk of court in case name is changed. 338.12 Section 338.12 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY NATIONALITY... database for naturalization recordkeeping, the name change information will be maintained in that database...

  13. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  14. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  15. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  16. 45 CFR 30.13 - Debt reporting and use of credit reporting agencies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... agencies. 30.13 Section 30.13 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION... over $100 to credit bureaus or other automated databases. Debts arising under the Social Security Act..., any subsequent reporting to or updating of a credit bureau or other automated database may be handled...

  17. A Complex Systems Approach to More Resilient Multi-Layered Security Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Jones, Katherine A.; Bandlow, Alisa

    In July 2012, protestors cut through security fences and gained access to the Y-12 National Security Complex. This was believed to be a highly reliable, multi-layered security system. This report documents the results of a Laboratory Directed Research and Development (LDRD) project that created a consistent, robust mathematical framework using complex systems analysis algorithms and techniques to better understand the emergent behavior, vulnerabilities and resiliency of multi-layered security systems subject to budget constraints and competing security priorities. Because there are several dimensions to security system performance and a range of attacks that might occur, the framework is multi-objective for amore » performance frontier to be estimated. This research explicitly uses probability of intruder interruption given detection (P I) as the primary resilience metric. We demonstrate the utility of this framework with both notional as well as real-world examples of Physical Protection Systems (PPSs) and validate using a well-established force-on-force simulation tool, Umbra.« less

  18. Securing services in the cloud: an investigation of the threats and the mitigations

    NASA Astrophysics Data System (ADS)

    Farroha, Bassam S.; Farroha, Deborah L.

    2012-05-01

    The stakeholder's security concerns over data in the clouds (Voice, Video and Text) are a real concern to DoD, the IC and private sector. This is primarily due to the lack of physical isolation of data when migrating to shared infrastructure platforms. The security concerns are related to privacy and regulatory compliance required in many industries (healthcare, financial, law enforcement, DoD, etc) and the corporate knowledge databases. The new paradigm depends on the service provider to ensure that the customer's information is continuously monitored and is kept available, secure, access controlled and isolated from potential adversaries.

  19. OS friendly microprocessor architecture: Hardware level computer security

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; La Fratta, Patrick

    2016-05-01

    We present an introduction to the patented OS Friendly Microprocessor Architecture (OSFA) and hardware level computer security. Conventional microprocessors have not tried to balance hardware performance and OS performance at the same time. Conventional microprocessors have depended on the Operating System for computer security and information assurance. The goal of the OS Friendly Architecture is to provide a high performance and secure microprocessor and OS system. We are interested in cyber security, information technology (IT), and SCADA control professionals reviewing the hardware level security features. The OS Friendly Architecture is a switched set of cache memory banks in a pipeline configuration. For light-weight threads, the memory pipeline configuration provides near instantaneous context switching times. The pipelining and parallelism provided by the cache memory pipeline provides for background cache read and write operations while the microprocessor's execution pipeline is running instructions. The cache bank selection controllers provide arbitration to prevent the memory pipeline and microprocessor's execution pipeline from accessing the same cache bank at the same time. This separation allows the cache memory pages to transfer to and from level 1 (L1) caching while the microprocessor pipeline is executing instructions. Computer security operations are implemented in hardware. By extending Unix file permissions bits to each cache memory bank and memory address, the OSFA provides hardware level computer security.

  20. Passive and Active Monitoring on a High Performance Research Network.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Warren

    2001-05-01

    The bold network challenges described in ''Internet End-to-end Performance Monitoring for the High Energy and Nuclear Physics Community'' presented at PAM 2000 have been tackled by the intrepid administrators and engineers providing the network services. After less than a year, the BaBar collaboration has collected almost 100 million particle collision events in a database approaching 165TB (Tera=10{sup 12}). Around 20TB has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, for processing and around 40 TB of simulated events have been imported to SLAC from Lawrence Livermore National Laboratory (LLNL). An unforseen challenge hasmore » arisen due to recent events and highlighted security concerns at DoE funded labs. New rules and regulations suggest it is only a matter of time before many active performance measurements may not be possible between many sites. Yet, at the same time, the importance of understanding every aspect of the network and eradicating packet loss for high throughput data transfers has become apparent. Work at SLAC to employ passive monitoring using netflow and OC3MON is underway and techniques to supplement and possibly replace the active measurements are being considered. This paper will detail the special needs and traffic characterization of a remarkable research project, and how the networking hurdles have been resolved (or not!) to achieve the required high data throughput. Results from active and passive measurements will be compared, and methods for achieving high throughput and the effect on the network will be assessed along with tools that directly measure throughput and applications used to actually transfer data.« less

  1. How Homeland Security Affects Spatial Information

    ERIC Educational Resources Information Center

    Zellmer, Linda

    2004-01-01

    A recent article in Security-Focus described the fact that several U.S. government buildings in Washington DC could no longer be clearly seen by people using MapQuest's aerial photo database. In addition, the photos of these buildings were altered at the Web sites wherein they are posted at the request of the U.S. Secret Service. This is an…

  2. [A security protocol for the exchange of personal medical data via Internet: monitoring treatment and drug effects].

    PubMed

    Viviani, R; Fischer, J; Spitzer, M; Freudenmann, R W

    2004-04-01

    We present a security protocol for the exchange of medical data via the Internet, based on the type/domain model. We discuss two applications of the protocol: in a system for the exchange of data for quality assurance, and in an on-line database of adverse reactions to drug use. We state that a type/domain security protocol can successfully comply with the complex requirements for data privacy and accessibility typical of such applications.

  3. Vacuum-assisted closure device as a split-thickness skin graft bolster in the burn population.

    PubMed

    Waltzman, Joshua T; Bell, Derek E

    2014-01-01

    The vacuum-assisted closure device (VAC) is associated with improved wound healing outcomes. Its use as a bolster device to secure a split-thickness skin graft has been previously demonstrated; however, there is little published evidence demonstrating its benefits specifically in the burn population. With use of the VAC becoming more commonplace, its effect on skin graft take and overall time to healing in burn patients deserves further investigation. Retrospective review of burn registry database at a high-volume level I trauma center and regional burn center during a 16-month period was performed. Patients who had a third-degree burn injury requiring a split-thickness skin graft and who received a VAC bolster were included. Data points included age, sex, burn mechanism, burn location, grafted area in square centimeters, need for repeat grafting, percent graft take, and time to complete reepithelialization. Sixty-seven patients were included in the study with a total of 88 skin graft sites secured with a VAC. Age ranged from <1 year to 84 years (average 41 years). The average grafted area was 367 ± 545 cm. The three most common were the leg, thigh, and arm (28, 15, and 12%, respectively). Average percent graft take was 99.5 ± 1.5%. Notably, no patients returned to the operating room for repeat grafting. The average time to complete reepithelialization was 16 ± 7 days. The VAC is a highly reliable and reproducible method to bolster a split-thickness skin graft in the burn population. The observed rate of zero returns to the operating room for repeat grafting was especially encouraging. Its ability to conform to contours of the body and cover large surface areas makes it especially useful in securing a graft. This method of bolstering results in decreased repeat grafting and minimal graft loss, thus decreasing morbidity compared with conventional bolster dressings.

  4. Privacy preserving protocol for detecting genetic relatives using rare variants.

    PubMed

    Hormozdiari, Farhad; Joo, Jong Wha J; Wadia, Akshay; Guan, Feng; Ostrosky, Rafail; Sahai, Amit; Eskin, Eleazar

    2014-06-15

    High-throughput sequencing technologies have impacted many areas of genetic research. One such area is the identification of relatives from genetic data. The standard approach for the identification of genetic relatives collects the genomic data of all individuals and stores it in a database. Then, each pair of individuals is compared to detect the set of genetic relatives, and the matched individuals are informed. The main drawback of this approach is the requirement of sharing your genetic data with a trusted third party to perform the relatedness test. In this work, we propose a secure protocol to detect the genetic relatives from sequencing data while not exposing any information about their genomes. We assume that individuals have access to their genome sequences but do not want to share their genomes with anyone else. Unlike previous approaches, our approach uses both common and rare variants which provide the ability to detect much more distant relationships securely. We use a simulated data generated from the 1000 genomes data and illustrate that we can easily detect up to fifth degree cousins which was not possible using the existing methods. We also show in the 1000 genomes data with cryptic relationships that our method can detect these individuals. The software is freely available for download at http://genetics.cs.ucla.edu/crypto/. © The Author 2014. Published by Oxford University Press.

  5. Video capture of clinical care to enhance patient safety

    PubMed Central

    Weinger, M; Gonzales, D; Slagle, J; Syeed, M

    2004-01-01

    

 Experience from other domains suggests that videotaping and analyzing actual clinical care can provide valuable insights for enhancing patient safety through improvements in the process of care. Methods are described for the videotaping and analysis of clinical care using a high quality portable multi-angle digital video system that enables simultaneous capture of vital signs and time code synchronization of all data streams. An observer can conduct clinician performance assessment (such as workload measurements or behavioral task analysis) either in real time (during videotaping) or while viewing previously recorded videotapes. Supplemental data are synchronized with the video record and stored electronically in a hierarchical database. The video records are transferred to DVD, resulting in a small, cheap, and accessible archive. A number of technical and logistical issues are discussed, including consent of patients and clinicians, maintaining subject privacy and confidentiality, and data security. Using anesthesiology as a test environment, over 270 clinical cases (872 hours) have been successfully videotaped and processed using the system. PMID:15069222

  6. Semantically enabled and statistically supported biological hypothesis testing with tissue microarray databases

    PubMed Central

    2011-01-01

    Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584

  7. Improvements to the Ionizing Radiation Risk Assessment Program for NASA Astronauts

    NASA Technical Reports Server (NTRS)

    Semones, E. J.; Bahadori, A. A.; Picco, C. E.; Shavers, M. R.; Flores-McLaughlin, J.

    2011-01-01

    To perform dosimetry and risk assessment, NASA collects astronaut ionizing radiation exposure data from space flight, medical imaging and therapy, aviation training activities and prior occupational exposure histories. Career risk of exposure induced death (REID) from radiation is limited to 3 percent at a 95 percent confidence level. The Radiation Health Office at Johnson Space Center (JSC) is implementing a program to integrate the gathering, storage, analysis and reporting of astronaut ionizing radiation dose and risk data and records. This work has several motivations, including more efficient analyses and greater flexibility in testing and adopting new methods for evaluating risks. The foundation for these improvements is a set of software tools called the Astronaut Radiation Exposure Analysis System (AREAS). AREAS is a series of MATLAB(Registered TradeMark)-based dose and risk analysis modules that interface with an enterprise level SQL Server database by means of a secure web service. It communicates with other JSC medical and space weather databases to maintain data integrity and consistency across systems. AREAS is part of a larger NASA Space Medicine effort, the Mission Medical Integration Strategy, with the goal of collecting accurate, high-quality and detailed astronaut health data, and then securely, timely and reliably presenting it to medical support personnel. The modular approach to the AREAS design accommodates past, current, and future sources of data from active and passive detectors, space radiation transport algorithms, computational phantoms and cancer risk models. Revisions of the cancer risk model, new radiation detection equipment and improved anthropomorphic computational phantoms can be incorporated. Notable hardware updates include the Radiation Environment Monitor (which uses Medipix technology to report real-time, on-board dosimetry measurements), an updated Tissue-Equivalent Proportional Counter, and the Southwest Research Institute Radiation Assessment Detector. Also, the University of Florida hybrid phantoms, which are flexible in morphometry and positioning, are being explored as alternatives to the current NASA computational phantoms.

  8. The exploration of the exhibition informatization

    NASA Astrophysics Data System (ADS)

    Zhang, Jiankang

    2017-06-01

    The construction and management of exhibition informatization is the main task and choke point during the process of Chinese exhibition industry’s transformation and promotion. There are three key points expected to realize a breakthrough during the construction of Chinese exhibition informatization, and the three aspects respectively are adopting service outsourcing to construct and maintain the database, adopting advanced chest card technology to collect various kinds of information, developing statistics analysis to maintain good cutomer relations. The success of Chinese exhibition informatization mainly calls for mature suppliers who can provide construction and maintenance of database, the proven technology, a sense of data security, advanced chest card technology, the ability of data mining and analysis and the ability to improve the exhibition service basing on the commercial information got from the data analysis. Several data security measures are expected to apply during the process of system developing, including the measures of the terminal data security, the internet data security, the media data security, the storage data security and the application data security. The informatization of this process is based on the chest card designing. At present, there are several types of chest card technology: bar code chest card; two-dimension code card; magnetic stripe chest card; smart-chip chest card. The information got from the exhibition data will help the organizers to make relevant service strategies, quantify the accumulated indexes of the customers, and improve the level of the customer’s satisfaction and loyalty, what’s more, the information can also provide more additional services like the commercial trips, VIP ceremonial reception.

  9. An Investigation of Multidimensional Voice Program Parameters in Three Different Databases for Voice Pathology Detection and Classification.

    PubMed

    Al-Nasheri, Ahmed; Muhammad, Ghulam; Alsulaiman, Mansour; Ali, Zulfiqar; Mesallam, Tamer A; Farahat, Mohamed; Malki, Khalid H; Bencherif, Mohamed A

    2017-01-01

    Automatic voice-pathology detection and classification systems may help clinicians to detect the existence of any voice pathologies and the type of pathology from which patients suffer in the early stages. The main aim of this paper is to investigate Multidimensional Voice Program (MDVP) parameters to automatically detect and classify the voice pathologies in multiple databases, and then to find out which parameters performed well in these two processes. Samples of the sustained vowel /a/ of normal and pathological voices were extracted from three different databases, which have three voice pathologies in common. The selected databases in this study represent three distinct languages: (1) the Arabic voice pathology database; (2) the Massachusetts Eye and Ear Infirmary database (English database); and (3) the Saarbruecken Voice Database (German database). A computerized speech lab program was used to extract MDVP parameters as features, and an acoustical analysis was performed. The Fisher discrimination ratio was applied to rank the parameters. A t test was performed to highlight any significant differences in the means of the normal and pathological samples. The experimental results demonstrate a clear difference in the performance of the MDVP parameters using these databases. The highly ranked parameters also differed from one database to another. The best accuracies were obtained by using the three highest ranked MDVP parameters arranged according to the Fisher discrimination ratio: these accuracies were 99.68%, 88.21%, and 72.53% for the Saarbruecken Voice Database, the Massachusetts Eye and Ear Infirmary database, and the Arabic voice pathology database, respectively. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  10. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    NASA Astrophysics Data System (ADS)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the encrypted images are first decrypted, then the features are extracted, and used for identification or verification.

  11. Development and characterization of a 3D high-resolution terrain database

    NASA Astrophysics Data System (ADS)

    Wilkosz, Aaron; Williams, Bryan L.; Motz, Steve

    2000-07-01

    A top-level description of methods used to generate elements of a high resolution 3D characterization database is presented. The database elements are defined as ground plane elevation map, vegetation height elevation map, material classification map, discrete man-made object map, and temperature radiance map. The paper will cover data collection by means of aerial photography, techniques of soft photogrammetry used to derive the elevation data, and the methodology followed to generate the material classification map. The discussion will feature the development of the database elements covering Fort Greely, Alaska. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems.

  12. The Application of Security Concepts to the Personnel Database for the Indonesian Navy.

    DTIC Science & Technology

    1983-09-01

    Postgraduate School, lionterey, California, June 1982. Since 1977, the Indonesian Navy Data Center (DISPULAHTAL) has collected and processed pa-sonnel data to...zel dlta Processing in the Indonesian Navy. 4 -a "o ’% ’." 5. ’S 1 1’S~. . . II. THE _IIIT_ IPR2ES1D PERSONSEL DATABASE SYSTEM The present Database...LEVEL *USER PROCESSING :CONCURRENT MULTI USER/LEVEL Ulf, U 3 , U 3 . . . users S. .. ...... secret C. .. ...... classified U .. .. ..... unclassified

  13. NNDC Stand: Activities and Services of the National Nuclear Data Center

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Arcilla, R.; Burrows, T. W.; Dunford, C. L.; Herman, M. W.; McLane, V.; Obložinský, P.; Sonzogni, A. A.; Tuli, J. K.; Winchell, D. F.

    2005-05-01

    The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research, applied nuclear technologies including energy, shielding, medical and homeland security. In 2004, to answer the needs of nuclear data users community, NNDC completed a project to modernize data storage and management of its databases and began offering new nuclear data Web services. The principles of database and Web application development as well as related nuclear reaction and structure database services are briefly described.

  14. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report

    DTIC Science & Technology

    2007-02-05

    34* Created new SQL server database for "PC Configuration" web application. Added roles for security closed 4235 and posted application to production. "e Wrote...and ran SQL Server scripts to migrate production databases to new server . "e Created backup jobs for new SQL Server databases. "* Continued...second phase of the TENA demo. Extensive tasking was established and assigned. A TENA interface to EW Server was reaffirmed after some uncertainty about

  15. Radioactivity and Environmental Security in the Oceans: New Research and Policy Priorities in the Arctic and North Atlantic

    DTIC Science & Technology

    1993-06-09

    within the framework of an update for the computer database "DiaNIK" which has been developed at the Vernadsky Institute of Geochemistry and Analytical...chemical thermodynamic data for minerals and mineral-forming substances. The structure of thermodynamic database "DiaNIK" is based on the principles...in the database . A substantial portion of the thermodynamic values recommended by "DiaNIK" experts for the substances in User Version 3.1 resulted from

  16. Development of an Intelligent Monitoring System for Geological Carbon Sequestration (GCS) Systems

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Jeong, H.; Xu, W.; Hovorka, S. D.; Zhu, T.; Templeton, T.; Arctur, D. K.

    2016-12-01

    To provide stakeholders timely evidence that GCS repositories are operating safely and efficiently requires integrated monitoring to assess the performance of the storage reservoir as the CO2 plume moves within it. As a result, GCS projects can be data intensive, as a result of proliferation of digital instrumentation and smart-sensing technologies. GCS projects are also resource intensive, often requiring multidisciplinary teams performing different monitoring, verification, and accounting (MVA) tasks throughout the lifecycle of a project to ensure secure containment of injected CO2. How to correlate anomaly detected by a certain sensor to events observed by other devices to verify leakage incidents? How to optimally allocate resources for task-oriented monitoring if reservoir integrity is in question? These are issues that warrant further investigation before real integration can take place. In this work, we are building a web-based, data integration, assimilation, and learning framework for geologic carbon sequestration projects (DIAL-GCS). DIAL-GCS will be an intelligent monitoring system (IMS) for automating GCS closed-loop management by leveraging recent developments in high-throughput database, complex event processing, data assimilation, and machine learning technologies. Results will be demonstrated using realistic data and model derived from a GCS site.

  17. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less

  18. Examining database persistence of ISO/EN 13606 standardized electronic health record extracts: relational vs. NoSQL approaches.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2017-08-18

    The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.

  19. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network.

    PubMed

    Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-11-01

    To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Network information security in a phase III Integrated Academic Information Management System (IAIMS).

    PubMed

    Shea, S; Sengupta, S; Crosswell, A; Clayton, P D

    1992-01-01

    The developing Integrated Academic Information System (IAIMS) at Columbia-Presbyterian Medical Center provides data sharing links between two separate corporate entities, namely Columbia University Medical School and The Presbyterian Hospital, using a network-based architecture. Multiple database servers with heterogeneous user authentication protocols are linked to this network. "One-stop information shopping" implies one log-on procedure per session, not separate log-on and log-off procedures for each server or application used during a session. These circumstances provide challenges at the policy and technical levels to data security at the network level and insuring smooth information access for end users of these network-based services. Five activities being conducted as part of our security project are described: (1) policy development; (2) an authentication server for the network; (3) Kerberos as a tool for providing mutual authentication, encryption, and time stamping of authentication messages; (4) a prototype interface using Kerberos services to authenticate users accessing a network database server; and (5) a Kerberized electronic signature.

  1. Loss-tolerant measurement-device-independent quantum private queries

    PubMed Central

    Zhao, Liang-Yuan; Yin, Zhen-Qiang; Chen, Wei; Qian, Yong-Jun; Zhang, Chun-Mei; Guo, Guang-Can; Han, Zheng-Fu

    2017-01-01

    Quantum private queries (QPQ) is an important cryptography protocol aiming to protect both the user’s and database’s privacy when the database is queried privately. Recently, a variety of practical QPQ protocols based on quantum key distribution (QKD) have been proposed. However, for QKD-based QPQ the user’s imperfect detectors can be subjected to some detector- side-channel attacks launched by the dishonest owner of the database. Here, we present a simple example that shows how the detector-blinding attack can damage the security of QKD-based QPQ completely. To remove all the known and unknown detector side channels, we propose a solution of measurement-device-independent QPQ (MDI-QPQ) with single- photon sources. The security of the proposed protocol has been analyzed under some typical attacks. Moreover, we prove that its security is completely loss independent. The results show that practical QPQ will remain the same degree of privacy as before even with seriously uncharacterized detectors. PMID:28051101

  2. Performance evaluation of secured DICOM image communication with next generation internet protocol IPv6

    NASA Astrophysics Data System (ADS)

    Yu, Fenghai; Zhang, Jianguo; Chen, Xiaomeng; Huang, H. K.

    2005-04-01

    Next Generation Internet (NGI) technology with new communication protocol IPv6 emerges as a potential solution for low-cost and high-speed networks for image data transmission. IPv6 is designed to solve many of the problems of the current version of IP (known as IPv4) with regard to address depletion, security, autoconfiguration, extensibility, and more. We choose CTN (Central Test Node) DICOM software developed by The Mallinckrodt Institute of Radiology to implement IPv6/IPv4 enabled DICOM communication software on different operating systems (Windows/Linux), and used this DICOM software to evaluate the performance of the IPv6/IPv4 enabled DICOM image communication with different security setting and environments. We compared the security communications of IPsec with SSL/TLS on different TCP/IP protocols (IPv6/IPv4), and find that there are some trade-offs to choose security solution between IPsec and SSL/TLS in the security implementation of IPv6/IPv4 communication networks.

  3. Trustworthy data collection from implantable medical devices via high-speed security implementation based on IEEE 1363.

    PubMed

    Hu, Fei; Hao, Qi; Lukowiak, Marcin; Sun, Qingquan; Wilhelm, Kyle; Radziszowski, Stanisław; Wu, Yao

    2010-11-01

    Implantable medical devices (IMDs) have played an important role in many medical fields. Any failure in IMDs operations could cause serious consequences and it is important to protect the IMDs access from unauthenticated access. This study investigates secure IMD data collection within a telehealthcare [mobile health (m-health)] network. We use medical sensors carried by patients to securely access IMD data and perform secure sensor-to-sensor communications between patients to relay the IMD data to a remote doctor's server. To meet the requirements on low computational complexity, we choose N-th degree truncated polynomial ring (NTRU)-based encryption/decryption to secure IMD-sensor and sensor-sensor communications. An extended matryoshkas model is developed to estimate direct/indirect trust relationship among sensors. An NTRU hardware implementation in very large integrated circuit hardware description language is studied based on industry Standard IEEE 1363 to increase the speed of key generation. The performance analysis results demonstrate the security robustness of the proposed IMD data access trust model.

  4. Correlation between safety climate and contractor safety assessment programs in construction

    PubMed Central

    Sparer, EH1; Murphy, LA; Taylor, KM; Dennerlein, Jt

    2015-01-01

    Background Contractor safety assessment programs (CSAPs) measure safety performance by integrating multiple data sources together; however, the relationship between these measures of safety performance and safety climate within the construction industry is unknown. Methods 401 construction workers employed by 68 companies on 26 sites and 11 safety managers employed by 11 companies completed brief surveys containing a nine-item safety climate scale developed for the construction industry. CSAP scores from ConstructSecure, Inc., an online CSAP database, classified these 68 companies as high or low scorers, with the median score of the sample population as the threshold. Spearman rank correlations evaluated the association between the CSAP score and the safety climate score at the individual level, as well as with various grouping methodologies. In addition, Spearman correlations evaluated the comparison between manager-assessed safety climate and worker-assessed safety climate. Results There were no statistically significant differences between safety climate scores reported by workers in the high and low CSAP groups. There were, at best, weak correlations between workers’ safety climate scores and the company CSAP scores, with marginal statistical significance with two groupings of the data. There were also no significant differences between the manager-assessed safety climate and the worker-assessed safety climate scores. Conclusions A CSAP safety performance score does not appear to capture safety climate, as measured in this study. The nature of safety climate in construction is complex, which may be reflective of the challenges in measuring safety climate within this industry. PMID:24038403

  5. [Assessment on ecological security spatial differences of west areas of Liaohe River based on GIS].

    PubMed

    Wang, Geng; Wu, Wei

    2005-09-01

    Ecological security assessment and early warning research have spatiality; non-linearity; randomicity, it is needed to deal with much spatial information. Spatial analysis and data management are advantages of GIS, it can define distribution trend and spatial relations of environmental factors, and show ecological security pattern graphically. The paper discusses the method of ecological security spatial differences of west areas of Liaohe River based on GIS and ecosystem non-health. First, studying on pressure-state-response (P-S-R) assessment indicators system, investigating in person and gathering information; Second, digitizing the river, applying fuzzy AHP to put weight, quantizing and calculating by fuzzy comparing; Last, establishing grid data-base; expounding spatial differences of ecological security by GIS Interpolate and Assembly.

  6. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  7. An E-Hospital Security Architecture

    NASA Astrophysics Data System (ADS)

    Tian, Fang; Adams, Carlisle

    In this paper, we introduce how to use cryptography in network security and access control of an e-hospital. We first define the security goal of the e-hospital system, and then we analyze the current application system. Our idea is proposed on the system analysis and the related regulations of patients' privacy protection. The security of the whole application system is strengthened through layered security protection. Three security domains in the e-hospital system are defined according to their sensitivity level, and for each domain, we propose different security protections. We use identity based cryptography to establish secure communication channel in the backbone network and policy based cryptography to establish secure communication channel between end users and the backbone network. We also use policy based cryptography in the access control of the application system. We use a symmetric key cryptography to protect the real data in the database. The identity based and policy based cryptography are all based on elliptic curve cryptography—a public key cryptography.

  8. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  9. Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.

    PubMed

    Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip

    2018-02-01

    Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.

  10. Design and Implementation of Website Information Disclosure Assessment System

    PubMed Central

    Cho, Ying-Chiang; Pan, Jen-Yi

    2015-01-01

    Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people’s lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website’s information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites. PMID:25768434

  11. High-performance information search filters for acute kidney injury content in PubMed, Ovid Medline and Embase.

    PubMed

    Hildebrand, Ainslie M; Iansavichus, Arthur V; Haynes, R Brian; Wilczynski, Nancy L; Mehta, Ravindra L; Parikh, Chirag R; Garg, Amit X

    2014-04-01

    We frequently fail to identify articles relevant to the subject of acute kidney injury (AKI) when searching the large bibliographic databases such as PubMed, Ovid Medline or Embase. To address this issue, we used computer automation to create information search filters to better identify articles relevant to AKI in these databases. We first manually reviewed a sample of 22 992 full-text articles and used prespecified criteria to determine whether each article contained AKI content or not. In the development phase (two-thirds of the sample), we developed and tested the performance of >1.3-million unique filters. Filters with high sensitivity and high specificity for the identification of AKI articles were then retested in the validation phase (remaining third of the sample). We succeeded in developing and validating high-performance AKI search filters for each bibliographic database with sensitivities and specificities in excess of 90%. Filters optimized for sensitivity reached at least 97.2% sensitivity, and filters optimized for specificity reached at least 99.5% specificity. The filters were complex; for example one PubMed filter included >140 terms used in combination, including 'acute kidney injury', 'tubular necrosis', 'azotemia' and 'ischemic injury'. In proof-of-concept searches, physicians found more articles relevant to topics in AKI with the use of the filters. PubMed, Ovid Medline and Embase can be filtered for articles relevant to AKI in a reliable manner. These high-performance information filters are now available online and can be used to better identify AKI content in large bibliographic databases.

  12. Elliptic Curve Cryptography with Security System in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Huang, Xu; Sharma, Dharmendra

    2010-10-01

    The rapid progress of wireless communications and embedded micro-electro-system technologies has made wireless sensor networks (WSN) very popular and even become part of our daily life. WSNs design are generally application driven, namely a particular application's requirements will determine how the network behaves. However, the natures of WSN have attracted increasing attention in recent years due to its linear scalability, a small software footprint, low hardware implementation cost, low bandwidth requirement, and high device performance. It is noted that today's software applications are mainly characterized by their component-based structures which are usually heterogeneous and distributed, including the WSNs. But WSNs typically need to configure themselves automatically and support as hoc routing. Agent technology provides a method for handling increasing software complexity and supporting rapid and accurate decision making. This paper based on our previous works [1, 2], three contributions have made, namely (a) fuzzy controller for dynamic slide window size to improve the performance of running ECC (b) first presented a hidden generation point for protection from man-in-the middle attack and (c) we first investigates multi-agent applying for key exchange together. Security systems have been drawing great attentions as cryptographic algorithms have gained popularity due to the natures that make them suitable for use in constrained environment such as mobile sensor information applications, where computing resources and power availability are limited. Elliptic curve cryptography (ECC) is one of high potential candidates for WSNs, which requires less computational power, communication bandwidth, and memory in comparison with other cryptosystem. For saving pre-computing storages recently there is a trend for the sensor networks that the sensor group leaders rather than sensors communicate to the end database, which highlighted the needs to prevent from the man-in-the middle attack. A designed a hidden generator point that offer a good protection from the man-in-the middle (MinM) attack which becomes one of major worries for the sensor's networks with multiagent system is also discussed.

  13. 76 FR 81787 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security/ALL-030 Use of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... requirements for the agency (DHS) to respect individuals' rights to control their information in possession of... Database System of Records is a repository of information held by DHS in connection with its several and.... The DHS/ALL-030 Use of Terrorist Screening Database System of Records contains information that is...

  14. An adaptive cryptographic accelerator for network storage security on dynamically reconfigurable platform

    NASA Astrophysics Data System (ADS)

    Tang, Li; Liu, Jing-Ning; Feng, Dan; Tong, Wei

    2008-12-01

    Existing security solutions in network storage environment perform poorly because cryptographic operations (encryption and decryption) implemented in software can dramatically reduce system performance. In this paper we propose a cryptographic hardware accelerator on dynamically reconfigurable platform for the security of high performance network storage system. We employ a dynamic reconfigurable platform based on a FPGA to implement a PowerPCbased embedded system, which executes cryptographic algorithms. To reduce the reconfiguration latency, we apply prefetch scheduling. Moreover, the processing elements could be dynamically configured to support different cryptographic algorithms according to the request received by the accelerator. In the experiment, we have implemented AES (Rijndael) and 3DES cryptographic algorithms in the reconfigurable accelerator. Our proposed reconfigurable cryptographic accelerator could dramatically increase the performance comparing with the traditional software-based network storage systems.

  15. From Serpent to CEO: Improving First-Term Security Forces Airman Performance Through Neuroscience Education

    DTIC Science & Technology

    2017-06-09

    full ability to inhibit ANS and limbic response are prone to be impulsive, 25 unintentional, or hesitant when faced with high -threat decisions...graduate degrees in Criminal Justice, a Graduate Certificate in Organizational Leadership, and a current American Society for Industrial Security...experience and full ability to inhibit ANS and limbic response are prone to be impulsive, unintentional, or hesitant when faced with high -threat

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Andrew J.; Miller, Brian W.; Robinson, Sean M.

    Imaging technology is generally considered too invasive for arms control inspections due to the concern that it cannot properly secure sensitive features of the inspected item. However, this same sensitive information, which could include direct information on the form and function of the items under inspection, could be used for robust arms control inspections. The single-pixel X-ray imager (SPXI) is introduced as a method to make such inspections, capturing the salient spatial information of an object in a secure manner while never forming an actual image. The method is built on the theory of compressive sensing and the single pixelmore » optical camera. The performance of the system is quantified here using simulated inspections of simple objects. Measures of the robustness and security of the method are introduced and used to determine how such an inspection would be made which can maintain high robustness and security. In particular, it is found that an inspection with low noise (<1%) and high undersampling (>256×) exhibits high robustness and security.« less

  17. Database searching and accounting of multiplexed precursor and product ion spectra from the data independent analysis of simple and complex peptide mixtures.

    PubMed

    Li, Guo-Zhong; Vissers, Johannes P C; Silva, Jeffrey C; Golick, Dan; Gorenstein, Marc V; Geromanos, Scott J

    2009-03-01

    A novel database search algorithm is presented for the qualitative identification of proteins over a wide dynamic range, both in simple and complex biological samples. The algorithm has been designed for the analysis of data originating from data independent acquisitions, whereby multiple precursor ions are fragmented simultaneously. Measurements used by the algorithm include retention time, ion intensities, charge state, and accurate masses on both precursor and product ions from LC-MS data. The search algorithm uses an iterative process whereby each iteration incrementally increases the selectivity, specificity, and sensitivity of the overall strategy. Increased specificity is obtained by utilizing a subset database search approach, whereby for each subsequent stage of the search, only those peptides from securely identified proteins are queried. Tentative peptide and protein identifications are ranked and scored by their relative correlation to a number of models of known and empirically derived physicochemical attributes of proteins and peptides. In addition, the algorithm utilizes decoy database techniques for automatically determining the false positive identification rates. The search algorithm has been tested by comparing the search results from a four-protein mixture, the same four-protein mixture spiked into a complex biological background, and a variety of other "system" type protein digest mixtures. The method was validated independently by data dependent methods, while concurrently relying on replication and selectivity. Comparisons were also performed with other commercially and publicly available peptide fragmentation search algorithms. The presented results demonstrate the ability to correctly identify peptides and proteins from data independent acquisition strategies with high sensitivity and specificity. They also illustrate a more comprehensive analysis of the samples studied; providing approximately 20% more protein identifications, compared to a more conventional data directed approach using the same identification criteria, with a concurrent increase in both sequence coverage and the number of modified peptides.

  18. Partnerships - Working Together to Build The National Map

    USGS Publications Warehouse

    ,

    2004-01-01

    Through The National Map, the U.S. Geological Survey (USGS) is working with partners to ensure that current, accurate, and complete base geographic information is available for the Nation. Designed as a network of online digital databases, it provides a consistent geographic data framework for the country and serves as a foundation for integrating, sharing, and using data easily and reliably. It provides public access to high quality geospatial data and information from multiple partners to help inform decisionmaking by resource managers and the public, and to support intergovernmental homeland security and emergency management requirements.

  19. Towards a privacy preserving cohort discovery framework for clinical research networks.

    PubMed

    Yuan, Jiawei; Malin, Bradley; Modave, François; Guo, Yi; Hogan, William R; Shenkman, Elizabeth; Bian, Jiang

    2017-02-01

    The last few years have witnessed an increasing number of clinical research networks (CRNs) focused on building large collections of data from electronic health records (EHRs), claims, and patient-reported outcomes (PROs). Many of these CRNs provide a service for the discovery of research cohorts with various health conditions, which is especially useful for rare diseases. Supporting patient privacy can enhance the scalability and efficiency of such processes; however, current practice mainly relies on policy, such as guidelines defined in the Health Insurance Portability and Accountability Act (HIPAA), which are insufficient for CRNs (e.g., HIPAA does not require encryption of data - which can mitigate insider threats). By combining policy with privacy enhancing technologies we can enhance the trustworthiness of CRNs. The goal of this research is to determine if searchable encryption can instill privacy in CRNs without sacrificing their usability. We developed a technique, implemented in working software to enable privacy-preserving cohort discovery (PPCD) services in large distributed CRNs based on elliptic curve cryptography (ECC). This technique also incorporates a block indexing strategy to improve the performance (in terms of computational running time) of PPCD. We evaluated the PPCD service with three real cohort definitions: (1) elderly cervical cancer patients who underwent radical hysterectomy, (2) oropharyngeal and tongue cancer patients who underwent robotic transoral surgery, and (3) female breast cancer patients who underwent mastectomy) with varied query complexity. These definitions were tested in an encrypted database of 7.1 million records derived from the publically available Healthcare Cost and Utilization Project (HCUP) Nationwide Inpatient Sample (NIS). We assessed the performance of the PPCD service in terms of (1) accuracy in cohort discovery, (2) computational running time, and (3) privacy afforded to the underlying records during PPCD. The empirical results indicate that the proposed PPCD can execute cohort discovery queries in a reasonable amount of time, with query runtime in the range of 165-262s for the 3 use cases, with zero compromise in accuracy. We further show that the search performance is practical because it supports a highly parallelized design for secure evaluation over encrypted records. Additionally, our security analysis shows that the proposed construction is resilient to standard adversaries. PPCD services can be designed for clinical research networks. The security construction presented in this work specifically achieves high privacy guarantees by preventing both threats originating from within and beyond the network. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Internal Review of the Washington Navy Yard Shooting. A Report to the Secretary of Defense

    DTIC Science & Technology

    2013-11-20

    the following: • Biometrically enabled background security screening • Identification card security features • Identity -proofing and vetting...claimed identities vetted through mandatory databases such as NCIC and TSDB. This occurred in attempts to reduce access costs. OMB memorandum 05-24...other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for

  1. Program for Critical Technologies in Breast Oncology

    DTIC Science & Technology

    1999-07-01

    the tissues, and in a ethical manner that respects the patients’ rights . The Program for Critical Technologies in Breast Oncology helps address all of...diagnosis, database 15. NUMBER OF PAGES 148 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS...closer to clinical utility. Page 17 References Adida C. Crotty PL. McGrath J. Berrebi D. Diebold J. Altieri DC. Developmentally regulated

  2. Organizing the Army for Information Warfare

    DTIC Science & Technology

    2013-03-01

    US’s reputation in the global community, but by pilfering intellectual property, foiling industrial controls, and ‘ hacking ’ into secured networks...human factors can be exploited to discern passwords and circumvent other physical safeguards that secure cyber infrastructure.48 The increasing...Ranking of America’s Largest Corporations.” 7 David F. Carr, Information Week, January 25, 2012, “ Facebook : The Database Of Wealth And Power,” http

  3. High-energy physics software parallelization using database techniques

    NASA Astrophysics Data System (ADS)

    Argante, E.; van der Stok, P. D. V.; Willers, I.

    1997-02-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.

  4. Design Considerations for a Web-based Database System of ELISpot Assay in Immunological Research

    PubMed Central

    Ma, Jingming; Mosmann, Tim; Wu, Hulin

    2005-01-01

    The enzyme-linked immunospot (ELISpot) assay has been a primary means in immunological researches (such as HIV-specific T cell response). Due to huge amount of data involved in ELISpot assay testing, the database system is needed for efficient data entry, easy retrieval, secure storage, and convenient data process. Besides, the NIH has recently issued a policy to promote the sharing of research data (see http://grants.nih.gov/grants/policy/data_sharing). The Web-based database system will be definitely benefit to data sharing among broad research communities. Here are some considerations for a database system of ELISpot assay (DBSEA). PMID:16779326

  5. 7 CFR 1494.401 - Performance security.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Performance security. 1494.401 Section 1494.401... Program Operations § 1494.401 Performance security. (a) Requirement to establish performance security... establish performance security, in a form which is acceptable to CCC, in order to guarantee the eligible...

  6. 7 CFR 1494.401 - Performance security.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 10 2011-01-01 2011-01-01 false Performance security. 1494.401 Section 1494.401... Program Operations § 1494.401 Performance security. (a) Requirement to establish performance security... establish performance security, in a form which is acceptable to CCC, in order to guarantee the eligible...

  7. Face Recognition for Access Control Systems Combining Image-Difference Features Based on a Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Miwa, Shotaro; Kage, Hiroshi; Hirai, Takashi; Sumi, Kazuhiko

    We propose a probabilistic face recognition algorithm for Access Control System(ACS)s. Comparing with existing ACSs using low cost IC-cards, face recognition has advantages in usability and security that it doesn't require people to hold cards over scanners and doesn't accept imposters with authorized cards. Therefore face recognition attracts more interests in security markets than IC-cards. But in security markets where low cost ACSs exist, price competition is important, and there is a limitation on the quality of available cameras and image control. Therefore ACSs using face recognition are required to handle much lower quality images, such as defocused and poor gain-controlled images than high security systems, such as immigration control. To tackle with such image quality problems we developed a face recognition algorithm based on a probabilistic model which combines a variety of image-difference features trained by Real AdaBoost with their prior probability distributions. It enables to evaluate and utilize only reliable features among trained ones during each authentication, and achieve high recognition performance rates. The field evaluation using a pseudo Access Control System installed in our office shows that the proposed system achieves a constant high recognition performance rate independent on face image qualities, that is about four times lower EER (Equal Error Rate) under a variety of image conditions than one without any prior probability distributions. On the other hand using image difference features without any prior probabilities are sensitive to image qualities. We also evaluated PCA, and it has worse, but constant performance rates because of its general optimization on overall data. Comparing with PCA, Real AdaBoost without any prior distribution performs twice better under good image conditions, but degrades to a performance as good as PCA under poor image conditions.

  8. Missing data reconstruction using Gaussian mixture models for fingerprint images

    NASA Astrophysics Data System (ADS)

    Agaian, Sos S.; Yeole, Rushikesh D.; Rao, Shishir P.; Mulawka, Marzena; Troy, Mike; Reinecke, Gary

    2016-05-01

    Publisher's Note: This paper, originally published on 25 May 2016, was replaced with a revised version on 16 June 2016. If you downloaded the original PDF, but are unable to access the revision, please contact SPIE Digital Library Customer Service for assistance. One of the most important areas in biometrics is matching partial fingerprints in fingerprint databases. Recently, significant progress has been made in designing fingerprint identification systems for missing fingerprint information. However, a dependable reconstruction of fingerprint images still remains challenging due to the complexity and the ill-posed nature of the problem. In this article, both binary and gray-level images are reconstructed. This paper also presents a new similarity score to evaluate the performance of the reconstructed binary image. The offered fingerprint image identification system can be automated and extended to numerous other security applications such as postmortem fingerprints, forensic science, investigations, artificial intelligence, robotics, all-access control, and financial security, as well as for the verification of firearm purchasers, driver license applicants, etc.

  9. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  10. Speaker emotion recognition: from classical classifiers to deep neural networks

    NASA Astrophysics Data System (ADS)

    Mezghani, Eya; Charfeddine, Maha; Nicolas, Henri; Ben Amar, Chokri

    2018-04-01

    Speaker emotion recognition is considered among the most challenging tasks in recent years. In fact, automatic systems for security, medicine or education can be improved when considering the speech affective state. In this paper, a twofold approach for speech emotion classification is proposed. At the first side, a relevant set of features is adopted, and then at the second one, numerous supervised training techniques, involving classic methods as well as deep learning, are experimented. Experimental results indicate that deep architecture can improve classification performance on two affective databases, the Berlin Dataset of Emotional Speech and the SAVEE Dataset Surrey Audio-Visual Expressed Emotion.

  11. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  12. 7 CFR 1494.401 - Performance security.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 10 2013-01-01 2013-01-01 false Performance security. 1494.401 Section 1494.401... Performance security. (a) Requirement to establish performance security. Prior to the submission of an offer to CCC in response to an Invitation, an eligible exporter must establish performance security, in a...

  13. 7 CFR 1494.401 - Performance security.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 10 2012-01-01 2012-01-01 false Performance security. 1494.401 Section 1494.401... Performance security. (a) Requirement to establish performance security. Prior to the submission of an offer to CCC in response to an Invitation, an eligible exporter must establish performance security, in a...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Andrew J.; Miller, Brian W.; Robinson, Sean M.

    Imaging technology is generally considered too invasive for arms control inspections due to the concern that it cannot properly secure sensitive features of the inspected item. But, this same sensitive information, which could include direct information on the form and function of the items under inspection, could be used for robust arms control inspections. The single-pixel X-ray imager (SPXI) is introduced as a method to make such inspections, capturing the salient spatial information of an object in a secure manner while never forming an actual image. We built this method on the theory of compressive sensing and the single pixelmore » optical camera. The performance of the system is quantified using simulated inspections of simple objects. Measures of the robustness and security of the method are introduced and used to determine how robust and secure such an inspection would be. Particularly, it is found that an inspection with low noise (<1%) and high undersampling (>256×) exhibits high robustness and security.« less

  15. A single-pixel X-ray imager concept and its application to secure radiographic inspections

    DOE PAGES

    Gilbert, Andrew J.; Miller, Brian W.; Robinson, Sean M.; ...

    2017-07-01

    Imaging technology is generally considered too invasive for arms control inspections due to the concern that it cannot properly secure sensitive features of the inspected item. But, this same sensitive information, which could include direct information on the form and function of the items under inspection, could be used for robust arms control inspections. The single-pixel X-ray imager (SPXI) is introduced as a method to make such inspections, capturing the salient spatial information of an object in a secure manner while never forming an actual image. We built this method on the theory of compressive sensing and the single pixelmore » optical camera. The performance of the system is quantified using simulated inspections of simple objects. Measures of the robustness and security of the method are introduced and used to determine how robust and secure such an inspection would be. Particularly, it is found that an inspection with low noise (<1%) and high undersampling (>256×) exhibits high robustness and security.« less

  16. A single-pixel X-ray imager concept and its application to secure radiographic inspections

    NASA Astrophysics Data System (ADS)

    Gilbert, Andrew J.; Miller, Brian W.; Robinson, Sean M.; White, Timothy A.; Pitts, William Karl; Jarman, Kenneth D.; Seifert, Allen

    2017-07-01

    Imaging technology is generally considered too invasive for arms control inspections due to the concern that it cannot properly secure sensitive features of the inspected item. However, this same sensitive information, which could include direct information on the form and function of the items under inspection, could be used for robust arms control inspections. The single-pixel X-ray imager (SPXI) is introduced as a method to make such inspections, capturing the salient spatial information of an object in a secure manner while never forming an actual image. The method is built on the theory of compressive sensing and the single pixel optical camera. The performance of the system is quantified using simulated inspections of simple objects. Measures of the robustness and security of the method are introduced and used to determine how robust and secure such an inspection would be. In particular, it is found that an inspection with low noise ( < 1 %) and high undersampling ( > 256 ×) exhibits high robustness and security.

  17. Optimization of the Controlled Evaluation of Closed Relational Queries

    NASA Astrophysics Data System (ADS)

    Biskup, Joachim; Lochner, Jan-Hendrik; Sonntag, Sebastian

    For relational databases, controlled query evaluation is an effective inference control mechanism preserving confidentiality regarding a previously declared confidentiality policy. Implementations of controlled query evaluation usually lack efficiency due to costly theorem prover calls. Suitably constrained controlled query evaluation can be implemented efficiently, but is not flexible enough from the perspective of database users and security administrators. In this paper, we propose an optimized framework for controlled query evaluation in relational databases, being efficiently implementable on the one hand and relaxing the constraints of previous approaches on the other hand.

  18. eCOMPAGT – efficient Combination and Management of Phenotypes and Genotypes for Genetic Epidemiology

    PubMed Central

    Schönherr, Sebastian; Weißensteiner, Hansi; Coassin, Stefan; Specht, Günther; Kronenberg, Florian; Brandstätter, Anita

    2009-01-01

    Background High-throughput genotyping and phenotyping projects of large epidemiological study populations require sophisticated laboratory information management systems. Most epidemiological studies include subject-related personal information, which needs to be handled with care by following data privacy protection guidelines. In addition, genotyping core facilities handling cooperative projects require a straightforward solution to monitor the status and financial resources of the different projects. Description We developed a database system for an efficient combination and management of phenotypes and genotypes (eCOMPAGT) deriving from genetic epidemiological studies. eCOMPAGT securely stores and manages genotype and phenotype data and enables different user modes with different rights. Special attention was drawn on the import of data deriving from TaqMan and SNPlex genotyping assays. However, the database solution is adjustable to other genotyping systems by programming additional interfaces. Further important features are the scalability of the database and an export interface to statistical software. Conclusion eCOMPAGT can store, administer and connect phenotype data with all kinds of genotype data and is available as a downloadable version at . PMID:19432954

  19. Information System through ANIS at CeSAM

    NASA Astrophysics Data System (ADS)

    Moreau, C.; Agneray, F.; Gimenez, S.

    2015-09-01

    ANIS (AstroNomical Information System) is a web generic tool developed at CeSAM to facilitate and standardize the implementation of astronomical data of various kinds through private and/or public dedicated Information Systems. The architecture of ANIS is composed of a database server which contains the project data, a web user interface template which provides high level services (search, extract and display imaging and spectroscopic data using a combination of criteria, an object list, a sql query module or a cone search interfaces), a framework composed of several packages, and a metadata database managed by a web administration entity. The process to implement a new ANIS instance at CeSAM is easy and fast : the scientific project has to submit data or a data secure access, the CeSAM team installs the new instance (web interface template and the metadata database), and the project administrator can configure the instance with the web ANIS-administration entity. Currently, the CeSAM offers through ANIS a web access to VO compliant Information Systems for different projects (HeDaM, HST-COSMOS, CFHTLS-ZPhots, ExoDAT,...).

  20. Building a recruitment database for asthma trials: a conceptual framework for the creation of the UK Database of Asthma Research Volunteers.

    PubMed

    Nwaru, Bright I; Soyiri, Ireneous N; Simpson, Colin R; Griffiths, Chris; Sheikh, Aziz

    2016-05-26

    Randomised clinical trials are the 'gold standard' for evaluating the effectiveness of healthcare interventions. However, successful recruitment of participants remains a key challenge for many trialists. In this paper, we present a conceptual framework for creating a digital, population-based database for the recruitment of asthma patients into future asthma trials in the UK. Having set up the database, the goal is to then make it available to support investigators planning asthma clinical trials. The UK Database of Asthma Research Volunteers will comprise a web-based front-end that interactively allows participant registration, and a back-end that houses the database containing participants' key relevant data. The database will be hosted and maintained at a secure server at the Asthma UK Centre for Applied Research based at The University of Edinburgh. Using a range of invitation strategies, key demographic and clinical data will be collected from those pre-consenting to consider participation in clinical trials. These data will, with consent, in due course, be linkable to other healthcare, social, economic, and genetic datasets. To use the database, asthma investigators will send their eligibility criteria for participant recruitment; eligible participants will then be informed about the new trial and asked if they wish to participate. A steering committee will oversee the running of the database, including approval of usage access. Novel communication strategies will be utilised to engage participants who are recruited into the database in order to avoid attrition as a result of waiting time to participation in a suitable trial, and to minimise the risk of their being approached when already enrolled in a trial. The value of this database will be whether it proves useful and usable to researchers in facilitating recruitment into clinical trials on asthma and whether patient privacy and data security are protected in meeting this aim. Successful recruitment is fundamental to the success of a clinical trial. The UK Database of Asthma Research Volunteers, the first of its kind in the context of asthma, presents a novel approach to overcoming recruitment barriers and will facilitate the catalysing of important clinical trials on asthma in the UK.

  1. 47 CFR Appendix B to Part 64 - Priority Access Service (PAS) for National Security and Emergency Preparedness (NSEP)

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... providers as necessary to maintain the viability of the PAS system. 5. Maintain a database for PAS related... NSEP PAS database only to those having a need-to-know or who will not use the information for economic... selected for this priority should be responsible for ensuring the viability or reconstruction of the basic...

  2. 47 CFR Appendix B to Part 64 - Priority Access Service (PAS) for National Security and Emergency Preparedness (NSEP)

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... providers as necessary to maintain the viability of the PAS system. 5. Maintain a database for PAS related... NSEP PAS database only to those having a need-to-know or who will not use the information for economic... selected for this priority should be responsible for ensuring the viability or reconstruction of the basic...

  3. 47 CFR Appendix B to Part 64 - Priority Access Service (PAS) for National Security and Emergency Preparedness (NSEP)

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... providers as necessary to maintain the viability of the PAS system. 5. Maintain a database for PAS related... NSEP PAS database only to those having a need-to-know or who will not use the information for economic... selected for this priority should be responsible for ensuring the viability or reconstruction of the basic...

  4. 47 CFR Appendix B to Part 64 - Priority Access Service (PAS) for National Security and Emergency Preparedness (NSEP)

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... providers as necessary to maintain the viability of the PAS system. 5. Maintain a database for PAS related... NSEP PAS database only to those having a need-to-know or who will not use the information for economic... selected for this priority should be responsible for ensuring the viability or reconstruction of the basic...

  5. U.S. Security-Related Agreements in Force Since 1955: Introducing a New Database

    DTIC Science & Technology

    2014-01-01

    necessarily reflect the opinions of its research clients and sponsors. Support RAND Make a tax-deductible charitable contribution at www.rand.org/giving...PAF), a division of the RAND Corporation, is the U.S. Air Force’s federally funded research and development center for studies and analyses. PAF...33   Additional Applications of the Treaty and Agreement Database ........................................................... 35   Summary

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report contains papers on the following topics: NREN Security Issues: Policies and Technologies; Layer Wars: Protect the Internet with Network Layer Security; Electronic Commission Management; Workflow 2000 - Electronic Document Authorization in Practice; Security Issues of a UNIX PEM Implementation; Implementing Privacy Enhanced Mail on VMS; Distributed Public Key Certificate Management; Protecting the Integrity of Privacy-enhanced Electronic Mail; Practical Authorization in Large Heterogeneous Distributed Systems; Security Issues in the Truffles File System; Issues surrounding the use of Cryptographic Algorithms and Smart Card Applications; Smart Card Augmentation of Kerberos; and An Overview of the Advanced Smart Card Access Control System.more » Selected papers were processed separately for inclusion in the Energy Science and Technology Database.« less

  7. Critical Needs for Robust and Reliable Database for Design and Manufacturing of Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Singh, M.

    1999-01-01

    Ceramic matrix composite (CMC) components are being designed, fabricated, and tested for a number of high temperature, high performance applications in aerospace and ground based systems. The critical need for and the role of reliable and robust databases for the design and manufacturing of ceramic matrix composites are presented. A number of issues related to engineering design, manufacturing technologies, joining, and attachment technologies, are also discussed. Examples of various ongoing activities in the area of composite databases. designing to codes and standards, and design for manufacturing are given.

  8. Marketing Strategy and Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This report documents the marketing campaign that has been designed for middle and high school students in New Mexico to increase interest in participation in national security careers at the National Nuclear Security Administration. This marketing campaign builds on the research that was previously conducted, as well as the focus groups that were conducted. This work is a part of the National Nuclear Security Preparedness Project (NSPP) being performed under a Department of Energy (DOE) / National Nuclear Security Administration (NNSA) grant. Outcome analysis was performed to determine appropriate marketing strategies. The analysis was based upon focus groups with middlemore » school and high school students, student interactions, and surveys completed by students to understand and gauge student interest in Science, Technology, Engineering, and Math (STEM) subjects, interest in careers at NNSA, future job considerations, and student desire to pursue post-secondary education. Further, through the focus groups, students were asked to attend a presentation on NNSA job opportunities and employee requirements. The feedback received from the students was utilized to develop the focus and components of the marketing campaign.« less

  9. Performance of an optical identification and interrogation system

    NASA Astrophysics Data System (ADS)

    Venugopalan, A.; Ghosh, A. K.; Verma, P.; Cheng, S.

    2008-04-01

    A free space optics based identification and interrogation system has been designed. The applications of the proposed system lie primarily in areas which require a secure means of mutual identification and information exchange between optical readers and tags. Conventional RFIDs raise issues regarding security threats, electromagnetic interference and health safety. The security of RF-ID chips is low due to the wide spatial spread of radio waves. Malicious nodes can read data being transmitted on the network, if they are in the receiving range. The proposed system provides an alternative which utilizes the narrow paraxial beams of lasers and an RSA-based authentication scheme. These provide enhanced security to communication between a tag and the base station or reader. The optical reader can also perform remote identification and the tag can be read from a far off distance, given line of sight. The free space optical identification and interrogation system can be used for inventory management, security systems at airports, port security, communication with high security systems, etc. to name a few. The proposed system was implemented with low-cost, off-the-shelf components and its performance in terms of throughput and bit error rate has been measured and analyzed. The range of operation with a bit-error-rate lower than 10-9 was measured to be about 4.5 m. The security of the system is based on the strengths of the RSA encryption scheme implemented using more than 1024 bits.

  10. Countermeasure Evaluation and Validation Project (CEVP) Database Requirement Documentation

    NASA Technical Reports Server (NTRS)

    Shin, Sung Y.

    2003-01-01

    The initial focus of the project by the JSC laboratories will be to develop, test and implement a standardized complement of integrated physiological test (Integrated Testing Regimen, ITR) that will examine both system and intersystem function, and will be used to validate and certify candidate countermeasures. The ITR will consist of medical requirements (MRs) and non-MR core ITR tests, and countermeasure-specific testing. Non-MR and countermeasure-specific test data will be archived in a database specific to the CEVP. Development of a CEVP Database will be critical to documenting the progress of candidate countermeasures. The goal of this work is a fully functional software system that will integrate computer-based data collection and storage with secure, efficient, and practical distribution of that data over the Internet. This system will provide the foundation of a new level of interagency and international cooperation for scientific experimentation and research, providing intramural, international, and extramural collaboration through management and distribution of the CEVP data. The research performed this summer includes the first phase of the project. The first phase of the project is a requirements analysis. This analysis will identify the expected behavior of the system under normal conditions and abnormal conditions; that could affect the system's ability to produce this behavior; and the internal features in the system needed to reduce the risk of unexpected or unwanted behaviors. The second phase of this project have also performed in this summer. The second phase of project is the design of data entry screen and data retrieval screen for a working model of the Ground Data Database. The final report provided the requirements for the CEVP system in a variety of ways, so that both the development team and JSC technical management have a thorough understanding of how the system is expected to behave.

  11. Cooperative high-performance storage in the accelerated strategic computing initiative

    NASA Technical Reports Server (NTRS)

    Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark

    1996-01-01

    The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.

  12. Attachment-security prime effect on skin-conductance synchronization in psychotherapists: An empirical study.

    PubMed

    Palmieri, Arianna; Kleinbub, Johann R; Calvo, Vincenzo; Benelli, Enrico; Messina, Irene; Sambin, Marco; Voci, Alberto

    2018-03-01

    Physiological synchronization (PS) is a phenomenon of simultaneous activity between two persons' physiological signals. It has been associated with empathy, shared affectivity, and efficacious therapeutic relationships. The aim of the present study was to explore the possible connections between PS and the attachment system, seeking preliminary evidence of this link by means of an experimental manipulation of the sense of attachment security in psychotherapists according to a protocol by Mikulincer and Shaver (2001), which has been proven to elicit empathetic behavior. We compared the synchronization of skin-conductance signals in brief psychological interviews between 18 psychodynamic therapists and 18 healthy volunteers. A sense of attachment-security priming was administered to half of the therapists, whereas the other half received a positive-affect control prime. Lag analysis was performed to investigate the "leading" or "following" attitudes of the participants in the two conditions. Mixed-model regressions and evidence-ratio model comparisons were used to investigate the effects of the manipulation on PS. Therapist attachment anxiety and avoidance traits were considered covariates. The attachment-security prime showed a significant effect on PS lag dynamics, but not on overall PS amount. Lag analysis showed that the therapists in the attachment-security condition were significantly more prone to assume a leading attitude in the physiological coupling than the therapists in the control condition. Therapist attachment anxiety and avoidance had no apparent effect. Our result paves the way for further exploration of the clinical relationship from a physiological standpoint. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Clinical Databases for Chest Physicians.

    PubMed

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  14. Corporate Crime Database Act

    THOMAS, 113th Congress

    Rep. Conyers, John, Jr. [D-MI-13

    2014-04-10

    House - 06/09/2014 Referred to the Subcommittee on Crime, Terrorism, Homeland Security, and Investigations. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  15. 36 CFR 1260.34 - What are the responsibilities of the NDC?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ADMINISTRATION DECLASSIFICATION DECLASSIFICATION OF NATIONAL SECURITY INFORMATION The National Declassification... databases; and (f) Storage, and related services, on a reimbursable basis, for Federal records containing...

  16. 49 CFR 224.109 - Inspection, repair, and replacement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REFLECTORIZATION OF RAIL FREIGHT ROLLING STOCK Application... of the defect is maintained in the locomotive cab or in a secure and accessible electronic database...

  17. 36 CFR 1260.34 - What are the responsibilities of the NDC?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ADMINISTRATION DECLASSIFICATION DECLASSIFICATION OF NATIONAL SECURITY INFORMATION The National Declassification... databases; and (f) Storage, and related services, on a reimbursable basis, for Federal records containing...

  18. 49 CFR 224.109 - Inspection, repair, and replacement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REFLECTORIZATION OF RAIL FREIGHT ROLLING STOCK Application... of the defect is maintained in the locomotive cab or in a secure and accessible electronic database...

  19. 49 CFR 224.109 - Inspection, repair, and replacement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REFLECTORIZATION OF RAIL FREIGHT ROLLING STOCK Application... of the defect is maintained in the locomotive cab or in a secure and accessible electronic database...

  20. Noise of High-Performance Aircraft at Afterburner

    DTIC Science & Technology

    2015-03-30

    Quarterly progress report 3. DATES COVERED (From - To) 12-15-2014 to 04-03-2015 4. TITLE AND SUBTITLE Noise of High-Performance Aircraft at Afterburner ...generation of a high- performance aircraft operating at afterburner condition. The new noise components are indirect combustion noise produced by the...spectrum is reported 15. SUBJECT TERMS Jet noise at afterburner 16. SECURITY CLASSIFICATION OF: a. REPORT u b. ABSTRACT u c. THIS PAGE u 17

  1. Countries at Risk: Heightened Human Security Risk to States With Transboundary Water Resources and Instability

    NASA Astrophysics Data System (ADS)

    Veilleux, J. C.; Sullivan, G. S.; Paola, C.; Starget, A.; Watson, J. E.; Hwang, Y. J.; Picucci, J. A.; Choi, C. S.

    2014-12-01

    The Countries at Risk project is a global assessment of countries with transboundary water resources that are at risk for conflict because of high human security instability. Building upon Basins at Risk (BAR) research, our team used updated Transboundary Freshwater Dispute Database georeferenced social and environmental data, quantitative data from global indices, and qualitative data from news media sources. Our assessment considered a combination of analyzing 15 global indices related to water or human security to identify which countries scored as highest risk in each index. From this information, we were able to assess the highest risk countries' human security risk by using a new human security measurement tool, as well as comparing this analysis to the World Bank's Fragile States Index and the experimental Human Security Index. In addition, we identified which countries have the highest number of shared basins, the highest percentage of territory covered by a transboundary basin, and the highest dependency of withdrawal from transboundary waters from outside their country boundaries. By synthesizing these social and environmental data assessments, we identified five countries to analyze as case studies. These five countries are Afghanistan, China, Iraq, Moldova, and Sudan. We created a series of 30 maps to spatial analyze the relationship between the transboundary basins and social and environmental parameters to include population, institutional capacity, and physical geography by country. Finally, we synthesized our spatial analysis, Human Security Key scores, and current events scored by using the BAR scale to determine what aspects and which basins are most at risk with each country in our case studies and how this concerns future global water resources.

  2. Securing BGP Using External Security Monitors

    DTIC Science & Technology

    2006-01-01

    forms. In Proc. SOSP, Brighton , UK , Oct. 2005. [19] A. Seshadri, A. Perrig, L. van Doorn, and P. Khosla. SWATT: Software-based Attestation for...Williams, E. G. Sirer, and F. B. Schnei- der. Nexus: A New Operating System for Trustwor- thy Computing (extended abstract). In Proc. SOSP, Brighton , UK ...as a distributed database of untrustworthy hosts or messages. An ESM that detects invalid behavior issues a certifi- cate describing the behavior or

  3. Negative emotionality moderates associations among attachment, toddler sleep, and later problem behaviors.

    PubMed

    Troxel, Wendy M; Trentacosta, Christopher J; Forbes, Erika E; Campbell, Susan B

    2013-02-01

    Secure parent-child relationships are implicated in children's self-regulation, including the ability to self-soothe at bedtime. Sleep, in turn, may serve as a pathway linking attachment security with subsequent emotional and behavioral problems in children. We used path analysis to examine the direct relationship between attachment security and maternal reports of sleep problems during toddlerhood and the degree to which sleep serves as a pathway linking attachment with subsequent teacher-reported emotional and behavioral problems. We also examined infant negative emotionality as a vulnerability factor that may potentiate attachment-sleep-adjustment outcomes. Data were drawn from 776 mother-infant dyads participating in the National Institute of Child and Human Development Study of Early Child Care. After statistically adjusting for mother and child characteristics, including child sleep and emotional and behavioral problems at 24 months, we found no evidence for a statistically significant direct path between attachment security and sleep problems at 36 months; however, there was a direct relationship between sleep problems at 36 months and internalizing problems at 54 months. Path models that examined the moderating influence of infant negative emotionality demonstrated significant direct relationships between attachment security and toddler sleep problems and between sleep problems and subsequent emotional and behavioral problems, but only among children characterized by high negative emotionality at 6 months. In addition, among this subset, there was a significant indirect path between attachment and internalizing problems through sleep problems. These longitudinal findings implicate sleep as one critical pathway linking attachment security with adjustment difficulties, particularly among temperamentally vulnerable children. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  4. Attitudes towards information system security among physicians in Croatia.

    PubMed

    Markota, M; Kern, J; Svab, I

    2001-07-01

    To examine attitudes about information system security among Croatian physicians a cross-sectional study was performed on a representative sample of 800 Croatian physicians. An anonymous questionnaire comprising 21 questions was distributed and statistical analysis was performed using a chi-square test. A 76.2% response rate was obtained. The majority of respondents (85.8%) believe that information system security is a new area in their work. In general, physicians are not informed about European directives, conventions, recommendations, etc. Only a small number of physicians use personal computers at work (29%). Those physicians who have a personal computer use it mainly for administrative reasons. Most healthcare institutions (89%) do not have a security manual and the area of information system security is left to individual interest and initiative. Only 25% of physicians who have a personal computer use any type of password. A high percentage of physicians (22%) has never thought about the problem of personal data being used by organizations (e.g. police, banks) without legal background; a small, but still significant percentage of physicians (5.6%) has even agreed with such use. Results indicate that for the vast majority of physicians, information system security is a new area in their daily work, one which is left to individual interest and initiative. They are not familiar with the ethical, technical and legal backgrounds which have been defined for that area within the Council of Europe and the European Union. New aspects: This is the first study performed in Central and Eastern Europe dealing with information system security, performed on a representative nationwide sample of all the physicians.

  5. 36 CFR § 1260.34 - What are the responsibilities of the NDC?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... RECORDS ADMINISTRATION DECLASSIFICATION DECLASSIFICATION OF NATIONAL SECURITY INFORMATION The National... databases; and (f) Storage, and related services, on a reimbursable basis, for Federal records containing...

  6. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  7. The old age health security in rural China: where to go?

    PubMed

    Dai, Baozhen

    2015-11-04

    The huge number of rural elders and the deepening health problems (e.g. growing threats of infectious diseases and chronic diseases etc.) place enormous pressure on old age health security in rural China. This study aims to provide information for policy-makers to develop effective measures for promoting rural elders' health care service access by examining the current developments and challenges confronted by the old age health security in rural China. Search resources are electronic databases, web pages of the National Bureau of Statistics of China and the National Health and Family Planning Commission of China on the internet, China Population and Employment Statistics Yearbook, China Civil Affairs' Statistical Yearbook and China Health Statistics Yearbooks etc. Articles were identified from Elsevier, Wiley, EBSCO, EMBASE, PubMed, SCI Expanded, ProQuest, and National Knowledge Infrastructure of China (CNKI) which is the most informative database in Chinese. Search terms were "rural", "China", "health security", "cooperative medical scheme", "social medical assistance", "medical insurance" or "community based medical insurance", "old", or "elder", "elderly", or "aged", "aging". Google scholar was searched with the same combination of keywords. The results showed that old age health security in rural China had expanded to all rural elders and substantially improved health care service utilization among rural elders. Increasing chronic disease prevalence rates, pressing public health issues, inefficient rural health care service provision system and lack of sufficient financing challenged the old age health security in rural China. Increasing funds from the central and regional governments for old age health security in rural China will contribute to reducing urban-rural disparities in provision of old age health security and increasing health equity among rural elders between different regions. Meanwhile, initiating provider payment reform may contribute to improving the efficiency of rural health care service provision system and promoting health care service access among rural elders.

  8. Algorithms and methodology used in constructing high-resolution terrain databases

    NASA Astrophysics Data System (ADS)

    Williams, Bryan L.; Wilkosz, Aaron

    1998-07-01

    This paper presents a top-level description of methods used to generate high-resolution 3D IR digital terrain databases using soft photogrammetry. The 3D IR database is derived from aerial photography and is made up of digital ground plane elevation map, vegetation height elevation map, material classification map, object data (tanks, buildings, etc.), and temperature radiance map. Steps required to generate some of these elements are outlined. The use of metric photogrammetry is discussed in the context of elevation map development; and methods employed to generate the material classification maps are given. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems. A discussion is also presented on database certification which consists of validation, verification, and accreditation procedures followed to certify that the developed databases give a true representation of the area of interest, and are fully compatible with the targeted digital simulators.

  9. A secure cluster-based multipath routing protocol for WMSNs.

    PubMed

    Almalkawi, Islam T; Zapata, Manel Guerrero; Al-Karaki, Jamal N

    2011-01-01

    The new characteristics of Wireless Multimedia Sensor Network (WMSN) and its design issues brought by handling different traffic classes of multimedia content (video streams, audio, and still images) as well as scalar data over the network, make the proposed routing protocols for typical WSNs not directly applicable for WMSNs. Handling real-time multimedia data requires both energy efficiency and QoS assurance in order to ensure efficient utility of different capabilities of sensor resources and correct delivery of collected information. In this paper, we propose a Secure Cluster-based Multipath Routing protocol for WMSNs, SCMR, to satisfy the requirements of delivering different data types and support high data rate multimedia traffic. SCMR exploits the hierarchical structure of powerful cluster heads and the optimized multiple paths to support timeliness and reliable high data rate multimedia communication with minimum energy dissipation. Also, we present a light-weight distributed security mechanism of key management in order to secure the communication between sensor nodes and protect the network against different types of attacks. Performance evaluation from simulation results demonstrates a significant performance improvement comparing with existing protocols (which do not even provide any kind of security feature) in terms of average end-to-end delay, network throughput, packet delivery ratio, and energy consumption.

  10. A Secure Cluster-Based Multipath Routing Protocol for WMSNs

    PubMed Central

    Almalkawi, Islam T.; Zapata, Manel Guerrero; Al-Karaki, Jamal N.

    2011-01-01

    The new characteristics of Wireless Multimedia Sensor Network (WMSN) and its design issues brought by handling different traffic classes of multimedia content (video streams, audio, and still images) as well as scalar data over the network, make the proposed routing protocols for typical WSNs not directly applicable for WMSNs. Handling real-time multimedia data requires both energy efficiency and QoS assurance in order to ensure efficient utility of different capabilities of sensor resources and correct delivery of collected information. In this paper, we propose a Secure Cluster-based Multipath Routing protocol for WMSNs, SCMR, to satisfy the requirements of delivering different data types and support high data rate multimedia traffic. SCMR exploits the hierarchical structure of powerful cluster heads and the optimized multiple paths to support timeliness and reliable high data rate multimedia communication with minimum energy dissipation. Also, we present a light-weight distributed security mechanism of key management in order to secure the communication between sensor nodes and protect the network against different types of attacks. Performance evaluation from simulation results demonstrates a significant performance improvement comparing with existing protocols (which do not even provide any kind of security feature) in terms of average end-to-end delay, network throughput, packet delivery ratio, and energy consumption. PMID:22163854

  11. Strengthening National, Homeland, and Economic Security. Networking and Information Technology Research and Development Supplement to the President’s FY 2003 Budget

    DTIC Science & Technology

    2002-07-01

    Knowledge From Data .................................................. 25 HIGH-CONFIDENCE SOFTWARE AND SYSTEMS Reliability, Security, and Safety for...NOAA’s Cessna Citation flew over the 16-acre World Trade Center site, scanning with an Optech ALSM unit. The system recorded data points from 33,000...provide the data storage and compute power for intelligence analysis, high-performance national defense systems , and critical scientific research • Large

  12. Key Future Engineering Capabilities for Human Capital Retention

    NASA Astrophysics Data System (ADS)

    Sivich, Lorrie

    Projected record retirements of Baby Boomer generation engineers have been predicted to result in significant losses of mission-critical knowledge in space, national security, and future scientific ventures vital to high-technology corporations. No comprehensive review or analysis of engineering capabilities has been performed to identify threats related to the specific loss of mission-critical knowledge posed by the increasing retirement of tenured engineers. Archival data from a single diversified Fortune 500 aerospace manufacturing engineering company's engineering career database were analyzed to ascertain whether relationships linking future engineering capabilities, engineering disciplines, and years of engineering experience could be identified to define critical knowledge transfer models. Chi square, logistic, and linear regression analyses were used to map patterns of discipline-specific, mission-critical knowledge using archival data of engineers' perceptions of engineering capabilities, key developmental experiences, and knowledge learned from their engineering careers. The results from the study were used to document key engineering future capabilities. The results were then used to develop a proposed human capital retention plan to address specific key knowledge gaps of younger engineers as veteran engineers retire. The potential for social change from this study involves informing leaders of aerospace engineering corporations on how to build better quality mentoring or succession plans to fill the void of lost knowledge from retiring engineers. This plan can secure mission-critical knowledge for younger engineers for current and future product development and increased global competitiveness in the technology market.

  13. Grid-enabled mammographic auditing and training system

    NASA Astrophysics Data System (ADS)

    Yap, M. H.; Gale, A. G.

    2008-03-01

    Effective use of new technologies to support healthcare initiatives is important and current research is moving towards implementing secure grid-enabled healthcare provision. In the UK, a large-scale collaborative research project (GIMI: Generic Infrastructures for Medical Informatics), which is concerned with the development of a secure IT infrastructure to support very widespread medical research across the country, is underway. In the UK, there are some 109 breast screening centers and a growing number of individuals (circa 650) nationally performing approximately 1.5 million screening examinations per year. At the same, there is a serious, and ongoing, national workforce issue in screening which has seen a loss of consultant mammographers and a growth in specially trained technologists and other non-radiologists. Thus there is a need to offer effective and efficient mammographic training so as to maintain high levels of screening skills. Consequently, a grid based system has been proposed which has the benefit of offering very large volumes of training cases that the mammographers can access anytime and anywhere. A database, spread geographically across three university systems, of screening cases is used as a test set of known cases. The GIMI mammography training system first audits these cases to ensure that they are appropriately described and annotated. Subsequently, the cases are utilized for training in a grid-based system which has been developed. This paper briefly reviews the background to the project and then details the ongoing research. In conclusion, we discuss the contributions, limitations, and future plans of such a grid based approach.

  14. Multiple-Feature Extracting Modules Based Leak Mining System Design

    PubMed Central

    Cho, Ying-Chiang; Pan, Jen-Yi

    2013-01-01

    Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing. PMID:24453892

  15. Multiple-feature extracting modules based leak mining system design.

    PubMed

    Cho, Ying-Chiang; Pan, Jen-Yi

    2013-01-01

    Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing.

  16. Hash function based on chaotic map lattices.

    PubMed

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  17. Hash function based on chaotic map lattices

    NASA Astrophysics Data System (ADS)

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  18. Design and Analysis of a Model Reconfigurable Cyber-Exercise Laboratory (RCEL) for Information Assurance Education

    DTIC Science & Technology

    2004-03-01

    with MySQL . This choice was made because MySQL is open source. Any significant database engine such as Oracle or MS- SQL or even MS Access can be used...10 Figure 6. The DoD vs . Commercial Life Cycle...necessarily be interested in SCADA network security 13. MySQL (Database server) – This station represents a typical data server for a web page

  19. CALS Database Usage and Analysis Tool Study

    DTIC Science & Technology

    1991-09-01

    inference aggregation and cardinality aggregation as two distinct aspects of the aggregation problem. The paper develops the concept of a semantic...aggregation, cardinality aggregation I " CALS Database Usage Analysis Tool Study * Bibliography * Page 7 i NIDX - An Expert System for Real-Time...1989 IEEE Symposium on Research in Security and Privacy, Oakland, CA, May 1989. [21 Baur, D.S.; Eichelman, F.R. 1I; Herrera , R.M.; Irgon, A.E

  20. Comment on "flexible protocol for quantum private query based on B92 protocol"

    NASA Astrophysics Data System (ADS)

    Chang, Yan; Zhang, Shi-Bin; Zhu, Jing-Min

    2017-03-01

    In a recent paper (Quantum Inf Process 13:805-813, 2014), a flexible quantum private query (QPQ) protocol based on B92 protocol is presented. Here we point out that the B92-based QPQ protocol is insecure in database security when the channel has loss, that is, the user (Alice) will know more records in Bob's database compared with she has bought.

Top