The methodology of database design in organization management systems
NASA Astrophysics Data System (ADS)
Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.
2017-01-01
The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.
Developing Visualization Support System for Teaching/Learning Database Normalization
ERIC Educational Resources Information Center
Folorunso, Olusegun; Akinwale, AdioTaofeek
2010-01-01
Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…
A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol
ERIC Educational Resources Information Center
Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.
2006-01-01
Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…
Medication safety research by observational study design.
Lao, Kim S J; Chui, Celine S L; Man, Kenneth K C; Lau, Wallis C Y; Chan, Esther W; Wong, Ian C K
2016-06-01
Observational studies have been recognised to be essential for investigating the safety profile of medications. Numerous observational studies have been conducted on the platform of large population databases, which provide adequate sample size and follow-up length to detect infrequent and/or delayed clinical outcomes. Cohort and case-control are well-accepted traditional methodologies for hypothesis testing, while within-individual study designs are developing and evolving, addressing previous known methodological limitations to reduce confounding and bias. Respective examples of observational studies of different study designs using medical databases are shown. Methodology characteristics, study assumptions, strengths and weaknesses of each method are discussed in this review.
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
ERIC Educational Resources Information Center
Deutsch, Donald R.
This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…
ERIC Educational Resources Information Center
Lucas, Paul M.
2009-01-01
This study utilized a mixed-method design in order to investigate the alignment of secondary science teachers' instructional methodologies and their homework designs. Surveys were distributed to educators from a Center for Ocean Sciences Excellence Education (COSEE) database. Coding rubrics were developed to categorize the participants' responses…
Emission Database for Global Atmospheric Research (EDGAR).
ERIC Educational Resources Information Center
Olivier, J. G. J.; And Others
1994-01-01
Presents the objective and methodology chosen for the construction of a global emissions source database called EDGAR and the structural design of the database system. The database estimates on a regional and grid basis, 1990 annual emissions of greenhouse gases, and of ozone depleting compounds from all known sources. (LZ)
Centralized database for interconnection system design. [for spacecraft
NASA Technical Reports Server (NTRS)
Billitti, Joseph W.
1989-01-01
A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.
Linking Multiple Databases: Term Project Using "Sentences" DBMS.
ERIC Educational Resources Information Center
King, Ronald S.; Rainwater, Stephen B.
This paper describes a methodology for use in teaching an introductory Database Management System (DBMS) course. Students master basic database concepts through the use of a multiple component project implemented in both relational and associative data models. The associative data model is a new approach for designing multi-user, Web-enabled…
Vieira, Vanessa Pedrosa; De Biase, Noemi; Peccin, Maria Stella; Atallah, Alvaro Nagib
2009-06-01
To evaluate the methodological adequacy of voice and laryngeal study designs published in speech-language pathology and otorhinolaryngology journals indexed for the ISI Web of Knowledge (ISI Web) and the MEDLINE database. A cross-sectional study conducted at the Universidade Federal de São Paulo (Federal University of São Paulo). Two Brazilian speech-language pathology and otorhinolaryngology journals (Pró-Fono and Revista Brasileira de Otorrinolaringologia) and two international speech-language pathology and otorhinolaryngology journals (Journal of Voice, Laryngoscope), all dated between 2000 and 2004, were hand-searched by specialists. Subsequently, voice and larynx publications were separated, and a speech-language pathologist and otorhinolaryngologist classified 374 articles from the four journals according to objective and study design. The predominant objective contained in the articles was that of primary diagnostic evaluation (27%), and the most frequent study design was case series (33.7%). A mere 7.8% of the studies were designed adequately with respect to the stated objectives. There was no statistical difference in the methodological quality of studies indexed for the ISI Web and the MEDLINE database. The studies published in both national journals, indexed for the MEDLINE database, and international journals, indexed for the ISI Web, demonstrate weak methodology, with research poorly designed to meet the proposed objectives. There is much scientific work to be done in order to decrease uncertainty in the field analysed.
ERIC Educational Resources Information Center
Cavaleri, Piero
2008-01-01
Purpose: The purpose of this paper is to describe the use of AJAX for searching the Biblioteche Oggi database of bibliographic records. Design/methodology/approach: The paper is a demonstration of how bibliographic database single page interfaces allow the implementation of more user-friendly features for social and collaborative tasks. Findings:…
DOT National Transportation Integrated Search
2005-01-01
In 2003, an Internet-based Geotechnical Database Management System (GDBMS) was developed for the Virginia Department of Transportation (VDOT) using distributed Geographic Information System (GIS) methodology for data management, archival, retrieval, ...
The Philip Morris Information Network: A Library Database on an In-House Timesharing System.
ERIC Educational Resources Information Center
DeBardeleben, Marian Z.; And Others
1983-01-01
Outlines a database constructed at Philip Morris Research Center Library which encompasses holdings and circulation and acquisitions records for all items in the library. Host computer (DECSYSTEM-2060), software (BASIC), database design, search methodology, cataloging, and accessibility are noted; sample search, circ-in profile, end user profiles,…
The Recovery Care and Treatment Center: A Database Design and Development Case
ERIC Educational Resources Information Center
Harris, Ranida B.; Vaught, Kara L.
2008-01-01
The advantages of active learning methodologies have been suggested and empirically shown by a number of IS educators. Case studies are one such teaching technique that offers students the ability to think analytically, apply material learned, and solve a real-world problem. This paper presents a case study designed to be used in a database design…
NASA Technical Reports Server (NTRS)
Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.
1985-01-01
Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.
1984-12-01
52242 Prepared for the AIR FORCE OFFICE OF SCIENTIFIC RESEARCH Under Grant No. AFOSR 82-0322 December 1984 ~ " ’w Unclassified SECURITY CLASSIFICATION4...OF THIS PAGE REPORT DOCUMENTATION PAGE is REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE MARKINGS Unclassified None 20 SECURITY CLASSIFICATION...designer .and computer- are 20 DIiRIBUTION/AVAILABI LIT Y 0P ABSTR4ACT 21 ABSTRACT SECURITY CLASSIFICA1ONr UNCLASSIFIED/UNLIMITED SAME AS APT OTIC USERS
NASA Astrophysics Data System (ADS)
Ehlmann, Bryon K.
Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... statistical and other methodological consultation to this collaborative project. Discussion: Grantees under... and technical assistance must be designed to contribute to the following outcomes: (a) Maintenance of... methodological consultation available for research projects that use the BMS Database, as well as site- specific...
Options for Putting CDS/ISIS Databases on the Internet
ERIC Educational Resources Information Center
Buxton, Andrew
2006-01-01
Purpose: To review the variety of software solutions available for putting CDS/ISIS databases on the internet. To help anyone considering which route to take. Design/methodology/approach: Briefly describes the characteristics, history, origin and availability of each package. Identifies the type of skills required to implement the package and the…
Improved orthologous databases to ease protozoan targets inference.
Kotowski, Nelson; Jardim, Rodrigo; Dávila, Alberto M R
2015-09-29
Homology inference helps on identifying similarities, as well as differences among organisms, which provides a better insight on how closely related one might be to another. In addition, comparative genomics pipelines are widely adopted tools designed using different bioinformatics applications and algorithms. In this article, we propose a methodology to build improved orthologous databases with the potential to aid on protozoan target identification, one of the many tasks which benefit from comparative genomics tools. Our analyses are based on OrthoSearch, a comparative genomics pipeline originally designed to infer orthologs through protein-profile comparison, supported by an HMM, reciprocal best hits based approach. Our methodology allows OrthoSearch to confront two orthologous databases and to generate an improved new one. Such can be later used to infer potential protozoan targets through a similarity analysis against the human genome. The protein sequences of Cryptosporidium hominis, Entamoeba histolytica and Leishmania infantum genomes were comparatively analyzed against three orthologous databases: (i) EggNOG KOG, (ii) ProtozoaDB and (iii) Kegg Orthology (KO). That allowed us to create two new orthologous databases, "KO + EggNOG KOG" and "KO + EggNOG KOG + ProtozoaDB", with 16,938 and 27,701 orthologous groups, respectively. Such new orthologous databases were used for a regular OrthoSearch run. By confronting "KO + EggNOG KOG" and "KO + EggNOG KOG + ProtozoaDB" databases and protozoan species we were able to detect the following total of orthologous groups and coverage (relation between the inferred orthologous groups and the species total number of proteins): Cryptosporidium hominis: 1,821 (11 %) and 3,254 (12 %); Entamoeba histolytica: 2,245 (13 %) and 5,305 (19 %); Leishmania infantum: 2,702 (16 %) and 4,760 (17 %). Using our HMM-based methodology and the largest created orthologous database, it was possible to infer 13 orthologous groups which represent potential protozoan targets; these were found because of our distant homology approach. We also provide the number of species-specific, pair-to-pair and core groups from such analyses, depicted in Venn diagrams. The orthologous databases generated by our HMM-based methodology provide a broader dataset, with larger amounts of orthologous groups when compared to the original databases used as input. Those may be used for several homology inference analyses, annotation tasks and protozoan targets identification.
Daily Migraine Prevention and Its Influence on Resource Utilization in the Military Health System
2006-08-01
Database and employed a one group pretest - posttest design of patients exposed to prevention. Each patient was followed over 18 months (6 months prior to...Framework ..............................................37 Chapter IV: Research Design and Methodology.............................38 Overview of Design...39 Data Collection ................................................................................41 Research Hypotheses
Toward a Bio-Medical Thesaurus: Building the Foundation of the UMLS
Tuttle, Mark S.; Blois, Marsden S.; Erlbaum, Mark S.; Nelson, Stuart J.; Sherertz, David D.
1988-01-01
The Unified Medical Language System (UMLS) is being designed to provide a uniform user interface to heterogeneous machine-readable bio-medical information resources, such as bibliographic databases, genetic databases, expert systems and patient records.1 Such an interface will have to recognize different ways of saying the same thing, and provide links to ways of saying related things. One way to represent the necessary associations is via a domain thesaurus. As no such thesaurus exists, and because, once built, it will be both sizable and in need of continuous maintenance, its design should include a methodology for building and maintaining it. We propose a methodology, utilizing lexically expanded schema inversion, and a design, called T. Lex, which together form one approach to the problem of defining and building a bio-medical thesaurus. We argue that the semantic locality implicit in such a thesaurus will support model-based reasoning in bio-medicine.2
Lessons Learned From Developing Reactor Pressure Vessel Steel Embrittlement Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jy-An John
Materials behaviors caused by neutron irradiation under fission and/or fusion environments can be little understood without practical examination. Easily accessible material information system with large material database using effective computers is necessary for design of nuclear materials and analyses or simulations of the phenomena. The developed Embrittlement Data Base (EDB) at ORNL is this comprehensive collection of data. EDB database contains power reactor pressure vessel surveillance data, the material test reactor data, foreign reactor data (through bilateral agreements authorized by NRC), and the fracture toughness data. The lessons learned from building EDB program and the associated database management activity regardingmore » Material Database Design Methodology, Architecture and the Embedded QA Protocol are described in this report. The development of IAEA International Database on Reactor Pressure Vessel Materials (IDRPVM) and the comparison of EDB database and IAEA IDRPVM database are provided in the report. The recommended database QA protocol and database infrastructure are also stated in the report.« less
Online Bibliographic Databases in South Central Pennsylvania: Current Status and Training Needs.
ERIC Educational Resources Information Center
Townley, Charles
A survey of libraries in south central Pennsylvania was designed to identify those that are using or planning to use databases and assess their perceived training needs. This report describes the methodology and analyzes the responses received form the 57 libraries that completed the questionnaire. Data presented in eight tables are concerned with…
Collective Bargaining in Higher Education and the Professions. Bibliography No. 20.
ERIC Educational Resources Information Center
Lowe, Ida B.; Johnson, Beth Hillman
This bibliography of 834 citations is an annual accounting of literature on collective bargaining in higher education and the professions for 1991. The research and design and methodology used in the preparation of this volume relied on computer searches of various databases and manual retrieval of other citations not available on database.…
A Conceptual Framework for Systematic Reviews of Research in Educational Leadership and Management
ERIC Educational Resources Information Center
Hallinger, Philip
2013-01-01
Purpose: The purpose of this paper is to present a framework for scholars carrying out reviews of research that meet international standards for publication. Design/methodology/approach: This is primarily a conceptual paper focusing on the methodology of conducting systematic reviews of research. However, the paper draws on a database of reviews…
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
ERIC Educational Resources Information Center
Oduwole, A. A.; Sowole, A. O.
2006-01-01
Purpose: This study examined the utilisation of the Essential Electronic Agricultural Library database (TEEAL) at the University of Agriculture Library, Abeokuta, Nigeria. Design/methodology/approach: Data collection was by questionnaire following a purposive sampling technique. A total of 104 out 150 (69.3 per cent) responses were received and…
Sensory re-education after nerve injury of the upper limb: a systematic review.
Oud, Tanja; Beelen, Anita; Eijffinger, Elianne; Nollet, Frans
2007-06-01
To systematically review the available evidence for the effectiveness of sensory re-education to improve the sensibility of the hand in patients with a peripheral nerve injury of the upper limb. Studies were identified by an electronic search in the databases MEDLINE, Cumulative Index to Nursing & Allied Health Literature (CINAHL), EMBASE, the Cochrane Library, the Physiotherapy Evidence Database (PEDro), and the database of the Dutch National Institute of Allied Health Professions (Doconline) and by screening the reference lists of relevant articles. Two reviewers selected studies that met the following inclusion criteria: all designs except case reports, adults with impaired sensibility of the hand due to a peripheral nerve injury of the upper limb, and sensibility and functional sensibility as outcome measures. The methodological quality of the included studies was independently assessed by two reviewers. A best-evidence synthesis was performed, based on design, methodological quality and significant findings on outcome measures. Seven studies, with sample sizes ranging from 11 to 49, were included in the systematic review and appraised for content. Five of these studies were of poor methodological quality. One uncontrolled study (N = 1 3 ) was considered to be of sufficient methodological quality, and one randomized controlled trial (N = 49) was of high methodological quality. Best-evidence synthesis showed that there is limited evidence for the effectiveness of sensory re-education, provided by a statistically significant improvement in sensibility found in one high-quality randomized controlled trial. There is a need for further well-defined clinical trials to assess the effectiveness of sensory re-education of patients with impaired sensibility of the hand due to a peripheral nerve injury.
Ultra-Structure database design methodology for managing systems biology data and analyses
Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C
2009-01-01
Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Griffith, B C; White, H D; Drott, M C; Saye, J D
1986-07-01
This article reports on five separate studies designed for the National Library of Medicine (NLM) to develop and test methodologies for evaluating the products of large databases. The methodologies were tested on literatures of the medical behavioral sciences (MBS). One of these studies examined how well NLM covered MBS monographic literature using CATLINE and OCLC. Another examined MBS journal and serial literature coverage in MEDLINE and other MBS-related databases available through DIALOG. These two studies used 1010 items derived from the reference lists of sixty-one journals, and tested for gaps and overlaps in coverage in the various databases. A third study examined the quality of the indexing NLM provides to MBS literatures and developed a measure of indexing as a system component. The final two studies explored how well MEDLINE retrieved documents on topics submitted by MBS professionals and how online searchers viewed MEDLINE (and other systems and databases) in handling MBS topics. The five studies yielded both broad research outcomes and specific recommendations to NLM.
SwePep, a database designed for endogenous peptides and mass spectrometry.
Fälth, Maria; Sköld, Karl; Norrman, Mathias; Svensson, Marcus; Fenyö, David; Andren, Per E
2006-06-01
A new database, SwePep, specifically designed for endogenous peptides, has been constructed to significantly speed up the identification process from complex tissue samples utilizing mass spectrometry. In the identification process the experimental peptide masses are compared with the peptide masses stored in the database both with and without possible post-translational modifications. This intermediate identification step is fast and singles out peptides that are potential endogenous peptides and can later be confirmed with tandem mass spectrometry data. Successful applications of this methodology are presented. The SwePep database is a relational database developed using MySql and Java. The database contains 4180 annotated endogenous peptides from different tissues originating from 394 different species as well as 50 novel peptides from brain tissue identified in our laboratory. Information about the peptides, including mass, isoelectric point, sequence, and precursor protein, is also stored in the database. This new approach holds great potential for removing the bottleneck that occurs during the identification process in the field of peptidomics. The SwePep database is available to the public.
Fashion sketch design by interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Mok, P. Y.; Wang, X. X.; Xu, J.; Kwok, Y. L.
2012-11-01
Computer aided design is vitally important for the modern industry, particularly for the creative industry. Fashion industry faced intensive challenges to shorten the product development process. In this paper, a methodology is proposed for sketch design based on interactive genetic algorithms. The sketch design system consists of a sketch design model, a database and a multi-stage sketch design engine. First, a sketch design model is developed based on the knowledge of fashion design to describe fashion product characteristics by using parameters. Second, a database is built based on the proposed sketch design model to define general style elements. Third, a multi-stage sketch design engine is used to construct the design. Moreover, an interactive genetic algorithm (IGA) is used to accelerate the sketch design process. The experimental results have demonstrated that the proposed method is effective in helping laypersons achieve satisfied fashion design sketches.
ERIC Educational Resources Information Center
Cheung, Waiman; Li, Eldon Y.; Yee, Lester W.
2003-01-01
Metadatabase modeling and design integrate process modeling and data modeling methodologies. Both are core topics in the information technology (IT) curriculum. Learning these topics has been an important pedagogical issue to the core studies for management information systems (MIS) and computer science (CSc) students. Unfortunately, the learning…
Kamal, Noreen; Fels, Sidney
2013-01-01
Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.
Fusion of Visible and Thermal Descriptors Using Genetic Algorithms for Face Recognition Systems.
Hermosilla, Gabriel; Gallardo, Francisco; Farias, Gonzalo; San Martin, Cesar
2015-07-23
The aim of this article is to present a new face recognition system based on the fusion of visible and thermal features obtained from the most current local matching descriptors by maximizing face recognition rates through the use of genetic algorithms. The article considers a comparison of the performance of the proposed fusion methodology against five current face recognition methods and classic fusion techniques used commonly in the literature. These were selected by considering their performance in face recognition. The five local matching methods and the proposed fusion methodology are evaluated using the standard visible/thermal database, the Equinox database, along with a new database, the PUCV-VTF, designed for visible-thermal studies in face recognition and described for the first time in this work. The latter is created considering visible and thermal image sensors with different real-world conditions, such as variations in illumination, facial expression, pose, occlusion, etc. The main conclusions of this article are that two variants of the proposed fusion methodology surpass current face recognition methods and the classic fusion techniques reported in the literature, attaining recognition rates of over 97% and 99% for the Equinox and PUCV-VTF databases, respectively. The fusion methodology is very robust to illumination and expression changes, as it combines thermal and visible information efficiently by using genetic algorithms, thus allowing it to choose optimal face areas where one spectrum is more representative than the other.
Fusion of Visible and Thermal Descriptors Using Genetic Algorithms for Face Recognition Systems
Hermosilla, Gabriel; Gallardo, Francisco; Farias, Gonzalo; San Martin, Cesar
2015-01-01
The aim of this article is to present a new face recognition system based on the fusion of visible and thermal features obtained from the most current local matching descriptors by maximizing face recognition rates through the use of genetic algorithms. The article considers a comparison of the performance of the proposed fusion methodology against five current face recognition methods and classic fusion techniques used commonly in the literature. These were selected by considering their performance in face recognition. The five local matching methods and the proposed fusion methodology are evaluated using the standard visible/thermal database, the Equinox database, along with a new database, the PUCV-VTF, designed for visible-thermal studies in face recognition and described for the first time in this work. The latter is created considering visible and thermal image sensors with different real-world conditions, such as variations in illumination, facial expression, pose, occlusion, etc. The main conclusions of this article are that two variants of the proposed fusion methodology surpass current face recognition methods and the classic fusion techniques reported in the literature, attaining recognition rates of over 97% and 99% for the Equinox and PUCV-VTF databases, respectively. The fusion methodology is very robust to illumination and expression changes, as it combines thermal and visible information efficiently by using genetic algorithms, thus allowing it to choose optimal face areas where one spectrum is more representative than the other. PMID:26213932
Spatial Designation of Critical Habitats for Endangered and Threatened Species in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuttle, Mark A; Singh, Nagendra; Sabesan, Aarthy
Establishing biological reserves or "hot spots" for endangered and threatened species is critical to support real-world species regulatory and management problems. Geographic data on the distribution of endangered and threatened species can be used to improve ongoing efforts for species conservation in the United States. At present no spatial database exists which maps out the location endangered species for the US. However, spatial descriptions do exists for the habitat associated with all endangered species, but in a form not readily suitable to use in a geographic information system (GIS). In our study, the principal challenge was extracting spatial data describingmore » these critical habitats for 472 species from over 1000 pages of the federal register. In addition, an appropriate database schema was designed to accommodate the different tiers of information associated with the species along with the confidence of designation; the interpreted location data was geo-referenced to the county enumeration unit producing a spatial database of endangered species for the whole of US. The significance of these critical habitat designations, database scheme and methodologies will be discussed.« less
Basic Concepts, Current Practices and Available Resources for Forensic Investigations on Pavements
DOT National Transportation Integrated Search
1997-09-01
The purpose of the project, entitled Development of a Methodology for Identifying Pavement Design and Construction Data Needed to Support a Forensic Investigation," is to develop a database containing information useful in identifying the premature f...
Cardiological database management system as a mediator to clinical decision support.
Pappas, C; Mavromatis, A; Maglaveras, N; Tsikotis, A; Pangalos, G; Ambrosiadou, V
1996-03-01
An object-oriented medical database management system is presented for a typical cardiologic center, facilitating epidemiological trials. Object-oriented analysis and design were used for the system design, offering advantages for the integrity and extendibility of medical information systems. The system was developed using object-oriented design and programming methodology, the C++ language and the Borland Paradox Relational Data Base Management System on an MS-Windows NT environment. Particular attention was paid to system compatibility, portability, the ease of use, and the suitable design of the patient record so as to support the decisions of medical personnel in cardiovascular centers. The system was designed to accept complex, heterogeneous, distributed data in various formats and from different kinds of examinations such as Holter, Doppler and electrocardiography.
Modeling biology using relational databases.
Peitzsch, Robert M
2003-02-01
There are several different methodologies that can be used for designing a database schema; no one is the best for all occasions. This unit demonstrates two different techniques for designing relational tables and discusses when each should be used. These two techniques presented are (1) traditional Entity-Relationship (E-R) modeling and (2) a hybrid method that combines aspects of data warehousing and E-R modeling. The method of choice depends on (1) how well the information and all its inherent relationships are understood, (2) what types of questions will be asked, (3) how many different types of data will be included, and (4) how much data exists.
Warehouses information system design and development
NASA Astrophysics Data System (ADS)
Darajatun, R. A.; Sukanta
2017-12-01
Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.
NASA Astrophysics Data System (ADS)
Hanan, Lu; Qiushi, Li; Shaobin, Li
2016-12-01
This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.
Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair
Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats
2011-01-01
Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574
Higher Education for Sustainable Development: A Systematic Review
ERIC Educational Resources Information Center
Wu, Yen-Chun Jim; Shen, Ju-Peng
2016-01-01
Purpose: This study aims to provide a complete understanding of academic research into higher education for sustainable development (HESD). Design/methodology/approach: This study utilizes a systematic review of four scientific literature databases to outline topics of research during the UN's Decade of Education for Sustainable Development…
Emotional Intelligence Research within Human Resource Development Scholarship
ERIC Educational Resources Information Center
Farnia, Forouzan; Nafukho, Fredrick Muyia
2016-01-01
Purpose: The purpose of this study is to review and synthesize pertinent emotional intelligence (EI) research within the human resource development (HRD) scholarship. Design/methodology/approach: An integrative review of literature was conducted and multiple electronic databases were searched to find the relevant resources. Using the content…
Use of Comparative Case Study Methodology for US Public Health Policy Analysis: A Review.
Dinour, Lauren M; Kwan, Amy; Freudenberg, Nicholas
There is growing recognition that policies influence population health, highlighting the need for evidence to inform future policy development and reform. This review describes how comparative case study methodology has been applied to public health policy research and discusses the methodology's potential to contribute to this evidence. English-language, peer-reviewed articles published between 1995 and 2012 were sought from 4 databases. Articles were included if they described comparative case studies addressing US public health policy. Two researchers independently assessed the 20 articles meeting review criteria. Case-related characteristics and research design tactics utilized to minimize threats to reliability and validity, such as the use of multiple sources of evidence and a case study protocol, were extracted from each article. Although comparative case study methodology has been used to analyze a range of public health policies at all stages and levels, articles reported an average use of only 3.65 (out of 10) research design tactics. By expanding the use of accepted research design tactics, public health policy researchers can contribute to expanding the evidence needed to advance health-promoting policies.
From Databases to Modelling of Functional Pathways
2004-01-01
This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell. PMID:18629070
From databases to modelling of functional pathways.
Nasi, Sergio
2004-01-01
This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell.
Digital Initiatives and Metadata Use in Thailand
ERIC Educational Resources Information Center
SuKantarat, Wichada
2008-01-01
Purpose: This paper aims to provide information about various digital initiatives in libraries in Thailand and especially use of Dublin Core metadata in cataloguing digitized objects in academic and government digital databases. Design/methodology/approach: The author began researching metadata use in Thailand in 2003 and 2004 while on sabbatical…
Bloch-Mouillet, E
1999-01-01
This paper aims to provide technical and practical advice about finding references using Current Contents on disk (Macintosh or PC) or via the Internet (FTP). Seven editions are published each week. They are all organized in the same way and have the same search engine. The Life Sciences edition, extensively used in medical research, is presented here in detail, as an example. This methodological note explains, in French, how to use this reference database. It is designed to be a practical guide for browsing and searching the database, and particularly for creating search profiles adapted to the needs of researchers.
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
Geometric database maintenance using CCTV cameras and overlay graphics
NASA Astrophysics Data System (ADS)
Oxenberg, Sheldon C.; Landell, B. Patrick; Kan, Edwin
1988-01-01
An interactive graphics system using closed circuit television (CCTV) cameras for remote verification and maintenance of a geometric world model database has been demonstrated in GE's telerobotics testbed. The database provides geometric models and locations of objects viewed by CCTV cameras and manipulated by telerobots. To update the database, an operator uses the interactive graphics system to superimpose a wireframe line drawing of an object with known dimensions on a live video scene containing that object. The methodology used is multipoint positioning to easily superimpose a wireframe graphic on the CCTV image of an object in the work scene. An enhanced version of GE's interactive graphics system will provide the object designation function for the operator control station of the Jet Propulsion Laboratory's telerobot demonstration system.
One approach to design of speech emotion database
NASA Astrophysics Data System (ADS)
Uhrin, Dominik; Chmelikova, Zdenka; Tovarek, Jaromir; Partila, Pavol; Voznak, Miroslav
2016-05-01
This article describes a system for evaluating the credibility of recordings with emotional character. Sound recordings form Czech language database for training and testing systems of speech emotion recognition. These systems are designed to detect human emotions in his voice. The emotional state of man is useful in the security forces and emergency call service. Man in action (soldier, police officer and firefighter) is often exposed to stress. Information about the emotional state (his voice) will help to dispatch to adapt control commands for procedure intervention. Call agents of emergency call service must recognize the mental state of the caller to adjust the mood of the conversation. In this case, the evaluation of the psychological state is the key factor for successful intervention. A quality database of sound recordings is essential for the creation of the mentioned systems. There are quality databases such as Berlin Database of Emotional Speech or Humaine. The actors have created these databases in an audio studio. It means that the recordings contain simulated emotions, not real. Our research aims at creating a database of the Czech emotional recordings of real human speech. Collecting sound samples to the database is only one of the tasks. Another one, no less important, is to evaluate the significance of recordings from the perspective of emotional states. The design of a methodology for evaluating emotional recordings credibility is described in this article. The results describe the advantages and applicability of the developed method.
Interactive bibliographical database on color
NASA Astrophysics Data System (ADS)
Caivano, Jose L.
2002-06-01
The paper describes the methodology and results of a project under development, aimed at the elaboration of an interactive bibliographical database on color in all fields of application: philosophy, psychology, semiotics, education, anthropology, physical and natural sciences, biology, medicine, technology, industry, architecture and design, arts, linguistics, geography, history. The project is initially based upon an already developed bibliography, published in different journals, updated in various opportunities, and now available at the Internet, with more than 2,000 entries. The interactive database will amplify that bibliography, incorporating hyperlinks and contents (indexes, abstracts, keywords, introductions, or eventually the complete document), and devising mechanisms for information retrieval. The sources to be included are: books, doctoral dissertations, multimedia publications, reference works. The main arrangement will be chronological, but the design of the database will allow rearrangements or selections by different fields: subject, Decimal Classification System, author, language, country, publisher, etc. A further project is to develop another database, including color-specialized journals or newsletters, and articles on color published in international journals, arranged in this case by journal name and date of publication, but allowing also rearrangements or selections by author, subject and keywords.
Archetype relational mapping - a practical openEHR persistence solution.
Wang, Li; Min, Lingtong; Wang, Rui; Lu, Xudong; Duan, Huilong
2015-11-05
One of the primary obstacles to the widespread adoption of openEHR methodology is the lack of practical persistence solutions for future-proof electronic health record (EHR) systems as described by the openEHR specifications. This paper presents an archetype relational mapping (ARM) persistence solution for the archetype-based EHR systems to support healthcare delivery in the clinical environment. First, the data requirements of the EHR systems are analysed and organized into archetype-friendly concepts. The Clinical Knowledge Manager (CKM) is queried for matching archetypes; when necessary, new archetypes are developed to reflect concepts that are not encompassed by existing archetypes. Next, a template is designed for each archetype to apply constraints related to the local EHR context. Finally, a set of rules is designed to map the archetypes to data tables and provide data persistence based on the relational database. A comparison study was conducted to investigate the differences among the conventional database of an EHR system from a tertiary Class A hospital in China, the generated ARM database, and the Node + Path database. Five data-retrieving tests were designed based on clinical workflow to retrieve exams and laboratory tests. Additionally, two patient-searching tests were designed to identify patients who satisfy certain criteria. The ARM database achieved better performance than the conventional database in three of the five data-retrieving tests, but was less efficient in the remaining two tests. The time difference of query executions conducted by the ARM database and the conventional database is less than 130 %. The ARM database was approximately 6-50 times more efficient than the conventional database in the patient-searching tests, while the Node + Path database requires far more time than the other two databases to execute both the data-retrieving and the patient-searching tests. The ARM approach is capable of generating relational databases using archetypes and templates for archetype-based EHR systems, thus successfully adapting to changes in data requirements. ARM performance is similar to that of conventionally-designed EHR systems, and can be applied in a practical clinical environment. System components such as ARM can greatly facilitate the adoption of openEHR architecture within EHR systems.
Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo
2010-01-01
Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818
Measures of outdoor play and independent mobility in children and youth: A methodological review.
Bates, Bree; Stone, Michelle R
2015-09-01
Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
The Design of a Standard Software Development Methodology for the Brazilian Aeronautical Ministry.
1985-12-01
controlling the overall aeronautical activities. In war time its main objective will be to achieve and maintain air superiority over the Brazilian territory...and the detailed design phase. Like flowcharts, pseudocode can be used at any level of abstraction[171. (4) Advantages 0 (a) Is superior to flow...incorporate database concepts[17]. Its level of proliferation in Brasil is low, and it is not known by the SIMAER’s professionals. ’- ," 3. Structured Design
Applications of GIS and database technologies to manage a Karst Feature Database
Gao, Y.; Tipping, R.G.; Alexander, E.C.
2006-01-01
This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.
[Methodology for clinical research in Orthodontics, the assets of the beOrtho website].
Ruiz, Martial; Thibult, François
2014-06-01
The rules applying to the "evidence-based" methodology strongly influenced the clinical research in orthodontics. However, the implementation of clinical studies requires rigour, important statistical and methodological knowledge, as well as a reliable environment in order to compile and store the data obtained from research. We developed the project "beOrtho.com" (based on orthodontic evidence) in order to fill up the gap between our desire to drive clinical research and the necessity of methodological rigour in the exploitation of its results. BeOrtho website was created to answer the issue of sample recruitment, data compilation and storage, while providing help for the methodological design of clinical studies. It allows the development and monitoring of clinical studies, as well as the creation of databases. On the other hand, we designed an evaluation grid for clinical studies which helps developing systematic reviews. In order to illustrate our point, we tested a research protocol evaluating the interest of the mandibular advancement in the framework of Class II treatment. © EDP Sciences, SFODF, 2014.
Design risk assessment for burst-prone mines: Application in a Canadian mine
NASA Astrophysics Data System (ADS)
Cheung, David J.
A proactive stance towards improving the effectiveness and consistency of risk assessments has been adopted recently by mining companies and industry. The next 10-20 years forecasts that ore deposits accessible using shallow mining techniques will diminish. The industry continues to strive for success in "deeper" mining projects in order to keep up with the continuing demand for raw materials. Although the returns are quite profitable, many projects have been sidelined due to high uncertainty and technical risk in the mining of the mineral deposit. Several hardrock mines have faced rockbursting and seismicity problems. Within those reported, mines in countries like South Africa, Australia and Canada have documented cases of severe rockburst conditions attributed to the mining depth. Severe rockburst conditions known as "burst-prone" can be effectively managed with design. Adopting a more robust design can ameliorate the exposure of workers and equipment to adverse conditions and minimize the economic consequences, which can hinder the bottom line of an operation. This thesis presents a methodology created for assessing the design risk in burst-prone mines. The methodology includes an evaluation of relative risk ratings for scenarios with options of risk reduction through several design principles. With rockbursts being a hazard of seismic events, the methodology is based on research in the area of mining seismicity factoring in rockmass failure mechanisms, which results from a combination of mining induced stress, geological structures, rockmass properties and mining influences. The methodology was applied to case studies at Craig Mine of Xstrata Nickel in Sudbury, Ontario, which is known to contain seismically active fault zones. A customized risk assessment was created and applied to rockburst case studies, evaluating the seismic vulnerability and consequence for each case. Application of the methodology to Craig Mine demonstrates that changes in the design can reduce both exposure risk (personnel and equipment), and economical risk (revenue and costs). Fatal and catastrophic consequences can be averted through robust planning and design. Two customized approaches were developed to conduct risk assessment of case studies at Craig Mine. Firstly, the Brownfield Approach utilizes the seismic database to determine the seismic hazard from a rating system that evaluates frequency-magnitude, event size, and event-blast relation. Secondly, the Greenfield Approach utilizes the seismic database, focusing on larger magnitude events, rocktype, and geological structure. The customized Greenfield Approach can also be applied in the evaluation of design risk in deep mines with the same setting and condition as Craig Mine. Other mines with different settings and conditions can apply the principles in the methodology to evaluate design alternatives and risk reduction strategies for burst-prone mines.
State Practices in the Assessment of Outcomes for Students with Disabilities. Technical Report.
ERIC Educational Resources Information Center
Shriner, James G.; And Others
This technical report describes the methodology, results, and conclusions of a 1991 survey, which was conducted to determine state efforts to develop systems to assess educational outcomes, states' needs for solutions to technical/implementation problems, existing databases, and efforts of states to design a comprehensive system of indicators in…
Do Apprentices' Communities of Practice Block Unwelcome Knowledge?
ERIC Educational Resources Information Center
Sligo, Frank; Tilley, Elspeth; Murray, Niki
2011-01-01
Purpose: This study aims to examine how well print-literacy support being provided to New Zealand Modern Apprentices (MAs) is supporting their study and practical work. Design/methodology/approach: The authors undertook a qualitative analysis of a database of 191 MAs in the literacy programme, then in 14 case studies completed 46 interviews with…
E-Training Adoption in the Nigerian Civil Service
ERIC Educational Resources Information Center
Zainab, Bello; Bhatti, Muhammad Awais; Pangil, Faizuniah Bt; Battour, Mohamed Mohamed
2015-01-01
Purpose: The purpose of this paper is to highlight the factors that aid e-training adoption in the Nigerian civil service. Design/methodology/approach: This paper is based on a review of past literature from databases, reports, newspapers, magazines, etc. The literature recognised the role of perceived cost, computer self-efficacy, availability of…
Epistemological Trends in Educational Leadership Studies in Israel: 2000-2012
ERIC Educational Resources Information Center
Eyal, Ori; Rom, Noa
2015-01-01
Purpose: The purpose of this paper is to identify the epistemological trends in the Israeli Educational Leadership (EL) scholarship between the years 2000 and 2012. Design/methodology/approach: The 51 studies included in this review were detected through a systematic search in online academic databases. Abstracts of studies identified as being…
The Service Learning Projects: Stakeholder Benefits and Potential Class Topics
ERIC Educational Resources Information Center
Rutti, Raina M.; LaBonte, Joanne; Helms, Marilyn Michelle; Hervani, Aref Agahei; Sarkarat, Sy
2016-01-01
Purpose: The purpose of this paper is to summarize the benefits of including a service learning project in college classes and focusses on benefits to all stakeholders, including students, community, and faculty. Design/methodology/approach: Using a snowball approach in academic databases as well as a nominal group technique to poll faculty, key…
Effectiveness of Occupational Health and Safety Training: A Systematic Review with Meta-Analysis
ERIC Educational Resources Information Center
Ricci, Federico; Chiesi, Andrea; Bisio, Carlo; Panari, Chiara; Pelosi, Annalisa
2016-01-01
Purpose: This meta-analysis aims to verify the efficacy of occupational health and safety (OHS) training in terms of knowledge, attitude and beliefs, behavior and health. Design/methodology/approach: The authors included studies published in English (2007-2014) selected from ten databases. Eligibility criteria were studies concerned with the…
Dialynas, Emmanuel; Topalis, Pantelis; Vontas, John; Louis, Christos
2009-01-01
Background Monitoring of insect vector populations with respect to their susceptibility to one or more insecticides is a crucial element of the strategies used for the control of arthropod-borne diseases. This management task can nowadays be achieved more efficiently when assisted by IT (Information Technology) tools, ranging from modern integrated databases to GIS (Geographic Information System). Here we describe an application ontology that we developed de novo, and a specially designed database that, based on this ontology, can be used for the purpose of controlling mosquitoes and, thus, the diseases that they transmit. Methodology/Principal Findings The ontology, named MIRO for Mosquito Insecticide Resistance Ontology, developed using the OBO-Edit software, describes all pertinent aspects of insecticide resistance, including specific methodology and mode of action. MIRO, then, forms the basis for the design and development of a dedicated database, IRbase, constructed using open source software, which can be used to retrieve data on mosquito populations in a temporally and spatially separate way, as well as to map the output using a Google Earth interface. The dependency of the database on the MIRO allows for a rational and efficient hierarchical search possibility. Conclusions/Significance The fact that the MIRO complies with the rules set forward by the OBO (Open Biomedical Ontologies) Foundry introduces cross-referencing with other biomedical ontologies and, thus, both MIRO and IRbase are suitable as parts of future comprehensive surveillance tools and decision support systems that will be used for the control of vector-borne diseases. MIRO is downloadable from and IRbase is accessible at VectorBase, the NIAID-sponsored open access database for arthropod vectors of disease. PMID:19547750
Cheer, Karen; MacLaren, David; Tsey, Komla
2015-01-01
Researchers are increasingly using grounded theory methodologies to study the professional experience of nurses and midwives. To review common grounded theory characteristics and research design quality as described in grounded theory studies of coping strategies used by nurses and midwives. A systematic database search for 2005-2015 identified and assessed grounded theory characteristics from 16 studies. Study quality was assessed using a modified Critical Appraisal Skills Programme tool. Grounded theory was considered a methodology or a set of methods, able to be used within different nursing and midwifery contexts. Specific research requirements determined the common grounded theory characteristics used in different studies. Most researchers did not clarify their epistemological and theoretical perspectives. To improve research design and trustworthiness of grounded theory studies in nursing and midwifery, researchers need to state their theoretical stance and clearly articulate their use of grounded theory methodology and characteristics in research reporting.
NASA Technical Reports Server (NTRS)
Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
Lazem, Shaimaa; Webster, Mary; Holmes, Wayne; Wolf, Motje
2015-09-02
Here we review 18 articles that describe the design and evaluation of 1 or more games for diabetes from technical, methodological, and theoretical perspectives. We undertook searches covering the period 2010 to May 2015 in the ACM, IEEE, Journal of Medical Internet Research, Studies in Health Technology and Informatics, and Google Scholar online databases using the keywords "children," "computer games," "diabetes," "games," "type 1," and "type 2" in various Boolean combinations. The review sets out to establish, for future research, an understanding of the current landscape of digital games designed for children with diabetes. We briefly explored the use and impact of well-established learning theories in such games. The most frequently mentioned theoretical frameworks were social cognitive theory and social constructivism. Due to the limitations of the reported evaluation methodologies, little evidence was found to support the strong promise of games for diabetes. Furthermore, we could not establish a relation between design features and the game outcomes. We argue that an in-depth discussion about the extent to which learning theories could and should be manifested in the design decisions is required. © 2015 Diabetes Technology Society.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Wang, S F; Zhan, S Y
2016-07-01
Electronic healthcare databases have become an important source for active surveillance of drug safety in the era of big data. The traditional epidemiology research designs are needed to confirm the association between drug use and adverse events based on these datasets, and the selection of the comparative control is essential to each design. This article aims to explain the principle and application of each type of control selection, introduce the methods and parameters for method comparison, and describe the latest achievements in the batch processing of control selection, which would provide important methodological reference for the use of electronic healthcare databases to conduct post-marketing drug safety surveillance in China.
Application of real-time database to LAMOST control system
NASA Astrophysics Data System (ADS)
Xu, Lingzhe; Xu, Xinqi
2004-09-01
The QNX based real time database is one of main features for Large sky Area Multi-Object fiber Spectroscopic Telescope's (LAMOST) control system, which serves as a storage and platform for data flow, recording and updating timely various status of moving components in the telescope structure as well as environmental parameters around it. The database joins harmonically in the administration of the Telescope Control System (TCS). The paper presents methodology and technique tips in designing the EMPRESS database GUI software package, such as the dynamic creation of control widgets, dynamic query and share memory. The seamless connection between EMPRESS and the graphical development tool of QNX"s Photon Application Builder (PhAB) has been realized, and so have the Windows look and feel yet under Unix-like operating system. In particular, the real time feature of the database is analyzed that satisfies the needs of the control system.
An Adaptive Database Intrusion Detection System
ERIC Educational Resources Information Center
Barrios, Rita M.
2011-01-01
Intrusion detection is difficult to accomplish when attempting to employ current methodologies when considering the database and the authorized entity. It is a common understanding that current methodologies focus on the network architecture rather than the database, which is not an adequate solution when considering the insider threat. Recent…
ERIC Educational Resources Information Center
Dekker, Karien; Kamerling, Margje
2017-01-01
Purpose: The paper aims to examine to what extent and why parental involvement as well as characteristics of ethnic school population influence social skills scores (social position, behavioural skills) of students. Design/methodology/approach: The study used the COOL5-18 database (2010) that included 553 Dutch primary schools and nearly 38,000…
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Tavazzi, Luigi
2015-10-01
The development of both technology, biological, and clinical knowledge leads to remarkable changes of scientific research methodology, including the clinical research. Major changes deal with the pragmatic approach of trial designs, an explosive diffusion of observational research which is becoming a usual component of clinical practice, and an active modelling of new research design. Moreover, a new healthcare landscape could be generated from the information technology routinely used to collect clinical data in huge databases, the management and the analytic methodology of big data, and the development of biological sensors compatible with the daily life delivering signals remotely forwardable to central databases. Precision medicine and individualized medicine seem to be the big novelties of the coming years, guiding to a shared pattern of patient/physician relationship. In healthcare, a huge business related mainly, but not exclusively, to the implementation of information technology is growing. This development will favor radical changes in the health systems, also reshaping the clinical activity. A new governance of the research strategies is needed and the application of the results should be based on shared ethical foundations. This new evolving profile of medical research and practice is discussed in this paper.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Hunt, Ron; Fulcher, Clay; Towner, Robert; McDonald, Emmett
2012-01-01
The design and theoretical basis of a new database tool that quickly generates vibroacoustic response estimates using a library of transfer functions (TFs) is discussed. During the early stages of a launch vehicle development program, these response estimates can be used to provide vibration environment specification to hardware vendors. The tool accesses TFs from a database, combines the TFs, and multiplies these by input excitations to estimate vibration responses. The database is populated with two sets of uncoupled TFs; the first set representing vibration response of a bare panel, designated as H(sup s), and the second set representing the response of the free-free component equipment by itself, designated as H(sup c). For a particular configuration undergoing analysis, the appropriate H(sup s) and H(sup c) are selected and coupled to generate an integrated TF, designated as H(sup s +c). This integrated TF is then used with the appropriate input excitations to estimate vibration responses. This simple yet powerful tool enables a user to estimate vibration responses without directly using finite element models, so long as suitable H(sup s) and H(sup c) sets are defined in the database libraries. The paper discusses the preparation of the database tool and provides the assumptions and methodologies necessary to combine H(sup s) and H(sup c) sets into an integrated H(sup s + c). An experimental validation of the approach is also presented.
1993-11-29
Certification: Initial Continuing Fund Experimentatlonal Research: Same Design , Implement In Ada, C, C++ Same Problem, Develop With Multiple Methodologies ...allowing analysts ( non programmers) to ’parit’ specifications for screens, reports, databases and etc 2) generating from design specifications 75% of...before the non -defense sector did and designed a tool to tackle the problem. DOD tested the tool and it worked. But DOD hasn’t put Ada to work in a
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hanagud, S.
1974-01-01
The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.
Mistry, Pankaj; Dunn, Janet A; Marshall, Andrea
2017-07-18
The application of adaptive design methodology within a clinical trial setting is becoming increasingly popular. However the application of these methods within trials is not being reported as adaptive designs hence making it more difficult to capture the emerging use of these designs. Within this review, we aim to understand how adaptive design methodology is being reported, whether these methods are explicitly stated as an 'adaptive design' or if it has to be inferred and to identify whether these methods are applied prospectively or concurrently. Three databases; Embase, Ovid and PubMed were chosen to conduct the literature search. The inclusion criteria for the review were phase II, phase III and phase II/III randomised controlled trials within the field of Oncology that published trial results in 2015. A variety of search terms related to adaptive designs were used. A total of 734 results were identified, after screening 54 were eligible. Adaptive designs were more commonly applied in phase III confirmatory trials. The majority of the papers performed an interim analysis, which included some sort of stopping criteria. Additionally only two papers explicitly stated the term 'adaptive design' and therefore for most of the papers, it had to be inferred that adaptive methods was applied. Sixty-five applications of adaptive design methods were applied, from which the most common method was an adaptation using group sequential methods. This review indicated that the reporting of adaptive design methodology within clinical trials needs improving. The proposed extension to the current CONSORT 2010 guidelines could help capture adaptive design methods. Furthermore provide an essential aid to those involved with clinical trials.
NASA Technical Reports Server (NTRS)
Rowell, Lawrence F.; Davis, John S.
1989-01-01
The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.
Sexual health education interventions for young people: a methodological review.
Oakley, A.; Fullerton, D.; Holland, J.; Arnold, S.; France-Dawson, M.; Kelley, P.; McGrellis, S.
1995-01-01
OBJECTIVES--To locate reports of sexual health education interventions for young people, assess the methodological quality of evaluations, identify the subgroup with a methodologically sound design, and assess the evidence with respect to the effectiveness of different approaches to promoting young people's sexual health. DESIGN--Survey of reports in English by means of electronic databases and hand searches for relevant studies conducted in the developed world since 1982. Papers were reviewed for eight methodological qualities. The evidence on effectiveness generated by studies meeting four core criteria was assessed. Judgments on effectiveness by reviewers and authors were compared. PAPERS--270 papers reporting sexual health interventions. MAIN OUTCOME MEASURE--The methodological quality of evaluations. RESULTS--73 reports of evaluations of sexual health interventions examining the effectiveness of these interventions in changing knowledge, attitudes, or behavioural outcomes were identified, of which 65 were separate outcome evaluations. Of these studies, 45 (69%) lacked random control groups, 44 (68%) failed to present preintervention and 38 (59%) postintervention data, and 26 (40%) omitted to discuss the relevance of loss of data caused by drop outs. Only 12 (18%) of the 65 outcome evaluations were judged to be methodologically sound. Academic reviewers were more likely than authors to judge studies as unclear because of design faults. Only two of the sound evaluations recorded interventions which were effective in showing an impact on young people's sexual behaviour. CONCLUSIONS--The design of evaluations in sexual health intervention needs to be improved so that reliable evidence of the effectiveness of different approaches to promoting young people's sexual health may be generated. PMID:7833754
Embedding learning from adverse incidents: a UK case study.
Eshareturi, Cyril; Serrant, Laura
2017-04-18
Purpose This paper reports on a regionally based UK study uncovering what has worked well in learning from adverse incidents in hospitals. The purpose of this paper is to review the incident investigation methodology used in identifying strengths or weaknesses and explore the use of a database as a tool to embed learning. Design/methodology/approach Documentary examination was conducted of all adverse incidents reported between 1 June 2011 and 30 June 2012 by three UK National Health Service hospitals. One root cause analysis report per adverse incident for each individual hospital was sent to an advisory group for a review. Using terms of reference supplied, the advisory group feedback was analysed using an inductive thematic approach. The emergent themes led to the generation of questions which informed seven in-depth semi-structured interviews. Findings "Time" and "work pressures" were identified as barriers to using adverse incident investigations as tools for quality enhancement. Methodologically, a weakness in approach was that no criteria influenced the techniques which were used in investigating adverse incidents. Regarding the sharing of learning, the use of a database as a tool to embed learning across the region was not supported. Practical implications Softer intelligence from adverse incident investigations could be usefully shared between hospitals through a regional forum. Originality/value The use of a database as a tool to facilitate the sharing of learning from adverse incidents across the health economy is not supported.
Generation of the Ares I-X Flight Test Vehicle Aerodynamic Data Book and Comparison To Flight
NASA Technical Reports Server (NTRS)
Bauer, Steven X.; Krist, Steven E.; Compton, William B.
2011-01-01
A 3.5-year effort to characterize the aerodynamic behavior of the Ares I-X Flight Test Vehicle (AIX FTV) is described in this paper. The AIX FTV was designed to be representative of the Ares I Crew Launch Vehicle (CLV). While there are several differences in the outer mold line from the current revision of the CLV, the overall length, mass distribution, and flight systems of the two vehicles are very similar. This paper briefly touches on each of the aerodynamic databases developed in the program, describing the methodology employed, experimental and computational contributions to the generation of the databases, and how well the databases and underlying computations compare to actual flight test results.
NASA Technical Reports Server (NTRS)
Moroh, Marsha
1988-01-01
A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.
SBROME: a scalable optimization and module matching framework for automated biosystems design.
Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias
2013-05-17
The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon
2008-08-01
Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical appraisal of single-subject designs, thereby assisting to improve standards of single-case methodology.
Clinical study of the Erlanger silver catheter--data management and biometry.
Martus, P; Geis, C; Lugauer, S; Böswald, M; Guggenbichler, J P
1999-01-01
The clinical evaluation of venous catheters for catheter-induced infections must conform to a strict biometric methodology. The statistical planning of the study (target population, design, degree of blinding), data management (database design, definition of variables, coding), quality assurance (data inspection at several levels) and the biometric evaluation of the Erlanger silver catheter project are described. The three-step data flow included: 1) primary data from the hospital, 2) relational database, 3) files accessible for statistical evaluation. Two different statistical models were compared: analyzing the first catheter only of a patient in the analysis (independent data) and analyzing several catheters from the same patient (dependent data) by means of the generalized estimating equations (GEE) method. The main result of the study was based on the comparison of both statistical models.
ERIC Educational Resources Information Center
Fleming, Steven T.
1992-01-01
The concept of risk-adjusted measures of quality is discussed, and a methodology is proposed for risk-adjusting and integrating multiple adverse outcomes of anesthesia services into measures for quality assurance and quality improvement programs. Although designed for a new anesthesiology database, the methods should apply to other health…
ERIC Educational Resources Information Center
Oduwole, Adebambo Adewale; Oyewumi, Olatundun
2010-01-01
Purpose: This study aims to examine the accessibility and use of web-based electronic databases on the Health InterNetwork Access to Research Initiative (HINARI) portal by physicians in the Neuropsychiatric Hospital, Aro--a psychiatry health institution in Nigeria. Design/methodology/approach: Collection of data was through the use of a three-part…
Composite Structures Damage Tolerance Analysis Methodologies
NASA Technical Reports Server (NTRS)
Chang, James B.; Goyal, Vinay K.; Klug, John C.; Rome, Jacob I.
2012-01-01
This report presents the results of a literature review as part of the development of composite hardware fracture control guidelines funded by NASA Engineering and Safety Center (NESC) under contract NNL04AA09B. The objectives of the overall development tasks are to provide a broad information and database to the designers, analysts, and testing personnel who are engaged in space flight hardware production.
NASA Technical Reports Server (NTRS)
Tan, Choon-Sooi; Suder, Kenneth (Technical Monitor)
2003-01-01
A framework for an effective computational methodology for characterizing the stability and the impact of distortion in high-speed multi-stage compressor is being developed. The methodology consists of using a few isolated-blade row Navier-Stokes solutions for each blade row to construct a body force database. The purpose of the body force database is to replace each blade row in a multi-stage compressor by a body force distribution to produce same pressure rise and flow turning. To do this, each body force database is generated in such a way that it can respond to the changes in local flow conditions. Once the database is generated, no hrther Navier-Stokes computations are necessary. The process is repeated for every blade row in the multi-stage compressor. The body forces are then embedded as source terms in an Euler solver. The method is developed to have the capability to compute the performance in a flow that has radial as well as circumferential non-uniformity with a length scale larger than a blade pitch; thus it can potentially be used to characterize the stability of a compressor under design. It is these two latter features as well as the accompanying procedure to obtain the body force representation that distinguish the present methodology from the streamline curvature method. The overall computational procedures have been developed. A dimensional analysis was carried out to determine the local flow conditions for parameterizing the magnitudes of the local body force representation of blade rows. An Euler solver was modified to embed the body forces as source terms. The results from the dimensional analysis show that the body forces can be parameterized in terms of the two relative flow angles, the relative Mach number, and the Reynolds number. For flow in a high-speed transonic blade row, they can be parameterized in terms of the local relative Mach number alone.
Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J
2014-01-01
To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become increasingly positive across quasi-experimental, nonexperimental, and modeled studies: 1.12 ± 2.16 (1.11-1.14), 1.61 ± 0.91 (1.56-1.65), and 2.05 ± 0.88 (2.04-2.06), respectively. Overall, mean weighted ROI in workplace health promotion demonstrated a positive ROI. Higher methodological quality studies provided evidence of smaller financial returns. Methodological quality and study design are important determinants.
Data Model and Relational Database Design for Highway Runoff Water-Quality Metadata
Granato, Gregory E.; Tessler, Steven
2001-01-01
A National highway and urban runoff waterquality metadatabase was developed by the U.S. Geological Survey in cooperation with the Federal Highway Administration as part of the National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS). The database was designed to catalog available literature and to document results of the synthesis in a format that would facilitate current and future research on highway and urban runoff. This report documents the design and implementation of the NDAMS relational database, which was designed to provide a catalog of available information and the results of an assessment of the available data. All the citations and the metadata collected during the review process are presented in a stratified metadatabase that contains citations for relevant publications, abstracts (or previa), and reportreview metadata for a sample of selected reports that document results of runoff quality investigations. The database is referred to as a metadatabase because it contains information about available data sets rather than a record of the original data. The database contains the metadata needed to evaluate and characterize how valid, current, complete, comparable, and technically defensible published and available information may be when evaluated for application to the different dataquality objectives as defined by decision makers. This database is a relational database, in that all information is ultimately linked to a given citation in the catalog of available reports. The main database file contains 86 tables consisting of 29 data tables, 11 association tables, and 46 domain tables. The data tables all link to a particular citation, and each data table is focused on one aspect of the information collected in the literature search and the evaluation of available information. This database is implemented in the Microsoft (MS) Access database software because it is widely used within and outside of government and is familiar to many existing and potential customers. The stratified metadatabase design for the NDAMS program is presented in the MS Access file DBDESIGN.mdb and documented with a data dictionary in the NDAMS_DD.mdb file recorded on the CD-ROM. The data dictionary file includes complete documentation of the table names, table descriptions, and information about each of the 419 fields in the database.
Authentic leadership in healthcare: a scoping review.
Malila, Niina; Lunkka, Nina; Suhonen, Marjo
2018-02-05
Purpose The purpose of this paper is to review peer-reviewed original research articles on authentic leadership (AL) in health care to identify potential research gaps and present recommendations for future research. The objectives are to examine and map evidence of the main characteristics, research themes and methodologies in the studies. AL is a leader's non-authoritarian, ethical and transparent behaviour pattern. Design/methodology/approach A scoping review with thematic analysis was conducted. A three-step search strategy was used with database and manual searches. The included studies were composed of English language peer-reviewed original research articles referring to both AL and health care. Findings In total, 29 studies were included. The studies favoured Canadian nurses in acute care hospitals. AL was understood as its original definition. The review identified four research themes: well-being at work, patient care quality, work environment and AL promotion. Quantitative research methodology with the authentic leadership questionnaire and cross-sectional design were prevalent. Research limitations/implications Future research needs more variation in research themes, study populations, settings, organisations, work sectors, geographical origins and theory perspectives. Different research methodologies, such as qualitative and mixed methods research and longitudinal designs, should be used more. Originality/value This is presumably the first literature review to map the research on AL in health care.
microRNAs Databases: Developmental Methodologies, Structural and Functional Annotations.
Singh, Nagendra Kumar
2017-09-01
microRNA (miRNA) is an endogenous and evolutionary conserved non-coding RNA, involved in post-transcriptional process as gene repressor and mRNA cleavage through RNA-induced silencing complex (RISC) formation. In RISC, miRNA binds in complementary base pair with targeted mRNA along with Argonaut proteins complex, causes gene repression or endonucleolytic cleavage of mRNAs and results in many diseases and syndromes. After the discovery of miRNA lin-4 and let-7, subsequently large numbers of miRNAs were discovered by low-throughput and high-throughput experimental techniques along with computational process in various biological and metabolic processes. The miRNAs are important non-coding RNA for understanding the complex biological phenomena of organism because it controls the gene regulation. This paper reviews miRNA databases with structural and functional annotations developed by various researchers. These databases contain structural and functional information of animal, plant and virus miRNAs including miRNAs-associated diseases, stress resistance in plant, miRNAs take part in various biological processes, effect of miRNAs interaction on drugs and environment, effect of variance on miRNAs, miRNAs gene expression analysis, sequence of miRNAs, structure of miRNAs. This review focuses on the developmental methodology of miRNA databases such as computational tools and methods used for extraction of miRNAs annotation from different resources or through experiment. This study also discusses the efficiency of user interface design of every database along with current entry and annotations of miRNA (pathways, gene ontology, disease ontology, etc.). Here, an integrated schematic diagram of construction process for databases is also drawn along with tabular and graphical comparison of various types of entries in different databases. Aim of this paper is to present the importance of miRNAs-related resources at a single place.
Benefits of an Object-oriented Database Representation for Controlled Medical Terminologies
Gu, Huanying; Halper, Michael; Geller, James; Perl, Yehoshua
1999-01-01
Objective: Controlled medical terminologies (CMTs) have been recognized as important tools in a variety of medical informatics applications, ranging from patient-record systems to decision-support systems. Controlled medical terminologies are typically organized in semantic network structures consisting of tens to hundreds of thousands of concepts. This overwhelming size and complexity can be a serious barrier to their maintenance and widespread utilization. The authors propose the use of object-oriented databases to address the problems posed by the extensive scope and high complexity of most CMTs for maintenance personnel and general users alike. Design: The authors present a methodology that allows an existing CMT, modeled as a semantic network, to be represented as an equivalent object-oriented database. Such a representation is called an object-oriented health care terminology repository (OOHTR). Results: The major benefit of an OOHTR is its schema, which provides an important layer of structural abstraction. Using the high-level view of a CMT afforded by the schema, one can gain insight into the CMT's overarching organization and begin to better comprehend it. The authors' methodology is applied to the Medical Entities Dictionary (MED), a large CMT developed at Columbia-Presbyterian Medical Center. Examples of how the OOHTR schema facilitated updating, correcting, and improving the design of the MED are presented. Conclusion: The OOHTR schema can serve as an important abstraction mechanism for enhancing comprehension of a large CMT, and thus promotes its usability. PMID:10428002
Breach Risk Magnitude: A Quantitative Measure of Database Security.
Yasnoff, William A
2016-01-01
A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.
Concepts and data model for a co-operative neurovascular database.
Mansmann, U; Taylor, W; Porter, P; Bernarding, J; Jäger, H R; Lasjaunias, P; Terbrugge, K; Meisel, J
2001-08-01
Problems of clinical management of neurovascular diseases are very complex. This is caused by the chronic character of the diseases, a long history of symptoms and diverse treatments. If patients are to benefit from treatment, then treatment decisions have to rely on reliable and accurate knowledge of the natural history of the disease and the various treatments. Recent developments in statistical methodology and experience from electronic patient records are used to establish an information infrastructure based on a centralized register. A protocol to collect data on neurovascular diseases with technical as well as logistical aspects of implementing a database for neurovascular diseases are described. The database is designed as a co-operative tool of audit and research available to co-operating centres. When a database is linked to a systematic patient follow-up, it can be used to study prognosis. Careful analysis of patient outcome is valuable for decision-making.
Comparison of two drug safety signals in a pharmacovigilance data mining framework.
Tubert-Bitter, Pascale; Bégaud, Bernard; Ahmed, Ismaïl
2016-04-01
Since adverse drug reactions are a major public health concern, early detection of drug safety signals has become a top priority for regulatory agencies and the pharmaceutical industry. Quantitative methods for analyzing spontaneous reporting material recorded in pharmacovigilance databases through data mining have been proposed in the last decades and are increasingly used to flag potential safety problems. While automated data mining is motivated by the usually huge size of pharmacovigilance databases, it does not systematically produce relevant alerts. Moreover, each detected signal requires appropriate assessment that may involve investigation of the whole therapeutic class. The goal of this article is to provide a methodology for comparing two detected signals. It is nested within the automated surveillance framework as (1) no extra information is required and (2) no simple inference on the actual risks can be extrapolated from spontaneous reporting data. We designed our methodology on the basis of two classical methods used for automated signal detection: the Bayesian Gamma Poisson Shrinker and the frequentist Proportional Reporting Ratio. A simulation study was conducted to assess the performances of both proposed methods. The latter were used to compare cardiovascular signals for two HIV treatments from the French pharmacovigilance database. © The Author(s) 2012.
System perspectives for mobile platform design in m-Health
NASA Astrophysics Data System (ADS)
Roveda, Janet M.; Fink, Wolfgang
2016-05-01
Advances in integrated circuit technologies have led to the integration of medical sensor front ends with data processing circuits, i.e., mobile platform design for wearable sensors. We discuss design methodologies for wearable sensor nodes and their applications in m-Health. From the user perspective, flexibility, comfort, appearance, fashion, ease-of-use, and visibility are key form factors. From the technology development point of view, high accuracy, low power consumption, and high signal to noise ratio are desirable features. From the embedded software design standpoint, real time data analysis algorithms, application and database interfaces are the critical components to create successful wearable sensor-based products.
Data mining of text as a tool in authorship attribution
NASA Astrophysics Data System (ADS)
Visa, Ari J. E.; Toivonen, Jarmo; Autio, Sami; Maekinen, Jarno; Back, Barbro; Vanharanta, Hannu
2001-03-01
It is common that text documents are characterized and classified by keywords that the authors use to give them. Visa et al. have developed a new methodology based on prototype matching. The prototype is an interesting document or a part of an extracted, interesting text. This prototype is matched with the document database of the monitored document flow. The new methodology is capable of extracting the meaning of the document in a certain degree. Our claim is that the new methodology is also capable of authenticating the authorship. To verify this claim two tests were designed. The test hypothesis was that the words and the word order in the sentences could authenticate the author. In the first test three authors were selected. The selected authors were William Shakespeare, Edgar Allan Poe, and George Bernard Shaw. Three texts from each author were examined. Every text was one by one used as a prototype. The two nearest matches with the prototype were noted. The second test uses the Reuters-21578 financial news database. A group of 25 short financial news reports from five different authors are examined. Our new methodology and the interesting results from the two tests are reported in this paper. In the first test, for Shakespeare and for Poe all cases were successful. For Shaw one text was confused with Poe. In the second test the Reuters-21578 financial news were identified by the author relatively well. The resolution is that our text mining methodology seems to be capable of authorship attribution.
NASA Astrophysics Data System (ADS)
Barriendos, Mariano; Carles Balasch Solanes, Josep; Tuset, Jordi; Lluís Ruiz-Bellet, Josep
2014-05-01
Available information of historical floods can improve the management of hydroclimatic hazards. This approach is useful in ungauged basins or with short instrumental data series. On the other hand, flood risk is increasing due to both the expansion of human land occupation and the modification of rainfall patterns in the present global climatic change scenario. Within the Prediflood Project, we have designed an integrated database of historical floods in Catalonia with the aim to feed data to: 1) Meteorological reconstruction and modelling. 2) Hydrological and hydraulic reconstruction. 3) Human impacts evaluation, of these floods. The firsts steps of the database design focus on spatial location and on the quality of the data sources in three levels: 1) Historical documentary sources and newspapers contemporary with the floods. 2) Local historiography. 3) Technical reports. After the application of historiographical methodologies, more than 2300 flood records have been added to the database so far. Despite the completion of the database is still a work in progress, the firsts analyses are already underway and focus on the largest floods with catastrophic effects simultaneously on more than 15 catchments: November 1617, October 1787, September 1842, May 1853, September 1874, January 1898, October 1907, October 1940, September 1962, November 1982, October 1994 and others.
Automatic pattern localization across layout database and photolithography mask
NASA Astrophysics Data System (ADS)
Morey, Philippe; Brault, Frederic; Beisser, Eric; Ache, Oliver; Röth, Klaus-Dieter
2016-03-01
Advanced process photolithography masks require more and more controls for registration versus design and critical dimension uniformity (CDU). The distribution of the measurement points should be distributed all over the whole mask and may be denser in areas critical to wafer overlay requirements. This means that some, if not many, of theses controls should be made inside the customer die and may use non-dedicated patterns. It is then mandatory to access the original layout database to select patterns for the metrology process. Finding hundreds of relevant patterns in a database containing billions of polygons may be possible, but in addition, it is mandatory to create the complete metrology job fast and reliable. Combining, on one hand, a software expertise in mask databases processing and, on the other hand, advanced skills in control and registration equipment, we have developed a Mask Dataprep Station able to select an appropriate number of measurement targets and their positions in a huge database and automatically create measurement jobs on the corresponding area on the mask for the registration metrology system. In addition, the required design clips are generated from the database in order to perform the rendering procedure on the metrology system. This new methodology has been validated on real production line for the most advanced process. This paper presents the main challenges that we have faced, as well as some results on the global performances.
ERIC Educational Resources Information Center
O'Dea, Jennifer A.
2005-01-01
Purpose: The purpose of this paper is to review current programmes and major issues surrounding preventive interventions for body image and obesity in schools. Design/methodology/approach: A literature review was carried out by analysing papers cited in major literature databases from the last 50 years. This review describes and summarises…
Burstyn, I; Kromhout, H; Cruise, P J; Brennan, P
2000-01-01
The objective of this project was to construct a database of exposure measurements which would be used to retrospectively assess the intensity of various exposures in an epidemiological study of cancer risk among asphalt workers. The database was developed as a stand-alone Microsoft Access 2.0 application, which could work in each of the national centres. Exposure data included in the database comprised measurements of exposure levels, plus supplementary information on production characteristics which was analogous to that used to describe companies enrolled in the study. The database has been successfully implemented in eight countries, demonstrating the flexibility and data security features adequate to the task. The database allowed retrieval and consistent coding of 38 data sets of which 34 have never been described in peer-reviewed scientific literature. We were able to collect most of the data intended. As of February 1999 the database consisted of 2007 sets of measurements from persons or locations. The measurements appeared to be free from any obvious bias. The methodology embodied in the creation of the database can be usefully employed to develop exposure assessment tools in epidemiological studies.
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.
Methodological quality of systematic reviews on treatments for depression: a cross-sectional study.
Chung, V C H; Wu, X Y; Feng, Y; Ho, R S T; Wong, S Y S; Threapleton, D
2017-05-02
Depression is one of the most common mental disorders and identifying effective treatment strategies is crucial for the control of depression. Well-conducted systematic reviews (SRs) and meta-analyses can provide the best evidence for supporting treatment decision-making. Nevertheless, the trustworthiness of conclusions can be limited by lack of methodological rigour. This study aims to assess the methodological quality of a representative sample of SRs on depression treatments. A cross-sectional study on the bibliographical and methodological characteristics of SRs published on depression treatments trials was conducted. Two electronic databases (the Cochrane Database of Systematic Reviews and the Database of Abstracts of Reviews of Effects) were searched for potential SRs. SRs with at least one meta-analysis on the effects of depression treatments were considered eligible. The methodological quality of included SRs was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. The associations between bibliographical characteristics and scoring on AMSTAR items were analysed using logistic regression analysis. A total of 358 SRs were included and appraised. Over half of included SRs (n = 195) focused on non-pharmacological treatments and harms were reported in 45.5% (n = 163) of all studies. Studies varied in methods and reporting practices: only 112 (31.3%) took the risk of bias among primary studies into account when formulating conclusions; 245 (68.4%) did not fully declare conflict of interests; 93 (26.0%) reported an 'a priori' design and 104 (29.1%) provided lists of both included and excluded studies. Results from regression analyses showed: more recent publications were more likely to report 'a priori' designs [adjusted odds ratio (AOR) 1.31, 95% confidence interval (CI) 1.09-1.57], to describe study characteristics fully (AOR 1.16, 95% CI 1.06-1.28), and to assess presence of publication bias (AOR 1.13, 95% CI 1.06-1.19), but were less likely to list both included and excluded studies (AOR 0.86, 95% CI 0.81-0.92). SRs published in journals with higher impact factor (AOR 1.14, 95% CI 1.04-1.25), completed by more review authors (AOR 1.12, 95% CI 1.01-1.24) and SRs on non-pharmacological treatments (AOR 1.62, 95% CI 1.01-2.59) were associated with better performance in publication bias assessment. The methodological quality of included SRs is disappointing. Future SRs should strive to improve rigour by considering of risk of bias when formulating conclusions, reporting conflict of interests and authors should explicitly describe harms. SR authors should also use appropriate methods to combine the results, prevent language and publication biases, and ensure timely updates.
Grant, A. M.; Richard, Y.; Deland, E.; Després, N.; de Lorenzi, F.; Dagenais, A.; Buteau, M.
1997-01-01
The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies. PMID:9357733
Grant, A M; Richard, Y; Deland, E; Després, N; de Lorenzi, F; Dagenais, A; Buteau, M
1997-01-01
The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies.
Comparative Effectiveness Research in Lung Diseases and Sleep Disorders
Lieu, Tracy A.; Au, David; Krishnan, Jerry A.; Moss, Marc; Selker, Harry; Harabin, Andrea; Connors, Alfred
2011-01-01
The Division of Lung Diseases of the National Heart, Lung, and Blood Institute (NHLBI) held a workshop to develop recommendations on topics, methodologies, and resources for comparative effectiveness research (CER) that will guide clinical decision making about available treatment options for lung diseases and sleep disorders. A multidisciplinary group of experts with experience in efficacy, effectiveness, implementation, and economic research identified (a) what types of studies the domain of CER in lung diseases and sleep disorders should include, (b) the criteria and process for setting priorities, and (c) current resources for and barriers to CER in lung diseases. Key recommendations were to (1) increase efforts to engage stakeholders in developing CER questions and study designs; (2) invest in further development of databases and other infrastructure, including efficient methods for data sharing; (3) make full use of a broad range of study designs; (4) increase the appropriate use of observational designs and the support of methodologic research; (5) ensure that committees that review CER grant applications include persons with appropriate perspective and expertise; and (6) further develop the workforce for CER by supporting training opportunities that focus on the methodologic and practical skills needed. PMID:21965016
Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D
2014-11-01
Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.
A requirements specification for a software design support system
NASA Technical Reports Server (NTRS)
Noonan, Robert E.
1988-01-01
Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.
NASA Technical Reports Server (NTRS)
Lin, Risheng; Afjeh, Abdollah A.
2003-01-01
Crucial to an efficient aircraft simulation-based design is a robust data modeling methodology for both recording the information and providing data transfer readily and reliably. To meet this goal, data modeling issues involved in the aircraft multidisciplinary design are first analyzed in this study. Next, an XML-based. extensible data object model for multidisciplinary aircraft design is constructed and implemented. The implementation of the model through aircraft databinding allows the design applications to access and manipulate any disciplinary data with a lightweight and easy-to-use API. In addition, language independent representation of aircraft disciplinary data in the model fosters interoperability amongst heterogeneous systems thereby facilitating data sharing and exchange between various design tools and systems.
Fast 3D shape screening of large chemical databases through alignment-recycling
Fontaine, Fabien; Bolton, Evan; Borodina, Yulia; Bryant, Stephen H
2007-01-01
Background Large chemical databases require fast, efficient, and simple ways of looking for similar structures. Although such tasks are now fairly well resolved for graph-based similarity queries, they remain an issue for 3D approaches, particularly for those based on 3D shape overlays. Inspired by a recent technique developed to compare molecular shapes, we designed a hybrid methodology, alignment-recycling, that enables efficient retrieval and alignment of structures with similar 3D shapes. Results Using a dataset of more than one million PubChem compounds of limited size (< 28 heavy atoms) and flexibility (< 6 rotatable bonds), we obtained a set of a few thousand diverse structures covering entirely the 3D shape space of the conformers of the dataset. Transformation matrices gathered from the overlays between these diverse structures and the 3D conformer dataset allowed us to drastically (100-fold) reduce the CPU time required for shape overlay. The alignment-recycling heuristic produces results consistent with de novo alignment calculation, with better than 80% hit list overlap on average. Conclusion Overlay-based 3D methods are computationally demanding when searching large databases. Alignment-recycling reduces the CPU time to perform shape similarity searches by breaking the alignment problem into three steps: selection of diverse shapes to describe the database shape-space; overlay of the database conformers to the diverse shapes; and non-optimized overlay of query and database conformers using common reference shapes. The precomputation, required by the first two steps, is a significant cost of the method; however, once performed, querying is two orders of magnitude faster. Extensions and variations of this methodology, for example, to handle more flexible and larger small-molecules are discussed. PMID:17880744
Application of Optical Disc Databases and Related Technology to Public Access Settings
1992-03-01
users to download and retain data. A Video Graphics Adapter (VGA) monitor was included. No printer was provided. 2. CD-ROM Product Computer Select, a...download facilities, without printer support, satisfy user needs? 38 A secondary, but significant, objective was avoidance of unnecessary Reader...design of User Log sheets and mitigated against attachment of a printer to the workstation. F. DATA COLLECTION This section describes the methodology
Assessing the impact of healthcare research: A systematic review of methodological frameworks.
Cruz Rivera, Samantha; Kyte, Derek G; Aiyegbusi, Olalekan Lee; Keeley, Thomas J; Calvert, Melanie J
2017-08-01
Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) 'primary research-related impact', (2) 'influence on policy making', (3) 'health and health systems impact', (4) 'health-related and societal impact', and (5) 'broader economic impact'. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research.
2013-01-01
Background Systematic reviews and meta-analyses of home telemonitoring interventions for patients with chronic diseases have increased over the past decade and become increasingly important to a wide range of clinicians, policy makers, and other health care stakeholders. While a few criticisms about their methodological rigor and synthesis approaches have recently appeared, no formal appraisal of their quality has been conducted yet. Objective The primary aim of this critical review was to evaluate the methodology, quality, and reporting characteristics of prior reviews that have investigated the effects of home telemonitoring interventions in the context of chronic diseases. Methods Ovid MEDLINE, the Database of Abstract of Reviews of Effects (DARE), and Health Technology Assessment Database (HTA) of the Cochrane Library were electronically searched to find relevant systematic reviews, published between January 1966 and December 2012. Potential reviews were screened and assessed for inclusion independently by three reviewers. Data pertaining to the methods used were extracted from each included review and examined for accuracy by two reviewers. A validated quality assessment instrument, R-AMSTAR, was used as a framework to guide the assessment process. Results Twenty-four reviews, nine of which were meta-analyses, were identified from more than 200 citations. The bibliographic search revealed that the number of published reviews has increased substantially over the years in this area and although most reviews focus on studying the effects of home telemonitoring on patients with congestive heart failure, researcher interest has extended to other chronic diseases as well, such as diabetes, hypertension, chronic obstructive pulmonary disease, and asthma. Nevertheless, an important number of these reviews appear to lack optimal scientific rigor due to intrinsic methodological issues. Also, the overall quality of reviews does not appear to have improved over time. While several criteria were met satisfactorily by either all or nearly all reviews, such as the establishment of an a priori design with inclusion and exclusion criteria, use of electronic searches on multiple databases, and reporting of studies characteristics, there were other important areas that needed improvement. Duplicate data extraction, manual searches of highly relevant journals, inclusion of gray and non-English literature, assessment of the methodological quality of included studies and quality of evidence were key methodological procedures that were performed infrequently. Furthermore, certain methodological limitations identified in the synthesis of study results have affected the results and conclusions of some reviews. Conclusions Despite the availability of methodological guidelines that can be utilized to guide the proper conduct of systematic reviews and meta-analyses and eliminate potential risks of bias, this knowledge has not yet been fully integrated in the area of home telemonitoring. Further efforts should be made to improve the design, conduct, reporting, and publication of systematic reviews and meta-analyses in this area. PMID:23880072
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
IN SILICO METHODOLOGIES FOR PREDICTIVE EVALUATION OF TOXICITY BASED ON INTEGRATION OF DATABASES
In silico methodologies for predictive evaluation of toxicity based on integration of databases
Chihae Yang1 and Ann M. Richard2, 1LeadScope, Inc. 1245 Kinnear Rd. Columbus, OH. 43212 2National Health & Environmental Effects Research Lab, U.S. EPA, Research Triangle Park, ...
Lunar base Controlled Ecological Life Support System (LCELSS): Preliminary conceptual design study
NASA Technical Reports Server (NTRS)
Schwartzkopf, Steven H.
1991-01-01
The objective of this study was to develop a conceptual design for a self-sufficient LCELSS. The mission need is for a CELSS with a capacity to supply the life support needs for a nominal crew of 30, and a capability for accommodating a range of crew sizes from 4 to 100 people. The work performed in this study was nominally divided into two parts. In the first part, relevant literature was assembled and reviewed. This review identified LCELSS performance requirements and the constraints and advantages confronting the design. It also collected information on the environment of the lunar surface and identified candidate technologies for the life support subsystems and the systems with which the LCELSS interfaced. Information on the operation and performance of these technologies was collected, along with concepts of how they might be incorporated into the LCELSS conceptual design. The data collected on these technologies was stored for incorporation into the study database. Also during part one, the study database structure was formulated and implemented, and an overall systems engineering methodology was developed for carrying out the study.
NASA Technical Reports Server (NTRS)
Schreiner, Samuel S.; Dominguez, Jesus A.; Sibille, Laurent; Hoffman, Jeffrey A.
2015-01-01
We present a parametric sizing model for a Molten Electrolysis Reactor that produces oxygen and molten metals from lunar regolith. The model has a foundation of regolith material properties validated using data from Apollo samples and simulants. A multiphysics simulation of an MRE reactor is developed and leveraged to generate a vast database of reactor performance and design trends. A novel design methodology is created which utilizes this database to parametrically design an MRE reactor that 1) can sustain the required mass of molten regolith, current, and operating temperature to meet the desired oxygen production level, 2) can operate for long durations via joule heated, cold wall operation in which molten regolith does not touch the reactor side walls, 3) can support a range of electrode separations to enable operational flexibility. Mass, power, and performance estimates for an MRE reactor are presented for a range of oxygen production levels. The effects of several design variables are explored, including operating temperature, regolith type/composition, batch time, and the degree of operational flexibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faidy, C.; Gilles, P.
The objective of the seminar was to present the current state of the art in Leak-Before-Break (LBB) methodology development, validation, and application in an international forum. With particular emphasis on industrial applications and regulatory policies, the seminar provided an opportunity to compare approaches, experiences, and codifications developed by different countries. The seminar was organized into four topic areas: status of LBB applications; technical issues in LBB methodology; complementary requirements (leak detection and inspection); LBB assessment and margins. As a result of this seminar, an improved understanding of LBB gained through sharing of different viewpoints from different countries, permits consideration of:more » simplified pipe support design and possible elimination of loss-of-coolant-accident (LOCA) mechanical consequences for specific cases; defense-in-depth type of applications without support modifications; support of safety cases for plants designed without the LOCA hypothesis. In support of these activities, better estimates of the limits to the LBB approach should follow, as well as an improvement in codifying methodologies. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less
A Systematic Review of Serious Games in Training Health Care Professionals.
Wang, Ryan; DeMaria, Samuel; Goldberg, Andrew; Katz, Daniel
2016-02-01
Serious games are computer-based games designed for training purposes. They are poised to expand their role in medical education. This systematic review, conducted in accordance with PRISMA guidelines, aimed to synthesize current serious gaming trends in health care training, especially those pertaining to developmental methodologies and game evaluation. PubMed, EMBASE, and Cochrane databases were queried for relevant documents published through December 2014. Of the 3737 publications identified, 48 of them, covering 42 serious games, were included. From 2007 to 2014, they demonstrate a growth from 2 games and 2 genres to 42 games and 8 genres. Overall, study design was heterogeneous and methodological quality by MERQSI score averaged 10.5/18, which is modest. Seventy-nine percent of serious games were evaluated for training outcomes. As the number of serious games for health care training continues to grow, having schemas that organize how educators approach their development and evaluation is essential for their success.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Search Filter Precision Can Be Improved By NOTing Out Irrelevant Content
Wilczynski, Nancy L.; McKibbon, K. Ann; Haynes, R. Brian
2011-01-01
Background: Most methodologic search filters developed for use in large electronic databases such as MEDLINE have low precision. One method that has been proposed but not tested for improving precision is NOTing out irrelevant content. Objective: To determine if search filter precision can be improved by NOTing out the text words and index terms assigned to those articles that are retrieved but are off-target. Design: Analytic survey. Methods: NOTing out unique terms in off-target articles and testing search filter performance in the Clinical Hedges Database. Main Outcome Measures: Sensitivity, specificity, precision and number needed to read (NNR). Results: For all purpose categories (diagnosis, prognosis and etiology) except treatment and for all databases (MEDLINE, EMBASE, CINAHL and PsycINFO), constructing search filters that NOTed out irrelevant content resulted in substantive improvements in NNR (over four-fold for some purpose categories and databases). Conclusion: Search filter precision can be improved by NOTing out irrelevant content. PMID:22195215
Dai, Yun-kai; Zhang, Yun-zhan; Li, Dan-yan; Ye, Jin-tong; Zeng, Ling-feng; Wang, Qi; Hu, Ling
2017-01-01
Jianpi Yiqi therapy (JYT) is a classical therapy in treating chronic atrophic gastritis (CAG), but the clinical effects of it are still contentious. The purpose of this article is to evaluate the efficacy and safety of JYT for CAG. Seven electronic databases including PubMed, Embase, Springer Link, CNKI (China National Knowledge Infrastructure), VIP (Chinese Scientific Journals Database), Wan-fang database, and CBM (Chinese Biomedicine Database) were searched from their inception to November 1, 2016. 13 randomized controlled trials (RCTs) with a total of 1119 participants were identified for analysis. Meta-analyses demonstrated that both JYT (RR 1.41; 95% CI 1.27, 1.57; P < 0.00001) and JYT + western medicine (RR 1.27; 95% CI 1.17, 1.38; P < 0.00001) were more efficacious than only western medicine. Furthermore, JYT had potential improvement on traditional Chinese medicine (TCM) symptoms scores such as stomachache, stomach distention, belching, fatigue, et al. In addition, no serious adverse events were reported in the selected trials. The Cochrane Collaboration’s risk of bias tool was evaluated for the weaknesses of methodological quality, while the quality level of Grades of Recommendations Assessment Development and Evaluation (GRADE) evidence classification indicated “Very low”. This meta-analysis indicates that JYT may have potential effects on the treatment of patients with CAG. However, due to limitations of methodological quality and small sample size of the included studies, further standardized research of rigorous design should be needed. PMID:28738092
Aiken, Leona S; West, Stephen G; Millsap, Roger E
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Nyland, John; Causey, Brandon; Wera, Jeff; Krupp, Ryan; Tate, David; Gupta, Amit
2017-07-01
This systematic literature review evaluated the methodological research design quality of studies that evaluated patient outcomes following distal biceps brachii tendon repair and developed evidence-based recommendations for future patient clinical outcomes research. Following the preferred reporting items for systematic reviews and meta-analyses criteria, and using "biceps brachii", "tendon", "repair" and "outcome assessment" search terms, the CINAHL, Academic Search Premier and MEDLINE databases were searched from January 1960-October 2015. The modified Coleman methodology score (MCMS) served as the primary outcome measure. Descriptive statistical analysis was performed for composite and component MCMS and for patient outcome assessment methodology use frequency. A total of 93 studies were evaluated. Overall MCMS was low (57.1 ± 14). Only 12 (12.9 %) had prospective cohort or randomized controlled trial designs. There was a moderate relationship between publication year and MCMS (r = 0.53, P < 0.0001). Although 61 studies (65.6 %) had adequate surgical descriptions, only 3 (3.2 %) had well-described rehabilitation. Of 2253 subjects, only 39 (1.7 %) were women. Studies published after 2008 had higher MCMS scores than studies published earlier (61.3 ± 10 versus 52.9 ± 16, P = 0.003). Although overall research study methodological scores improved on average since 2008, generally low MCMS scores, retrospective designs, lack of eccentric elbow flexor or supinator strength testing, and poorly described surgical and rehabilitation descriptions remain commonplace. These findings decrease clinical study validity and generalizability. III.
Kim, Jung-Hee; Shin, Sujin; Park, Jin-Hwa
2015-04-01
The purpose of this study was to evaluate the methodological quality of nursing studies using structural equation modeling in Korea. Databases of KISS, DBPIA, and National Assembly Library up to March 2014 were searched using the MeSH terms 'nursing', 'structure', 'model'. A total of 152 studies were screened. After removal of duplicates and non-relevant titles, 61 papers were read in full. Of the sixty-one articles retrieved, 14 studies were published between 1992 and 2000, 27, between 2001 and 2010, and 20, between 2011 and March 2014. The methodological quality of the review examined varied considerably. The findings of this study suggest that more rigorous research is necessary to address theoretical identification, two indicator rule, distribution of sample, treatment of missing values, mediator effect, discriminant validity, convergent validity, post hoc model modification, equivalent models issues, and alternative models issues should be undergone. Further research with robust consistent methodological study designs from model identification to model respecification is needed to improve the validity of the research.
Menditto, Enrica; Bolufer De Gea, Angela; Cahir, Caitriona; Marengoni, Alessandra; Riegler, Salvatore; Fico, Giuseppe; Costa, Elisio; Monaco, Alessandro; Pecorelli, Sergio; Pani, Luca; Prados-Torres, Alexandra
2016-01-01
Computerized health care databases have been widely described as an excellent opportunity for research. The availability of "big data" has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on "adherence to prescription and medical plans" identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners with the aim of improving data sharing on a European level. A total of six databases belonging to three different European countries (Spain, Republic of Ireland, and Italy) were included in the analysis. Preliminary results suggest that there are some similarities. However, these results should be applied in different contexts and European countries, supporting the idea that large European studies should be designed in order to get the most of already available databases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
Dupree, Jean A.; Crowfoot, Richard M.
2012-01-01
The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)
Dictionary learning-based CT detection of pulmonary nodules
NASA Astrophysics Data System (ADS)
Wu, Panpan; Xia, Kewen; Zhang, Yanbo; Qian, Xiaohua; Wang, Ge; Yu, Hengyong
2016-10-01
Segmentation of lung features is one of the most important steps for computer-aided detection (CAD) of pulmonary nodules with computed tomography (CT). However, irregular shapes, complicated anatomical background and poor pulmonary nodule contrast make CAD a very challenging problem. Here, we propose a novel scheme for feature extraction and classification of pulmonary nodules through dictionary learning from training CT images, which does not require accurately segmented pulmonary nodules. Specifically, two classification-oriented dictionaries and one background dictionary are learnt to solve a two-category problem. In terms of the classification-oriented dictionaries, we calculate sparse coefficient matrices to extract intrinsic features for pulmonary nodule classification. The support vector machine (SVM) classifier is then designed to optimize the performance. Our proposed methodology is evaluated with the lung image database consortium and image database resource initiative (LIDC-IDRI) database, and the results demonstrate that the proposed strategy is promising.
Gilderthorp, Rosanna C
2015-03-01
This study aimed to critically review all studies that have set out to evaluate the use of eye movement desensitization and reprocessing (EMDR) for people diagnosed with both intellectual disability (ID) and post-traumatic stress disorder (PTSD). Searches of the online databases Psych Info, The Cochrane Database of Systematic Reviews, The Cochrane Database of Randomized Control Trials, CINAHL, ASSIA and Medline were conducted. Five studies are described and evaluated. Key positive points include the high clinical salience of the studies and their high external validity. Several common methodological criticisms are highlighted, however, including difficulty in the definition of the terms ID and PTSD, lack of control in design and a lack of consideration of ethical implications. Overall, the articles reviewed indicate cause for cautious optimism about the utility of EMDR with this population. The clinical and research implications of this review are discussed. © The Author(s) 2014.
Does Metformin Reduce Cancer Risks? Methodologic Considerations.
Golozar, Asieh; Liu, Shuiqing; Lin, Joeseph A; Peairs, Kimberly; Yeh, Hsin-Chieh
2016-01-01
The substantial burden of cancer and diabetes and the association between the two conditions has been a motivation for researchers to look for targeted strategies that can simultaneously affect both diseases and reduce their overlapping burden. In the absence of randomized clinical trials, researchers have taken advantage of the availability and richness of administrative databases and electronic medical records to investigate the effects of drugs on cancer risk among diabetic individuals. The majority of these studies suggest that metformin could potentially reduce cancer risk. However, the validity of this purported reduction in cancer risk is limited by several methodological flaws either in the study design or in the analysis. Whether metformin use decreases cancer risk relies heavily on the availability of valid data sources with complete information on confounders, accurate assessment of drug use, appropriate study design, and robust analytical techniques. The majority of the observational studies assessing the association between metformin and cancer risk suffer from methodological shortcomings and efforts to address these issues have been incomplete. Future investigations on the association between metformin and cancer risk should clearly address the methodological issues due to confounding by indication, prevalent user bias, and time-related biases. Although the proposed strategies do not guarantee a bias-free estimate for the association between metformin and cancer, they will reduce synthesis of and reporting of erroneous results.
El-Naggar, Noura El-Ahmady; Moawad, Hassan; El-Shweihy, Nancy M; El-Ewasy, Sara M
2015-01-01
Among the antitumor drugs, bacterial enzyme L-asparaginase has been employed as the most effective chemotherapeutic agent in pediatric oncotherapy especially for acute lymphoblastic leukemia. Glutaminase free L-asparaginase producing actinomycetes were isolated from soil samples collected from Egypt. Among them, a potential culture, strain NEAE-119, was selected and identified on the basis of morphological, cultural, physiological, and biochemical properties together with 16S rRNA sequence as Streptomyces olivaceus NEAE-119 and sequencing product (1509 bp) was deposited in the GenBank database under accession number KJ200342. The optimization of different process parameters for L-asparaginase production by Streptomyces olivaceus NEAE-119 using Plackett-Burman experimental design and response surface methodology was carried out. Fifteen variables (temperature, pH, incubation time, inoculum size, inoculum age, agitation speed, dextrose, starch, L-asparagine, KNO3, yeast extract, K2HPO4, MgSO4·7H2O, NaCl, and FeSO4·7H2O) were screened using Plackett-Burman experimental design. The most positive significant independent variables affecting enzyme production (temperature, inoculum age, and agitation speed) were further optimized by the face-centered central composite design-response surface methodology.
El-Naggar, Noura El-Ahmady; Moawad, Hassan; El-Shweihy, Nancy M.; El-Ewasy, Sara M.
2015-01-01
Among the antitumor drugs, bacterial enzyme L-asparaginase has been employed as the most effective chemotherapeutic agent in pediatric oncotherapy especially for acute lymphoblastic leukemia. Glutaminase free L-asparaginase producing actinomycetes were isolated from soil samples collected from Egypt. Among them, a potential culture, strain NEAE-119, was selected and identified on the basis of morphological, cultural, physiological, and biochemical properties together with 16S rRNA sequence as Streptomyces olivaceus NEAE-119 and sequencing product (1509 bp) was deposited in the GenBank database under accession number KJ200342. The optimization of different process parameters for L-asparaginase production by Streptomyces olivaceus NEAE-119 using Plackett-Burman experimental design and response surface methodology was carried out. Fifteen variables (temperature, pH, incubation time, inoculum size, inoculum age, agitation speed, dextrose, starch, L-asparagine, KNO3, yeast extract, K2HPO4, MgSO4·7H2O, NaCl, and FeSO4·7H2O) were screened using Plackett-Burman experimental design. The most positive significant independent variables affecting enzyme production (temperature, inoculum age, and agitation speed) were further optimized by the face-centered central composite design-response surface methodology. PMID:26180806
Standardized Radiation Shield Design Methods: 2005 HZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.
2006-01-01
Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.
Slimani, N; Deharveng, G; Unwin, I; Southgate, D A T; Vignat, J; Skeie, G; Salvini, S; Parpinel, M; Møller, A; Ireland, J; Becker, W; Farran, A; Westenbrink, S; Vasilopoulou, E; Unwin, J; Borgejordet, A; Rohrmann, S; Church, S; Gnagnarella, P; Casagrande, C; van Bakel, M; Niravong, M; Boutron-Ruault, M C; Stripp, C; Tjønneland, A; Trichopoulou, A; Georga, K; Nilsson, S; Mattisson, I; Ray, J; Boeing, H; Ocké, M; Peeters, P H M; Jakszyn, P; Amiano, P; Engeset, D; Lund, E; de Magistris, M Santucci; Sacerdote, C; Welch, A; Bingham, S; Subar, A F; Riboli, E
2007-09-01
This paper describes the ad hoc methodological concepts and procedures developed to improve the comparability of Nutrient databases (NDBs) across the 10 European countries participating in the European Prospective Investigation into Cancer and Nutrition (EPIC). This was required because there is currently no European reference NDB available. A large network involving national compilers, nutritionists and experts on food chemistry and computer science was set up for the 'EPIC Nutrient DataBase' (ENDB) project. A total of 550-1500 foods derived from about 37,000 standardized EPIC 24-h dietary recalls (24-HDRS) were matched as closely as possible to foods available in the 10 national NDBs. The resulting national data sets (NDS) were then successively documented, standardized and evaluated according to common guidelines and using a DataBase Management System specifically designed for this project. The nutrient values of foods unavailable or not readily available in NDSs were approximated by recipe calculation, weighted averaging or adjustment for weight changes and vitamin/mineral losses, using common algorithms. The final ENDB contains about 550-1500 foods depending on the country and 26 common components. Each component value was documented and standardized for unit, mode of expression, definition and chemical method of analysis, as far as possible. Furthermore, the overall completeness of NDSs was improved (>or=99%), particularly for beta-carotene and vitamin E. The ENDB constitutes a first real attempt to improve the comparability of NDBs across European countries. This methodological work will provide a useful tool for nutritional research as well as end-user recommendations to improve NDBs in the future.
Evaluation of STD/AIDS prevention programs: a review of approaches and methodologies.
da Cruz, Marly Marques; dos Santos, Elizabeth Moreira; Monteiro, Simone
2007-05-01
The article presents a review of approaches and methodologies in the evaluation of STD/AIDS prevention programs, searching for theoretical and methodological support for the institutionalization of evaluation and decision-making. The review included the MEDLINE, SciELO, and ISI Web of Science databases and other sources like textbooks and congress abstracts from 1990 to 2005, with the key words: "evaluation", "programs", "prevention", "STD/AIDS", and similar terms. The papers showed a predominance of quantitative outcome or impact evaluative studies with an experimental or quasi-experimental design. The main use of evaluation is accountability, although knowledge output and program improvement were also identified in the studies. Only a few evaluative studies contemplate process evaluation and its relationship to the contexts. The review aimed to contribute to the debate on STD/AIDS, which requires more effective, consistent, and sustainable decisions in the field of prevention.
Landolt, Alison S; Milling, Leonard S
2011-08-01
This paper presents a comprehensive methodological review of research on the efficacy of hypnosis for reducing labor and delivery pain. To be included, studies were required to use a between-subjects or mixed model design in which hypnosis was compared with a control condition or alternative intervention in reducing labor pain. An exhaustive search of the PsycINFO and PubMed databases produced 13 studies satisfying these criteria. Hetero-hypnosis and self-hypnosis were consistently shown to be more effective than standard medical care, supportive counseling, and childbirth education classes in reducing pain. Other benefits included better infant Apgar scores and shorter Stage 1 labor. Common methodological limitations of the literature include a failure to use random assignment, to specify the demographic characteristics of samples, and to use a treatment manual. Copyright © 2011 Elsevier Ltd. All rights reserved.
The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale
ERIC Educational Resources Information Center
Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine
2013-01-01
Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…
Kingfisher: a system for remote sensing image database management
NASA Astrophysics Data System (ADS)
Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.
2003-04-01
At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.
Pölkki, Tarja; Kanste, Outi; Kääriäinen, Maria; Elo, Satu; Kyngäs, Helvi
2014-02-01
To analyse systematic review articles published in the top 10 nursing journals to determine the quality of the methods employed within them. Systematic review is defined as a scientific research method that synthesises high-quality scientific knowledge on a given topic. The number of such reviews in nursing science has increased dramatically during recent years, but their methodological quality has not previously been assessed. A review of the literature using a narrative approach. Ranked impact factor scores for nursing journals were obtained from the Journal Citation Report database of the Institute of Scientific Information (ISI Web of Knowledge). All issues from the years 2009 and 2010 of the top 10 ranked journals were included. CINAHL and MEDLINE databases were searched to locate studies using the search terms 'systematic review' and 'systematic literature review'. A total of 39 eligible studies were identified. Their methodological quality was evaluated through the specific criteria of quality assessment, description of synthesis and strengths and weaknesses reported in the included studies. Most of the eligible systematic reviews included several different designs or types of quantitative study. The majority included a quality assessment, and a total of 17 different criteria were identified. The method of synthesis was mentioned in about half of the reviews, the most common being narrative synthesis. The weaknesses of reviews were discussed, while strengths were rarely highlighted. The methodological quality of the systematic reviews examined varied considerably, although they were all published in nursing journals with a high-impact factor. Despite the fact that systematic reviews are considered the most robust source of research evidence, they vary in methodological quality. This point is important to consider in clinical practice when applying the results to patient care. © 2013 Blackwell Publishing Ltd.
Snijder, Mieke; Shakeshaft, Anthony; Wagemakers, Annemarie; Stephens, Anne; Calabria, Bianca
2015-11-21
Community development is a health promotion approach identified as having great potential to improve Indigenous health, because of its potential for extensive community participation. There has been no systematic examination of the extent of community participation in community development projects and little analysis of their effectiveness. This systematic review aims to identify the extent of community participation in community development projects implemented in Australian Indigenous communities, critically appraise the qualitative and quantitative methods used in their evaluation, and summarise their outcomes. Ten electronic peer-reviewed databases and two electronic grey literature databases were searched for relevant studies published between 1990 and 2015. The level of community participation and the methodological quality of the qualitative and quantitative components of the studies were assessed against standardised criteria. Thirty one evaluation studies of community development projects were identified. Community participation varied between different phases of project development, generally high during project implementation, but low during the evaluation phase. For the majority of studies, methodological quality was low and the methods were poorly described. Although positive qualitative or quantitative outcomes were reported in all studies, only two studies reported statistically significant outcomes. Partnerships between researchers, community members and service providers have great potential to improve methodological quality and community participation when research skills and community knowledge are integrated to design, implement and evaluate community development projects. The methodological quality of studies evaluating Australian Indigenous community development projects is currently too weak to confidently determine the cost-effectiveness of community development projects in improving the health and wellbeing of Indigenous Australians. Higher quality studies evaluating community development projects would strengthen the evidence base.
Choosing phenomenology as a guiding philosophy for nursing research.
Matua, Gerald Amandu
2015-03-01
To provide an overview of important methodological considerations that nurse researchers need to adhere to when choosing phenomenology as a guiding philosophy and research method. Phenomenology is a major philosophy and research method in the humanities, human sciences and arts disciplines with a central goal of describing people's experiences. However, many nurse researchers continue to grapple with methodological issues related to their choice of phenomenological method. The author conducted online and manual searches of relevant research books and electronic databases. Using an integrative method, peer-reviewed research and discussion papers published between January 1990 and December 2011 and listed in the CINAHL, Science Direct, PubMed and Google Scholar databases were reviewed. In addition, textbooks that addressed research methodologies such as phenomenology were used. Although phenomenology is widely used today to broaden understanding of human phenomena relevant to nursing practice, nurse researchers often fail to adhere to acceptable scientific and phenomenological standards. Cognisant of these challenges, researchers are expected to indicate in their work the focus of their investigations, designs, and approaches to collecting and analysing data. They are also expected to present their findings in an evocative and expressive manner. Choosing phenomenology requires researchers to understand it as a philosophy, including basic assumptions and tenets of phenomenology as a research method. This awareness enables researchers, especially novices, to make important methodological decisions, particularly those necessary to indicate the study's scientific rigour and phenomenological validity. This paper adds to the discussion of phenomenology as a guiding philosophy for nursing research. It aims to guide new researchers on important methodological decisions they need to make to safeguard their study's scientific rigour and phenomenological validity.
Building information models for astronomy projects
NASA Astrophysics Data System (ADS)
Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro
2012-09-01
A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.
Geodemographic segmentation systems for screening health data.
Openshaw, S; Blake, M
1995-01-01
AIM--To describe how geodemographic segmentation systems might be useful as a quick and easy way of exploring postcoded health databases for potential interesting patterns related to deprivation and other socioeconomic characteristics. DESIGN AND SETTING--This is demonstrated using GB Profiles, a freely available geodemographic classification system developed at Leeds University. It is used here to screen a database of colorectal cancer registrations as a first step in the analysis of that data. RESULTS AND CONCLUSION--Conventional geodemographics is a fairly simple technology and a number of outstanding methodological problems are identified. A solution to some problems is illustrated by using neural net based classifiers and then by reference to a more sophisticated geodemographic approach via a data optimal segmentation technique. Images PMID:8594132
Economic Studies in Motor Neurone Disease: A Systematic Methodological Review.
Moore, Alan; Young, Carolyn A; Hughes, Dyfrig A
2017-04-01
Motor neurone disease (MND) is a devastating condition which greatly diminishes patients' quality of life and limits life expectancy. Health technology appraisals of future interventions in MND need robust data on costs and utilities. Existing economic evaluations have been noted to be limited and fraught with challenges. The aim of this study was to identify and critique methodological aspects of all published economic evaluations, cost studies, and utility studies in MND. We systematically reviewed all relevant published studies in English from 1946 until January 2016, searching the databases of Medline, EMBASE, Econlit, NHS Economic Evaluation Database (NHS EED) and the Health Economics Evaluation Database (HEED). Key data were extracted and synthesised narratively. A total of 1830 articles were identified, of which 15 economic evaluations, 23 cost and 3 utility studies were included. Most economic studies focused on riluzole (n = 9). Six studies modelled the progressive decline in motor function using a Markov design but did not include mutually exclusive health states. Cost estimates for a number of evaluations were based on expert opinion and were hampered by high variability and location-specific characteristics. Few cost studies reported disease-stage-specific costs (n = 3) or fully captured indirect costs. Utilities in three studies of MND patients used the EuroQol EQ-5D questionnaire or standard gamble, but included potentially unrepresentative cohorts and did not consider any health impacts on caregivers. Economic evaluations in MND suffer from significant methodological issues such as a lack of data, uncertainty with the disease course and use of inappropriate modelling framework. Limitations may be addressed through the collection of detailed and representative data from large cohorts of patients.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Snipes, J. S.; Ramaswami, S.
2016-01-01
An alternative to the traditional trial-and-error empirical approach for the development of new materials is the so-called materials-by-design approach. Within the latter approach, a material is treated as a complex system and its design and optimization is carried out by employing computer-aided engineering analyses, predictive tools, and available material databases. In the present work, the materials-by-design approach is utilized to redesign a grade of high-strength low-alloy (HSLA) class of steels with improved mechanical properties (primarily strength and fracture toughness), processability (e.g., castability, hot formability, and weldability), and corrosion resistance. Toward that end, a number of material thermodynamics, kinetics of phase transformations, and physics of deformation and fracture computational models and databases have been developed/assembled and utilized within a multi-disciplinary, two-level material-by-design optimization scheme. To validate the models, their prediction is compared against the experimental results for the related steel HSLA100. Then the optimization procedure is employed to determine the optimal chemical composition and the tempering schedule for a newly designed grade of the HSLA class of steels with enhanced mechanical properties, processability, and corrosion resistance.
A systematic review of the use of an expertise-based randomised controlled trial design.
Cook, Jonathan A; Elders, Andrew; Boachie, Charles; Bassinga, Ted; Fraser, Cynthia; Altman, Doug G; Boutron, Isabelle; Ramsay, Craig R; MacLennan, Graeme S
2015-05-30
Under a conventional two-arm randomised trial design, participants are allocated to an intervention and participating health professionals are expected to deliver both interventions. However, health professionals often have differing levels of expertise in a skill-based interventions such as surgery or psychotherapy. An expertise-based approach to trial design, where health professionals only deliver an intervention in which they have expertise, has been proposed as an alternative. The aim of this project was to systematically review the use of an expertise-based trial design in the medical literature. We carried out a comprehensive search of nine databases--AMED, BIOSIS, CENTRAL, CINAHL, Cochrane Methodology Register, EMBASE, MEDLINE, Science Citation Index, and PsycINFO--from 1966 to 2012 and performed citation searches using the ISI Citation Indexes and Scopus. Studies that used an expertise-based trial design were included. Two review authors independently screened the titles and abstracts and assessed full-text reports. Data were extracted and summarised on the study characteristics, general and expertise-specific study methodology, and conduct. In total, 7476 titles and abstracts were identified, leading to 43 included studies (54 articles). The vast majority (88%) used a pure expertise-based design; three (7%) adopted a hybrid design, and two (5%) used a design that was unclear. Most studies compared substantially different interventions (79%). In many cases, key information relating to the expertise-based design was absent; only 12 (28%) reported criteria for delivering both interventions. Most studies recruited the target sample size or very close to it (median of 101, interquartile range of 94 to 118), although the target was reported for only 40% of studies. The proportion of participants who received the allocated intervention was high (92%, interquartile range of 82 to 99%). While use of an expertise-based trial design is growing, it remains uncommon. Reporting of study methodology and, particularly, expertise-related methodology was poor. Empirical evidence provided some support for purported benefits such as high levels of recruitment and compliance with allocation. An expertise-based trial design should be considered but its value seems context-specific, particularly when interventions differ substantially or interventions are typically delivered by different health professionals.
A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.
Das, Arup; Gupta, A K; Mazumder, T N
2012-08-15
A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.
The IMI PROTECT project: purpose, organizational structure, and procedures.
Reynolds, Robert F; Kurz, Xavier; de Groot, Mark C H; Schlienger, Raymond G; Grimaldi-Bensouda, Lamiae; Tcherny-Lessenot, Stephanie; Klungel, Olaf H
2016-03-01
The Pharmacoepidemiological Research on Outcomes of Therapeutics by a European ConsorTium (PROTECT) initiative was a collaborative European project that sought to address limitations of current methods in the field of pharmacoepidemiology and pharmacovigilance. Initiated in 2009 and ending in 2015, PROTECT was part of the Innovative Medicines Initiative, a joint undertaking by the European Union and pharmaceutical industry. Thirty-five partners including academics, regulators, small and medium enterprises, and European Federation of Pharmaceuticals Industries and Associations companies contributed to PROTECT. Two work packages within PROTECT implemented research examining the extent to which differences in the study design, methodology, and choice of data source can contribute to producing discrepant results from observational studies on drug safety. To evaluate the effect of these differences, the project applied different designs and analytic methodology for six drug-adverse event pairs across several electronic healthcare databases and registries. This papers introduces the organizational structure and procedures of PROTECT, including how drug-adverse event and data sources were selected, study design and analyses documents were developed, and results managed centrally. Copyright © 2016 John Wiley & Sons, Ltd.
Baek, Hyunjung; Kim, Jae-Hyo; Lee, Beom-Joon
2018-01-01
Background Radiation pneumonitis is a common and serious complication of radiotherapy. Many published randomized controlled studies (RCTs) reveal a growing trend of using herbal medicines as adjuvant therapy to prevent radiation pneumonitis; however, their efficacy and safety remain unexplored. Objective The aim of this systematic review is to evaluate the efficacy and safety of herbal medicines as adjunctive therapy for the prevention of radiation pneumonitis in patients with lung cancer who undergo radiotherapy. Methods We searched the following 11 databases: three English medical databases [MEDLINE (PubMed), EMBASE, The Cochrane Central Register of Controlled Trials (CENTRAL)], five Korean medical databases (Korean Studies Information, Research information Service System, KoreaMed, DBPIA, National Digital Science Library), and three Chinese medical databases [the China National Knowledge Database (CNKI), Journal Integration Platform (VIP), and WanFang Database]. The primary outcome was the incidence of radiation pneumonitis. The risk of bias was assessed using the Cochrane risk-of-bias tool. Results Twenty-two RCTs involving 1819 participants were included. The methodological quality was poor for most of the studies. Meta-analysis showed that herbal medicines combined with radiotherapy significantly reduced the incidence of radiation pneumonitis (n = 1819; RR 0.53, 95% CI 0.45–0.63, I2 = 8%) and the incidence of severe radiation pneumonitis (n = 903; RR 0.22, 95% CI 0.11–0.41, I2 = 0%). Combined therapy also improved the Karnofsky performance score (n = 420; WMD 4.62, 95% CI 1.05–8.18, I2 = 82%). Conclusion There is some encouraging evidence that oral administration of herbal medicines combined with radiotherapy may benefit patients with lung cancer by preventing or minimizing radiation pneumonitis. However, due to the poor methodological quality of the identified studies, definitive conclusion could not be drawn. To confirm the merits of this approach, further rigorously designed large scale trials are warranted. PMID:29847598
[Shoulder disability questionnaires: a systematic review].
Fayad, F; Mace, Y; Lefevre-Colau, M M
2005-07-01
To identify all available shoulder disability questionnaires designed to measure physical functioning and to examine those with satisfactory clinimetric quality. We used the Medline database and the "Guide des outils de mesure de l'évaluation en médecine physique et de réadaptation" textbook to search for questionnaires. Analysis took into account the development methodology, clinimetric quality of the instruments and frequency of their utilization. We classified the instruments according to the International Classification of Functioning, Disability and Health. Thirty-eight instruments have been developed to measure disease-, shoulder- or upper extremity-specific outcome. Four scales assess upper-extremity disability and 3 others shoulder disability. We found 6 scales evaluating disability and shoulder pain, 7 scales measuring the quality of life in patients with various conditions of the shoulder, 14 scales combining objective and subjective measures, 2 pain scales and 2 unclassified scales. Older instruments developed before the advent of modern measurement development methodology usually combine objective and subjective measures. Recent instruments were designed with appropriate methodology. Most are self-administered questionnaires. Numerous shoulder outcome measure instruments are available. There is no "gold standard" for assessing shoulder function outcome in the general population.
Methodology and reporting quality of reporting guidelines: systematic review.
Wang, Xiaoqin; Chen, Yaolong; Yang, Nan; Deng, Wei; Wang, Qi; Li, Nan; Yao, Liang; Wei, Dang; Chen, Gen; Yang, Kehu
2015-09-22
With increasing attention put on the methodology of reporting guidelines, Moher et al. conducted a review of reporting guidelines up to December 2009. Information gaps appeared on many aspects. Therefore, in 2010, the Guidance for Developers of Health Research Reporting Guidelines was developed. With more than four years passed and a considerable investment was put into reporting guideline development, a large number of new, updated, and expanded reporting guidelines have become available since January 2010. We aimed to systematically review the reporting guidelines published since January 2010, and investigate the application of the Guidance. We systematically searched databases including the Cochrane Methodology Register, MEDLINE, and EMBASE, and retrieved EQUATOR and the website (if available) to find reporting guidelines as well as their accompanying documents. We screened the titles and abstracts resulting from searches and extracted data. We focused on the methodology and reporting of the included guidelines, and described information with a series of tables and narrative summaries. Data were summarized descriptively using frequencies, proportions, and medians as appropriate. Twenty-eight and 32 reporting guidelines were retrieved from databases and EQUATOR network, respectively. Reporting guidelines were designed for a broad spectrum of types of research. A considerable number of reporting guidelines were published and updated in recent years. Methods of initial items were given in 45 (75%) guidelines. Thirty-eight (63%) guidelines reported they have reached consensus, and 35 (58%) described their consensus methods. Only 9 (15%) guidelines followed the Guidance. Only few guidelines were developed complying with the Guidance. More attention should be paid to the quality of reporting guidelines.
Borotikar, Bhushan S.; Sheehan, Frances T.
2017-01-01
Objectives To establish an in vivo, normative patellofemoral cartilage contact mechanics database acquired during voluntary muscle control using a novel dynamic magnetic resonance (MR) imaging-based computational methodology and validate the contact mechanics sensitivity to the known sub-millimeter methodological inaccuracies. Design Dynamic cine phase-contrast and multi-plane cine images were acquired while female subjects (n=20, sample of convenience) performed an open kinetic chain (knee flexion-extension) exercise inside a 3-Tesla MR scanner. Static cartilage models were created from high resolution three-dimensional static MR data and accurately placed in their dynamic pose at each time frame based on the cine-PC data. Cartilage contact parameters were calculated based on the surface overlap. Statistical analysis was performed using paired t-test and a one-sample repeated measures ANOVA. The sensitivity of the contact parameters to the known errors in the patellofemoral kinematics was determined. Results Peak mean patellofemoral contact area was 228.7±173.6mm2 at 40° knee angle. During extension, contact centroid and peak strain locations tracked medially on the femoral and patellar cartilage and were not significantly different from each other. At 30°, 35°, and 40° of knee extension, contact area was significantly different. Contact area and centroid locations were insensitive to rotational and translational perturbations. Conclusion This study is a first step towards unfolding the biomechanical pathways to anterior patellofemoral pain and OA using dynamic, in vivo, and accurate methodologies. The database provides crucial data for future studies and for validation of, or as an input to, computational models. PMID:24012620
Toward unification of taxonomy databases in a distributed computer environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitakami, Hajime; Tateno, Yoshio; Gojobori, Takashi
1994-12-31
All the taxonomy databases constructed with the DNA databases of the international DNA data banks are powerful electronic dictionaries which aid in biological research by computer. The taxonomy databases are, however not consistently unified with a relational format. If we can achieve consistent unification of the taxonomy databases, it will be useful in comparing many research results, and investigating future research directions from existent research results. In particular, it will be useful in comparing relationships between phylogenetic trees inferred from molecular data and those constructed from morphological data. The goal of the present study is to unify the existent taxonomymore » databases and eliminate inconsistencies (errors) that are present in them. Inconsistencies occur particularly in the restructuring of the existent taxonomy databases, since classification rules for constructing the taxonomy have rapidly changed with biological advancements. A repair system is needed to remove inconsistencies in each data bank and mismatches among data banks. This paper describes a new methodology for removing both inconsistencies and mismatches from the databases on a distributed computer environment. The methodology is implemented in a relational database management system, SYBASE.« less
Gazan, Rozenn; Barré, Tangui; Perignon, Marlène; Maillot, Matthieu; Darmon, Nicole; Vieux, Florent
2018-01-01
The holistic approach required to assess diet sustainability is hindered by lack of comprehensive databases compiling relevant food metrics. Those metrics are generally scattered in different data sources with various levels of aggregation hampering their matching. The objective was to develop a general methodology to compile food metrics describing diet sustainability dimensions into a single database and to apply it to the French context. Each step of the methodology is detailed: indicators and food metrics identification and selection, food list definition, food matching and values assignment. For the French case, nutrient and contaminant content, bioavailability factors, distribution of dietary intakes, portion sizes, food prices, greenhouse gas emission, acidification and marine eutrophication estimates were allocated to 212 commonly consumed generic foods. This generic database compiling 279 metrics will allow the simultaneous evaluation of the four dimensions of diet sustainability, namely health, economic, social and environmental, dimensions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Jeong, Sohyun; Han, Nayoung; Choi, Boyoon; Sohn, Minji; Song, Yun-Kyoung; Chung, Myeon-Woo; Na, Han-Sung; Ji, Eunhee; Kim, Hyunah; Rhew, Ki Yon; Kim, Therasa; Kim, In-Wha; Oh, Jung Mi
2016-06-01
To construct a database of published clinical drug trials suitable for use 1) as a research tool in accessing clinical trial information and 2) in evidence-based decision-making by regulatory professionals, clinical research investigators, and medical practitioners. Comprehensive information obtained from a search of design elements and results of clinical trials in peer reviewed journals using PubMed (http://www.ncbi.nlm.ih.gov/pubmed). The methodology to develop a structured database was devised by a panel composed of experts in medical, pharmaceutical, information technology, and members of Ministry of Food and Drug Safety (MFDS) using a step by step approach. A double-sided system consisting of user mode and manager mode served as the framework for the database; elements of interest from each trial were entered via secure manager mode enabling the input information to be accessed in a user-friendly manner (user mode). Information regarding methodology used and results of drug treatment were extracted as detail elements of each data set and then inputted into the web-based database system. Comprehensive information comprising 2,326 clinical trial records, 90 disease states, and 939 drugs entities and concerning study objectives, background, methods used, results, and conclusion could be extracted from published information on phase II/III drug intervention clinical trials appearing in SCI journals within the last 10 years. The extracted data was successfully assembled into a clinical drug trial database with easy access suitable for use as a research tool. The clinically most important therapeutic categories, i.e., cancer, cardiovascular, respiratory, neurological, metabolic, urogenital, gastrointestinal, psychological, and infectious diseases were covered by the database. Names of test and control drugs, details on primary and secondary outcomes and indexed keywords could also be retrieved and built into the database. The construction used in the database enables the user to sort and download targeted information as a Microsoft Excel spreadsheet. Because of the comprehensive and standardized nature of the clinical drug trial database and its ease of access it should serve as valuable information repository and research tool for accessing clinical trial information and making evidence-based decisions by regulatory professionals, clinical research investigators, and medical practitioners.
Lazem, Shaimaa; Webster, Mary; Holmes, Wayne; Wolf, Motje
2015-01-01
Here we review 18 articles that describe the design and evaluation of 1 or more games for diabetes from technical, methodological, and theoretical perspectives. We undertook searches covering the period 2010 to May 2015 in the ACM, IEEE, Journal of Medical Internet Research, Studies in Health Technology and Informatics, and Google Scholar online databases using the keywords “children,” “computer games,” “diabetes,” “games,” “type 1,” and “type 2” in various Boolean combinations. The review sets out to establish, for future research, an understanding of the current landscape of digital games designed for children with diabetes. We briefly explored the use and impact of well-established learning theories in such games. The most frequently mentioned theoretical frameworks were social cognitive theory and social constructivism. Due to the limitations of the reported evaluation methodologies, little evidence was found to support the strong promise of games for diabetes. Furthermore, we could not establish a relation between design features and the game outcomes. We argue that an in-depth discussion about the extent to which learning theories could and should be manifested in the design decisions is required. PMID:26337753
Object-oriented analysis and design of an ECG storage and retrieval system integrated with an HIS.
Wang, C; Ohe, K; Sakurai, T; Nagase, T; Kaihara, S
1996-03-01
For a hospital information system, object-oriented methodology plays an increasingly important role, especially for the management of digitized data, e.g., the electrocardiogram, electroencephalogram, electromyogram, spirogram, X-ray, CT and histopathological images, which are not yet computerized in most hospitals. As a first step in an object-oriented approach to hospital information management and storing medical data in an object-oriented database, we connected electrocardiographs to a hospital network and established the integration of ECG storage and retrieval systems with a hospital information system. In this paper, the object-oriented analysis and design of the ECG storage and retrieval systems is reported.
Clifford, Anton; McCalman, Janya; Bainbridge, Roxanne; Tsey, Komla
2015-04-01
This article describes the characteristics and reviews the methodological quality of interventions designed to improve cultural competency in health care for Indigenous peoples of Australia, New Zealand, Canada and the USA. A total of 17 electronic databases and 13 websites for the period of 2002-13. Studies were included if they evaluated an intervention strategy designed to improve cultural competency in health care for Indigenous peoples of Australia, New Zealand, the USA or Canada. Information on the characteristics and methodological quality of included studies was extracted using standardized assessment tools. Sixteen published evaluations of interventions to improve cultural competency in health care for Indigenous peoples were identified: 11 for Indigenous peoples of the USA and 5 for Indigenous Australians. The main types of intervention strategies were education and training of the health workforce, culturally specific health programs and recruitment of an Indigenous health workforce. Main positive outcomes reported were improvements in health professionals' confidence, and patients' satisfaction with and access to health care. The methodological quality of evaluations and the reporting of key methodological criteria were variable. Particular problems included weak study designs, low or no reporting of consent rates, confounding and non-validated measurement instruments. There is a lack of evidence from rigorous evaluations on the effectiveness of interventions for improving cultural competency in health care for Indigenous peoples. Future evaluations should employ more rigorous study designs and extend their measurement of outcomes beyond those relating to health professionals, to those relating to the health of Indigenous peoples. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Assessing the impact of healthcare research: A systematic review of methodological frameworks
Keeley, Thomas J.; Calvert, Melanie J.
2017-01-01
Background Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Methods and findings Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) ‘primary research-related impact’, (2) ‘influence on policy making’, (3) ‘health and health systems impact’, (4) ‘health-related and societal impact’, and (5) ‘broader economic impact’. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. Conclusions The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research. PMID:28792957
Measurements and Predictions for a Distributed Exhaust Nozzle
NASA Technical Reports Server (NTRS)
Kinzie, Kevin W.; Brown, Martha C.; Schein, David B.; Solomon, W. David, Jr.
2001-01-01
The acoustic and aerodynamic performance characteristics of a distributed exhaust nozzle (DEN) design concept were evaluated experimentally and analytically with the purpose of developing a design methodology for developing future DEN technology. Aerodynamic and acoustic measurements were made to evaluate the DEN performance and the CFD design tool. While the CFD approach did provide an excellent prediction of the flowfield and aerodynamic performance characteristics of the DEN and 2D reference nozzle, the measured acoustic suppression potential of this particular DEN was low. The measurements and predictions indicated that the mini-exhaust jets comprising the distributed exhaust coalesced back into a single stream jet very shortly after leaving the nozzles. Even so, the database provided here will be useful for future distributed exhaust designs with greater noise reduction and aerodynamic performance potential.
Warwick, Peter D.; Verma, Mahendra K.; Attanasi, Emil; Olea, Ricardo A.; Blondes, Madalyn S.; Freeman, Philip; Brennan, Sean T.; Merrill, Matthew; Jahediesfanjani, Hossein; Roueche, Jacqueline; Lohr, Celeste D.
2017-01-01
The U.S. Geological Survey (USGS) has developed an assessment methodology for estimating the potential incremental technically recoverable oil resources resulting from carbon dioxide-enhanced oil recovery (CO2-EOR) in reservoirs with appropriate depth, pressure, and oil composition. The methodology also includes a procedure for estimating the CO2 that remains in the reservoir after the CO2-EOR process is complete. The methodology relies on a reservoir-level database that incorporates commercially available geologic and engineering data. The mathematical calculations of this assessment methodology were tested and produced realistic results for the Permian Basin Horseshoe Atoll, Upper Pennsylvanian-Wolfcampian Play (Texas, USA). The USGS plans to use the new methodology to conduct an assessment of technically recoverable hydrocarbons and associated CO2 sequestration resulting from CO2-EOR in the United States.
Mungall, Christopher J; Emmert, David B
2007-07-01
A few years ago, FlyBase undertook to design a new database schema to store Drosophila data. It would fully integrate genomic sequence and annotation data with bibliographic, genetic, phenotypic and molecular data from the literature representing a distillation of the first 100 years of research on this major animal model system. In developing this new integrated schema, FlyBase also made a commitment to ensure that its design was generic, extensible and available as open source, so that it could be employed as the core schema of any model organism data repository, thereby avoiding redundant software development and potentially increasing interoperability. Our question was whether we could create a relational database schema that would be successfully reused. Chado is a relational database schema now being used to manage biological knowledge for a wide variety of organisms, from human to pathogens, especially the classes of information that directly or indirectly can be associated with genome sequences or the primary RNA and protein products encoded by a genome. Biological databases that conform to this schema can interoperate with one another, and with application software from the Generic Model Organism Database (GMOD) toolkit. Chado is distinctive because its design is driven by ontologies. The use of ontologies (or controlled vocabularies) is ubiquitous across the schema, as they are used as a means of typing entities. The Chado schema is partitioned into integrated subschemas (modules), each encapsulating a different biological domain, and each described using representations in appropriate ontologies. To illustrate this methodology, we describe here the Chado modules used for describing genomic sequences. GMOD is a collaboration of several model organism database groups, including FlyBase, to develop a set of open-source software for managing model organism data. The Chado schema is freely distributed under the terms of the Artistic License (http://www.opensource.org/licenses/artistic-license.php) from GMOD (www.gmod.org).
Validation Database Based Thermal Analysis of an Advanced RPS Concept
NASA Technical Reports Server (NTRS)
Balint, Tibor S.; Emis, Nickolas D.
2006-01-01
Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.
Menditto, Enrica; Bolufer De Gea, Angela; Cahir, Caitriona; Marengoni, Alessandra; Riegler, Salvatore; Fico, Giuseppe; Costa, Elisio; Monaco, Alessandro; Pecorelli, Sergio; Pani, Luca; Prados-Torres, Alexandra
2016-01-01
Computerized health care databases have been widely described as an excellent opportunity for research. The availability of “big data” has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on “adherence to prescription and medical plans” identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners with the aim of improving data sharing on a European level. A total of six databases belonging to three different European countries (Spain, Republic of Ireland, and Italy) were included in the analysis. Preliminary results suggest that there are some similarities. However, these results should be applied in different contexts and European countries, supporting the idea that large European studies should be designed in order to get the most of already available databases. PMID:27358570
NASA Astrophysics Data System (ADS)
Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.
2014-04-01
Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.
Deep Borehole Emplacement Mode Hazard Analysis Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David
This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less
An Overview of Meta-Analyses of Danhong Injection for Unstable Angina.
Zhang, Xiaoxia; Wang, Hui; Chang, Yanxu; Wang, Yuefei; Lei, Xiang; Fu, Shufei; Zhang, Junhua
2015-01-01
Objective. To systematically collect evidence and evaluate the effects of Danhong injection (DHI) for unstable angina (UA). Methods. A comprehensive search was conducted in seven electronic databases up to January 2015. The methodological and reporting quality of included studies was assessed by using AMSTAR and PRISMA. Result. Five articles were included. The conclusions suggest that DHI plus conventional medicine treatment was effective for UA pectoris treatment, could alleviate symptoms of angina and ameliorate electrocardiograms. Flaws of the original studies and systematic reviews weaken the strength of evidence. Limitations of the methodology quality include performing an incomprehensive literature search, lacking detailed characteristics, ignoring clinical heterogeneity, and not assessing publication bias and other forms of bias. The flaws of reporting systematic reviews included the following: not providing a structured summary, no standardized search strategy. For the pooled findings, researchers took statistical heterogeneity into consideration, but clinical and methodology heterogeneity were ignored. Conclusion. DHI plus conventional medicine treatment generally appears to be effective for UA treatment. However, the evidence is not hard enough due to methodological flaws in original clinical trials and systematic reviews. Furthermore, rigorous designed randomized controlled trials are also needed. The methodology and reporting quality of systematic reviews should be improved.
An Overview of Meta-Analyses of Danhong Injection for Unstable Angina
Zhang, Xiaoxia; Chang, Yanxu; Wang, Yuefei; Lei, Xiang; Fu, Shufei; Zhang, Junhua
2015-01-01
Objective. To systematically collect evidence and evaluate the effects of Danhong injection (DHI) for unstable angina (UA). Methods. A comprehensive search was conducted in seven electronic databases up to January 2015. The methodological and reporting quality of included studies was assessed by using AMSTAR and PRISMA. Result. Five articles were included. The conclusions suggest that DHI plus conventional medicine treatment was effective for UA pectoris treatment, could alleviate symptoms of angina and ameliorate electrocardiograms. Flaws of the original studies and systematic reviews weaken the strength of evidence. Limitations of the methodology quality include performing an incomprehensive literature search, lacking detailed characteristics, ignoring clinical heterogeneity, and not assessing publication bias and other forms of bias. The flaws of reporting systematic reviews included the following: not providing a structured summary, no standardized search strategy. For the pooled findings, researchers took statistical heterogeneity into consideration, but clinical and methodology heterogeneity were ignored. Conclusion. DHI plus conventional medicine treatment generally appears to be effective for UA treatment. However, the evidence is not hard enough due to methodological flaws in original clinical trials and systematic reviews. Furthermore, rigorous designed randomized controlled trials are also needed. The methodology and reporting quality of systematic reviews should be improved. PMID:26539221
Report on FY17 testing in support of integrated EPP-SMT design methods development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yanli .; Jetter, Robert I.; Sham, T. -L.
The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The purpose of this methodology is to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, thermomechanical tests continued in FY17. Thismore » report presents the recent test results for Type 1 SMT specimens on Alloy 617 with long hold times, pressurization SMT on Alloy 617, and two-bar thermal ratcheting test results on SS316H at the temperature range of 405 °C to 705 °C. Preliminary EPP strain range analysis on the two-bar tests are critically evaluated and compared with the experimental results.« less
Automating the Generation of the Cassini Tour Atlas Database
NASA Technical Reports Server (NTRS)
Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.
2010-01-01
The Tour Atlas is a large database of geometrical tables, plots, and graphics used by Cassini science planning engineers and scientists primarily for science observation planning. Over time, as the contents of the Tour Atlas grew, the amount of time it took to recreate the Tour Atlas similarly grew--to the point that it took one person a week of effort. When Cassini tour designers estimated that they were going to create approximately 30 candidate Extended Mission trajectories--which needed to be analyzed for science return in a short amount of time--it became a necessity to automate. We report on the automation methodology that reduced the amount of time it took one person to (re)generate a Tour Atlas from a week to, literally, one UNIX command.
Rezaei-Hachesu, Peyman; Samad-Soltani, Taha; Yaghoubi, Sajad; GhaziSaeedi, Marjan; Mirnia, Kayvan; Masoumi-Asl, Hossein; Safdari, Reza
2018-07-01
Neonatal intensive care units (NICUs) have complex patients in terms of their diagnoses and required treatments. Antimicrobial treatment is a common therapy for patients in NICUs. To solve problems pertaining to empirical therapy, antimicrobial stewardship programs have recently been introduced. Despite the success of these programs in terms of data collection, there is still inefficiency in terms of analyzing and reporting the data. Thus, to successfully implement these stewardship programs, the design of antimicrobial resistance (AMR) surveillance systems is recommended as a first step. As a result, this study aimed to design an AMR surveillance system for use in the NICUs in northwestern Iranian hospitals to cover these information gaps. The recommended system is compatible with the World Health Organization (WHO) guidelines. The business intelligence (BI) requirements were extracted in an interview with a product owner (PO) using a valid and reliable checklist. Following this, an AMR surveillance system was designed and evaluated in relation to user experiences via a user experience questionnaire (UEQ). Finally, an association analysis was performed on the database, and the results were reported by identifying the important multidrug resistances in the database. A customized software development methodology was proposed. The three major modules of the AMR surveillance are the data registry, dashboard, and decision support modules. The data registry module was implemented based on a three-tier architecture, and the Clinical Decision Support System (CDSS) and dashboard modules were designed based on the BI requirements of the Scrum product owner (PO). The mean values of UEQ measures were in a good range. This measures showed the suitable usability of the AMR surveillance system. Applying efficient software development methodologies allows for the systems' compatibility with users' opinions and requirements. In addition, the construction of interdisciplinary communication models for research and software engineering allows for research and development concepts to be used in operational environments. Copyright © 2018 Elsevier B.V. All rights reserved.
Comparison of flavonoid intake assessment methods.
Ivey, Kerry L; Croft, Kevin; Prince, Richard L; Hodgson, Jonathan M
2016-09-14
Flavonoids are a diverse group of polyphenolic compounds found in high concentrations in many plant foods and beverages. High flavonoid intake has been associated with reduced risk of chronic disease. To date, population based studies have used the United States Department of Agriculture (USDA) food content database to determine habitual flavonoid intake. More recently, a new flavonoid food content database, Phenol-Explorer (PE), has been developed. However, the level of agreement between the two databases is yet to be explored. To compare the methods used to create each database, and to explore the level of agreement between the flavonoid intake estimates derived from USDA and PE data. The study population included 1063 randomly selected women aged over 75 years. Two separate intake estimates were determined using food composition data from the USDA and the PE databases. There were many similarities in methods used to create each database; however, there are several methodological differences that manifest themselves in differences in flavonoid intake estimates between the 2 databases. Despite differences in net estimates, there was a strong level of agreement between total-flavonoid, flavanol, flavanone and anthocyanidin intake estimates derived from each database. Intake estimates for flavanol monomers showed greater agreement than flavanol polymers. The level of agreement between the two databases was the weakest for the flavonol and flavone intake estimates. In this population, the application of USDA and PE source data yielded highly correlated intake estimates for total-flavonoids, flavanols, flavanones and anthocyanidins. For these sub-classes, the USDA and PE databases may be used interchangeably in epidemiological investigations. There was poorer correlation between intake estimates for flavonols and flavones due to differences in USDA and PE methodologies. Individual flavonoid compound groups that comprise flavonoid sub-classes had varying levels of agreement. As such, when determining the appropriate database to calculate flavonoid intake variables, it is important to consider methodologies underpinning database creation and which foods are important contributors to dietary intake in the population of interest.
Long-Term Durability Analysis of a 100,000+ Hr Stirling Power Convertor Heater Head
NASA Technical Reports Server (NTRS)
Bartolotta, Paul A.; Bowman, Randy R.; Krause, David L.; Halford, Gary R.
2000-01-01
DOE and NASA have identified Stirling Radioisotope Power Systems (SRPS) as the power supply for deep space exploration missions the Europa Orbiter and Solar Probe. As a part of this effort, NASA has initiated a long-term durability project for critical hot section components of the Stirling power convertor to qualify flight hardware. This project will develop a life prediction methodology that utilizes short-term (t < 20,000 hr) test data to verify long-term (t > 100,000 hr) design life. The project consists of generating a materials database for the specific heat of alloy, evaluation of critical hermetic sealed joints, life model characterization, and model verification. This paper will describe the qualification methodology being developed and provide a status for this effort.
Subjective comparison and evaluation of speech enhancement algorithms
Hu, Yi; Loizou, Philipos C.
2007-01-01
Making meaningful comparisons between the performance of the various speech enhancement algorithms proposed over the years, has been elusive due to lack of a common speech database, differences in the types of noise used and differences in the testing methodology. To facilitate such comparisons, we report on the development of a noisy speech corpus suitable for evaluation of speech enhancement algorithms. This corpus is subsequently used for the subjective evaluation of 13 speech enhancement methods encompassing four classes of algorithms: spectral subtractive, subspace, statistical-model based and Wiener-type algorithms. The subjective evaluation was performed by Dynastat, Inc. using the ITU-T P.835 methodology designed to evaluate the speech quality along three dimensions: signal distortion, noise distortion and overall quality. This paper reports the results of the subjective tests. PMID:18046463
FY16 Progress Report on Test Results In Support Of Integrated EPP and SMT Design Methods Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yanli; Jetter, Robert I.; Sham, T. -L.
2016-08-08
The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating an SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. This report presents the recent test results for Type 2 SMT specimens on Alloy 617, Pressurization SMT on Alloy 617, Type 1 SMT on Gr. 91, and two-barmore » thermal ratcheting test results on Alloy 617 with a new thermal loading profile.« less
Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu
2006-01-01
To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.
Online Database Coverage of Pharmaceutical Journals.
ERIC Educational Resources Information Center
Snow, Bonnie
1984-01-01
Describes compilation of data concerning pharmaceutical journal coverage in online databases which aid information providers in collection development and database selection. Methodology, results (a core collection, overlap, timeliness, geographic scope), and implications are discussed. Eight references and a list of 337 journals indexed online in…
Gavrielides, Marios A.; Kinnard, Lisa M.; Myers, Kyle J.; Peregoy, Jennifer; Pritchard, William F.; Zeng, Rongping; Esparza, Juan; Karanian, John; Petrick, Nicholas
2010-01-01
A number of interrelated factors can affect the precision and accuracy of lung nodule size estimation. To quantify the effect of these factors, we have been conducting phantom CT studies using an anthropomorphic thoracic phantom containing a vasculature insert to which synthetic nodules were inserted or attached. Ten repeat scans were acquired on different multi-detector scanners, using several sets of acquisition and reconstruction protocols and various nodule characteristics (size, shape, density, location). This study design enables both bias and variance analysis for the nodule size estimation task. The resulting database is in the process of becoming publicly available as a resource to facilitate the assessment of lung nodule size estimation methodologies and to enable comparisons between different methods regarding measurement error. This resource complements public databases of clinical data and will contribute towards the development of procedures that will maximize the utility of CT imaging for lung cancer screening and tumor therapy evaluation. PMID:20640011
Draft secure medical database standard.
Pangalos, George
2002-01-01
Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.
Meulepas, Johanna M; Ronckers, Cécile M; Smets, Anne M J B; Nievelstein, Rutger A J; Jahnen, Andreas; Lee, Choonsik; Kieft, Mariëtte; Laméris, Johan S; van Herk, Marcel; Greuter, Marcel J W; Jeukens, Cécile R L P N; van Straten, Marcel; Visser, Otto; van Leeuwen, Flora E; Hauptmann, Michael
2014-04-01
Computed tomography (CT) scans are indispensable in modern medicine; however, the spectacular rise in global use coupled with relatively high doses of ionizing radiation per examination have raised radiation protection concerns. Children are of particular concern because they are more sensitive to radiation-induced cancer compared with adults and have a long lifespan to express harmful effects which may offset clinical benefits of performing a scan. This paper describes the design and methodology of a nationwide study, the Dutch Pediatric CT Study, regarding risk of leukemia and brain tumors in children after radiation exposure from CT scans. It is a retrospective record-linkage cohort study with an expected number of 100,000 children who received at least one electronically archived CT scan covering the calendar period since the introduction of digital archiving until 2012. Information on all archived CT scans of these children will be obtained, including date of examination, scanned body part and radiologist's report, as well as the machine settings required for organ dose estimation. We will obtain cancer incidence by record linkage with external databases. In this article, we describe several approaches to the collection of data on archived CT scans, the estimation of radiation doses and the assessment of confounding. The proposed approaches provide useful strategies for data collection and confounder assessment for general retrospective record-linkage studies, particular those using hospital databases on radiological procedures for the assessment of exposure to ionizing or non-ionizing radiation.
Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H
2015-01-08
Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.
Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH
2015-01-01
Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783
EQUIP: A European Survey of Quality Criteria for the Evaluation of Databases.
ERIC Educational Resources Information Center
Wilson, T. D.
1998-01-01
Reports on two stages of an investigation into the perceived quality of online databases. Presents data from 989 questionnaires from 600 database users in 12 European and Scandinavian countries and results of a test of the SERVQUAL methodology for identifying user expectations about database services. Lists statements used in the SERVQUAL survey.…
Jet aircraft engine emissions database development: 1992 military, charter, and nonscheduled traffic
NASA Technical Reports Server (NTRS)
Metwally, Munir
1995-01-01
Studies relating to environmental emissions database for the military, charter, and non-scheduled traffic for the year 1992 were conducted by McDonnell Douglas Aerospace Transport Aircraft. The report also includes a comparison with a previous emission database for year 1990. Discussions of the methodology used in formulating these databases are provided.
Development of a Water Infrastructure Knowledge Database
This paper presents a methodology for developing a national database, as applied to water infrastructure systems, which includes both drinking water and wastewater. The database is branded as "WATERiD" and can be accessed at www.waterid.org. Water infrastructure in the U.S. is ag...
Theory-based interventions in physical activity: a systematic review of literature in Iran.
Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya
2014-11-30
Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested.
MicroRNAs for osteosarcoma in the mouse: a meta-analysis
Chang, Junli; Yao, Min; Li, Yimian; Zhao, Dongfeng; Hu, Shaopu; Cui, Xuejun; Liu, Gang; Shi, Qi; Wang, Yongjun; Yang, Yanping
2016-01-01
Osteosarcoma (OS) is the most common primary malignant bone carcinoma with high morbidity that happens mainly in children and young adults. As the key components of gene-regulatory networks, microRNAs (miRNAs) control many critical pathophysiological processes, including initiation and progression of cancers. The objective of this study is to summarize and evaluate the potential of miRNAs as targets for prevention and treatment of OS in mouse models, and to explore the methodological quality of current studies. We searched PubMed, Web of Science, Embase, Wan Fang Database, VIP Database, China Knowledge Resource Integrated Database, and Chinese BioMedical since their beginning date to 10 May 2016. Two reviewers separately screened the controlled studies, which estimate the effects of miRNAs on osteosarcoma in mice. A pair-wise analysis was performed. Thirty six studies with enough randomization were selected and included in the meta-analysis. We found that blocking oncogenic or restoring decreased miRNAs in cancer cells could significantly suppress the progression of OS in vivo, as assessed by tumor volume and tumor weight. This meta-analysis suggests that miRNAs are potential therapeutic targets for OS and correction of the altered expression of miRNAs significantly suppresses the progression of OS in mouse models, however, the overall methodological quality of studies included here was low, and more animal studies with the rigourous design must be carried out before a miRNA-based treatment could be translated from animal studies to clinical trials. PMID:27852052
Systematic review of the methodological and reporting quality of case series in surgery.
Agha, R A; Fowler, A J; Lee, S-Y; Gundogan, B; Whitehurst, K; Sagoo, H K; Jeong, K J L; Altman, D G; Orgill, D P
2016-09-01
Case series are an important and common study type. No guideline exists for reporting case series and there is evidence of key data being missed from such reports. The first step in the process of developing a methodologically sound reporting guideline is a systematic review of literature relevant to the reporting deficiencies of case series. A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, Embase, Cochrane Methods Register, Science Citation Index and Conference Proceedings Citation index, from the start of indexing to 5 November 2014. Independent screening, eligibility assessments and data extraction were performed. Included articles were then analysed for five areas of deficiency: failure to use standardized definitions, missing or selective data (including the omission of whole cases or important variables), transparency or incomplete reporting, whether alternative study designs were considered, and other issues. Database searching identified 2205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequencies of methodological and reporting issues identified were: failure to use standardized definitions (57 per cent), missing or selective data (66 per cent), transparency or incomplete reporting (70 per cent), whether alternative study designs were considered (11 per cent) and other issues (52 per cent). The methodological and reporting quality of surgical case series needs improvement. The data indicate that evidence-based guidelines for the conduct and reporting of case series may be useful. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.
Effects of traditional Chinese patent medicine on essential hypertension: a systematic review.
Xiong, Xingjiang; Wang, Pengqian; Zhang, Yuqing; Li, Xiaoke
2015-02-01
Traditional Chinese patent medicine (TCPM) is widely used for essential hypertension (EH) in China. However, there is no critically appraised evidence, such as systematic reviews or meta-analyses, regarding the potential benefits and disadvantages of TCPM to justify their clinical use and recommendation. The aim of this review was to systematically evaluate and meta-analyze the effects of TCPM for EH. Seven databases, the Cochrane Library, PubMed, EMBASE, the China National Knowledge Infrastructure, the Chinese Scientific Journal Database, the Chinese Biomedical Literature Database, and the Wanfang Database, were searched from their inception to August 2014 for relevant studies that compared one TCPM plus antihypertensive drugs versus antihypertensive drugs alone. The methodological quality of the included trials was assessed using the Cochrane risk-of-bias tool. The primary outcome measures were mortality or progression to severe complications and adverse events. The secondary outcome measures were blood pressure (BP) and quality of life (QOL). Seventy-three trials, which included 8138 patients, on 17 TCPMs were included. In general, the methodological quality was low. Two trials evaluated the effects of TCPMs on mortality and the progression to severe complications after treatment, and no significant difference was identified compared with antihypertensive drugs alone. No severe adverse events were reported. Thirteen TCPMs used in complementary therapy significantly decreased systolic BP by 3.94 to 13.50 mmHg and diastolic BP by 2.28 to 11.25 mmHg. QOL was significantly improved by TCPM plus antihypertensive drugs compared with antihypertensive drugs alone. This systematic review provided the first classification of clinical evidence for the effectiveness of TCPM for EH. The usage of TCPMs for EH was supported by evidence of class level III. As a result of the methodological drawbacks of the included studies, more rigorously designed randomized controlled trials that focus on mortality and cardiovascular events during long-term follow-up are warranted before TCPM can be recommended for hypertensive patients. Two TCPMs, Song ling xue mai kang capsules and Yang xue qing nao granules, should be prioritized for further research.
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas M., Jr.; Roelofs, Larry H.; Dorfman, Erik
1991-01-01
A methodology for optimizing organization of data obtained by NASA earth and space missions is discussed. The methodology uses a concept based on semantic data modeling techniques implemented in a hierarchical storage model. The modeling is used to organize objects in mass storage devices, relational database systems, and object-oriented databases. The semantic data modeling at the metadata record level is examined, including the simulation of a knowledge base and semantic metadata storage issues. The semantic data model hierarchy and its application for efficient data storage is addressed, as is the mapping of the application structure to the mass storage.
Hoskin, Jordan D; Miyatani, Masae; Craven, B Catharine
2017-03-30
Carotid intima-media thickness (cIMT) may be used increasingly as a cardiovascular disease (CVD) screening tool in individuals with spinal cord injury (SCI) as other routine invasive diagnostic tests are often unfeasible. However, variation in cIMT acquisition and analysis methods is an issue in the current published literature. The growth of the field is dependent on cIMT quality acquisition and analysis to ensure accurate reporting of CVD risk. The purpose of this study is to evaluate the quality of the reported methodology used to collect cIMT values in SCI. Data from 12 studies, which measured cIMT in individuals with SCI, were identified from the Medline, Embase and CINAHL databases. The quality of the reported methodologies was scored based on adherence to cIMT methodological guidelines abstracted from two consensus papers. Five studies were scored as 'moderate quality' in methodological reporting, having specified 9 to 11 of 15 quality reporting criterion. The remaining seven studies were scored as 'low quality', having reported less than 9 of 15 quality reporting criterion. No study had methodological reporting that was scored as 'high quality'. The overall reporting of quality methodology was poor in the published SCI literature. A greater adherence to current methodological guidelines is needed to advance the field of cIMT in SCI. Further research is necessary to refine cIMT acquisition and analysis guidelines to aid authors designing research and journals in screening manuscripts for publication.
Full value documentation in the Czech Food Composition Database.
Machackova, M; Holasova, M; Maskova, E
2010-11-01
The aim of this project was to launch a new Food Composition Database (FCDB) Programme in the Czech Republic; to implement a methodology for food description and value documentation according to the standards designed by the European Food Information Resource (EuroFIR) Network of Excellence; and to start the compilation of a pilot FCDB. Foods for the initial data set were selected from the list of foods included in the Czech Food Consumption Basket. Selection of 24 priority components was based on the range of components used in former Czech tables. The priority list was extended with components for which original Czech analytical data or calculated data were available. Values that were input into the compiled database were documented according to the EuroFIR standards within the entities FOOD, COMPONENT, VALUE and REFERENCE using Excel sheets. Foods were described using the LanguaL Thesaurus. A template for documentation of data according to the EuroFIR standards was designed. The initial data set comprised documented data for 162 foods. Values were based on original Czech analytical data (available for traditional and fast foods, milk and milk products, wheat flour types), data derived from literature (for example, fruits, vegetables, nuts, legumes, eggs) and calculated data. The Czech FCDB programme has been successfully relaunched. Inclusion of the Czech data set into the EuroFIR eSearch facility confirmed compliance of the database format with the EuroFIR standards. Excel spreadsheets are applicable for full value documentation in the FCDB.
Coupling Computer-Aided Process Simulation and ...
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable
Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L
2014-01-01
Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.
Evaluation of Database Coverage: A Comparison of Two Methodologies.
ERIC Educational Resources Information Center
Tenopir, Carol
1982-01-01
Describes experiment which compared two techniques used for evaluating and comparing database coverage of a subject area, e.g., "bibliography" and "subject profile." Differences in time, cost, and results achieved are compared by applying techniques to field of volcanology using two databases, Geological Reference File and GeoArchive. Twenty…
Measuring use patterns of online journals and databases
De Groote, Sandra L.; Dorsch, Josephine L.
2003-01-01
Purpose: This research sought to determine use of online biomedical journals and databases and to assess current user characteristics associated with the use of online resources in an academic health sciences center. Setting: The Library of the Health Sciences–Peoria is a regional site of the University of Illinois at Chicago (UIC) Library with 350 print journals, more than 4,000 online journals, and multiple online databases. Methodology: A survey was designed to assess online journal use, print journal use, database use, computer literacy levels, and other library user characteristics. A survey was sent through campus mail to all (471) UIC Peoria faculty, residents, and students. Results: Forty-one percent (188) of the surveys were returned. Ninety-eight percent of the students, faculty, and residents reported having convenient access to a computer connected to the Internet. While 53% of the users indicated they searched MEDLINE at least once a week, other databases showed much lower usage. Overall, 71% of respondents indicated a preference for online over print journals when possible. Conclusions: Users prefer online resources to print, and many choose to access these online resources remotely. Convenience and full-text availability appear to play roles in selecting online resources. The findings of this study suggest that databases without links to full text and online journal collections without links from bibliographic databases will have lower use. These findings have implications for collection development, promotion of library resources, and end-user training. PMID:12883574
Statistical Design Model (SDM) of satellite thermal control subsystem
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi
2016-07-01
Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware
Discipline and Methodology in Higher Education Research
ERIC Educational Resources Information Center
Tight, Malcolm
2013-01-01
Higher education research is a multidisciplinary field, engaging researchers from across the academy who make use of a wide range of methodological approaches. This article examines the relation between discipline and methodology in higher education research, analysing a database of 567 articles published in 15 leading higher education journals…
Yang, Min; Jiang, Li; Wang, Aihong; Xu, Guihua
2017-02-01
To evaluate the epidemiological characteristics, reporting characteristics, and methodological quality of systematic reviews in the traditional Chinese medicine nursing field published in Chinese journals. The number of systematic reviews in the traditional Chinese medicine nursing field has increased, but their epidemiology, quality, and reporting characteristics have not been assessed completely. We generated an overview of reviews using a narrative approach. Four Chinese databases were searched for systematic reviews from inception to December 2015. The Preferred Reporting Items of Systematic Reviews and Meta-analyses and the Assessment of Multiple Systematic Reviews checklists were adopted to evaluate reporting and methodological quality, respectively. A total of 73 eligible systematic reviews, published from 2005 to 2015, were included. The deficiencies in reporting characteristics mainly lay in the lack of structured abstract or protocol, incomplete reporting of search strategies, study selection, and risk of bias. The deficiencies in methodological quality were reflected in the lack of a priori design and conflict of interest, incomplete literature searches, and assessment of publication bias. The quality of the evaluated reviews was unsatisfactory; attention should be paid to the improvement of reporting and methodological quality in the conduct of systematic reviews. © 2016 John Wiley & Sons Australia, Ltd.
Shyam, Sangeetha; Wai, Tony Ng Kock; Arshad, Fatimah
2012-01-01
This paper outlines the methodology to add glycaemic index (GI) and glycaemic load (GL) functionality to food DietPLUS, a Microsoft Excel-based Malaysian food composition database and diet intake calculator. Locally determined GI values and published international GI databases were used as the source of GI values. Previously published methodology for GI value assignment was modified to add GI and GL calculators to the database. Two popular local low GI foods were added to the DietPLUS database, bringing up the total number of foods in the database to 838 foods. Overall, in relation to the 539 major carbohydrate foods in the Malaysian Food Composition Database, 243 (45%) food items had local Malaysian values or were directly matched to International GI database and another 180 (33%) of the foods were linked to closely-related foods in the GI databases used. The mean ± SD dietary GI and GL of the dietary intake of 63 women with previous gestational diabetes mellitus, calculated using DietPLUS version3 were, 62 ± 6 and 142 ± 45, respectively. These values were comparable to those reported from other local studies. DietPLUS version3, a simple Microsoft Excel-based programme aids calculation of diet GI and GL for Malaysian diets based on food records.
Minshull, Claire; Gleeson, Nigel
2017-09-01
To evaluate the methodologic quality of resistance training interventions for the management of knee osteoarthritis. A search of the literature for studies published up to August 10, 2015, was performed on MEDLINE (OVID platform), PubMed, Embase, and Physiotherapy Evidence Database databases. Search terms associated with osteoarthritis, knee, and muscle resistance exercise were used. Studies were included in the review if they were published in the English language and met the following criteria: (1) muscle resistance training was the primary intervention; (2) randomized controlled trial design; (3) treatment arms included at least a muscle conditioning intervention and a nonexercise group; and (4) participants had osteoarthritis of the knee. Studies using preoperative (joint replacement) interventions with only postoperative outcomes were excluded. The search yielded 1574 results. The inclusion criteria were met by 34 studies. Two reviewers independently screened the articles for eligibility. Critical appraisal of the methodology was assessed according to the principles of resistance training and separately for the reporting of adherence using a specially designed scoring system. A rating for each article was assigned. There were 34 studies that described a strength training focus of the intervention; however, the principles of resistance training were inconsistently applied and inadequately reported across all. Methods for adherence monitoring were incorporated into the design of 28 of the studies, but only 13 reported sufficient detail to estimate average dose of exercise. These findings affect the interpretation of the efficacy of muscle resistance exercise in the management of knee osteoarthritis. Clinicians and health care professionals cannot be confident whether nonsignificant findings are because of the lack of efficacy of muscle resistance interventions, or occur through limitations in treatment prescription and patient adherence. Future research that seeks to evaluate the effects of muscle strength training interventions on symptoms of osteoarthritis should be properly designed and adherence diligently reported. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Carter, Alexander W; Mandavia, Rishi; Mayer, Erik; Marti, Joachim; Mossialos, Elias; Darzi, Ara
2017-01-01
Introduction Recent avoidable failures in patient care highlight the ongoing need for evidence to support improvements in patient safety. According to the most recent reviews, there is a dearth of economic evidence related to patient safety. These reviews characterise an evidence gap in terms of the scope and quality of evidence available to support resource allocation decisions. This protocol is designed to update and improve on the reviews previously conducted to determine the extent of methodological progress in economic analyses in patient safety. Methods and analysis A broad search strategy with two core themes for original research (excluding opinion pieces and systematic reviews) in ‘patient safety’ and ‘economic analyses’ has been developed. Medline, Econlit and National Health Service Economic Evaluation Database bibliographic databases will be searched from January 2007 using a combination of medical subject headings terms and research-derived search terms (see table 1). The method is informed by previous reviews on this topic, published in 2012. Screening, risk of bias assessment (using the Cochrane collaboration tool) and economic evaluation quality assessment (using the Drummond checklist) will be conducted by two independent reviewers, with arbitration by a third reviewer as needed. Studies with a low risk of bias will be assessed using the Drummond checklist. High-quality economic evaluations are those that score >20/35. A qualitative synthesis of evidence will be performed using a data collection tool to capture the study design(s) employed, population(s), setting(s), disease area(s), intervention(s) and outcome(s) studied. Methodological quality scores will be compared with previous reviews where possible. Effect size(s) and estimate uncertainty will be captured and used in a quantitative synthesis of high-quality evidence, where possible. Ethics and dissemination Formal ethical approval is not required as primary data will not be collected. The results will be disseminated through a peer-reviewed publication, presentations and social media. Trial registration number CRD42017057853. PMID:28821527
Zhan, Jie; Pan, Ruihuan; Zhou, Mingchao; Tan, Feng; Huang, Zhen; Dong, Jing; Wen, Zehuai
2018-01-01
Objectives To assess the effectiveness and safety of electroacupuncture (EA) combined with rehabilitation therapy (RT) and/or conventional drugs (CD) for improving poststroke motor dysfunction (PSMD). Design Systematic review and meta-analysis. Methods The China National Knowledge Infrastructure, Chinese Biological Medicine Database, Chinese Scientific Journal Database, Cochrane Library, Medline and Embase were electronically searched from inception to December 2016. The methodological quality of the included trials was assessed using the Cochrane risk of bias assessment tool. Statistical analyses were performed by RevMan V.5.3 and Stata SE V.11.0. Results Nineteen trials with 1434 participants were included for qualitative synthesis and meta-analysis. The methodological quality of the included trials was generally poor. The meta-analysis indicated that the EA group might be benefiting more than the non-EA group in terms of the changes in the Fugl-Meyer Assessment Scale (FMA) (weighted mean difference (WMD): 10.79, 95% CI 6.39 to 15.20, P<0.001), FMA for lower extremity (WMD: 5.16, 95% CI 3.78 to 6.54, P<0.001) and activities of daily living (standardised mean difference: 1.37, 95% CI 0.79 to 1.96, P<0.001). However, there was no difference between EA and non-EA groups in terms of the effective rate (relative risk: 1.13, 95% CI 1.00 to 1.27, P=0.050). Moreover, there were not any reports of side effects due to EA combined with RT and/or CD in the included trials. Conclusions This review provides new evidence for the effectiveness and safety of EA combined with RT and/or CD for PSMD. However, the results should be interpreted cautiously because of methodological weakness and publication bias. Further clinical trials with a rigorous design and large sample sizes are warranted. PROSPERO registration number CRD42016037597. PMID:29371267
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraile-Garcia, Esteban, E-mail: esteban.fraile@unirioja.es; Ferreiro-Cabello, Javier, E-mail: javier.ferreiro@unirioja.es; Qualiberica S.L.
The European Committee for Standardization (CEN) through its Technical Committee CEN/TC-350 is developing a series of standards for assessing the building sustainability, at both product and building levels. The practical application of the selection (decision making) of structural alternatives made by one-way slabs leads to an intermediate level between the product and the building. Thus the present study addresses this problem of decision making, following the CEN guidelines and incorporating relevant aspects of architectural design into residential construction. A life cycle assessment (LCA) is developed in order to obtain valid information for the decision making process (the LCA was developedmore » applying CML methodology although Ecoindicator99 was used in order to facilitate the comparison of the values); this information (the carbon footprint values) is contrasted with other databases and with the information from the Environmental Product Declaration (EPD) of one of the lightening materials (expanded polystyrene), in order to validate the results. Solutions of different column disposition and geometries are evaluated in the three pillars of sustainable construction on residential construction: social, economic and environmental. The quantitative analysis of the variables used in this study enables and facilitates an objective comparison in the design stage by a responsible technician; the application of the proposed methodology reduces the possible solutions to be evaluated by the expert to 12.22% of the options in the case of low values of the column index and to 26.67% for the highest values. - Highlights: • Methodology for selection of structural alternatives in buildings with one-way slabs • Adapted to CEN guidelines (CEN/TC-350) for assessing the building sustainability • LCA is developed in order to obtain valid information for the decision making process. • Results validated comparing carbon footprint, databases and Env. Product Declarations • The proposal reduces the solutions to be evaluated to between 12.22 and 26.67%.« less
Scheduled Civil Aircraft Emission Inventories for 1999: Database Development and Analysis
NASA Technical Reports Server (NTRS)
Sutkus, Donald J., Jr.; Baughcum, Steven L.; DuBois, Douglas P.
2001-01-01
This report describes the development of a three-dimensional database of aircraft fuel burn and emissions (NO(x), CO, and hydrocarbons) for the scheduled commercial aircraft fleet for each month of 1999. Global totals of emissions and fuel burn for 1999 are compared to global totals from 1992 and 2015 databases. 1999 fuel burn, departure and distance totals for selected airlines are compared to data reported on DOT Form 41 to evaluate the accuracy of the calculations. DOT Form T-100 data were used to determine typical payloads for freighter aircraft and this information was used to model freighter aircraft more accurately by using more realistic payloads. Differences in the calculation methodology used to create the 1999 fuel burn and emissions database from the methodology used in previous work are described and evaluated.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
Post-OPC verification using a full-chip pattern-based simulation verification method
NASA Astrophysics Data System (ADS)
Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary
2005-11-01
In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.
Mapping the literature of nurse practitioners.
Shams, Marie-Lise Antoun
2006-04-01
This study was designed to identify core journals for the nurse practitioner specialty and to determine the extent of their indexing in bibliographic databases. As part of a larger project for mapping the literature of nursing, this study followed a common methodology based on citation analysis. Four journals designated by nurse practitioners as sources for their practice information were selected. All cited references were analyzed to determine format types and publication years. Bradford's Law of Scattering was applied to identify core journals. Nine bibliographic databases were searched to estimate the index coverage of the core titles. The findings indicate that nurse practitioners rely primarily on journals (72.0%) followed by books (20.4%) for their professional knowledge. The majority of the identified core journals belong to non-nursing disciplines. This is reflected in the indexing coverage results: PubMed/MEDLINE more comprehensively indexes the core titles than CINAHL does. Nurse practitioners, as primary care providers, consult medical as well as nursing sources for their information. The implications of the citation analysis findings are significant for collection development librarians and indexing services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANDELL, JOHN F.; SAMBORSKY, DANIEL D.; CAIRNS, DOUGLAS
This report presents the major findings of the Montana State University Composite Materials Fatigue Program from 1997 to 2001, and is intended to be used in conjunction with the DOE/MSU Composite Materials Fatigue Database. Additions of greatest interest to the database in this time period include environmental and time under load effects for various resin systems; large tow carbon fiber laminates and glass/carbon hybrids; new reinforcement architectures varying from large strands to prepreg with well-dispersed fibers; spectrum loading and cumulative damage laws; giga-cycle testing of strands; tough resins for improved structural integrity; static and fatigue data for interply delamination; andmore » design knockdown factors due to flaws and structural details as well as time under load and environmental conditions. The origins of a transition to increased tensile fatigue sensitivity with increasing fiber content are explored in detail for typical stranded reinforcing fabrics. The second focus of the report is on structural details which are prone to delamination failure, including ply terminations, skin-stiffener intersections, and sandwich panel terminations. Finite element based methodologies for predicting delamination initiation and growth in structural details are developed and validated, and simplified design recommendations are presented.« less
European Healthy Cities evaluation: conceptual framework and methodology.
de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola
2015-06-01
This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Food Patterns Equivalents Database 2011-12: Methodology and User Guide
USDA-ARS?s Scientific Manuscript database
The purpose of developing the Food Patterns Equivalents Database (FPED) 2011-12 is to convert the 8,251 foods in the Food and Nutrients Database for Dietary Studies (FNDDS) 2011-12 used for the What We Eat in America, National Health and Nutrition Examination Survey (WWEIA, NHANES) 2011-12 to 37 USD...
Food Patterns Equivalents Database 2005-2006: Methodology and User Guide
USDA-ARS?s Scientific Manuscript database
The purpose of developing the Food Patterns Equivalents Database (FPED) 2005-2006 is to convert the 7,000+ foods in the Food and Nutrients Database for Dietary Studies (FNDDS) 3.0 used for the What We Eat in America, National Health and Nutrition Examination Survey (WWEIA, NHANES) 2005-2006, to USDA...
Food Patterns Equivalents Database 2013-14: Methodology and User Guide
USDA-ARS?s Scientific Manuscript database
The purpose of developing the Food Patterns Equivalents Database (FPED) 2013-14 is to convert the 8,536 foods in the Food and Nutrients Database for Dietary Studies (FNDDS) 2013-14 used for the What We Eat in America, National Health and Nutrition Examination Survey (WWEIA, NHANES) 2013-14 to the 37...
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
A Safety Index and Method for Flightdeck Evaluation
NASA Technical Reports Server (NTRS)
Latorella, Kara A.
2000-01-01
If our goal is to improve safety through machine, interface, and training design, then we must define a metric of flightdeck safety that is usable in the design process. Current measures associated with our notions of "good" pilot performance and ultimate safety of flightdeck performance fail to provide an adequate index of safe flightdeck performance for design evaluation purposes. The goal of this research effort is to devise a safety index and method that allows us to evaluate flightdeck performance holistically and in a naturalistic experiment. This paper uses Reason's model of accident causation (1990) as a basis for measuring safety, and proposes a relational database system and method for 1) defining a safety index of flightdeck performance, and 2) evaluating the "safety" afforded by flightdeck performance for the purpose of design iteration. Methodological considerations, limitations, and benefits are discussed as well as extensions to this work.
NASA Aeronautics and Space Database for bibliometric analysis
NASA Technical Reports Server (NTRS)
Powers, R.; Rudman, R.
2004-01-01
The authors use the NASA Aeronautics and Space Database to perform bibliometric analysis of citations. This paper explains their research methodology and gives some sample results showing collaboration trends between NASA Centers and other institutions.
A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.
Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva
2015-11-01
It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.
Chicago Area Transportation Study (CATS): Methodological Overview
DOT National Transportation Integrated Search
1994-04-01
This report contains a methodological discussion of the Chicago Area : Transportation Study (CATS) 1990 Household Travel Survey. It was prepared to : assist those who are working with the Household Travel Survey database. This : report concentrates o...
NASA Technical Reports Server (NTRS)
Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.
1992-01-01
The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Lovell, Karina; Bee, Penny
2011-12-01
Obsessive-compulsive disorder (OCD) is a chronic and disabling mental health problem. Only a minority of people receive evidence-based psychological treatments, and this deficit has prompted an increasing focus on delivering cognitive behaviour therapy (CBT) in new and innovative ways. To conduct a scoping review of the published evidence base for CBT-based interventions incorporating a health technology in the treatment of OCD. The questions posed by the review were (a) are technology-assisted treatments clinically effective, (b) are patient outcomes durable and (c) are more innovative services deemed acceptable by those individuals who engage in them? Scoping review of published studies using any study design examining CBT interventions incorporating a health technology for OCD. Electronic databases searched included MEDLINE (1966-2010), PsycInfo (1967-2010), EMBASE (1980-2010) and CINAHL databases (1982-2010). Thirteen studies were identified, of these, five used bibliotherapy, five examined computerised CBT (cCBT), two investigated telephone delivered CBT and one evaluated video conferencing. Overall studies were small and methodologically flawed, which precludes definitive conclusions of clinical effectiveness, durability or stakeholder satisfaction. To date the evidence base for technology-enhanced OCD treatments has undergone limited development. Future research should seek to overcome the methodological shortcomings of published work by conducting large-scale trials that incorporate clinical, cost and acceptability outcomes.
Retting, Richard A.; Ferguson, Susan A.; McCartt, Anne T.
2003-01-01
We provide a brief critical review and assessment of engineering modifications to the built environment that can reduce the risk of pedestrian injuries. In our review, we used the Transportation Research Information Services database to conduct a search for studies on engineering countermeasures documented in the scientific literature. We classified countermeasures into 3 categories—speed control, separation of pedestrians from vehicles, and measures that increase the visibility and conspicuity of pedestrians. We determined the measures and settings with the greatest potential for crash prevention. Our review, which emphasized inclusion of studies with adequate methodological designs, showed that modification of the built environment can substantially reduce the risk of pedestrian–vehicle crashes. PMID:12948963
Development of the Orion Crew Module Static Aerodynamic Database. Part 1; Hypersonic
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Walker, Eric L.; Robinson, Philip E.
2011-01-01
The Orion aerodynamic database provides force and moment coefficients given the velocity, attitude, configuration, etc. of the Crew Exploration Vehicle (CEV). The database is developed and maintained by the NASA CEV Aerosciences Project team from computational and experimental aerodynamic simulations. The database is used primarily by the Guidance, Navigation, and Control (GNC) team to design vehicle trajectories and assess flight performance. The initial hypersonic re-entry portion of the Crew Module (CM) database was developed in 2006. Updates incorporating additional data and improvements to the database formulation and uncertainty methodologies have been made since then. This paper details the process used to develop the CM database, including nominal values and uncertainties, for Mach numbers greater than 8 and angles of attack between 140deg and 180deg. The primary available data are more than 1000 viscous, reacting gas chemistry computational simulations using both the Laura and Dplr codes, over a range of Mach numbers from 2 to 37 and a range of angles of attack from 147deg to 172deg. Uncertainties were based on grid convergence, laminar-turbulent solution variations, combined altitude and code-to-code variations, and expected heatshield asymmetry. A radial basis function response surface tool, NEAR-RS, was used to fit the coefficient data smoothly in a velocity-angle-of-attack space. The resulting database is presented and includes some data comparisons and a discussion of the predicted variation of trim angle of attack and lift-to-drag ratio. The database provides a variation in trim angle of attack on the order of +/-2deg, and a range in lift-to-drag ratio of +/-0.035 for typical vehicle flight conditions.
Methodological quality of systematic reviews addressing femoroacetabular impingement.
Kowalczuk, Marcin; Adamich, John; Simunovic, Nicole; Farrokhyar, Forough; Ayeni, Olufemi R
2015-09-01
As the body of literature on femoroacetabular impingement (FAI) continues to grow, clinicians turn to systematic reviews to remain current with the best available evidence. The quality of systematic reviews in the FAI literature is currently unknown. The goal of this study was to assess the quality of the reporting of systematic reviews addressing FAI over the last 11 years (2003-2014) and to identify the specific methodological shortcomings and strengths. A search of the electronic databases, MEDLINE, EMBASE and PubMed, was performed to identify relevant systematic reviews. Methodological quality was assessed by two reviewers using the revised assessment of multiple systematic reviews (R-AMSTAR) scoring tool. An intraclass correlation coefficient (ICC) with 95 % confidence intervals (CI) was used to determine agreement between reviewers on R-AMSTAR quality scores. A total of 22 systematic reviews were assessed for methodological quality. The mean consensus R-AMSTAR score across all studies was 26.7 out of 40.0, indicating fair methodological quality. An ICC of 0.931, 95 % CI 0.843-0.971 indicated excellent agreement between reviewers during the scoring process. The systematic reviews addressing FAI are generally of fair methodological quality. Use of tools such as the R-AMSTAR score or PRISMA guidelines while designing future systematic reviews can assist in eliminating methodological shortcomings identified in this review. These shortcomings need to be kept in mind by clinicians when applying the current literature to their patient populations and making treatment decisions. Systematic reviews of highest methodological quality should be used by clinicians when possible to answer clinical questions.
Integration of air traffic databases : a case study
DOT National Transportation Integrated Search
1995-03-01
This report describes a case study to show the benefits from maximum utilization of existing air traffic databases. The study demonstrates the utility of integrating available data through developing and demonstrating a methodology addressing the iss...
Global Statistics of Bolides in the Terrestrial Atmosphere
NASA Astrophysics Data System (ADS)
Chernogor, L. F.; Shevelyov, M. B.
2017-06-01
Purpose: Evaluation and analysis of distribution of the number of meteoroid (mini asteroid) falls as a function of glow energy, velocity, the region of maximum glow altitude, and geographic coordinates. Design/methodology/approach: The satellite database on the glow of 693 mini asteroids, which were decelerated in the terrestrial atmosphere, has been used for evaluating basic meteoroid statistics. Findings: A rapid decrease in the number of asteroids with increasing of their glow energy is confirmed. The average speed of the celestial bodies is equal to about 17.9 km/s. The altitude of maximum glow most often equals to 30-40 km. The distribution law for a number of meteoroids entering the terrestrial atmosphere in longitude and latitude (after excluding the component in latitudinal dependence due to the geometry) is approximately uniform. Conclusions: Using a large enough database of measurements, the meteoroid (mini asteroid) statistics has been evaluated.
SNPs selection using support vector regression and genetic algorithms in GWAS
2014-01-01
Introduction This paper proposes a new methodology to simultaneously select the most relevant SNPs markers for the characterization of any measurable phenotype described by a continuous variable using Support Vector Regression with Pearson Universal kernel as fitness function of a binary genetic algorithm. The proposed methodology is multi-attribute towards considering several markers simultaneously to explain the phenotype and is based jointly on statistical tools, machine learning and computational intelligence. Results The suggested method has shown potential in the simulated database 1, with additive effects only, and real database. In this simulated database, with a total of 1,000 markers, and 7 with major effect on the phenotype and the other 993 SNPs representing the noise, the method identified 21 markers. Of this total, 5 are relevant SNPs between the 7 but 16 are false positives. In real database, initially with 50,752 SNPs, we have reduced to 3,073 markers, increasing the accuracy of the model. In the simulated database 2, with additive effects and interactions (epistasis), the proposed method matched to the methodology most commonly used in GWAS. Conclusions The method suggested in this paper demonstrates the effectiveness in explaining the real phenotype (PTA for milk), because with the application of the wrapper based on genetic algorithm and Support Vector Regression with Pearson Universal, many redundant markers were eliminated, increasing the prediction and accuracy of the model on the real database without quality control filters. The PUK demonstrated that it can replicate the performance of linear and RBF kernels. PMID:25573332
Assessment of CFD-based Response Surface Model for Ares I Supersonic Ascent Aerodynamics
NASA Technical Reports Server (NTRS)
Hanke, Jeremy L.
2011-01-01
The Ascent Force and Moment Aerodynamic (AFMA) Databases (DBs) for the Ares I Crew Launch Vehicle (CLV) were typically based on wind tunnel (WT) data, with increments provided by computational fluid dynamics (CFD) simulations for aspects of the vehicle that could not be tested in the WT tests. During the Design Analysis Cycle 3 analysis for the outer mold line (OML) geometry designated A106, a major tunnel mishap delayed the WT test for supersonic Mach numbers (M) greater than 1.6 in the Unitary Plan Wind Tunnel at NASA Langley Research Center, and the test delay pushed the final delivery of the A106 AFMA DB back by several months. The aero team developed an interim database based entirely on the already completed CFD simulations to mitigate the impact of the delay. This CFD-based database used a response surface methodology based on radial basis functions to predict the aerodynamic coefficients for M > 1.6 based on only the CFD data from both WT and flight Reynolds number conditions. The aero team used extensive knowledge of the previous AFMA DB for the A103 OML to guide the development of the CFD-based A106 AFMA DB. This report details the development of the CFD-based A106 Supersonic AFMA DB, constructs a prediction of the database uncertainty using data available at the time of development, and assesses the overall quality of the CFD-based DB both qualitatively and quantitatively. This assessment confirms that a reasonable aerodynamic database can be constructed for launch vehicles at supersonic conditions using only CFD data if sufficient knowledge of the physics and expected behavior is available. This report also demonstrates the applicability of non-parametric response surface modeling using radial basis functions for development of aerodynamic databases that exhibit both linear and non-linear behavior throughout a large data space.
Quality and methodological challenges in Internet-based mental health trials.
Ye, Xibiao; Bapuji, Sunita Bayyavarapu; Winters, Shannon; Metge, Colleen; Raynard, Mellissa
2014-08-01
To review the quality of Internet-based mental health intervention studies and their methodological challenges. We searched multiple literature databases to identify relevant studies according to the Population, Interventions, Comparators, Outcomes, and Study Design framework. Two reviewers independently assessed selection bias, allocation bias, confounding bias, blinding, data collection methods, and withdrawals/dropouts, using the Quality Assessment Tool for Quantitative Studies. We rated each component as strong, moderate, or weak and assigned a global rating (strong, moderate, or weak) to each study. We discussed methodological issues related to the study quality. Of 122 studies included, 31 (25%), 44 (36%), and 47 (39%) were rated strong, moderate, and weak, respectively. Only five studies were rated strong for all of the six quality components (three of them were published by the same group). Lack of blinding, selection bias, and low adherence were the top three challenges in Internet-based mental health intervention studies. The overall quality of Internet-based mental health intervention needs to improve. In particular, studies need to improve sample selection, intervention allocation, and blinding.
Challenges in evaluating cancer as a clinical outcome in postapproval studies of drug safety
Pinheiro, Simone P.; Rivera, Donna R.; Graham, David J.; Freedman, Andrew N.; Major, Jacqueline M.; Penberthy, Lynne; Levenson, Mark; Bradley, Marie C.; Wong, Hui-Lee; Ouellet-Hellstrom, Rita
2017-01-01
Pharmaceuticals approved in the United States are largely not known human carcinogens. However, cancer signals associated with pharmaceuticals may be hypothesized or arise after product approval. There are many study designs that can be used to evaluate cancer as an outcome in the postapproval setting. Because prospective systematic collection of cancer outcomes from a large number of individuals may be lengthy, expensive, and challenging, leveraging data from large existing databases are an integral approach. Such studies have the capability to evaluate the clinical experience of a large number of individuals, yet there are unique methodological challenges involved in their use to evaluate cancer outcomes. To discuss methodological challenges and potential solutions, the Food and Drug Administration and the National Cancer Institute convened a two-day public meeting in 2014. This commentary summarizes the most salient issues discussed at the meeting. PMID:27663208
Challenges in evaluating cancer as a clinical outcome in postapproval studies of drug safety.
Pinheiro, Simone P; Rivera, Donna R; Graham, David J; Freedman, Andrew N; Major, Jacqueline M; Penberthy, Lynne; Levenson, Mark; Bradley, Marie C; Wong, Hui-Lee; Ouellet-Hellstrom, Rita
2016-11-01
Pharmaceuticals approved in the United States are largely not known human carcinogens. However, cancer signals associated with pharmaceuticals may be hypothesized or arise after product approval. There are many study designs that can be used to evaluate cancer as an outcome in the postapproval setting. Because prospective systematic collection of cancer outcomes from a large number of individuals may be lengthy, expensive, and challenging, leveraging data from large existing databases are an integral approach. Such studies have the capability to evaluate the clinical experience of a large number of individuals, yet there are unique methodological challenges involved in their use to evaluate cancer outcomes. To discuss methodological challenges and potential solutions, the Food and Drug Administration and the National Cancer Institute convened a two-day public meeting in 2014. This commentary summarizes the most salient issues discussed at the meeting. Published by Elsevier Inc.
Integration, warehousing, and analysis strategies of Omics data.
Gedela, Srinubabu
2011-01-01
"-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.
Oropharyngeal dysphagia in myotonic dystrophy type 1: a systematic review.
Pilz, Walmari; Baijens, Laura W J; Kremer, Bernd
2014-06-01
A systematic review was conducted to investigate the pathophysiology of and diagnostic procedures for oropharyngeal dysphagia in myotonic dystrophy (MD). The electronic databases Embase, PubMed, and The Cochrane Library were used. The search was limited to English, Dutch, French, German, Spanish, and Portuguese publications. Sixteen studies met the inclusion criteria. Two independent reviewers assessed the methodological quality of the included articles. Swallowing assessment tools, the corresponding protocols, the studies' outcome measurements, and main findings are summarized and presented. The body of literature on pathophysiology of swallowing in dysphagic patients with MD type 1 remains scant. The included studies are heterogeneous with respect to design and outcome measures and hence are not directly comparable. More importantly, most studies had methodological problems. These are discussed in detail and recommendations for further research on diagnostic examinations for swallowing disorders in patients with MD type 1 are provided.
Analysis of Landslide Hazard Impact Using the Landslide Database for Germany
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The Federal Republic of Germany has long been among the few European countries that lack a national landslide database. Systematic collection and inventory of landslide data still shows a comprehensive research history in Germany, but only one focused on development of databases with local or regional coverage. This has changed in recent years with the launch of a database initiative aimed at closing the data gap existing at national level. The present contribution reports on this project that is based on a landslide database which evolved over the last 15 years to a database covering large parts of Germany. A strategy of systematic retrieval, extraction, and fusion of landslide data is at the heart of the methodology, providing the basis for a database with a broad potential of application. The database offers a data pool of more than 4,200 landslide data sets with over 13,000 single data files and dates back to 12th century. All types of landslides are covered by the database, which stores not only core attributes, but also various complementary data, including data on landslide causes, impacts, and mitigation. The current database migration to PostgreSQL/PostGIS is focused on unlocking the full scientific potential of the database, while enabling data sharing and knowledge transfer via a web GIS platform. In this contribution, the goals and the research strategy of the database project are highlighted at first, with a summary of best practices in database development providing perspective. Next, the focus is on key aspects of the methodology, which is followed by the results of different case studies in the German Central Uplands. The case study results exemplify database application in analysis of vulnerability to landslides, impact statistics, and hazard or cost modeling.
On the detection of pornographic digital images
NASA Astrophysics Data System (ADS)
Schettini, Raimondo; Brambilla, Carla; Cusano, Claudio; Ciocca, Gianluigi
2003-06-01
The paper addresses the problem of distinguishing between pornographic and non-pornographic photographs, for the design of semantic filters for the web. Both, decision forests of trees built according to CART (Classification And Regression Trees) methodology and Support Vectors Machines (SVM), have been used to perform the classification. The photographs are described by a set of low-level features, features that can be automatically computed simply on gray-level and color representation of the image. The database used in our experiments contained 1500 photographs, 750 of which labeled as pornographic on the basis of the independent judgement of several viewers.
Aerodynamic and acoustic test of a United Technologies model scale rotor at DNW
NASA Technical Reports Server (NTRS)
Yu, Yung H.; Liu, Sandy R.; Jordan, Dave E.; Landgrebe, Anton J.; Lorber, Peter F.; Pollack, Michael J.; Martin, Ruth M.
1990-01-01
The UTC model scale rotors, the DNW wind tunnel, the AFDD rotary wing test stand, the UTRC and AFDD aerodynamic and acoustic data acquisition systems, and the scope of test matrices are discussed and an introduction to the test results is provided. It is pointed out that a comprehensive aero/acoustic database of several configurations of the UTC scaled model rotor has been created. The data is expected to improve understanding of rotor aerodynamics, acoustics, and dynamics, and lead to enhanced analytical methodology and design capabilities for the next generation of rotorcraft.
Statistical EMC: A new dimension electromagnetic compatibility of digital electronic systems
NASA Astrophysics Data System (ADS)
Tsaliovich, Anatoly
Electromagnetic compatibility compliance test results are used as a database for addressing three classes of electromagnetic-compatibility (EMC) related problems: statistical EMC profiles of digital electronic systems, the effect of equipment-under-test (EUT) parameters on the electromagnetic emission characteristics, and EMC measurement specifics. Open area test site (OATS) and absorber line shielded room (AR) results are compared for equipment-under-test highest radiated emissions. The suggested statistical evaluation methodology can be utilized to correlate the results of different EMC test techniques, characterize the EMC performance of electronic systems and components, and develop recommendations for electronic product optimal EMC design.
Júnez-Ferreira, H E; Herrera, G S
2013-04-01
This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.
NASA Technical Reports Server (NTRS)
Salikuddin, M.; Martens, S.; Shin, H.; Majjigi, R. K.; Krejsa, Gene (Technical Monitor)
2002-01-01
The objective of this task was to develop a design methodology and noise reduction concepts for high bypass exhaust systems which could be applied to both existing production and new advanced engine designs. Special emphasis was given to engine cycles with bypass ratios in the range of 4:1 to 7:1, where jet mixing noise was a primary noise source at full power takeoff conditions. The goal of this effort was to develop the design methodology for mixed-flow exhaust systems and other novel noise reduction concepts that would yield 3 EPNdB noise reduction relative to 1992 baseline technology. Two multi-lobed mixers, a 22-lobed axisymmetric and a 21-lobed with a unique lobe, were designed. These mixers along with a confluent mixer were tested with several fan nozzles of different lengths with and without acoustic treatment in GEAE's Cell 41 under the current subtask (Subtask C). In addition to the acoustic and LDA tests for the model mixer exhaust systems, a semi-empirical noise prediction method for mixer exhaust system is developed. Effort was also made to implement flowfield data for noise prediction by utilizing MGB code. In general, this study established an aero and acoustic diagnostic database to calibrate and refine current aero and acoustic prediction tools.
Horton, Emily L; Renganathan, Ramkesh; Toth, Bryan N; Cohen, Alexa J; Bajcsy, Andrea V; Bateman, Amelia; Jennings, Mathew C; Khattar, Anish; Kuo, Ryan S; Lee, Felix A; Lim, Meilin K; Migasiuk, Laura W; Zhang, Amy; Zhao, Oliver K; Oliveira, Marcio A
2017-01-01
To lay the groundwork for devising, improving, and implementing new technologies to meet the needs of individuals with visual impairments, a systematic literature review was conducted to: a) describe hardware platforms used in assistive devices, b) identify their various applications, and c) summarize practices in user testing conducted with these devices. A search in relevant EBSCO databases for articles published between 1980 and 2014 with terminology related to visual impairment, technology, and tactile sensory adaptation yielded 62 articles that met the inclusion criteria for final review. It was found that while earlier hardware development focused on pin matrices, the emphasis then shifted toward force feedback haptics and accessible touch screens. The inclusion of interactive and multimodal features has become increasingly prevalent. The quantity and consistency of research on navigation, education, and computer accessibility suggest that these are pertinent areas of need for the visually impaired community. Methodologies for usability testing ranged from case studies to larger cross-sectional studies. Many studies used blindfolded sighted users to draw conclusions about design principles and usability. Altogether, the findings presented in this review provide insight on effective design strategies and user testing methodologies for future research on assistive technology for individuals with visual impairments.
Empirical cost models for estimating power and energy consumption in database servers
NASA Astrophysics Data System (ADS)
Valdivia Garcia, Harold Dwight
The explosive growth in the size of data centers, coupled with the widespread use of virtualization technology has brought power and energy consumption as major concerns for data center administrators. Provisioning decisions must take into consideration not only target application performance but also the power demands and total energy consumption incurred by the hardware and software to be deployed at the data center. Failure to do so will result in damaged equipment, power outages, and inefficient operation. Since database servers comprise one of the most popular and important server applications deployed in such facilities, it becomes necessary to have accurate cost models that can predict the power and energy demands that each database workloads will impose in the system. In this work we present an empirical methodology to estimate the power and energy cost of database operations. Our methodology uses multiple-linear regression to derive accurate cost models that depend only on readily available statistics such as selectivity factors, tuple size, numbers columns and relational cardinality. Moreover, our method does not need measurement of individual hardware components, but rather total power and energy consumption measured at a server. We have implemented our methodology, and ran experiments with several server configurations. Our experiments indicate that we can predict power and energy more accurately than alternative methods found in the literature.
Bare, Jane; Gloria, Thomas; Norris, Gregory
2006-08-15
Normalization is an optional step within Life Cycle Impact Assessment (LCIA) that may be used to assist in the interpretation of life cycle inventory data as well as life cycle impact assessment results. Normalization transforms the magnitude of LCI and LCIA results into relative contribution by substance and life cycle impact category. Normalization thus can significantly influence LCA-based decisions when tradeoffs exist. The U. S. Environmental Protection Agency (EPA) has developed a normalization database based on the spatial scale of the 48 continental U.S. states, Hawaii, Alaska, the District of Columbia, and Puerto Rico with a one-year reference time frame. Data within the normalization database were compiled based on the impact methodologies and lists of stressors used in TRACI-the EPA's Tool for the Reduction and Assessment of Chemical and other environmental Impacts. The new normalization database published within this article may be used for LCIA case studies within the United States, and can be used to assist in the further development of a global normalization database. The underlying data analyzed for the development of this database are included to allow the development of normalization data consistent with other impact assessment methodologies as well.
Alayli-Goebbels, Adrienne F G; Evers, Silvia M A A; Alexeeva, Daria; Ament, André J H A; de Vries, Nanne K; Tilly, Jan C; Severens, Johan L
2014-06-01
The objective of this study was to review methodological quality of economic evaluations of lifestyle behavior change interventions (LBCIs) and to examine how they address methodological challenges for public health economic evaluation identified in the literature. Pubmed and the NHS economic evaluation database were searched for published studies in six key areas for behavior change: smoking, physical activity, dietary behavior, (illegal) drug use, alcohol use and sexual behavior. From included studies (n = 142), we extracted data on general study characteristics, characteristics of the LBCIs, methodological quality and handling of methodological challenges. Economic evaluation evidence for LBCIs showed a number of weaknesses: methods, study design and characteristics of evaluated interventions were not well reported; methodological quality showed several shortcomings and progress with addressing methodological challenges remained limited. Based on the findings of this review we propose an agenda for improving future evidence to support decision-making. Recommendations for practice include improving reporting of essential study details and increasing adherence with good practice standards. Recommendations for research methods focus on mapping out complex causal pathways for modeling, developing measures to capture broader domains of wellbeing and community outcomes, testing methods for considering equity, identifying relevant non-health sector costs and advancing methods for evidence synthesis. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis
Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd
2014-01-01
Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942
Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-01-01
Objective To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). Methods PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. Results A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. Conclusion The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. PMID:29511016
Evaluation of Tsunami Run-Up on Coastal Areas at Regional Scale
NASA Astrophysics Data System (ADS)
González, M.; Aniel-Quiroga, Í.; Gutiérrez, O.
2017-12-01
Tsunami hazard assessment is tackled by means of numerical simulations, giving as a result, the areas flooded by tsunami wave inland. To get this, some input data is required, i.e., the high resolution topobathymetry of the study area, the earthquake focal mechanism parameters, etc. The computational cost of these kinds of simulations are still excessive. An important restriction for the elaboration of large scale maps at National or regional scale is the reconstruction of high resolution topobathymetry on the coastal zone. An alternative and traditional method consists of the application of empirical-analytical formulations to calculate run-up at several coastal profiles (i.e. Synolakis, 1987), combined with numerical simulations offshore without including coastal inundation. In this case, the numerical simulations are faster but some limitations are added as the coastal bathymetric profiles are very simply idealized. In this work, we present a complementary methodology based on a hybrid numerical model, formed by 2 models that were coupled ad hoc for this work: a non-linear shallow water equations model (NLSWE) for the offshore part of the propagation and a Volume of Fluid model (VOF) for the areas near the coast and inland, applying each numerical scheme where they better reproduce the tsunami wave. The run-up of a tsunami scenario is obtained by applying the coupled model to an ad-hoc numerical flume. To design this methodology, hundreds of worldwide topobathymetric profiles have been parameterized, using 5 parameters (2 depths and 3 slopes). In addition, tsunami waves have been also parameterized by their height and period. As an application of the numerical flume methodology, the coastal parameterized profiles and tsunami waves have been combined to build a populated database of run-up calculations. The combination was tackled by means of numerical simulations in the numerical flume The result is a tsunami run-up database that considers real profiles shape, realistic tsunami waves, and optimized numerical simulations. This database allows the calculation of the run-up of any new tsunami wave by interpolation on the database, in a short period of time, based on the tsunami wave characteristics provided as an output of the NLSWE model along the coast at a large scale domain (regional or National scale).
Shi, Chunhu; Zhu, Lin; Wang, Xue; Qin, Chunxia; Xu, Qi; Tian, Jinhui
2014-12-01
The importance of systematic reviews (SRs) of nursing interventions' impact on practice makes their methodological quality and reporting characteristics especially important as it directly influence their utility for clinicians, patients and policy makers.The study aims to assess the methodological quality and reporting characteristics of SRs of nursing interventions in Chinese nursing journals. Three Chinese databases were searched for SRs of nursing interventions from inception to October 2011. The assessment of multiple systematic reviews (AMSTAR) and Preferred Reporting Items for Systematic Reviews and Meta Analyses (PRISMA) statements were used to assess methodological quality and reporting characteristics. Seventy-four SRs were included. The proportion of SRs complying with AMSTAR checklist items ranged from 0% to 82.4%. No SRs reported an 'a priori' design or conflict of interest. Only four items were found to be reported in more than 50% of the SRs: a list of included and excluded studies, the scientific quality of included studies, the appropriate use of methods to combine findings, and formulating conclusions appropriately. The majority of SRs of nursing interventions in China had major methodological and reporting flaws that limited their value to guide decisions. Chinese authors and journals should adopt and keep up with the AMSTAR and PRISMA statements to improve the quality of SRs in this field. © 2014 Wiley Publishing Asia Pty Ltd.
Interventions to Improve Parental Communication About Sex: A Systematic Review
Holland, Cynthia L.; Bost, James
2011-01-01
CONTEXT: The relative effectiveness of interventions to improve parental communication with adolescents about sex is not known. OBJECTIVE: To compare the effectiveness and methodologic quality of interventions for improving parental communication with adolescents about sex. METHODS: We searched 6 databases: OVID/Medline, PsychInfo, ERIC, Cochrane Review, Communication and Mass Media, and the Cumulative Index to Nursing and Allied Health Literature. We included studies published between 1980 and July 2010 in peer-reviewed English-language journals that targeted US parents of adolescents aged 11 to 18 years, used an experimental or quasi-experimental design, included a control group, and had a pretest/posttest design. We abstracted data on multiple communication outcomes defined by the integrative conceptual model (communication frequency, content, skills, intentions, self-efficacy, perceived environmental barriers/facilitators, perceived social norms, attitudes, outcome expectations, knowledge, and beliefs). Methodologic quality was assessed using the 11-item methodologic quality score. RESULTS: Twelve studies met inclusion criteria. Compared with controls, parents who participated in these interventions experienced improvements in multiple communication domains including the frequency, quality, intentions, comfort, and self-efficacy for communicating. We noted no effects on parental attitudes toward communicating or the outcomes they expected to occur as a result of communicating. Four studies were of high quality, 7 were of medium quality, and 1 was of lower quality. CONCLUSIONS: Our review was limited by the lack of standardized measures for assessing parental communication. Still, interventions for improving parent-adolescent sex communication are well designed and have some targeted effects. Wider dissemination could augment efforts by schools, clinicians, and health educators. PMID:21321027
Interventions to improve parental communication about sex: a systematic review.
Akers, Aletha Y; Holland, Cynthia L; Bost, James
2011-03-01
The relative effectiveness of interventions to improve parental communication with adolescents about sex is not known. To compare the effectiveness and methodologic quality of interventions for improving parental communication with adolescents about sex. We searched 6 databases: OVID/Medline, PsychInfo, ERIC, Cochrane Review, Communication and Mass Media, and the Cumulative Index to Nursing and Allied Health Literature. We included studies published between 1980 and July 2010 in peer-reviewed English-language journals that targeted US parents of adolescents aged 11 to 18 years, used an experimental or quasi-experimental design, included a control group, and had a pretest/posttest design. We abstracted data on multiple communication outcomes defined by the integrative conceptual model (communication frequency, content, skills, intentions, self-efficacy, perceived environmental barriers/facilitators, perceived social norms, attitudes, outcome expectations, knowledge, and beliefs). Methodologic quality was assessed using the 11-item methodologic quality score. Twelve studies met inclusion criteria. Compared with controls, parents who participated in these interventions experienced improvements in multiple communication domains including the frequency, quality, intentions, comfort, and self-efficacy for communicating. We noted no effects on parental attitudes toward communicating or the outcomes they expected to occur as a result of communicating. Four studies were of high quality, 7 were of medium quality, and 1 was of lower quality. Our review was limited by the lack of standardized measures for assessing parental communication. Still, interventions for improving parent-adolescent sex communication are well designed and have some targeted effects. Wider dissemination could augment efforts by schools, clinicians, and health educators.
Wiley, Emily A.; Stover, Nicholas A.
2014-01-01
Use of inquiry-based research modules in the classroom has soared over recent years, largely in response to national calls for teaching that provides experience with scientific processes and methodologies. To increase the visibility of in-class studies among interested researchers and to strengthen their impact on student learning, we have extended the typical model of inquiry-based labs to include a means for targeted dissemination of student-generated discoveries. This initiative required: 1) creating a set of research-based lab activities with the potential to yield results that a particular scientific community would find useful and 2) developing a means for immediate sharing of student-generated results. Working toward these goals, we designed guides for course-based research aimed to fulfill the need for functional annotation of the Tetrahymena thermophila genome, and developed an interactive Web database that links directly to the official Tetrahymena Genome Database for immediate, targeted dissemination of student discoveries. This combination of research via the course modules and the opportunity for students to immediately “publish” their novel results on a Web database actively used by outside scientists culminated in a motivational tool that enhanced students’ efforts to engage the scientific process and pursue additional research opportunities beyond the course. PMID:24591511
Wiley, Emily A; Stover, Nicholas A
2014-01-01
Use of inquiry-based research modules in the classroom has soared over recent years, largely in response to national calls for teaching that provides experience with scientific processes and methodologies. To increase the visibility of in-class studies among interested researchers and to strengthen their impact on student learning, we have extended the typical model of inquiry-based labs to include a means for targeted dissemination of student-generated discoveries. This initiative required: 1) creating a set of research-based lab activities with the potential to yield results that a particular scientific community would find useful and 2) developing a means for immediate sharing of student-generated results. Working toward these goals, we designed guides for course-based research aimed to fulfill the need for functional annotation of the Tetrahymena thermophila genome, and developed an interactive Web database that links directly to the official Tetrahymena Genome Database for immediate, targeted dissemination of student discoveries. This combination of research via the course modules and the opportunity for students to immediately "publish" their novel results on a Web database actively used by outside scientists culminated in a motivational tool that enhanced students' efforts to engage the scientific process and pursue additional research opportunities beyond the course.
Xu, Yan-Wen; Cheng, Andy S K; Li-Tsang, Cecilia W P
2013-01-01
This paper aims to systematically explore the prevalence and risk factors of Work-related Musculoskeletal Disorders (WMSDs) in the catering industry by reviewing relevant published literature with the goal of developing future prevention strategies. The systematic review was carried out in nine English medical databases, two Chinese-dominated full-text databases and seven web sites with the designated search strategies. Studies were included if they met the defined inclusion criteria hierarchically to investigate prevalence and or risk factors associated with WMSDs in the catering industry with appropriate epidemiological methodology. Nine English databases yielded 634 citations, and two Chinese databases yielded 401 citations, although only five English and three Chinese studies passed the inclusion criteria. Three-fourths of the studies were cross-sectional. The prevalence of WMSDs varied from 3% to 86% depending on the type of establishment and positions. The most important risk factors were physical job demands, such as work posture, force applied, and repeated movement. The lack of epidemiological information about WMSDs in the catering industry is apparent. Further studies are needed to investigate the relation among prevalence, risk factors and forms of WMSDs, in particular the interaction of risk factors in psychosocial aspects of the catering industry.
Bryce, Shayden; Sloan, Elise; Lee, Stuart; Ponsford, Jennie; Rossell, Susan
2016-04-01
Systematic reviews and meta-analyses are a primary source of evidence when evaluating the benefit(s) of cognitive remediation (CR) in schizophrenia. These studies are designed to rigorously synthesize scientific literature; however, cannot be assumed to be of high methodological quality. The aims of this report were to: 1) review the use of systematic reviews and meta-analyses regarding CR in schizophrenia; 2) conduct a systematic methodological appraisal of published reports examining the benefits of this intervention on core outcome domains; and 3) compare the correspondence between methodological and reporting quality. Electronic databases were searched for relevant articles. Twenty-one reviews met inclusion criteria and were scored according to the AMSTAR checklist-a validated scale of methodological quality. Five meta-analyses were also scored according to PRISMA statement to compare 'quality of conduct' with 'quality of reporting'. Most systematic reviews and meta-analyses shared strengths and fell within a 'medium' level of methodological quality. Nevertheless, there were consistent areas of potential weakness that were not addressed by most reviews. These included the lack of protocol registration, uncertainty regarding independent data extraction and consensus procedures, and the minimal assessment of publication bias. Moreover, quality of conduct may not necessarily parallel quality of reporting, suggesting that consideration of these methods independently may be important. Reviews concerning CR for schizophrenia are a valuable source of evidence. However, the methodological quality of these reports may require additional consideration. Enhancing quality of conduct is essential for enabling research literature to be interpreted with confidence. Copyright © 2016 Elsevier Ltd. All rights reserved.
Six methodological steps to build medical data warehouses for research.
Szirbik, N B; Pelletier, C; Chaussalet, T
2006-09-01
We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.
Cooney, Lewis; Loke, Yoon K; Golder, Su; Kirkham, Jamie; Jorgensen, Andrea; Sinha, Ian; Hawcutt, Daniel
2017-06-02
Many medicines are dosed to achieve a particular therapeutic range, and monitored using therapeutic drug monitoring (TDM). The evidence base for a therapeutic range can be evaluated using systematic reviews, to ensure it continues to reflect current indications, doses, routes and formulations, as well as updated adverse effect data. There is no consensus on the optimal methodology for systematic reviews of therapeutic ranges. An overview of systematic reviews of therapeutic ranges was undertaken. The following databases were used: Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts and Reviews of Effects (DARE) and MEDLINE. The published methodologies used when systematically reviewing the therapeutic range of a drug were analyzed. Step by step recommendations to optimize such systematic reviews are proposed. Ten systematic reviews that investigated the correlation between serum concentrations and clinical outcomes encompassing a variety of medicines and indications were assessed. There were significant variations in the methodologies used (including the search terms used, data extraction methods, assessment of bias, and statistical analyses undertaken). Therapeutic ranges should be population and indication specific and based on clinically relevant outcomes. Recommendations for future systematic reviews based on these findings have been developed. Evidence based therapeutic ranges have the potential to improve TDM practice. Current systematic reviews investigating therapeutic ranges have highly variable methodologies and there is no consensus of best practice when undertaking systematic reviews in this field. These recommendations meet a need not addressed by standard protocols.
Aaron, Grant J; Varadhan, Ravi
2017-01-01
Background: The Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) project is a multiagency and multicountry collaboration that was formed to improve micronutrient assessment and to better characterize anemia. Objectives: The aims of the project were to 1) identify factors associated with inflammation, 2) assess the relations between inflammation, malaria infection, and biomarkers of iron and vitamin A status and compare adjustment approaches, and 3) assess risk factors for anemia in preschool children (PSC) and women of reproductive age (WRA). Design: The BRINDA database inclusion criteria included surveys that 1) were conducted after 2004, 2) had target groups of PSC, WRA, or both, and 3) used a similar laboratory methodology for the measurement of ≥1 biomarker of iron [ferritin or soluble transferrin receptor or vitamin A status (retinol-binding protein or retinol)] and ≥1 biomarker of inflammation (α-1-acid glycoprotein or C-reactive protein). Individual data sets were standardized and merged into a BRINDA database comprising 16 nationally and regionally representative surveys from 14 countries. Collectively, the database covered all 6 WHO geographic regions and contained ∼30,000 PSC and 27,000 WRA. Data were analyzed individually and combined with the use of a meta-analysis. Results: The methods that were used to standardize the BRINDA database and the analytic approaches used to address the project’s research questions are presented in this article. Three approaches to adjust micronutrient biomarker concentrations in the presence of inflammation and malaria infection are presented, along with an anemia conceptual framework that guided the BRINDA project’s anemia analyses. Conclusions: The BRINDA project refines approaches to interpret iron and vitamin A biomarker values in settings of inflammation and malaria infection and suggests the use of a new regression approach as well as proposes an anemia framework to which real-world data can be applied. Findings can inform guidelines and strategies to prevent and control micronutrient deficiencies and anemia globally. PMID:28615254
Zhou, Quan; Tao, Jing; Song, Huamei; Chen, Aihua; Yang, Huaijie; Zuo, Manzhen; Li, Hairong
2016-12-01
Kuntai capsule has been widely used for the treatment of menopausal syndrome in China for long time. We conducted this review to assess efficacy and safety of Kuntai capsule for the treatment of menopausal syndrome. We searched studies in PubMed, ClinicalTrials, the Cochrane Library, China National Knowledge Infrastructure Database(CNKI), China Science and Technology Journal Database (VIP), Wan fang Database and Chinese Biomedical Literature Database(CBM) until November 20, 2014. Randomized trials on Kuntai capsule for menopausal syndrome, compared with placebo or hormone replacement therapy (HRT) were included. Two reviewers independently retrieved the randomized controlled trials (RCTs) and extracted the information. The Cochrane risk of bias method was used to assess the quality of the included studies, and a Meta-analysis was conducted with Review Manager 5.2 software. A total of 17 RCTs (1455 participants) were included. The studies were of low methodological quality. Meta-analysis indicated that there was no statistical difference in the Kupperman index (KI) [WMD=0.51, 95% CI (-0.04, 1.06)], the effective rate of KI [OR=1.21, 95% CI (0.72, 2.04)], E2 level [WMD=-15.18, 95% CI (-33.93, 3.56)], and FSH level [WMD=-3.46, 95% CI (-7.2, 0.28)] after treatment between Kuntai versus HRT group (P>0.05). However, Compared with HRT, Kuntai capsule could significantly reduce the total incidence of adverse events [OR=0.28, 95% CI (0.17, 0.45)]. Kuntai capsule may be effective for treating menopausal syndrome and lower risk of side effects. The studies we analyzed were of low methodological quality. Therefore, more strictly designed large-scale randomized clinical trials are needed to evaluate the efficacy of Kuntai capsule in menopausal syndrome. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fernandez, Ritin; Everett, Bronwyn; Miranda, Charmaine; Rolley, John X; Rajaratnam, Rohan; Davidson, Patricia M
2015-01-01
OBJECTIVEctives of this descriptive comparative study were to (1) review data obtained from the World Health Organisation Statistical Information System (WHOSIS) database relating to the prevalence of risk factors for coronary heart disease (CHD) among Indians and Australians and (2) compare these data with published epidemiological studies of CHD riskfactors in adult migrant Asian Indians to provide a comprehensive and comparable assessment of risk factors relating to CHD and the mortality attributable to these risk factors. Design: ThDESIGNdy was undertaken using a database search and integrative review methodology. Data were obtained for comparison of CHD risk factors between Indians and Australians using the WHOSIS database. For the integrative review the MEDLINE, CINAHL, EMBASE, and Cochrane databases were searched using the keywords 'Migrants', 'Asian Indian', 'India', 'Migration', 'Immigration', 'Risk factors', and coronary heart disease. Two reviewers independently assessed the eligibility of the studies for inclusion in the review, the methodological quality and extracted details of eligible studies. Results from the integrative review on CHD risk factors in Asian Indians are presented in a narrative format, along with results from the WHOSIS database. Results: TRESULTSadjusted mortality for CHD was four times higher in migrant Asian Indians when compared to both the native population of the host country and migrants from other countries. Similarly when compared to migrants from other countries migrant Asian Indians had the highest prevalence of overweight individuals. Prevalence rates for hypercholesterolemia were up to 18.5 % among mgrant Asian Indians and migrant Asian Indian women had a higher prevalence of hypertriglyceridaemia compared to Caucasian females. Migrant Asian Indians also had a higher incidence of hypertension and upto 71 % of migrnt Asian Indian men did not meet current guidelines for participation in physical activity. Ethnic-specific prevalence of diabetes ranged from 6-7% among the normal weight to 19-33% among the obese migrant Asian Indians compared with non-Hispanic whites. ConclusionCONCLUSIONAsian Indians have an increased risk of CHD. Culturally sensitive strategies that recognise the effects of migration and extend beyond the health sector should be developed to target lifestyle changes in this high risk population.
Reinventing The Design Process: Teams and Models
NASA Technical Reports Server (NTRS)
Wall, Stephen D.
1999-01-01
The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.
NASA Astrophysics Data System (ADS)
Kaskhedikar, Apoorva Prakash
According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.
Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F.; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P.; McCrae, John; McCorkindale, Sheila; Leather, David
2016-01-01
Abstract Background The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once‐daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. Objective The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in‐depth exploration of the safety results will be the subject of future publications. Achievements The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Conclusion Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. PMID:27804174
Collier, Sue; Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P; McCrae, John; McCorkindale, Sheila; Leather, David
2017-03-01
The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once-daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in-depth exploration of the safety results will be the subject of future publications. The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.
Middleton, Anna; Niemiec, Emilia; Prainsack, Barbara; Bobe, Jason; Farley, Lauren; Steed, Claire; Smith, James; Bevan, Paul; Bonhomme, Natasha; Kleiderman, Erika; Thorogood, Adrian; Schickhardt, Christoph; Garattini, Chiara; Vears, Danya; Littler, Katherine; Banner, Natalie; Scott, Erick; Kovalevskaya, Nadezda V; Levin, Elissa; Morley, Katherine I; Howard, Heidi C
2018-06-01
Our international study, 'Your DNA, Your Say', uses film and an online cross-sectional survey to gather public attitudes toward the donation, access and sharing of DNA information. We describe the methodological approach used to create an engaging and bespoke survey, suitable for translation into many different languages. We address some of the particular challenges in designing a survey on the subject of genomics. In order to understand the significance of a genomic result, researchers and clinicians alike use external databases containing DNA and medical information from thousands of people. We ask how publics would like their 'anonymous' data to be used (or not to be used) and whether they are concerned by the potential risks of reidentification; the results will be used to inform policy.
Methodologies and Methods for User Behavioral Research.
ERIC Educational Resources Information Center
Wang, Peiling
1999-01-01
Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…
Chen, Wei; Liu, Bo; Wang, Li-qiong; Ren, Jun; Liu, Jian-ping
2014-07-30
Many Chinese patent medicines (CPMs) have been authorized by the Chinese State of Food and Drug Administration for the treatment of the common cold. A number of clinical trials have been conducted and published. However, there is no systematic review or meta-analysis on their efficacy and safety for the common cold to justify their clinical use. We searched CENTRAL, MEDLINE, EMBASE, SinoMed, CNKI, VIP, China Important Conference Papers Database, China Dissertation Database, and online clinical trial registry websites for published and unpublished randomized clinical trials (RCTs) of CPMs for the common cold till 31 March 2013. Revman 5.2 software was used for data analysis with effect estimate presented as relative risk (RR) and mean difference (MD) with a 95% confidence interval (CI). A total of five RCTs were identified. All of the RCTs were of high risk of bias with flawed study design and poor methodological quality. All RCTs included children aged between 6 months to 14 years. Results of individual trials showed that Shuanghuanglian oral liquid (RR 4.00; 95% CI: 2.26 to 7.08), and Xiaoer Resuqing oral liquid (RR 1.43; 95% CI: 1.15 to 1.77) had higher cure rates compared with antivirus drugs. Most of the trials did not report adverse events, and the safety of CPMs was still uncertain. Some CPMs showed a potential positive effect for the common cold on cure rate. However, due to the poor methodology quality and the defects in the clinical design of the included RCTs, such as the lack of placebo controlled trials, the inappropriate comparison intervention and outcome measurement, the confirmative conclusions on the beneficial effect of CPMs for the common cold could not be drawn.
Liew, H B; Rosli, M A; Wan Azman, W A; Robaayah, Z; Sim, K H
2008-09-01
The National Cardiovascular Database for Percutaneous Coronary Intervention (NCVD PCI) Registry is the first multicentre interventional cardiology project, involving the main cardiac centres in the country. The ultimate goal of NCVD PCI is to provide a contemporary appraisal of PCI in Malaysia. This article introduces the foundation, the aims, methodology, database collection and preliminary results of the first six-month database.
François, Clément; Tanasescu, Adrian; Lamy, François-Xavier; Despiegel, Nicolas; Falissard, Bruno; Chalem, Ylana; Lançon, Christophe; Llorca, Pierre-Michel; Saragoussi, Delphine; Verpillat, Patrice; Wade, Alan G; Zighed, Djamel A
2017-01-01
Background and objective : Automated healthcare databases (AHDB) are an important data source for real life drug and healthcare use. In the filed of depression, lack of detailed clinical data requires the use of binary proxies with important limitations. The study objective was to create a Depressive Health State Index (DHSI) as a continuous health state measure for depressed patients using available data in an AHDB. Methods: The study was based on historical cohort design using the UK Clinical Practice Research Datalink (CPRD). Depressive episodes (depression diagnosis with an antidepressant prescription) were used to create the DHSI through 6 successive steps: (1) Defining study design; (2) Identifying constituent parameters; (3) Assigning relative weights to the parameters; (4) Ranking based on the presence of parameters; (5) Standardizing the rank of the DHSI; (6) Developing a regression model to derive the DHSI in any other sample. Results : The DHSI ranged from 0 (worst) to 100 (best health state) comprising 29 parameters. The proportion of depressive episodes with a remission proxy increased with DHSI quartiles. Conclusion : A continuous outcome for depressed patients treated by antidepressants was created in an AHDB using several different variables and allowed more granularity than currently used proxies.
Onyura, Betty; Baker, Lindsay; Cameron, Blair; Friesen, Farah; Leslie, Karen
2016-01-01
An umbrella review compiles evidence from multiple reviews into a single accessible document. This umbrella review synthesizes evidence from systematic reviews on curricular and instructional design approaches in undergraduate medical education, focusing on learning outcomes. We conducted bibliographic database searches in Medline, EMBASE and ERIC from database inception to May 2013 inclusive, and digital keyword searches of leading medical education journals. We identified 18,470 abstracts; 467 underwent duplicate full-text scrutiny. Thirty-six articles met all eligibility criteria. Articles were abstracted independently by three authors, using a modified Kirkpatrick model for evaluating learning outcomes. Evidence for the effectiveness of diverse educational approaches is reported. This review maps out empirical knowledge on the efficacy of a broad range of educational approaches in medical education. Critical knowledge gaps, and lapses in methodological rigour, are discussed, providing valuable insight for future research. The findings call attention to the need for adopting evaluative strategies that explore how contextual variabilities and individual (teacher/learner) differences influence efficacy of educational interventions. Additionally, the results underscore that extant empirical evidence does not always provide unequivocal answers about what approaches are most effective. Educators should incorporate best available empirical knowledge with experiential and contextual knowledge.
Decoys Selection in Benchmarking Datasets: Overview and Perspectives
Réau, Manon; Langenfeld, Florent; Zagury, Jean-François; Lagarde, Nathalie; Montes, Matthieu
2018-01-01
Virtual Screening (VS) is designed to prospectively help identifying potential hits, i.e., compounds capable of interacting with a given target and potentially modulate its activity, out of large compound collections. Among the variety of methodologies, it is crucial to select the protocol that is the most adapted to the query/target system under study and that yields the most reliable output. To this aim, the performance of VS methods is commonly evaluated and compared by computing their ability to retrieve active compounds in benchmarking datasets. The benchmarking datasets contain a subset of known active compounds together with a subset of decoys, i.e., assumed non-active molecules. The composition of both the active and the decoy compounds subsets is critical to limit the biases in the evaluation of the VS methods. In this review, we focus on the selection of decoy compounds that has considerably changed over the years, from randomly selected compounds to highly customized or experimentally validated negative compounds. We first outline the evolution of decoys selection in benchmarking databases as well as current benchmarking databases that tend to minimize the introduction of biases, and secondly, we propose recommendations for the selection and the design of benchmarking datasets. PMID:29416509
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David; Johnson, Kenneth
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David O.; Johnson, Kenneth L.
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
Search strategies to identify information on adverse effects: a systematic review
Golder, Su; Loke, Yoon
2009-01-01
Objectives: The review evaluated studies of electronic database search strategies designed to retrieve adverse effects data for systematic reviews. Methods: Studies of adverse effects were located in ten databases as well as by checking references, hand-searching, searching citations, and contacting experts. Two reviewers screened the retrieved records for potentially relevant papers. Results: Five thousand three hundred thirteen citations were retrieved, yielding 19 studies designed to develop or evaluate adverse effect filters, of which 3 met the inclusion criteria. All 3 studies identified highly sensitive search strategies capable of retrieving over 95% of relevant records. However, 1 study did not evaluate precision, while the level of precision in the other 2 studies ranged from 0.8% to 2.8%. Methodological issues in these papers included the relatively small number of records, absence of a validation set of records for testing, and limited evaluation of precision. Conclusions: The results indicate the difficulty of achieving highly sensitive searches for information on adverse effects with a reasonable level of precision. Researchers who intend to locate studies on adverse effects should allow for the amount of resources and time required to conduct a highly sensitive search. PMID:19404498
The landslide database for Germany: Closing the gap at national level
NASA Astrophysics Data System (ADS)
Damm, Bodo; Klose, Martin
2015-11-01
The Federal Republic of Germany has long been among the few European countries that lack a national landslide database. Systematic collection and inventory of landslide data still has a long research history in Germany, but one focussed on the development of databases with local or regional coverage. This has changed in recent years with the launch of a database initiative aimed at closing the data gap existing at national level. The present paper reports on this project that is based on a landslide database which evolved over the last 15 years to a database covering large parts of Germany. A strategy of systematic retrieval, extraction, and fusion of landslide data is at the heart of the methodology, providing the basis for a database with a broad potential of application. The database offers a data pool of more than 4,200 landslide data sets with over 13,000 single data files and dates back to the 12th century. All types of landslides are covered by the database, which stores not only core attributes, but also various complementary data, including data on landslide causes, impacts, and mitigation. The current database migration to PostgreSQL/PostGIS is focused on unlocking the full scientific potential of the database, while enabling data sharing and knowledge transfer via a web GIS platform. In this paper, the goals and the research strategy of the database project are highlighted at first, with a summary of best practices in database development providing perspective. Next, the focus is on key aspects of the methodology, which is followed by the results of three case studies in the German Central Uplands. The case study results exemplify database application in the analysis of landslide frequency and causes, impact statistics, and landslide susceptibility modeling. Using the example of these case studies, strengths and weaknesses of the database are discussed in detail. The paper concludes with a summary of the database project with regard to previous achievements and the strategic roadmap.
PseudoBase: a database with RNA pseudoknots.
van Batenburg, F H; Gultyaev, A P; Pleij, C W; Ng, J; Oliehoek, J
2000-01-01
PseudoBase is a database containing structural, functional and sequence data related to RNA pseudo-knots. It can be reached at http://wwwbio. Leiden Univ.nl/ approximately Batenburg/PKB.html. This page will direct the user to a retrieval page from where a particular pseudoknot can be chosen, or to a submission page which enables the user to add pseudoknot information to the database or to an informative page that elaborates on the various aspects of the database. For each pseudoknot, 12 items are stored, e.g. the nucleotides of the region that contains the pseudoknot, the stem positions of the pseudoknot, the EMBL accession number of the sequence that contains this pseudoknot and the support that can be given regarding the reliability of the pseudoknot. Access is via a small number of steps, using 16 different categories. The development process was done by applying the evolutionary methodology for software development rather than by applying the methodology of the classical waterfall model or the more modern spiral model.
Interactive Multi-Instrument Database of Solar Flares
NASA Technical Reports Server (NTRS)
Ranjan, Shubha S.; Spaulding, Ryan; Deardorff, Donald G.
2018-01-01
The fundamental motivation of the project is that the scientific output of solar research can be greatly enhanced by better exploitation of the existing solar/heliosphere space-data products jointly with ground-based observations. Our primary focus is on developing a specific innovative methodology based on recent advances in "big data" intelligent databases applied to the growing amount of high-spatial and multi-wavelength resolution, high-cadence data from NASA's missions and supporting ground-based observatories. Our flare database is not simply a manually searchable time-based catalog of events or list of web links pointing to data. It is a preprocessed metadata repository enabling fast search and automatic identification of all recorded flares sharing a specifiable set of characteristics, features, and parameters. The result is a new and unique database of solar flares and data search and classification tools for the Heliophysics community, enabling multi-instrument/multi-wavelength investigations of flare physics and supporting further development of flare-prediction methodologies.
The effectiveness of Pilates exercise in people with chronic low back pain: a systematic review.
Wells, Cherie; Kolt, Gregory S; Marshall, Paul; Hill, Bridget; Bialocerkowski, Andrea
2014-01-01
To evaluate the effectiveness of Pilates exercise in people with chronic low back pain (CLBP) through a systematic review of randomised controlled trials (RCTs). A search for RCTs was undertaken using Medical Search Terms and synonyms for "Pilates" and "low back pain" within the maximal date range of 10 databases. Databases included the Cumulative Index to Nursing and Allied Health Literature; Cochrane Library; Medline; Physiotherapy Evidence Database; ProQuest: Health and Medical Complete, Nursing and Allied Health Source, Dissertation and Theses; Scopus; Sport Discus; Web of Science. Two independent reviewers were involved in the selection of evidence. To be included, relevant RCTs needed to be published in the English language. From 152 studies, 14 RCTs were included. Two independent reviewers appraised the methodological quality of RCTs using the McMaster Critical Review Form for Quantitative Studies. The author(s), year of publication, and details regarding participants, Pilates exercise, comparison treatments, and outcome measures, and findings, were then extracted. The methodological quality of RCTs ranged from "poor" to "excellent". A meta-analysis of RCTs was not undertaken due to the heterogeneity of RCTs. Pilates exercise provided statistically significant improvements in pain and functional ability compared to usual care and physical activity between 4 and 15 weeks, but not at 24 weeks. There were no consistent statistically significant differences in improvements in pain and functional ability with Pilates exercise, massage therapy, or other forms of exercise at any time period. Pilates exercise offers greater improvements in pain and functional ability compared to usual care and physical activity in the short term. Pilates exercise offers equivalent improvements to massage therapy and other forms of exercise. Future research should explore optimal Pilates exercise designs, and whether some people with CLBP may benefit from Pilates exercise more than others.
Physiology-based face recognition in the thermal infrared spectrum.
Buddharaju, Pradeep; Pavlidis, Ioannis T; Tsiamyrtzis, Panagiotis; Bazakos, Mike
2007-04-01
The current dominant approaches to face recognition rely on facial characteristics that are on or over the skin. Some of these characteristics have low permanency can be altered, and their phenomenology varies significantly with environmental factors (e.g., lighting). Many methodologies have been developed to address these problems to various degrees. However, the current framework of face recognition research has a potential weakness due to its very nature. We present a novel framework for face recognition based on physiological information. The motivation behind this effort is to capitalize on the permanency of innate characteristics that are under the skin. To establish feasibility, we propose a specific methodology to capture facial physiological patterns using the bioheat information contained in thermal imagery. First, the algorithm delineates the human face from the background using the Bayesian framework. Then, it localizes the superficial blood vessel network using image morphology. The extracted vascular network produces contour shapes that are characteristic to each individual. The branching points of the skeletonized vascular network are referred to as Thermal Minutia Points (TMPs) and constitute the feature database. To render the method robust to facial pose variations, we collect for each subject to be stored in the database five different pose images (center, midleft profile, left profile, midright profile, and right profile). During the classification stage, the algorithm first estimates the pose of the test image. Then, it matches the local and global TMP structures extracted from the test image with those of the corresponding pose images in the database. We have conducted experiments on a multipose database of thermal facial images collected in our laboratory, as well as on the time-gap database of the University of Notre Dame. The good experimental results show that the proposed methodology has merit, especially with respect to the problem of low permanence over time. More importantly, the results demonstrate the feasibility of the physiological framework in face recognition and open the way for further methodological and experimental research in the area.
de Groot, Mark C H; Schuerch, Markus; de Vries, Frank; Hesse, Ulrik; Oliva, Belén; Gil, Miguel; Huerta, Consuelo; Requena, Gema; de Abajo, Francisco; Afonso, Ana S; Souverein, Patrick C; Alvarez, Yolanda; Slattery, Jim; Rottenkolber, Marietta; Schmiedl, Sven; Van Dijk, Liset; Schlienger, Raymond G; Reynolds, Robert; Klungel, Olaf H
2014-05-01
The annual prevalence of antiepileptic drug (AED) prescribing reported in the literature differs considerably among European countries due to use of different type of data sources, time periods, population distribution, and methodologic differences. This study aimed to measure prevalence of AED prescribing across seven European routine health care databases in Spain, Denmark, The Netherlands, the United Kingdom, and Germany using a standardized methodology and to investigate sources of variation. Analyses on the annual prevalence of AEDs were stratified by sex, age, and AED. Overall prevalences were standardized to the European 2008 reference population. Prevalence of any AED varied from 88 per 10,000 persons (The Netherlands) to 144 per 10,000 in Spain and Denmark in 2001. In all databases, prevalence increased linearly: from 6% in Denmark to 15% in Spain each year since 2001. This increase could be attributed entirely to an increase in "new," recently marketed AEDs while prevalence of AEDs that have been available since the mid-1990s, hardly changed. AED use increased with age for both female and male patients up to the ages of 80 to 89 years old and tended to be somewhat higher in female than in male patients between the ages of 40 and 70. No differences between databases in the number of AEDs used simultaneously by a patient were found. We showed that during the study period of 2001-2009, AED prescribing increased in five European Union (EU) countries and that this increase was due entirely to the newer AEDs marketed since the 1990s. Using a standardized methodology, we showed consistent trends across databases and countries over time. Differences in age and sex distribution explained only part of the variation between countries. Therefore, remaining variation in AED use must originate from other differences in national health care systems. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.
System Dynamics Aviation Readiness Modeling Demonstration
2005-08-31
requirements. It is recommended that the Naval Aviation Enterprise take a close look at the requirements i.e., performance measures, methodology ...unit’s capability to perform specific Joint Mission Essential Task List (JMETL) requirements now and in the future. This assessment methodology must...the time-associated costs. The new methodology must base decisions on currently available data and databases. A “useful” readiness model should be
Biomechanics of fencing sport: A scoping review
Chen, Tony Lin-Wei; Wong, Duo Wai-Chi; Wang, Yan; Ren, Sicong; Yan, Fei
2017-01-01
Objectives The aim of our scoping review was to identify and summarize current evidence on the biomechanics of fencing to inform athlete development and injury prevention. Design Scoping review. Method Peer-reviewed research was identified from electronic databases using a structured keyword search. Details regarding experimental design, study group characteristics and measured outcomes were extracted from retrieved studies, summarized and information regrouped under themes for analysis. The methodological quality of the evidence was evaluated. Results Thirty-seven peer-reviewed studies were retrieved, the majority being observational studies conducted with experienced and elite athletes. The methodological quality of the evidence was “fair” due to the limited scope of research. Male fencers were the prevalent group studied, with the lunge and use of a foil weapon being the principal movement evaluated. Motion capture and pedabarography were the most frequently used data collection techniques. Conclusions Elite fencers exhibited sequential coordination of upper and lower limb movements with coherent patterns of muscle activation, compared to novice fencers. These elite features of neuromuscular coordination resulted in higher magnitudes of forward linear velocity of the body center of mass and weapon. Training should focus on explosive power. Sex- and equipment-specific effects could not be evaluated based on available research. PMID:28187164
NASA Astrophysics Data System (ADS)
Vorndran, Shelby; Russo, Juan; Zhang, Deming; Gordon, Michael; Kostuk, Raymond
2012-10-01
In this work, a concentrating photovoltaic (CPV) design methodology is proposed which aims to maximize system efficiency for a given irradiance condition. In this technique, the acceptance angle of the system is radiometrically matched to the angular spread of the site's average irradiance conditions using a simple geometric ratio. The optical efficiency of CPV systems from flat-plate to high-concentration is plotted at all irradiance conditions. Concentrator systems are measured outdoors in various irradiance conditions to test the methodology. This modeling technique is valuable at the design stage to determine the ideal level of concentration for a CPV module. It requires only two inputs: the acceptance angle profile of the system and the site's average direct and diffuse irradiance fractions. Acceptance angle can be determined by raytracing or testing a fabricated prototype in the lab with a solar simulator. The average irradiance conditions can be found in the Typical Metrological Year (TMY3) database. Additionally, the information gained from this technique can be used to determine tracking tolerance, quantify power loss during an isolated weather event, and do more sophisticated analysis such as I-V curve simulation.
Developing methods for systematic reviewing in health services delivery and organisation
Alborz, Alison; McNally, Rosalind
2007-01-01
Objectives To develop methods to facilitate the ‘systematic’ review of evidence from a range of methodologies on diffuse or ‘soft’ topics, as exemplified by ‘access to healthcare’. Data sources 28 bibliographic databases, research registers, organisational web sites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Review methods Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords’ model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesised. Quality assessment was by an initial set of ‘generic’ quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Results 82 studies were fully evaluated. Five studies were rated ‘highly rigorous’, 22 ‘rigorous’, 46 ‘less rigorous’ and 9 ‘poor’ papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. Conclusions The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or ‘soft’ topics. Synthesis can be facilitated further by using software, such as the Microsoft ‘Access’ database, for managing information. PMID:15606880
Recruitment for exercise or physical activity interventions: a protocol for systematic review.
Hoover, Jeffrey C; Alenazi, Aqeel M; Alothman, Shaima; Alshehri, Mohammed M; Rucker, Jason; Kluding, Patricia
2018-03-27
Recruiting participants into research trials is essential for the advancement of scientific knowledge that depends on clinical research studies. For the field of exercise and physical activity, there is an added difficulty in recruiting participants because participants must be willing to participate in an intervention that requires a significant commitment of both time and physical effort. Therefore, we have planned a systematic review to analyse how methodological factors, intervention characteristics and participant demographics impact recruitment rates in specific populations. This information will help researchers improve the design and recruitment approach in future studies. A mixed methods systematic review will be performed on studies that implement physical activity interventions and present data on participant recruitment. We plan on searching the Pubmed, Cumulative Index to Nursing and Allied Health Literature and Online Resource for Recruitment research in Clinical Trials databases for potentially eligible articles from database inception through 10 February 2017. A standardised approach will be used to identify studies through a review of titles, abstracts and reference lists. The process for each eligible study is to determine their eligibility, extract data from eligible studies and rate each eligible study's methodological quality. Exploratory multivariate regression models will be used to determine the effects of methodological factors, intervention characteristics and participant demographics on the recruitment variables of interest. Because all of the data used in this systematic review has been published, this review does not require ethical approval. The results of this review will be disseminated through peer-reviewed publication as well as through conference presentations. CRD42017057284. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Alborz, Alison; McNally, Rosalind
2004-12-01
To develop methods to facilitate the 'systematic' review of evidence from a range of methodologies on diffuse or 'soft' topics, as exemplified by 'access to health care'. Twenty-eight bibliographic databases, research registers, organizational websites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords' model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesized. Quality assessment was by an initial set of 'generic' quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Eighty-two studies were fully evaluated. Five studies were rated 'highly rigorous', 22 'rigorous', 46 'less rigorous' and nine 'poor' papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or 'soft' topics. Synthesis can be facilitated further by using software, such as the microsoft 'access' database, for managing information.
NASA Astrophysics Data System (ADS)
Naldesi, Luciano; Buttol, Patrizia; Masoni, Paolo; Misceo, Monica; Sára, Balázs
2004-12-01
"eLCA" is a European Commission financed project aimed at realising "On line green tools and services for Small and Medium-sized Enterprises (SMEs)". Knowledge and use of Life Cycle Assessment (LCA) by SMEs are strategic to introduce the Integrated Product Policy (IPP) in Europe, but methodology simplification is needed. LCA requires a large amount of validated general and sector specific data. Since their availability and cost can be insuperable barriers for SMEs, pre-elaborated data/meta-data, use of standards and low cost solutions are required. Within the framework of the eLCA project an LCA software - eVerdEE - based on a simplified methodology and specialised for SMEs has been developed. eVerdEE is a web-based tool with some innovative features. Its main feature is the adaptation of ISO 14040 requirements to offer easy-to-handle functions with solid scientific bases. Complex methodological problems, such as the system boundaries definition, the data quality estimation and documentation, the choice of impact categories, are simplified according to the SMEs" needs. Predefined "Goal and Scope definition" and "Inventory" forms, a user-friendly and well structured procedure are time and cost-effective. The tool is supported by a database containing pre-elaborated environmental indicators of substances and processes for different impact categories. The impact assessment is calculated automatically by using the user"s input and the database values. The results have different levels of interpretation in order to identify the life cycle critical points and the improvement options. The use of a target plot allows the direct comparison of different design alternatives.
Vachon, Hugo; Rintala, Aki; Viechtbauer, Wolfgang; Myin-Germeys, Inez
2018-01-18
Due to a number of methodological advantages and theoretical considerations, more and more studies in clinical psychology research employ the Experience Sampling Method (ESM) as a data collection technique. Despite this growing interest, the absence of methodological guidelines related to the use of ESM has resulted in a large heterogeneity of designs while the potential effects of the design itself on the response behavior of the participants remain unknown. The objectives of this systematic review are to investigate the associations between the design characteristics and the data quality and feasibility of studies relying on ESM in severe psychiatric disorders. We will search for all published studies using ambulatory assessment with patients suffering from major depressive disorder, bipolar disorder, and psychotic disorder or individuals at high risk for these disorders. Electronic database searches will be performed in PubMed and Web of Science with no restriction on the publication date. Two reviewers will independently screen original studies in a title/abstract phase and a full-text phase based on the inclusion criteria. The information related to the design and sample characteristics, data quality, and feasibility will be extracted. We will provide results in terms of a descriptive synthesis, and when applicable, a meta-analysis of the findings will be conducted. Our results will attempt to highlight how the feasibility and data quality of ambulatory assessment might be related to the methodological characteristics of the study designs in severe psychiatric disorders. We will discuss these associations in different subsamples if sufficient data are available and will examine limitations in the reporting of the methods of ambulatory studies in the current literature. The protocol for this systematic review was registered on PROSPERO (PROSPERO 2017: CRD42017060322 ) and is available in full on the University of York website ( http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42017060322 ).
Costa, Caroline A D; Tonial, Cristian T; Garcia, Pedro Celiny R
2016-01-01
To systematically review the evidence about the impact of nutritional status in critically-ill pediatric patients on the following outcomes during hospitalization in pediatric intensive care units: length of hospital stay, need for mechanical ventilation, and mortality. The search was carried out in the following databases: Lilacs (Latin American and Caribbean Health Sciences), MEDLINE (National Library of Medicine United States) and Embase (Elsevier Database). No filters were selected. A total of seven relevant articles about the subject were included. The publication period was between 1982 and 2012. All articles assessed the nutritional status of patients on admission at pediatric intensive care units and correlated it to at least one assessed outcome. A methodological quality questionnaire created by the authors was applied, which was based on some references and the researchers' experience. All included studies met the quality criteria, but only four met all the items. The studies included in this review suggest that nutritional depletion is associated with worse outcomes in pediatric intensive care units. However, studies are scarce and those existing show no methodological homogeneity, especially regarding nutritional status assessment and classification methods. Contemporary and well-designed studies are needed in order to properly assess the association between children's nutritional status and its impact on outcomes of these patients. Copyright © 2016 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
Nichols, A W
2008-11-01
To identify sports medicine-related clinical trial research articles in the PubMed MEDLINE database published between 1996 and 2005 and conduct a review and analysis of topics of research, experimental designs, journals of publication and the internationality of authorships. Sports medicine research is international in scope with improving study methodology and an evolution of topics. Structured review of articles identified in a search of a large electronic medical database. PubMed MEDLINE database. Sports medicine-related clinical research trials published between 1996 and 2005. Review and analysis of articles that meet inclusion criteria. Articles were examined for study topics, research methods, experimental subject characteristics, journal of publication, lead authors and journal countries of origin and language of publication. The search retrieved 414 articles, of which 379 (345 English language and 34 non-English language) met the inclusion criteria. The number of publications increased steadily during the study period. Randomised clinical trials were the most common study type and the "diagnosis, management and treatment of sports-related injuries and conditions" was the most popular study topic. The knee, ankle/foot and shoulder were the most frequent anatomical sites of study. Soccer players and runners were the favourite study subjects. The American Journal of Sports Medicine had the highest number of publications and shared the greatest international diversity of authorships with the British Journal of Sports Medicine. The USA, Australia, Germany and the UK produced a good number of the lead authorships. In all, 91% of articles and 88% of journals were published in English. Sports medicine-related research is internationally diverse, clinical trial publications are increasing and the sophistication of research design may be improving.
REFOLDdb: a new and sustainable gateway to experimental protocols for protein refolding.
Mizutani, Hisashi; Sugawara, Hideaki; Buckle, Ashley M; Sangawa, Takeshi; Miyazono, Ken-Ichi; Ohtsuka, Jun; Nagata, Koji; Shojima, Tomoki; Nosaki, Shohei; Xu, Yuqun; Wang, Delong; Hu, Xiao; Tanokura, Masaru; Yura, Kei
2017-04-24
More than 7000 papers related to "protein refolding" have been published to date, with approximately 300 reports each year during the last decade. Whilst some of these papers provide experimental protocols for protein refolding, a survey in the structural life science communities showed a necessity for a comprehensive database for refolding techniques. We therefore have developed a new resource - "REFOLDdb" that collects refolding techniques into a single, searchable repository to help researchers develop refolding protocols for proteins of interest. We based our resource on the existing REFOLD database, which has not been updated since 2009. We redesigned the data format to be more concise, allowing consistent representations among data entries compared with the original REFOLD database. The remodeled data architecture enhances the search efficiency and improves the sustainability of the database. After an exhaustive literature search we added experimental refolding protocols from reports published 2009 to early 2017. In addition to this new data, we fully converted and integrated existing REFOLD data into our new resource. REFOLDdb contains 1877 entries as of March 17 th , 2017, and is freely available at http://p4d-info.nig.ac.jp/refolddb/ . REFOLDdb is a unique database for the life sciences research community, providing annotated information for designing new refolding protocols and customizing existing methodologies. We envisage that this resource will find wide utility across broad disciplines that rely on the production of pure, active, recombinant proteins. Furthermore, the database also provides a useful overview of the recent trends and statistics in refolding technology development.
A spatial database for landslides in northern Bavaria: A methodological approach
NASA Astrophysics Data System (ADS)
Jäger, Daniel; Kreuzer, Thomas; Wilde, Martina; Bemm, Stefan; Terhorst, Birgit
2018-04-01
Landslide databases provide essential information for hazard modeling, damages on buildings and infrastructure, mitigation, and research needs. This study presents the development of a landslide database system named WISL (Würzburg Information System on Landslides), currently storing detailed landslide data for northern Bavaria, Germany, in order to enable scientific queries as well as comparisons with other regional landslide inventories. WISL is based on free open source software solutions (PostgreSQL, PostGIS) assuring good correspondence of the various softwares and to enable further extensions with specific adaptions of self-developed software. Apart from that, WISL was designed to be particularly compatible for easy communication with other databases. As a central pre-requisite for standardized, homogeneous data acquisition in the field, a customized data sheet for landslide description was compiled. This sheet also serves as an input mask for all data registration procedures in WISL. A variety of "in-database" solutions for landslide analysis provides the necessary scalability for the database, enabling operations at the local server. In its current state, WISL already enables extensive analysis and queries. This paper presents an example analysis of landslides in Oxfordian Limestones in the northeastern Franconian Alb, northern Bavaria. The results reveal widely differing landslides in terms of geometry and size. Further queries related to landslide activity classifies the majority of the landslides as currently inactive, however, they clearly possess a certain potential for remobilization. Along with some active mass movements, a significant percentage of landslides potentially endangers residential areas or infrastructure. The main aspect of future enhancements of the WISL database is related to data extensions in order to increase research possibilities, as well as to transfer the system to other regions and countries.
NASA Technical Reports Server (NTRS)
Wang, Yi; Pant, Kapil; Brenner, Martin J.; Ouellette, Jeffrey A.
2018-01-01
This paper presents a data analysis and modeling framework to tailor and develop linear parameter-varying (LPV) aeroservoelastic (ASE) model database for flexible aircrafts in broad 2D flight parameter space. The Kriging surrogate model is constructed using ASE models at a fraction of grid points within the original model database, and then the ASE model at any flight condition can be obtained simply through surrogate model interpolation. The greedy sampling algorithm is developed to select the next sample point that carries the worst relative error between the surrogate model prediction and the benchmark model in the frequency domain among all input-output channels. The process is iterated to incrementally improve surrogate model accuracy till a pre-determined tolerance or iteration budget is met. The methodology is applied to the ASE model database of a flexible aircraft currently being tested at NASA/AFRC for flutter suppression and gust load alleviation. Our studies indicate that the proposed method can reduce the number of models in the original database by 67%. Even so the ASE models obtained through Kriging interpolation match the model in the original database constructed directly from the physics-based tool with the worst relative error far below 1%. The interpolated ASE model exhibits continuously-varying gains along a set of prescribed flight conditions. More importantly, the selected grid points are distributed non-uniformly in the parameter space, a) capturing the distinctly different dynamic behavior and its dependence on flight parameters, and b) reiterating the need and utility for adaptive space sampling techniques for ASE model database compaction. The present framework is directly extendible to high-dimensional flight parameter space, and can be used to guide the ASE model development, model order reduction, robust control synthesis and novel vehicle design of flexible aircraft.
A systematic review and metaanalysis of energy intake and weight gain in pregnancy.
Jebeile, Hiba; Mijatovic, Jovana; Louie, Jimmy Chun Yu; Prvan, Tania; Brand-Miller, Jennie C
2016-04-01
Gestational weight gain within the recommended range produces optimal pregnancy outcomes, yet many women exceed the guidelines. Official recommendations to increase energy intake by ∼ 1000 kJ/day in pregnancy may be excessive. To determine by metaanalysis of relevant studies whether greater increments in energy intake from early to late pregnancy corresponded to greater or excessive gestational weight gain. We systematically searched electronic databases for observational and intervention studies published from 1990 to the present. The databases included Ovid Medline, Cochrane Library, Excerpta Medica DataBASE (EMBASE), Cumulative Index to Nursing and Allied Health Literature (CINAHL), and Science Direct. In addition we hand-searched reference lists of all identified articles. Studies were included if they reported gestational weight gain and energy intake in early and late gestation in women of any age with a singleton pregnancy. Search also encompassed journals emerging from both developed and developing countries. Studies were individually assessed for quality based on the Quality Criteria Checklist obtained from the Evidence Analysis Manual: Steps in the academy evidence analysis process. Publication bias was plotted by the use of a funnel plot with standard mean difference against standard error. Identified studies were meta-analyzed and stratified by body mass index, study design, dietary methodology, and country status (developed/developing) by the use of a random-effects model. Of 2487 articles screened, 18 studies met inclusion criteria. On average, women gained 12.0 (2.8) kg (standardized mean difference = 1.306, P < .0005) yet reported only a small increment in energy intake that did not reach statistical significance (∼475 kJ/day, standard mean difference = 0.266, P = .016). Irrespective of baseline body mass index, study design, dietary methodology, or country status, changes in energy intake were not significantly correlated to the amount of gestational weight gain (r = 0.321, P = .11). Despite rapid physiologic weight gain, women report little or no change in energy intake during pregnancy. Current recommendations to increase energy intake by ∼ 1000 kJ/day may, therefore, encourage excessive weight gain and adverse pregnancy outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-02-25
It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal.
Carter, Alexander W; Mandavia, Rishi; Mayer, Erik; Marti, Joachim; Mossialos, Elias; Darzi, Ara
2017-08-18
Recent avoidable failures in patient care highlight the ongoing need for evidence to support improvements in patient safety. According to the most recent reviews, there is a dearth of economic evidence related to patient safety. These reviews characterise an evidence gap in terms of the scope and quality of evidence available to support resource allocation decisions. This protocol is designed to update and improve on the reviews previously conducted to determine the extent of methodological progress in economic analyses in patient safety. A broad search strategy with two core themes for original research (excluding opinion pieces and systematic reviews) in 'patient safety' and 'economic analyses' has been developed. Medline, Econlit and National Health Service Economic Evaluation Database bibliographic databases will be searched from January 2007 using a combination of medical subject headings terms and research-derived search terms (see table 1). The method is informed by previous reviews on this topic, published in 2012. Screening, risk of bias assessment (using the Cochrane collaboration tool) and economic evaluation quality assessment (using the Drummond checklist) will be conducted by two independent reviewers, with arbitration by a third reviewer as needed. Studies with a low risk of bias will be assessed using the Drummond checklist. High-quality economic evaluations are those that score >20/35. A qualitative synthesis of evidence will be performed using a data collection tool to capture the study design(s) employed, population(s), setting(s), disease area(s), intervention(s) and outcome(s) studied. Methodological quality scores will be compared with previous reviews where possible. Effect size(s) and estimate uncertainty will be captured and used in a quantitative synthesis of high-quality evidence, where possible. Formal ethical approval is not required as primary data will not be collected. The results will be disseminated through a peer-reviewed publication, presentations and social media. CRD42017057853. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Sermon, Jan; Geerts, Paul; Denee, Tom R.; De Vos, Cedric; Malfait, Bart; Lamotte, Mark; Mulder, Cornelis L.
2017-01-01
Achieving greater continuation of treatment is a key element to improve treatment outcomes in schizophrenia patients. However, reported treatment continuation can differ markedly depending on the study design. In a retrospective setting, treatment continuation remains overall poor among patients using antipsychotics. This study aimed to document the difference in treatment continuation between four long-acting injectable antipsychotics based on the QuintilesIMS LRx databases, national, longitudinal, panel based prescription databases of retail pharmacies, in the Netherlands and Belgium. Paliperidone palmitate once monthly, risperidone microspheres, haloperidol decanoate, and olanzapine pamoate were studied. This study demonstrated significantly higher treatment continuation of paliperidone palmitate once monthly compared to risperidone microspheres (p-value<0,01) and haloperidol decanoate (p-value<0,01) in both countries, a significantly higher treatment continuation of paliperidone palmitate once monthly compared to olanzapine pamoate in the Netherlands (p-value<0,01), and a general trend towards better treatment continuation versus olanzapine pamoate in Belgium. Analysing the subgroup of patients without previous exposure to long-acting antipsychotic treatment revealed the positive impact of previous exposure on treatment continuation with a subsequent long acting treatment. Additionally, the probability of restarting the index therapy was higher among patients treated with paliperidone palmitate once monthly compared to patients treated with risperidone microspheres and haloperidol decanoate. The data source used and the methodology defined ensured for the first time a comparison of treatment continuation in a non-interventional study design for the four long-acting injectable antipsychotics studied. PMID:28614404
Burnham, J F; Shearer, B S; Wall, J C
1992-01-01
Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.
Vaccines are different: A systematic review of budget impact analyses of vaccines.
Loze, Priscilla Magalhaes; Nasciben, Luciana Bertholim; Sartori, Ana Marli Christovam; Itria, Alexander; Novaes, Hillegonda Maria Dutilh; de Soárez, Patrícia Coelho
2017-05-15
Several countries require manufacturers to present a budget impact analysis (BIA), together with a cost-effectiveness analysis, to support national funding requests. However, guidelines for conducting BIA of vaccines are scarce. To analyze the methodological approaches used in published budget impact analysis (BIA) of vaccines, discussing specific methodological issues related to vaccines. This systematic review of the literature on BIA of vaccines was carried out in accordance with the Centre for Reviews and Dissemination - CRD guidelines. We searched multiple databases: MedLine, Embase, Biblioteca Virtual de Saúde (BVS), Cochrane Library, DARE Database, NHS Economic Evaluation Database (NHS EED), HTA Database (via Centre for Reviews and Dissemination - CRD), and grey literature. Two researchers, working independently, selected the studies and extracted the data. The methodology quality of individual studies was assessed using the ISPOR 2012 Budget Impact Analysis Good Practice II Task Force. A qualitative narrative synthesis was conducted. Twenty-two studies were reviewed. The most frequently evaluated vaccines were pneumococcal (41%), influenza (23%) and rotavirus (18%). The target population was stated in 21 studies (95%) and the perspective was clear in 20 (91%). Only 36% reported the calculations used to complete the BIA, 27% informed the total and disaggregated costs for each time period, and 9% showed the change in resource use for each time period. More than half of the studies (55%, n=12) reported less than 50% of the items recommended in the checklist. The production of BIA of vaccines has increased from 2009. The report of the methodological steps was unsatisfactory, making it difficult to assess the validity of the results presented. Vaccines specific issues should be discussed in international guidelines for BIA of vaccines, to improve the quality of the studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Isaacs, Eric B.; Wolverton, Chris
2018-02-26
Electronic band structure contains a wealth of information on the electronic properties of a solid and is routinely computed. However, the more difficult problem of designing a solid with a desired band structure is an outstanding challenge. In order to address this inverse band structure design problem, we devise an approach using materials database screening with materials attributes based on the constituent elements, nominal electron count, crystal structure, and thermodynamics. Our strategy is tested in the context of thermoelectric materials, for which a targeted band structure containing both flat and dispersive components with respect to crystal momentum is highly desirable.more » We screen for thermodynamically stable or metastable compounds containing d 8 transition metals coordinated by anions in a square planar geometry in order to mimic the properties of recently identified oxide thermoelectrics with such a band structure. In doing so, we identify 157 compounds out of a total of over half a million candidates. After further screening based on electronic band gap and structural anisotropy, we explicitly compute the band structures for the several of the candidates in order to validate the approach. We successfully find two new oxide systems that achieve the targeted band structure. Electronic transport calculations on these two compounds, Ba 2PdO 3 and La 4PdO 7, confirm promising thermoelectric power factor behavior for the compounds. This methodology is easily adapted to other targeted band structures and should be widely applicable to a variety of design problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isaacs, Eric B.; Wolverton, Chris
Electronic band structure contains a wealth of information on the electronic properties of a solid and is routinely computed. However, the more difficult problem of designing a solid with a desired band structure is an outstanding challenge. In order to address this inverse band structure design problem, we devise an approach using materials database screening with materials attributes based on the constituent elements, nominal electron count, crystal structure, and thermodynamics. Our strategy is tested in the context of thermoelectric materials, for which a targeted band structure containing both flat and dispersive components with respect to crystal momentum is highly desirable.more » We screen for thermodynamically stable or metastable compounds containing d 8 transition metals coordinated by anions in a square planar geometry in order to mimic the properties of recently identified oxide thermoelectrics with such a band structure. In doing so, we identify 157 compounds out of a total of over half a million candidates. After further screening based on electronic band gap and structural anisotropy, we explicitly compute the band structures for the several of the candidates in order to validate the approach. We successfully find two new oxide systems that achieve the targeted band structure. Electronic transport calculations on these two compounds, Ba 2PdO 3 and La 4PdO 7, confirm promising thermoelectric power factor behavior for the compounds. This methodology is easily adapted to other targeted band structures and should be widely applicable to a variety of design problems.« less
Chen, Wei; Lewith, George; Wang, Li-qiong; Ren, Jun; Xiong, Wen-jing; Lu, Fang; Liu, Jian-ping
2014-01-01
Chinese proprietary herbal medicines (CPHMs) have long history in China for the treatment of common cold, and lots of them have been listed in the 'China national essential drug list' by the Chinese Ministry of Health. The aim of this review is to provide a well-round clinical evidence assessment on the potential benefits and harms of CPHMs for common cold based on a systematic literature search to justify their clinical use and recommendation. We searched CENTRAL, MEDLINE, EMBASE, SinoMed, CNKI, VIP, China Important Conference Papers Database, China Dissertation Database, and online clinical trial registry websites from their inception to 31 March 2013 for clinical studies of CPHMs listed in the 'China national essential drug list' for common cold. There was no restriction on study design. A total of 33 CPHMs were listed in 'China national essential drug list 2012' for the treatment of common cold but only 7 had supportive clinical evidences. A total of 6 randomised controlled trials (RCTs) and 7 case series (CSs) were included; no other study design was identified. All studies were conducted in China and published in Chinese between 1995 and 2012. All included studies had poor study design and methodological quality, and were graded as very low quality. The use of CPHMs for common cold is not supported by robust evidence. Further rigorous well designed placebo-controlled, randomized trials are needed to substantiate the clinical claims made for CPHMs.
Development of a User-Oriented Data Classification for Information System Design Methodology.
1982-06-30
4, December 1978,I T [COD79] CODD , E. F ., "Extending the Database Relational Model h to Capture More Meaning." ACM TODS 4:4, December 1979. [COU731...I AD-All& 879 ALPHA 4 A aOROU INC SILVEXRIN MD.m F /S 5/2DEVELOP 4T OF A UUA-ORIENTS11 DATA CLASSIFICATION FOR INPORMAT--ETCIU)AMR at -82-C-0129...mwtizuii tm esign = au* C ~I #i systemtic, ady&nmuic Viobze of an terpditand I~~~UWT FigureTO OF Tso: ~ow u PawKq"I m F ~pra o.saper ewmatatLou Page. MIL
Becker, Christoph; Lauterbach, Gabriele; Spengler, Sarah; Dettweiler, Ulrich; Mess, Filip
2017-01-01
Background: Participants in Outdoor Education Programmes (OEPs) presumably benefit from these programmes in terms of their social and personal development, academic achievement and physical activity (PA). The aim of this systematic review was to identify studies about regular compulsory school- and curriculum-based OEPs, to categorise and evaluate reported outcomes, to assess the methodological quality, and to discuss possible benefits for students. Methods: We searched online databases to identify English- and German-language peer-reviewed journal articles that reported any outcomes on a student level. Two independent reviewers screened studies identified for eligibility and assessed the methodological quality. Results: Thirteen studies were included for analysis. Most studies used a case-study design, the average number of participants was moderate (mean valued (M) = 62.17; standard deviation (SD) = 64.12), and the methodological quality was moderate on average for qualitative studies (M = 0.52; SD = 0.11), and low on average for quantitative studies (M = 0.18; SD = 0.42). Eight studies described outcomes in terms of social dimensions, seven studies in learning dimensions and four studies were subsumed under additional outcomes, i.e., PA and health. Eleven studies reported positive, one study positive as well as negative, and one study reported negative effects. PA and mental health as outcomes were underrepresented. Conclusion: Tendencies were detected that regular compulsory school- and curriculum-based OEPs can promote students in respect of social, academic, physical and psychological dimensions. Very little is known concerning students’ PA or mental health. We recommend conducting more quasi-experimental design and longitudinal studies with a greater number of participants, and a high methodological quality to further investigate these tendencies. PMID:28475167
Becker, Christoph; Lauterbach, Gabriele; Spengler, Sarah; Dettweiler, Ulrich; Mess, Filip
2017-05-05
Participants in Outdoor Education Programmes (OEPs) presumably benefit from these programmes in terms of their social and personal development, academic achievement and physical activity (PA). The aim of this systematic review was to identify studies about regular compulsory school- and curriculum-based OEPs, to categorise and evaluate reported outcomes, to assess the methodological quality, and to discuss possible benefits for students. We searched online databases to identify English- and German-language peer-reviewed journal articles that reported any outcomes on a student level. Two independent reviewers screened studies identified for eligibility and assessed the methodological quality. Thirteen studies were included for analysis. Most studies used a case-study design, the average number of participants was moderate (mean valued (M) = 62.17; standard deviation (SD) = 64.12), and the methodological quality was moderate on average for qualitative studies (M = 0.52; SD = 0.11), and low on average for quantitative studies (M = 0.18; SD = 0.42). Eight studies described outcomes in terms of social dimensions, seven studies in learning dimensions and four studies were subsumed under additional outcomes, i.e., PA and health. Eleven studies reported positive, one study positive as well as negative, and one study reported negative effects. PA and mental health as outcomes were underrepresented. Tendencies were detected that regular compulsory school- and curriculum-based OEPs can promote students in respect of social, academic, physical and psychological dimensions. Very little is known concerning students' PA or mental health. We recommend conducting more quasi-experimental design and longitudinal studies with a greater number of participants, and a high methodological quality to further investigate these tendencies.
Capkun, Gorana; Lahoz, Raquel; Verdun, Elisabetta; Song, Xue; Chen, Weston; Korn, Jonathan R; Dahlke, Frank; Freitas, Rita; Fraeman, Kathy; Simeone, Jason; Johnson, Barbara H; Nordstrom, Beth
2015-05-01
Administrative claims databases provide a wealth of data for assessing the effect of treatments in clinical practice. Our aim was to propose methodology for real-world studies in multiple sclerosis (MS) using these databases. In three large US administrative claims databases: MarketScan, PharMetrics Plus and Department of Defense (DoD), patients with MS were selected using an algorithm identified in the published literature and refined for accuracy. Algorithms for detecting newly diagnosed ('incident') MS cases were also refined and tested. Methodology based on resource and treatment use was developed to differentiate between relapses with and without hospitalization. When various patient selection criteria were applied to the MarketScan database, an algorithm requiring two MS diagnoses at least 30 days apart was identified as the preferred method of selecting patient cohorts. Attempts to detect incident MS cases were confounded by the limited continuous enrollment of patients in these databases. Relapse detection algorithms identified similar proportions of patients in the MarketScan and PharMetrics Plus databases experiencing relapses with (2% in both databases) and without (15-20%) hospitalization in the 1 year follow-up period, providing findings in the range of those in the published literature. Additional validation of the algorithms proposed here would increase their credibility. The methods suggested in this study offer a good foundation for performing real-world research in MS using administrative claims databases, potentially allowing evidence from different studies to be compared and combined more systematically than in current research practice.
Literature Review of Research on Chronic Pain and Yoga in Military Populations
Miller, Shari; Gaylord, Susan; Buben, Alex; Brintz, Carrie; Rae Olmsted, Kristine; Asefnia, Nakisa; Bartoszek, Michael
2017-01-01
Background: Although yoga is increasingly being provided to active duty soldiers and veterans, studies with military populations are limited and effects on chronic pain are largely unknown. We reviewed the existing body of literature and provide recommendations for future research. Methods: We conducted a literature review of electronic databases (PubMed, PsychINFO, Web of Science, Science Citation Index Expanded, Social Sciences Citation Index, Conference Proceedings Citation Index—Science, and Conference Proceedings Citation Index—Social Science & Humanities). The studies were reviewed for characteristics such as mean age of participants, sample size, yoga type, and study design. Only peer-reviewed studies were included in the review. Results: The search yielded only six studies that examined pain as an outcome of yoga for military populations. With one exception, studies were with veteran populations. Only one study was conducted with Operation Enduring Freedom (OEF) or Operation Iraqi Freedom (OIF) veterans. One study was a randomized controlled trial (RCT). Four of the five studies remaining used pre/post design, while the last study used a post-only design. Conclusions: Studies on the use of yoga to treat chronic pain in military populations are in their infancy. Methodological weaknesses include small sample sizes, a lack of studies with key groups (active duty, OEF/IEF veterans), and use of single group uncontrolled designs (pre/post; post only) for all but one study. Future research is needed to address these methodological limitations and build on this small body of literature. PMID:28930278
Text Mining of Journal Articles for Sleep Disorder Terminologies.
Lam, Calvin; Lai, Fu-Chih; Wang, Chia-Hui; Lai, Mei-Hsin; Hsu, Nanly; Chung, Min-Huey
2016-01-01
Research on publication trends in journal articles on sleep disorders (SDs) and the associated methodologies by using text mining has been limited. The present study involved text mining for terms to determine the publication trends in sleep-related journal articles published during 2000-2013 and to identify associations between SD and methodology terms as well as conducting statistical analyses of the text mining findings. SD and methodology terms were extracted from 3,720 sleep-related journal articles in the PubMed database by using MetaMap. The extracted data set was analyzed using hierarchical cluster analyses and adjusted logistic regression models to investigate publication trends and associations between SD and methodology terms. MetaMap had a text mining precision, recall, and false positive rate of 0.70, 0.77, and 11.51%, respectively. The most common SD term was breathing-related sleep disorder, whereas narcolepsy was the least common. Cluster analyses showed similar methodology clusters for each SD term, except narcolepsy. The logistic regression models showed an increasing prevalence of insomnia, parasomnia, and other sleep disorders but a decreasing prevalence of breathing-related sleep disorder during 2000-2013. Different SD terms were positively associated with different methodology terms regarding research design terms, measure terms, and analysis terms. Insomnia-, parasomnia-, and other sleep disorder-related articles showed an increasing publication trend, whereas those related to breathing-related sleep disorder showed a decreasing trend. Furthermore, experimental studies more commonly focused on hypersomnia and other SDs and less commonly on insomnia, breathing-related sleep disorder, narcolepsy, and parasomnia. Thus, text mining may facilitate the exploration of the publication trends in SDs and the associated methodologies.
Brunoni, André R; Tadini, Laura; Fregni, Felipe
2010-03-03
There have been many changes in clinical trials methodology since the introduction of lithium and the beginning of the modern era of psychopharmacology in 1949. The nature and importance of these changes have not been fully addressed to date. As methodological flaws in trials can lead to false-negative or false-positive results, the objective of our study was to evaluate the impact of methodological changes in psychopharmacology clinical research over the past 60 years. We performed a systematic review from 1949 to 2009 on MEDLINE and Web of Science electronic databases, and a hand search of high impact journals on studies of seven major drugs (chlorpromazine, clozapine, risperidone, lithium, fluoxetine and lamotrigine). All controlled studies published 100 months after the first trial were included. Ninety-one studies met our inclusion criteria. We analyzed the major changes in abstract reporting, study design, participants' assessment and enrollment, methodology and statistical analysis. Our results showed that the methodology of psychiatric clinical trials changed substantially, with quality gains in abstract reporting, results reporting, and statistical methodology. Recent trials use more informed consent, periods of washout, intention-to-treat approach and parametric tests. Placebo use remains high and unchanged over time. Clinical trial quality of psychopharmacological studies has changed significantly in most of the aspects we analyzed. There was significant improvement in quality reporting and internal validity. These changes have increased study efficiency; however, there is room for improvement in some aspects such as rating scales, diagnostic criteria and better trial reporting. Therefore, despite the advancements observed, there are still several areas that can be improved in psychopharmacology clinical trials.
Belgian guidelines for economic evaluations: second edition.
Thiry, Nancy; Neyt, Mattias; Van De Sande, Stefaan; Cleemput, Irina
2014-12-01
The aim of this study was to present the updated methodological guidelines for economic evaluations of healthcare interventions (drugs, medical devices, and other interventions) in Belgium. The update of the guidelines was performed by three Belgian health economists following feedback from users of the former guidelines and personal experience. The updated guidelines were discussed with a multidisciplinary team consisting of other health economists, assessors of reimbursement request files, representatives of Belgian databases and representatives of the drugs and medical devices industry. The final document was validated by three external validators that were not involved in the previous discussions. The guidelines give methodological guidance for the following components of an economic evaluation: literature review, perspective of the evaluation, definition of the target population, choice of the comparator, analytic technique and study design, calculation of costs, valuation of outcomes, definition of the time horizon, modeling, handling uncertainty and discounting. We present a reference case that can be considered as the minimal requirement for Belgian economic evaluations of health interventions. These guidelines will improve the methodological quality, transparency and uniformity of the economic evaluations performed in Belgium. The guidelines will also provide support to the researchers and assessors performing or evaluating economic evaluations.
Horliana, Anna Carolina Ratto Tempestini; Chambrone, Leandro; Foz, Adriana Moura; Artese, Hilana Paula Carillo; Rabelo, Mariana de Sousa; Pannuti, Cláudio Mendes; Romito, Giuseppe Alexandre
2014-01-01
Background To date, there is no compilation of evidence-based information associating bacteremia and periodontal procedures. This systematic review aims to assess magnitude, duration, prevalence and nature of bacteremia caused by periodontal procedures. Study Design Systematic Review Types of Studies Reviewed MEDLINE, EMBASE and LILACS databases were searched in duplicate through August, 2013 without language restriction. Observational studies were included if blood samples were collected before, during or after periodontal procedures of patients with periodontitis. The methodological quality was assessed in duplicate using the modified Newcastle-Ottawa scale (NOS). Results Search strategy identified 509 potentially eligible articles and nine were included. Only four studies demonstrated high methodological quality, whereas five were of medium or low methodological quality. The study characteristics were considered too heterogeneous to conduct a meta-analysis. Among 219 analyzed patients, 106 (49.4%) had positive bacteremia. More frequent bacteria were S. viridans, A. actinomycetemcomitans P. gingivalis, M. micros and species Streptococcus and Actinomyces, although identification methods of microbiologic assays were different among studies. Clinical Implications Although half of the patients presented positive bacteremia after periodontal procedures, accurate results regarding the magnitude, duration and nature of bacteremia could not be confidentially assessed. PMID:24870125
Allometric scaling theory applied to FIA biomass estimation
David C. Chojnacky
2002-01-01
Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...
Individual determinants of research utilization: a systematic review.
Estabrooks, Carole A; Floyd, Judith A; Scott-Findlay, Shannon; O'Leary, Katherine A; Gushta, Matthew
2003-09-01
In order to design interventions that increase research use in nursing, it is necessary to have an understanding of what influences research use. To report findings on a systematic review of studies that examine individual characteristics of nurses and how they influence the utilization of research. A survey of published articles in English that examine the influence of individual factors on the research utilization behaviour of nurses, without restriction of the study design, from selected computerized databases and hand searches. Articles had to measure one or more individual determinants of research utilization, measure the dependent variable (research utilization), and evaluate the relationship between the dependent and independent variables. The studies also had to indicate the direction of the relationship between the independent and dependent variables, report a P-value and the statistic used, and indicate the magnitude of the relationship. Six categories of potential individual determinants were identified: beliefs and attitudes, involvement in research activities, information seeking, professional characteristics, education and other socio-economic factors. Research design, sampling, measurement, and statistical analysis were examined to evaluate methodological quality. Methodological problems surfaced in all of the studies and, apart from attitude to research, there was little to suggest that any potential individual determinant influences research use. Important conceptual and measurement issues with regard to research utilization could be better addressed if research in the area were undertaken longitudinally by multi-disciplinary teams of researchers.
ERIC Educational Resources Information Center
American Society for Information Science, Washington, DC.
This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…
Visualizing the semantic content of large text databases using text maps
NASA Technical Reports Server (NTRS)
Combs, Nathan
1993-01-01
A methodology for generating text map representations of the semantic content of text databases is presented. Text maps provide a graphical metaphor for conceptualizing and visualizing the contents and data interrelationships of large text databases. Described are a set of experiments conducted against the TIPSTER corpora of Wall Street Journal articles. These experiments provide an introduction to current work in the representation and visualization of documents by way of their semantic content.
NoSQL data model for semi-automatic integration of ethnomedicinal plant data from multiple sources.
Ningthoujam, Sanjoy Singh; Choudhury, Manabendra Dutta; Potsangbam, Kumar Singh; Chetia, Pankaj; Nahar, Lutfun; Sarker, Satyajit D; Basar, Norazah; Das Talukdar, Anupam
2014-01-01
Sharing traditional knowledge with the scientific community could refine scientific approaches to phytochemical investigation and conservation of ethnomedicinal plants. As such, integration of traditional knowledge with scientific data using a single platform for sharing is greatly needed. However, ethnomedicinal data are available in heterogeneous formats, which depend on cultural aspects, survey methodology and focus of the study. Phytochemical and bioassay data are also available from many open sources in various standards and customised formats. To design a flexible data model that could integrate both primary and curated ethnomedicinal plant data from multiple sources. The current model is based on MongoDB, one of the Not only Structured Query Language (NoSQL) databases. Although it does not contain schema, modifications were made so that the model could incorporate both standard and customised ethnomedicinal plant data format from different sources. The model presented can integrate both primary and secondary data related to ethnomedicinal plants. Accommodation of disparate data was accomplished by a feature of this database that supported a different set of fields for each document. It also allowed storage of similar data having different properties. The model presented is scalable to a highly complex level with continuing maturation of the database, and is applicable for storing, retrieving and sharing ethnomedicinal plant data. It can also serve as a flexible alternative to a relational and normalised database. Copyright © 2014 John Wiley & Sons, Ltd.
Morales-Bayuelo, Alejandro
2017-06-21
Mycobacterium tuberculosis remains one of the world's most devastating pathogens. For this reason, we developed a study involving 3D pharmacophore searching, selectivity analysis and database screening for a series of anti-tuberculosis compounds, associated with the protein kinases A, B, and G. This theoretical study is expected to shed some light onto some molecular aspects that could contribute to the knowledge of the molecular mechanics behind interactions of these compounds, with anti-tuberculosis activity. Using the Molecular Quantum Similarity field and reactivity descriptors supported in the Density Functional Theory, it was possible to measure the quantification of the steric and electrostatic effects through the Overlap and Coulomb quantitative convergence (alpha and beta) scales. In addition, an analysis of reactivity indices using global and local descriptors was developed, identifying the binding sites and selectivity on these anti-tuberculosis compounds in the active sites. Finally, the reported pharmacophores to PKn A, B and G, were used to carry out database screening, using a database with anti-tuberculosis drugs from the Kelly Chibale research group (http://www.kellychibaleresearch.uct.ac.za/), to find the compounds with affinity for the specific protein targets associated with PKn A, B and G. In this regard, this hybrid methodology (Molecular Mechanic/Quantum Chemistry) shows new insights into drug design that may be useful in the tuberculosis treatment today.
On use of the multistage dose-response model for assessing laboratory animal carcinogenicity
Nitcheva, Daniella; Piegorsch, Walter W.; West, R. Webster
2007-01-01
We explore how well a statistical multistage model describes dose-response patterns in laboratory animal carcinogenicity experiments from a large database of quantal response data. The data are collected from the U.S. EPA’s publicly available IRIS data warehouse and examined statistically to determine how often higher-order values in the multistage predictor yield significant improvements in explanatory power over lower-order values. Our results suggest that the addition of a second-order parameter to the model only improves the fit about 20% of the time, while adding even higher-order terms apparently does not contribute to the fit at all, at least with the study designs we captured in the IRIS database. Also included is an examination of statistical tests for assessing significance of higher-order terms in a multistage dose-response model. It is noted that bootstrap testing methodology appears to offer greater stability for performing the hypothesis tests than a more-common, but possibly unstable, “Wald” test. PMID:17490794
2013-01-01
Background Indigenous peoples of Australia, Canada, United States and New Zealand experience disproportionately high rates of suicide. As such, the methodological quality of evaluations of suicide prevention interventions targeting these Indigenous populations should be rigorously examined, in order to determine the extent to which they are effective for reducing rates of Indigenous suicide and suicidal behaviours. This systematic review aims to: 1) identify published evaluations of suicide prevention interventions targeting Indigenous peoples in Australia, Canada, United States and New Zealand; 2) critique their methodological quality; and 3) describe their main characteristics. Methods A systematic search of 17 electronic databases and 13 websites for the period 1981–2012 (inclusive) was undertaken. The reference lists of reviews of suicide prevention interventions were hand-searched for additional relevant studies not identified by the electronic and web search. The methodological quality of evaluations of suicide prevention interventions was assessed using a standardised assessment tool. Results Nine evaluations of suicide prevention interventions were identified: five targeting Native Americans; three targeting Aboriginal Australians; and one First Nation Canadians. The main intervention strategies employed included: Community Prevention, Gatekeeper Training, and Education. Only three of the nine evaluations measured changes in rates of suicide or suicidal behaviour, all of which reported significant improvements. The methodological quality of evaluations was variable. Particular problems included weak study designs, reliance on self-report measures, highly variable consent and follow-up rates, and the absence of economic or cost analyses. Conclusions There is an urgent need for an increase in the number of evaluations of preventive interventions targeting reductions in Indigenous suicide using methodologically rigorous study designs across geographically and culturally diverse Indigenous populations. Combining and tailoring best evidence and culturally-specific individual strategies into one coherent suicide prevention program for delivery to whole Indigenous communities and/or population groups at high risk of suicide offers considerable promise. PMID:23663493
Clifford, Anton C; Doran, Christopher M; Tsey, Komla
2013-05-13
Indigenous peoples of Australia, Canada, United States and New Zealand experience disproportionately high rates of suicide. As such, the methodological quality of evaluations of suicide prevention interventions targeting these Indigenous populations should be rigorously examined, in order to determine the extent to which they are effective for reducing rates of Indigenous suicide and suicidal behaviours. This systematic review aims to: 1) identify published evaluations of suicide prevention interventions targeting Indigenous peoples in Australia, Canada, United States and New Zealand; 2) critique their methodological quality; and 3) describe their main characteristics. A systematic search of 17 electronic databases and 13 websites for the period 1981-2012 (inclusive) was undertaken. The reference lists of reviews of suicide prevention interventions were hand-searched for additional relevant studies not identified by the electronic and web search. The methodological quality of evaluations of suicide prevention interventions was assessed using a standardised assessment tool. Nine evaluations of suicide prevention interventions were identified: five targeting Native Americans; three targeting Aboriginal Australians; and one First Nation Canadians. The main intervention strategies employed included: Community Prevention, Gatekeeper Training, and Education. Only three of the nine evaluations measured changes in rates of suicide or suicidal behaviour, all of which reported significant improvements. The methodological quality of evaluations was variable. Particular problems included weak study designs, reliance on self-report measures, highly variable consent and follow-up rates, and the absence of economic or cost analyses. There is an urgent need for an increase in the number of evaluations of preventive interventions targeting reductions in Indigenous suicide using methodologically rigorous study designs across geographically and culturally diverse Indigenous populations. Combining and tailoring best evidence and culturally-specific individual strategies into one coherent suicide prevention program for delivery to whole Indigenous communities and/or population groups at high risk of suicide offers considerable promise.
A manufacturing database of advanced materials used in spacecraft structures
NASA Technical Reports Server (NTRS)
Bao, Han P.
1994-01-01
Cost savings opportunities over the life cycle of a product are highest in the early exploratory phase when different design alternatives are evaluated not only for their performance characteristics but also their methods of fabrication which really control the ultimate manufacturing costs of the product. In the past, Design-To-Cost methodologies for spacecraft design concentrated on the sizing and weight issues more than anything else at the early so-called 'Vehicle Level' (Ref: DOD/NASA Advanced Composites Design Guide). Given the impact of manufacturing cost, the objective of this study is to identify the principal cost drivers for each materials technology and propose a quantitative approach to incorporating these cost drivers into the family of optimization tools used by the Vehicle Analysis Branch of NASA LaRC to assess various conceptual vehicle designs. The advanced materials being considered include aluminum-lithium alloys, thermoplastic graphite-polyether etherketone composites, graphite-bismaleimide composites, graphite- polyimide composites, and carbon-carbon composites. Two conventional materials are added to the study to serve as baseline materials against which the other materials are compared. These two conventional materials are aircraft aluminum alloys series 2000 and series 7000, and graphite-epoxy composites T-300/934. The following information is available in the database. For each material type, the mechanical, physical, thermal, and environmental properties are first listed. Next the principal manufacturing processes are described. Whenever possible, guidelines for optimum processing conditions for specific applications are provided. Finally, six categories of cost drivers are discussed. They include, design features affecting processing, tooling, materials, fabrication, joining/assembly, and quality assurance issues. It should be emphasized that this database is not an exhaustive database. Its primary use is to make the vehicle designer aware of some of the most important aspects of manufacturing associated with his/her choice of the structural materials. The other objective of this study is to propose a quantitative method to determine a Manufacturing Complexity Factor (MCF) for each material being contemplated. This MCF is derived on the basis of the six cost drivers mentioned above plus a Technology Readiness Factor which is very closely related to the Technology Readiness Level (TRL) as defined in the Access To Space final report. Short of any manufacturing information, our MCF is equivalent to the inverse of TRL. As more manufacturing information is available, our MCF is a better representation (than TRL) of the fabrication processes involved. The most likely application for MCF is in cost modeling for trade studies. On-going work is being pursued to expand the potential applications of MCF.
A manufacturing database of advanced materials used in spacecraft structures
NASA Astrophysics Data System (ADS)
Bao, Han P.
1994-12-01
Cost savings opportunities over the life cycle of a product are highest in the early exploratory phase when different design alternatives are evaluated not only for their performance characteristics but also their methods of fabrication which really control the ultimate manufacturing costs of the product. In the past, Design-To-Cost methodologies for spacecraft design concentrated on the sizing and weight issues more than anything else at the early so-called 'Vehicle Level' (Ref: DOD/NASA Advanced Composites Design Guide). Given the impact of manufacturing cost, the objective of this study is to identify the principal cost drivers for each materials technology and propose a quantitative approach to incorporating these cost drivers into the family of optimization tools used by the Vehicle Analysis Branch of NASA LaRC to assess various conceptual vehicle designs. The advanced materials being considered include aluminum-lithium alloys, thermoplastic graphite-polyether etherketone composites, graphite-bismaleimide composites, graphite- polyimide composites, and carbon-carbon composites. Two conventional materials are added to the study to serve as baseline materials against which the other materials are compared. These two conventional materials are aircraft aluminum alloys series 2000 and series 7000, and graphite-epoxy composites T-300/934. The following information is available in the database. For each material type, the mechanical, physical, thermal, and environmental properties are first listed. Next the principal manufacturing processes are described. Whenever possible, guidelines for optimum processing conditions for specific applications are provided. Finally, six categories of cost drivers are discussed. They include, design features affecting processing, tooling, materials, fabrication, joining/assembly, and quality assurance issues. It should be emphasized that this database is not an exhaustive database. Its primary use is to make the vehicle designer aware of some of the most important aspects of manufacturing associated with his/her choice of the structural materials. The other objective of this study is to propose a quantitative method to determine a Manufacturing Complexity Factor (MCF) for each material being contemplated. This MCF is derived on the basis of the six cost drivers mentioned above plus a Technology Readiness Factor which is very closely related to the Technology Readiness Level (TRL) as defined in the Access To Space final report. Short of any manufacturing information, our MCF is equivalent to the inverse of TRL. As more manufacturing information is available, our MCF is a better representation (than TRL) of the fabrication processes involved.
1978-09-01
This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a
Massive parallelization of serial inference algorithms for a complex generalized linear model
Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David
2014-01-01
Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Beccaria, Lisa; Kek, Megan Y C A; Huijser, Henk
2018-06-01
In this paper, a review of nursing education literature is employed to ascertain the extent to which nursing educators apply theory to their research, as well as the types of theory they employ. In addition, the use of research methodologies in the nursing education literature is explored. An integrative review. A systematic search was conducted for English-language, peer reviewed publications of any research design via Academic Search Complete, Science Direct, CINAHL, and Health Source: Nursing/Academic Edition databases from 2001 to 2016, of which 140 were reviewed. The findings suggest that within current nursing education literature the scholarship of discovery, and the exploration of epistemologies other than nursing, in particular as they relate to teaching and learning, shows significant potential for expansion and diversification. The analysis highlights opportunities for nursing educators to incorporate broader theoretical, pedagogical, methodological and philosophical perspectives within teaching and the scholarship of teaching. Copyright © 2018 Elsevier Ltd. All rights reserved.
Methodological Quality Assessment of Meta-Analyses of Hyperthyroidism Treatment.
Qin, Yahong; Yao, Liang; Shao, Feifei; Yang, Kehu; Tian, Limin
2018-01-01
Hyperthyroidism is a common condition that is associated with increased morbidity and mortality. A number of meta-analyses (MAs) have assessed the therapeutic measures for hyperthyroidism, including antithyroid drugs, surgery, and radioiodine, however, the methodological quality has not been evaluated. This study evaluated the methodological quality and summarized the evidence obtained from MAs of hyperthyroidism treatments for radioiodine, antithyroid drugs, and surgery. We searched the PubMed, EMBASE, Cochrane Library, Web of Science, and Chinese Biomedical Literature Database databases. Two investigators independently assessed the meta-analyses titles and abstracts for inclusion. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. A total of 26 MAs fulfilled the inclusion criteria. Based on the AMSTAR scores, the average methodological quality was 8.31, with large variability ranging from 4 to 11. The methodological quality of English meta-analyses was better than that of Chinese meta-analyses. Cochrane reviews had better methodological quality than non-Cochrane reviews due to better study selection and data extraction, the inclusion of unpublished studies, and better reporting of study characteristics. The authors did not report conflicts of interest in 53.8% meta-analyses, and 19.2% did not report the harmful effects of treatment. Publication bias was not assessed in 38.5% of meta-analyses, and 19.2% did not report the follow-up time. Large-scale assessment of methodological quality of meta-analyses of hyperthyroidism treatment highlighted methodological strengths and weaknesses. Consideration of scientific quality when formulating conclusions should be made explicit. Future meta-analyses should improve on reporting conflict of interest. © Georg Thieme Verlag KG Stuttgart · New York.
PathwayAccess: CellDesigner plugins for pathway databases.
Van Hemert, John L; Dickerson, Julie A
2010-09-15
CellDesigner provides a user-friendly interface for graphical biochemical pathway description. Many pathway databases are not directly exportable to CellDesigner models. PathwayAccess is an extensible suite of CellDesigner plugins, which connect CellDesigner directly to pathway databases using respective Java application programming interfaces. The process is streamlined for creating new PathwayAccess plugins for specific pathway databases. Three PathwayAccess plugins, MetNetAccess, BioCycAccess and ReactomeAccess, directly connect CellDesigner to the pathway databases MetNetDB, BioCyc and Reactome. PathwayAccess plugins enable CellDesigner users to expose pathway data to analytical CellDesigner functions, curate their pathway databases and visually integrate pathway data from different databases using standard Systems Biology Markup Language and Systems Biology Graphical Notation. Implemented in Java, PathwayAccess plugins run with CellDesigner version 4.0.1 and were tested on Ubuntu Linux, Windows XP and 7, and MacOSX. Source code, binaries, documentation and video walkthroughs are freely available at http://vrac.iastate.edu/~jlv.
Gratia, Audrey; Merlet, Denis; Ducruet, Violette; Lyathaud, Cédric
2015-01-01
A nuclear magnetic resonance (NMR) methodology was assessed regarding the identification and quantification of additives in three types of polylactide (PLA) intended as food contact materials. Additives were identified using the LNE/NMR database which clusters NMR datasets on more than 130 substances authorized by European Regulation No. 10/2011. Of the 12 additives spiked in the three types of PLA pellets, 10 were rapidly identified by the database and correlated with spectral comparison. The levels of the 12 additives were estimated using quantitative NMR combined with graphical computation. A comparison with chromatographic methods tended to prove the sensitivity of NMR by demonstrating an analytical difference of less than 15%. Our results therefore demonstrated the efficiency of the proposed NMR methodology for rapid assessment of the composition of PLA. Copyright © 2014 Elsevier B.V. All rights reserved.
Food Intakes Converted to Retail Commodities Databases 2003-08: Methodology and User Guide
USDA-ARS?s Scientific Manuscript database
The purpose for developing the Food Intakes Converted to Retail Commodities Databases (FICRCD) 2003-08 is to convert foods consumed in What We Eat In America, National Health and Nutrition Examination Survey (WWEIA, NHANES) 2003-2004, 2005-2006, and 2007-2008 to respective amounts of retail-level fo...
ERIC Educational Resources Information Center
Caison, Amy L.
2007-01-01
This study empirically explores the comparability of traditional survey-based retention research methodology with an alternative approach that relies on data commonly available in institutional student databases. Drawing on Tinto's [Tinto, V. (1993). "Leaving College: Rethinking the Causes and Cures of Student Attrition" (2nd Ed.), The University…
Ownsworth, Tamara; Haslam, Catherine
2016-01-01
To date, reviews of rehabilitation efficacy after traumatic brain injury (TBI) have overlooked the impact on sense of self, focusing instead on functional impairment and psychological distress. The present review sought to address this gap by critically appraising the methodology and efficacy of intervention studies that assess changes in self-concept. A systematic search of PsycINFO, Medline, CINAHL and PubMed was conducted from inception to September 2013 to identify studies reporting pre- and post-intervention changes on validated measures of self-esteem or self-concept in adults with TBI. Methodological quality of randomised controlled trials (RCTs) was examined using the Physiotherapy Evidence Database (PEDro) scale. A total of 17 studies (10 RCTs, 4 non-RCT group studies, 3 case studies) was identified, which examined the impact of psychotherapy, family-based support, cognitive rehabilitation or activity-based interventions on self-concept. The findings on the efficacy of these interventions were mixed, with only 10 studies showing some evidence of improvement in self-concept based on within-group or pre-post comparisons. Such findings highlight the need for greater focus on the impact of rehabilitation on self-understanding with improved assessment and intervention methodology. We draw upon theories of identity reconstruction and highlight implications for the design and evaluation of identity-oriented interventions that can supplement existing rehabilitation programmes for people with TBI.
Methodological adequacy of articles published in two open-access Brazilian cardiology periodicals.
Macedo, Cristiane Rufino; Silva, Davi Leite da; Puga, Maria Eduarda
2010-01-01
The use of rigorous scientific methods has contributed towards developing scientific articles of excellent methodological quality. This has made it possible to promote their citation and increase the impact factor. Brazilian periodicals have had to adapt to certain quality standards demanded by these indexing organizations, such as the content and the number of original articles published in each issue. This study aimed to evaluate the methodological adequacy of two Brazilian periodicals within the field of cardiology that are indexed in several databases and freely accessible through the Scientific Electronic Library Online (SciELO), and which are now indexed by the Web of Science (Institute for Scientific Information, ISI). Descriptive study at Brazilian Cochrane Center. All the published articles were evaluated according to merit assessment (content) and form assessment (performance). Ninety-six percent of the articles analyzed presented study designs that were adequate for answering the objectives. These two Brazilian periodicals within the field of cardiology published methodologically adequate articles, since they followed the quality standards. Thus, these periodicals can be considered both for consultation and as vehicles for publishing future articles. For further analyses, it is essential to apply other indicators of scientific activity such as bibliometrics, which evaluates quantitative aspects of the production, dissemination and use of information, and scientometrics, which is also concerned with the development of science policies, within which it is often superimposed on bibliometrics.
Surgical interventions for gastric cancer: a review of systematic reviews.
He, Weiling; Tu, Jian; Huo, Zijun; Li, Yuhuang; Peng, Jintao; Qiu, Zhenwen; Luo, Dandong; Ke, Zunfu; Chen, Xinlin
2015-01-01
To evaluate methodological quality and the extent of concordance among meta-analysis and/or systematic reviews on surgical interventions for gastric cancer (GC). A comprehensive search of PubMed, Medline, EMBASE, the Cochrane library and the DARE database was conducted to identify the reviews comparing different surgical interventions for GC prior to April 2014. After applying included criteria, available data were summarized and appraised by the Oxman and Guyatt scale. Fifty six reviews were included. Forty five reviews (80.4%) were well conducted, with scores of adapted Oxman and Guyatt scale ≥ 14. The reviews differed in criteria for avoiding bias and assessing the validity of the primary studies. Many primary studies displayed major methodological flaws, such as randomization, allocation concealment, and dropouts and withdrawals. According to the concordance assessment, laparoscopy-assisted gastrectomy (LAG) was superior to open gastrectomy, and laparoscopy-assisted distal gastrectomy was superior to open distal gastrectomy in short-term outcomes. However, the concordance regarding other surgical interventions, such as D1 vs. D2 lymphadenectomy, and robotic gastrectomy vs. LAG were absent. Systematic reviews on surgical interventions for GC displayed relatively high methodological quality. The improvement of methodological quality and reporting was necessary for primary studies. The superiority of laparoscopic over open surgery was demonstrated. But concordance on other surgical interventions was rare, which needed more well-designed RCTs and systematic reviews.
The Methodology of Clinical Studies Used by the FDA for Approval of High-Risk Orthopaedic Devices.
Barker, Jordan P; Simon, Stephen D; Dubin, Jonathan
2017-05-03
The purpose of this investigation was to examine the methodology of clinical trials used by the U.S. Food and Drug Administration (FDA) to determine the safety and effectiveness of high-risk orthopaedic devices approved between 2001 and 2015. Utilizing the FDA's online public database, this systematic review audited study design and methodological variables intended to minimize bias and confounding. An additional analysis of blinding as well as the Checklist to Evaluate a Report of a Nonpharmacological Trial (CLEAR NPT) was applied to the randomized controlled trials (RCTs). Of the 49 studies, 46 (94%) were prospective and 37 (76%) were randomized. Forty-seven (96%) of the studies were controlled in some form. Of 35 studies that reported it, blinding was utilized in 21 (60%), of which 8 (38%) were reported as single-blinded and 13 (62%) were reported as double-blinded. Of the 37 RCTs, outcome assessors were clearly blinded in 6 (16%), whereas 15 (41%) were deemed impossible to blind as implants could be readily discerned on imaging. When the CLEAR NPT was applied to the 37 RCTs, >70% of studies were deemed "unclear" in describing generation of allocation sequences, treatment allocation concealment, and adequate blinding of participants and outcome assessors. This study manifests the highly variable reporting and strength of clinical research methodology accepted by the FDA to approve high-risk orthopaedic devices.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
ERIC Educational Resources Information Center
Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter
2010-01-01
Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…
Addressing the English Language Arts Technology Standard in a Secondary Reading Methodology Course.
ERIC Educational Resources Information Center
Merkley, Donna J.; Schmidt, Denise A.; Allen, Gayle
2001-01-01
Describes efforts to integrate technology into a reading methodology course for secondary English majors. Discusses the use of e-mail, multimedia, distance education for videoconferences, online discussion technology, subject-specific software, desktop publishing, a database management system, a concept mapping program, and the use of the World…
Search and selection methodology of systematic reviews in orthodontics (2000-2004).
Flores-Mir, Carlos; Major, Michael P; Major, Paul W
2006-08-01
More systematic reviews related to orthodontic topics are published each year, although little has been done to evaluate their search and selection methodologies. Systematic reviews related to orthodontics published between January 1, 2000, and December 31, 2004, were searched for their use of multiple electronic databases and secondary searches. The search and selection methods of identified systematic reviews were evaluated against the Cochrane Handbook's guidelines. Sixteen orthodontic systematic reviews were identified in this period. The percentage of reviews documenting and using each criterion of article searching has changed over the last 5 years, with no recognizable directional trend. On average, most systematic reviews documented their electronic search terms (88%) and inclusion-exclusion criteria (100%), and used secondary searching (75%). Many still failed to search more than MEDLINE (56%), failed to document the database names and search dates (37%), failed to document the search strategy (62%), did not use several reviewers for selecting studies (75%), and did not include all languages (81%). The methodology of systematic reviews in orthodontics is still limited, with key methodological components frequently absent or not appropriately described.
Tao, Huan; Zhang, Yueyuan; Li, Qian; Chen, Jin
2017-11-01
To assess the methodological quality of systematic reviews (SRs) or meta-analysis concerning the predictive value of ERCC1 in platinum chemotherapy of non-small cell lung cancer. We searched the PubMed, EMbase, Cochrane library, international prospective register of systematic reviews, Chinese BioMedical Literature Database, China National Knowledge Infrastructure, Wan Fang and VIP database for SRs or meta-analysis. The methodological quality of included literatures was evaluated by risk of bias in systematic review (ROBIS) scale. Nineteen eligible SRs/meta-analysis were included. The most frequently searched databases were EMbase (74%), PubMed, Medline and CNKI. Fifteen SRs did additional retrieval manually, but none of them retrieved the registration platform. 47% described the two-reviewers model in the screening for eligible original articles, and seven SRs described the two reviewers to extract data. In methodological quality assessment, inter-rater reliability Kappa was 0.87 between two reviewers. Research question were well related to all SRs in phase 1 and the eligibility criteria was suitable for each SR, and rated as 'low' risk bias. But the 'high' risk bias existed in all the SRs regarding methods used to identify and/or select studies, and data collection and study appraisal. More than two-third of SRs or meta-analysis were finished with high risk of bias in the synthesis, findings and the final phase. The study demonstrated poor methodological quality of SRs/meta-analysis assessing the predictive value of ERCC1 in chemotherapy among the NSCLC patients, especially the high performance bias. Registration or publishing the protocol is recommended in future research.
A scalable database model for multiparametric time series: a volcano observatory case study
NASA Astrophysics Data System (ADS)
Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea
2014-05-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
A multidisciplinary database for geophysical time series management
NASA Astrophysics Data System (ADS)
Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.
2013-12-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
Takada, Mitsutaka; Fujimoto, Mai; Motomura, Haruka; Hosomi, Kouichi
2016-01-01
Voltage-gated sodium channels (VGSCs) are drug targets for the treatment of epilepsy. Recently, a decreased risk of cancer associated with sodium channel-blocking antiepileptic drugs (AEDs) has become a research focus of interest. The purpose of this study was to test the hypothesis that the use of sodium channel-blocking AEDs are inversely associated with cancer, using different methodologies, algorithms, and databases. A total of 65,146,507 drug-reaction pairs from the first quarter of 2004 through the end of 2013 were downloaded from the US Food and Drug Administration Adverse Event Reporting System. The reporting odds ratio (ROR) and information component (IC) were used to detect an inverse association between AEDs and cancer. Upper limits of the 95% confidence interval (CI) of < 1 and < 0 for the ROR and IC, respectively, signified inverse associations. Furthermore, using a claims database, which contains 3 million insured persons, an event sequence symmetry analysis (ESSA) was performed to identify an inverse association between AEDs and cancer over the period of January 2005 to May 2014. The upper limit of the 95% CI of adjusted sequence ratio (ASR) < 1 signified an inverse association. In the FAERS database analyses, significant inverse associations were found between sodium channel-blocking AEDs and individual cancers. In the claims database analyses, sodium channel-blocking AED use was inversely associated with diagnoses of colorectal cancer, lung cancer, gastric cancer, and hematological malignancies, with ASRs of 0.72 (95% CI: 0.60 - 0.86), 0.65 (0.51 - 0.81), 0.80 (0.65 - 0.98), and 0.50 (0.37 - 0.66), respectively. Positive associations between sodium channel-blocking AEDs and cancer were not found in the study. Multi-methodological approaches using different methodologies, algorithms, and databases suggest that sodium channel-blocking AED use is inversely associated with colorectal cancer, lung cancer, gastric cancer, and hematological malignancies.
Jia, Pengli; Tang, Li; Yu, Jiajie; Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-03-06
To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Practice databases and their uses in clinical research.
Tierney, W M; McDonald, C J
1991-04-01
A few large clinical information databases have been established within larger medical information systems. Although they are smaller than claims databases, these clinical databases offer several advantages: accurate and timely data, rich clinical detail, and continuous parameters (for example, vital signs and laboratory results). However, the nature of the data vary considerably, which affects the kinds of secondary analyses that can be performed. These databases have been used to investigate clinical epidemiology, risk assessment, post-marketing surveillance of drugs, practice variation, resource use, quality assurance, and decision analysis. In addition, practice databases can be used to identify subjects for prospective studies. Further methodologic developments are necessary to deal with the prevalent problems of missing data and various forms of bias if such databases are to grow and contribute valuable clinical information.
Faggion, Clovis M; Huda, Fahd; Wasiak, Jason
2014-06-01
To evaluate the methodological approaches used to assess the quality of studies included in systematic reviews (SRs) in periodontology and implant dentistry. Two electronic databases (PubMed and Cochrane Database of Systematic Reviews) were searched independently to identify SRs examining interventions published through 2 September 2013. The reference lists of included SRs and records of 10 specialty dental journals were searched manually. Methodological approaches were assessed using seven criteria based on the Cochrane Handbook for Systematic Reviews of Interventions. Temporal trends in methodological quality were also explored. Of the 159 SRs with meta-analyses included in the analysis, 44 (28%) reported the use of domain-based tools, 15 (9%) reported the use of checklists and 7 (4%) reported the use of scales. Forty-two (26%) SRs reported use of more than one tool. Criteria were met heterogeneously; authors of 15 (9%) publications incorporated the quality of evidence of primary studies into SRs, whereas 69% of SRs reported methodological approaches in the Materials/Methods section. Reporting of four criteria was significantly better in recent (2010-2013) than in previous publications. The analysis identified several methodological limitations of approaches used to assess evidence in studies included in SRs in periodontology and implant dentistry. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
François, Clément; Tanasescu, Adrian; Lamy, François-Xavier; Despiegel, Nicolas; Falissard, Bruno; Chalem, Ylana; Lançon, Christophe; Llorca, Pierre-Michel; Saragoussi, Delphine; Verpillat, Patrice; Wade, Alan G.; Zighed, Djamel A.
2017-01-01
ABSTRACT Background and objective: Automated healthcare databases (AHDB) are an important data source for real life drug and healthcare use. In the filed of depression, lack of detailed clinical data requires the use of binary proxies with important limitations. The study objective was to create a Depressive Health State Index (DHSI) as a continuous health state measure for depressed patients using available data in an AHDB. Methods: The study was based on historical cohort design using the UK Clinical Practice Research Datalink (CPRD). Depressive episodes (depression diagnosis with an antidepressant prescription) were used to create the DHSI through 6 successive steps: (1) Defining study design; (2) Identifying constituent parameters; (3) Assigning relative weights to the parameters; (4) Ranking based on the presence of parameters; (5) Standardizing the rank of the DHSI; (6) Developing a regression model to derive the DHSI in any other sample. Results: The DHSI ranged from 0 (worst) to 100 (best health state) comprising 29 parameters. The proportion of depressive episodes with a remission proxy increased with DHSI quartiles. Conclusion: A continuous outcome for depressed patients treated by antidepressants was created in an AHDB using several different variables and allowed more granularity than currently used proxies. PMID:29081921
Evidence generation from healthcare databases: recommendations for managing change.
Bourke, Alison; Bate, Andrew; Sauer, Brian C; Brown, Jeffrey S; Hall, Gillian C
2016-07-01
There is an increasing reliance on databases of healthcare records for pharmacoepidemiology and other medical research, and such resources are often accessed over a long period of time so it is vital to consider the impact of changes in data, access methodology and the environment. The authors discuss change in communication and management, and provide a checklist of issues to consider for both database providers and users. The scope of the paper is database research, and changes are considered in relation to the three main components of database research: the data content itself, how it is accessed, and the support and tools needed to use the database. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
An ex post facto evaluation framework for place-based police interventions.
Braga, Anthony A; Hureau, David M; Papachristos, Andrew V
2011-12-01
A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Recent methodological developments were applied to conduct a rigorous ex post facto evaluation of the Boston Police Department's Safe Street Team (SST) hot spots policing program. A nonrandomized quasi-experimental design was used to evaluate the violent crime control benefits of the SST program at treated street segments and intersections relative to untreated street segments and intersections. Propensity score matching techniques were used to identify comparison places in Boston. Growth curve regression models were used to analyze violent crime trends at treatment places relative to control places. UNITS OF ANALYSIS: Using computerized mapping and database software, a micro-level place database of violent index crimes at all street segments and intersections in Boston was created. Yearly counts of violent index crimes between 2000 and 2009 at the treatment and comparison street segments and intersections served as the key outcome measure. The SST program was associated with a statistically significant reduction in violent index crimes at the treatment places relative to the comparison places without displacing crime into proximate areas. To overcome the challenges of evaluation in real-world settings, evaluators need to continuously develop innovative approaches that take advantage of new theoretical and methodological approaches.
Reusable Rocket Engine Operability Modeling and Analysis
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Komar, D. R.
1998-01-01
This paper describes the methodology, model, input data, and analysis results of a reusable launch vehicle engine operability study conducted with the goal of supporting design from an operations perspective. Paralleling performance analyses in schedule and method, this requires the use of metrics in a validated operations model useful for design, sensitivity, and trade studies. Operations analysis in this view is one of several design functions. An operations concept was developed given an engine concept and the predicted operations and maintenance processes incorporated into simulation models. Historical operations data at a level of detail suitable to model objectives were collected, analyzed, and formatted for use with the models, the simulations were run, and results collected and presented. The input data used included scheduled and unscheduled timeline and resource information collected into a Space Transportation System (STS) Space Shuttle Main Engine (SSME) historical launch operations database. Results reflect upon the importance not only of reliable hardware but upon operations and corrective maintenance process improvements.
Fazio, Simone; Garraín, Daniel; Mathieux, Fabrice; De la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda
2015-01-01
Under the framework of the European Platform on Life Cycle Assessment, the European Reference Life-Cycle Database (ELCD - developed by the Joint Research Centre of the European Commission), provides core Life Cycle Inventory (LCI) data from front-running EU-level business associations and other sources. The ELCD contains energy-related data on power and fuels. This study describes the methods to be used for the quality analysis of energy data for European markets (available in third-party LC databases and from authoritative sources) that are, or could be, used in the context of the ELCD. The methodology was developed and tested on the energy datasets most relevant for the EU context, derived from GaBi (the reference database used to derive datasets for the ELCD), Ecoinvent, E3 and Gemis. The criteria for the database selection were based on the availability of EU-related data, the inclusion of comprehensive datasets on energy products and services, and the general approval of the LCA community. The proposed approach was based on the quality indicators developed within the International Reference Life Cycle Data System (ILCD) Handbook, further refined to facilitate their use in the analysis of energy systems. The overall Data Quality Rating (DQR) of the energy datasets can be calculated by summing up the quality rating (ranging from 1 to 5, where 1 represents very good, and 5 very poor quality) of each of the quality criteria indicators, divided by the total number of indicators considered. The quality of each dataset can be estimated for each indicator, and then compared with the different databases/sources. The results can be used to highlight the weaknesses of each dataset and can be used to guide further improvements to enhance the data quality with regard to the established criteria. This paper describes the application of the methodology to two exemplary datasets, in order to show the potential of the methodological approach. The analysis helps LCA practitioners to evaluate the usefulness of the ELCD datasets for their purposes, and dataset developers and reviewers to derive information that will help improve the overall DQR of databases.
RoadPlex: A Mobile VGI Game to Collect and Validate Data for POIs
NASA Astrophysics Data System (ADS)
Kashian, A.; Rajabifard, A.; Richter, K. F.
2014-11-01
By increasing the popularity of smart phones equipped with GPS sensors, more volunteers are expected to join VGI (Volunteered Geographic Information) activities and therefore more positional data will be collected in shorter time. Current statistics from open databases such OpenStreetMap reveals that although there have been exponential growth in the number of contributed POIs (Points of Interest), the lack of detailed attribute information is immediately visible. The process of adding attribute information to VGI databases is usually considered as a boring task and it is believed that contributors do not experience a similar level of satisfaction when they add such detailed information compared to tasks like adding new roads or copying building boundaries from satellite imageries. In other crowdsourcing projects, different approaches are taken for engaging contributors in problem solving by embedding the tasks inside a game. In the literature, this concept is known as "gamification" or "games with purpose" which encapsulate the idea of entertaining contributors while they are completing a particular defined task. Same concept is used to design a mobile application called "RoadPlex" which aims to collect general or specific attribute information for POIs The increased number of contributions in the past few months confirms that the design characteristics and the methodology of the game are appealing to players. Such growth enables us to evaluate the quality of the generated data through mining the database of answered questions. This paper reflects the some contribution results and emphasises the importance of using gamification concept in the domain of VGI.
Spinal Manipulative Therapy and Sports Performance Enhancement: A Systematic Review.
Botelho, Marcelo B; Alvarenga, Bruno A P; Molina, Nícolly; Ribas, Marcos; Baptista, Abrahão F
2017-09-01
The purpose of this study was to review the literature regarding the relationship between spinal manipulative therapy (SMT) and sports performance. PubMed and Embase databases were searched for original studies published up to July 2016. Inclusion criteria were if SMT has been applied to athletes and if any sports performance-related outcome was measured. Of the 581 potential studies, 7 clinical trials were selected. Most studies had adequate quality (≥6/11) when assessed by the PEDro scale. None of those studies assessed performance at an event or competition. Four studies revealed improvement in a sports performance test after SMT. Meta-analysis could not be performed because of the wide differences in methodologies, design, and outcomes measured. Spinal manipulative therapy influences a wide range of neurophysiological parameters that could be associated with sports performance. Of the 3 studies where SMT did not improve test performance, 2 used SMT not for therapeutic correction of a dysfunctional vertebral joint but to an arbitrary previously set joint. Although 4 of 7 studies showed that SMT improved sports performance tests, the evidence is still weak to support its use. Spinal manipulative therapy may be a promising approach for performance enhancement that should be investigated with more consistent methodologic designs. Copyright © 2017. Published by Elsevier Inc.
South Korean anthropometric data and survey methodology: 'Size Korea' project.
Kim, Jung Yong; You, Jae Woo; Kim, Mi Sook
2017-11-01
Considering the many emerging markets in East Asia, access to contemporary anthropometric data for this region is important for designers and manufacturers seeking to produce the best fitting products and living environments for consumers. The purpose of this paper is to describe Korean anthropometric data collection and survey techniques for those who are interested in ethnic characteristics, conducting surveys, and formulating ergonomic product designs for South Korean and, more broadly, East Asian populations. The Size Korea survey was conducted in 2003-2004 and 2010. A total of 14,200 civilians aged 0-90 years participated in the survey, with 119 body and weight dimensions measured in 2004. Twenty new dimensions from Inbody measurement were added in 2010 and the data were continuously updated. We referred to ISO 7250, 8559 and 15535 to ensure validity and reliability. Fifty major body dimensions, including weight, are summarised in this paper, and 34 of these dimensions can be compared with 11 multinational data already reported in other publications. Practitioner Summary: This paper presents the up-to-date anthropometric database of East Asian physical characteristics and survey methodology. These data satisfy the ISO standards and comprise 50 physical dimensions including weight. Thirty-four dimensions of these can be directly compared with available multinational data.
Roe, Justin W G; Carding, Paul N; Dwivedi, Raghav C; Kazi, Rehan A; Rhys-Evans, Peter H; Harrington, Kevin J; Nutting, Christopher M
2010-10-01
A systematic review to establish what evidence is available for swallowing outcomes following IMRT for head and neck cancer. Online electronic databases were searched to identify papers published in English from January 1998 to December 2009. Papers were independently appraised by two reviewers for methodological quality, method of swallowing evaluation and categorized according to the World Health Organisation's International Classification of Health Functions. The impact of radiation dose to dysphagia aspiration risk structures (DARS) was also evaluated. Sixteen papers met the inclusion criteria. The literature suggests that limiting the radiation dose to certain structures may result in favourable swallowing outcomes. Methodological limitations included variable assessment methods and outcome measures and heterogeneity of patients. There are only limited prospective data, especially where pre-treatment measures have been taken and compared to serial post-treatment assessment. Few studies have investigated the impact of IMRT on swallow function and the impact on everyday life. Initial studies have reported potential benefits but are limited in terms of study design and outcome data. Further well designed, prospective, longitudinal swallowing studies including multidimensional evaluation methods are required to enable a more comprehensive understanding of dysphagia complications and inform pre-treatment counselling and rehabilitation planning. Copyright © 2010 Elsevier Ltd. All rights reserved.
Use of geographic information systems in rabies vaccination campaigns.
Grisi-Filho, José Henrique de Hildebrand e; Amaku, Marcos; Dias, Ricardo Augusto; Montenegro Netto, Hildebrando; Paranhos, Noemia Tucunduva; Mendes, Maria Cristina Novo Campos; Ferreira Neto, José Soares; Ferreira, Fernando
2008-12-01
To develop a method to assist in the design and assessment of animal rabies control campaigns. A methodology was developed based on geographic information systems to estimate the animal (canine and feline) population and density per census tract and per subregion (known as "Subprefeituras") in the city of São Paulo (Southeastern Brazil) in 2002. The number of vaccination units in a given region was estimated to achieve a certain proportion of vaccination coverage. Census database was used for the human population, as well as estimates ratios of dog:inhabitant and cat:inhabitant. Estimated figures were 1,490,500 dogs and 226,954 cats in the city, i.e. an animal population density of 1138.14 owned animals per km(2). In the 2002 campaign, 926,462 were vaccinated, resulting in a vaccination coverage of 54%. The estimated number of vaccination units to be able to reach a 70%-vaccination coverage, by vaccinating 700 animals per unit on average, was 1,729. These estimates are presented as maps of animal density according to census tracts and "Subprefeituras". The methodology used in the study may be applied in a systematic way to the design and evaluation of rabies vaccination campaigns, enabling the identification of areas of critical vaccination coverage.
The relationship between ground conditions and injury: what level of evidence do we have?
Petrass, Lauren A; Twomey, Dara M
2013-03-01
To identify studies which address the relationship between ground conditions and injury, in a sporting context and to evaluate current practice and provide recommendations for future studies that measure ground conditions and injury risk. Systematic review. A comprehensive search of electronic databases from the earliest records available until the end of 2011, and supplemental hand searching was conducted to identify relevant studies. A classification scale was used to rate the methodological quality of studies. 79 potentially relevant articles were identified, and 27 met all inclusion criteria. They varied in methodological quality, with analytical observational studies the most common design, although four descriptive observational studies, considered to be of lower quality were also identified. Only five studies objectively measured ground conditions, and of studies that used subjective assessment, only one provided descriptors to explain their classifications. It appears that harder/drier grounds are associated with an increased injury risk but the presence of major limitations necessitates cautious interpretation of many key findings. There is limited high quality evidence of the relationship between injury risk and ground conditions. Further research with high quality designs, and measurement of ground conditions are required to draw more definitive conclusions regarding this relationship. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Valentine, Jeffrey C; Cooper, Harris
2008-06-01
Assessments of studies meant to evaluate the effectiveness of interventions, programs, and policies can serve an important role in the interpretation of research results. However, evidence suggests that available quality assessment tools have poor measurement characteristics and can lead to opposing conclusions when applied to the same body of studies. These tools tend to (a) be insufficiently operational, (b) rely on arbitrary post-hoc decision rules, and (c) result in a single number to represent a multidimensional construct. In response to these limitations, a multilevel and hierarchical instrument was developed in consultation with a wide range of methodological and statistical experts. The instrument focuses on the operational details of studies and results in a profile of scores instead of a single score to represent study quality. A pilot test suggested that satisfactory between-judge agreement can be obtained using well-trained raters working in naturalistic conditions. Limitations of the instrument are discussed, but these are inherent in making decisions about study quality given incomplete reporting and in the absence of strong, contextually based information about the effects of design flaws on study outcomes. (PsycINFO Database Record (c) 2008 APA, all rights reserved).
ERIC Educational Resources Information Center
Taft, Laritza M.
2010-01-01
In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…
USDA-ARS?s Scientific Manuscript database
The Dietary Supplement Ingredient Database (DSID) is a federal initiative to provide analytical validation of ingredients in dietary supplements. The first release on vitamins and minerals in adult MVMs is now available. Multiple lots of >100 representative adult MVMs were chemically analyzed for ...
ERIC Educational Resources Information Center
Uzunboylu, Huseyin; Genc, Zeynep
2017-01-01
The purpose of this study is to determine the recent trends in foreign language learning through mobile learning. The study was conducted employing document analysis and related content analysis among the qualitative research methodology. Through the search conducted on Scopus database with the key words "mobile learning and foreign language…
Towards a Methodology for the Design of Multimedia Public Access Interfaces.
ERIC Educational Resources Information Center
Rowley, Jennifer
1998-01-01
Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…
DPP-4 inhibitors for the treatment of type 2 diabetes: a methodology overview of systematic reviews.
Ling, Juan; Ge, Long; Zhang, Ding-Hua; Wang, Yong-Feng; Xie, Zhuo-Lin; Tian, Jin-Hui; Xiao, Xiao-Hui; Yang, Ke-Hu
2018-06-01
To evaluate the methodological quality of systematic reviews (SRs), and summarize evidence of important outcomes from dipeptidyl peptidase-4 inhibitors (DPP4-I) in treating type 2 diabetes mellitus (T2DM). We included SRs of DPP4-I for the treatment of T2DM until January, 2018 by searching the Cochrane Library, PubMed, EMBASE and three Chinese databases. We evaluated the methodological qualities with the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool and the GRADE (The Grading of Recommendations Assessment, Development and Evaluation) approach. Sixty-three SRs (a total of 2,603,140 participants) receiving DPP4-I for the treatment of T2DM were included. The results of AMSTAR showed that the lowest quality was "a list of studies (included and excluded) item" with only one (1.6%) study provided, followed by the "providing a priori design" item with only four (6.3%) studies conforming to this item, the next were "the status of publication (gray literature) used as an inclusion criterion item", with only 18 (28.9%) studies conforming to these items. Only seven (11.1%) studies scored more than nine points in AMSTAR, indicating high methodological quality. For GRADE, of the 128 outcomes, high quality evidence was provided in only 28 (21.9%), moderate in 70 (54.7%), low in 27 (21.1%), and very low in three (2.3%). The methodological quality of SRs of DPP4-I for type 2 diabetes mellitus is not high and there are common areas for improvement. Furthermore, the quality of evidence level is moderate and more high quality evidence is needed.
Comparison of tiered formularies and reference pricing policies: a systematic review
Morgan, Steve; Hanley, Gillian; Greyson, Devon
2009-01-01
Objectives To synthesize methodologically comparable evidence from the published literature regarding the outcomes of tiered formularies and therapeutic reference pricing of prescription drugs. Methods We searched the following electronic databases: ABI/Inform, CINAHL, Clinical Evidence, Digital Dissertations & Theses, Evidence-Based Medicine Reviews (which incorporates ACP Journal Club, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Cochrane Methodology Register, Database of Abstracts of Reviews of Effectiveness, Health Technology Assessments and NHS Economic Evaluation Database), EconLit, EMBASE, International Pharmaceutical Abstracts, MEDLINE, PAIS International and PAIS Archive, and the Web of Science. We also searched the reference lists of relevant articles and several grey literature sources. We sought English-language studies published from 1986 to 2007 that examined the effects of either therapeutic reference pricing or tiered formularies, reported on outcomes relevant to patient care and cost-effectiveness, and employed quantitative study designs that included concurrent or historical comparison groups. We abstracted and assessed potentially appropriate articles using a modified version of the data abstraction form developed by the Cochrane Effective Practice and Organisation of Care Group. Results From an initial list of 2964 citations, 12 citations (representing 11 studies) were deemed eligible for inclusion in our review: 3 studies (reported in 4 articles) of reference pricing and 8 studies of tiered formularies. The introduction of reference pricing was associated with reduced plan spending, switching to preferred medicines, reduced overall drug utilization and short-term increases in the use of physician services. Reference pricing was not associated with adverse health impacts. The introduction of tiered formularies was associated with reduced plan expenditures, greater patient costs and increased rates of non-compliance with prescribed drug therapy. From the data available, we were unable to examine the hypothesis that tiered formulary policies result in greater use of physician services and potentially worse health outcomes. Conclusion The available evidence does not clearly differentiate between reference pricing and tiered formularies in terms of policy outcomes. Reference pricing appears to have a slight evidentiary advantage, given that patients’ health outcomes under tiered formularies have not been well studied and that tiered formularies are associated with increased rates of medicine discontinuation. PMID:21603047
Chen, Wei; Lewith, George; Wang, Li-qiong; Ren, Jun; Xiong, Wen-jing; Lu, Fang; Liu, Jian-ping
2014-01-01
Objective Chinese proprietary herbal medicines (CPHMs) have long history in China for the treatment of common cold, and lots of them have been listed in the ‘China national essential drug list’ by the Chinese Ministry of Health. The aim of this review is to provide a well-round clinical evidence assessment on the potential benefits and harms of CPHMs for common cold based on a systematic literature search to justify their clinical use and recommendation. Methods We searched CENTRAL, MEDLINE, EMBASE, SinoMed, CNKI, VIP, China Important Conference Papers Database, China Dissertation Database, and online clinical trial registry websites from their inception to 31 March 2013 for clinical studies of CPHMs listed in the ‘China national essential drug list’ for common cold. There was no restriction on study design. Results A total of 33 CPHMs were listed in ‘China national essential drug list 2012’ for the treatment of common cold but only 7 had supportive clinical evidences. A total of 6 randomised controlled trials (RCTs) and 7 case series (CSs) were included; no other study design was identified. All studies were conducted in China and published in Chinese between 1995 and 2012. All included studies had poor study design and methodological quality, and were graded as very low quality. Conclusions The use of CPHMs for common cold is not supported by robust evidence. Further rigorous well designed placebo-controlled, randomized trials are needed to substantiate the clinical claims made for CPHMs. PMID:25329481
Development and characterization of a 3D high-resolution terrain database
NASA Astrophysics Data System (ADS)
Wilkosz, Aaron; Williams, Bryan L.; Motz, Steve
2000-07-01
A top-level description of methods used to generate elements of a high resolution 3D characterization database is presented. The database elements are defined as ground plane elevation map, vegetation height elevation map, material classification map, discrete man-made object map, and temperature radiance map. The paper will cover data collection by means of aerial photography, techniques of soft photogrammetry used to derive the elevation data, and the methodology followed to generate the material classification map. The discussion will feature the development of the database elements covering Fort Greely, Alaska. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems.
Automating Relational Database Design for Microcomputer Users.
ERIC Educational Resources Information Center
Pu, Hao-Che
1991-01-01
Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…
ESARR: enhanced situational awareness via road sign recognition
NASA Astrophysics Data System (ADS)
Perlin, V. E.; Johnson, D. B.; Rohde, M. M.; Lupa, R. M.; Fiorani, G.; Mohammad, S.
2010-04-01
The enhanced situational awareness via road sign recognition (ESARR) system provides vehicle position estimates in the absence of GPS signal via automated processing of roadway fiducials (primarily directional road signs). Sign images are detected and extracted from vehicle-mounted camera system, and preprocessed and read via a custom optical character recognition (OCR) system specifically designed to cope with low quality input imagery. Vehicle motion and 3D scene geometry estimation enables efficient and robust sign detection with low false alarm rates. Multi-level text processing coupled with GIS database validation enables effective interpretation even of extremely low resolution low contrast sign images. In this paper, ESARR development progress will be reported on, including the design and architecture, image processing framework, localization methodologies, and results to date. Highlights of the real-time vehicle-based directional road-sign detection and interpretation system will be described along with the challenges and progress in overcoming them.
Hospital Nurses' Work Environment Characteristics and Patient Safety Outcomes: A Literature Review.
Lee, Seung Eun; Scott, Linda D
2018-01-01
This integrative literature review assesses the relationship between hospital nurses' work environment characteristics and patient safety outcomes and recommends directions for future research based on examination of the literature. Using an electronic search of five databases, 18 studies published in English between 1999 and 2016 were identified for review. All but one study used a cross-sectional design, and only four used a conceptual/theoretical framework to guide the research. No definition of work environment was provided in most studies. Differing variables and instruments were used to measure patient outcomes, and findings regarding the effects of work environment on patient outcomes were inconsistent. To clarify the relationship between nurses' work environment characteristics and patient safety outcomes, researchers should consider using a longitudinal study design, using a theoretical foundation, and providing clear operational definitions of concepts. Moreover, given the inconsistent findings of previous studies, they should choose their measurement methodologies with care.
Ju, Feng; Zhang, Tong
2015-11-03
Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.
Stang, Paul E; Ryan, Patrick B; Racoosin, Judith A; Overhage, J Marc; Hartzema, Abraham G; Reich, Christian; Welebob, Emily; Scarnecchia, Thomas; Woodcock, Janet
2010-11-02
The U.S. Food and Drug Administration (FDA) Amendments Act of 2007 mandated that the FDA develop a system for using automated health care data to identify risks of marketed drugs and other medical products. The Observational Medical Outcomes Partnership is a public-private partnership among the FDA, academia, data owners, and the pharmaceutical industry that is responding to the need to advance the science of active medical product safety surveillance by using existing observational databases. The Observational Medical Outcomes Partnership's transparent, open innovation approach is designed to systematically and empirically study critical governance, data resource, and methodological issues and their interrelationships in establishing a viable national program of active drug safety surveillance by using observational data. This article describes the governance structure, data-access model, methods-testing approach, and technology development of this effort, as well as the work that has been initiated.
Vedsted, P; Christensen, M B
2005-02-01
To describe the basis on which our knowledge of frequent attendance in general practice rests and to propose recommendations for further research on frequent attenders (FAs). The literature review (finished February 2004) encompassed peer-reviewed articles in English describing contacts with general practice in terms of frequency. Searches were performed in the Medline, CINAHL, EMBASE, PsycINFO, Social Sciences Expanded Index and ISI Citation databases with additional searches in reference lists and the 'related articles' function in the ISI Citation database and Medline. General practice. Sixty-one articles (54 studies). The articles were assessed according to the following design variables: setting; definition of FAs; sampling; sample size; control groups; study aim; study design; data sources; effect measure; and main results. There was no generally accepted definition of frequent attendance. Research designs differed substantially. Eight articles gave sufficient information on all design variables. The top 10% of attenders accounted for 30-50% of all contacts, and up to 40% of FAs were still FAs the following year. More than 50% of FAs had a physical disease, more than 50% of FAs suffered from psychological distress, social factors (low social support, unemployment, divorce) were associated with frequent attendance in more than 50% of FAs, multiproblems (physical, psychological and social) were found in one-third of FAs, and frequent attendance was associated with increasing age and female gender. The diversity of designs, definitions and methods in the current literature on FAs in general practice hampers comparison of their precision, validity and generalizability, and calls for cautious interpretation and adoption of a common, generally acceptable definition in future studies.
Huang, Xiaoyan; O'Connor, Margaret; Ke, Li-Shan; Lee, Susan
2016-05-01
The right of children to have their voice heard has been accepted by researchers, and there are increasing numbers of qualitative health studies involving children. The ethical and methodological issues of including children in research have caused worldwide concerns, and many researchers have published articles sharing their own experiences. To systematically review and synthesise experts' opinions and experiences about ethical and methodological issues of including children in research, as well as related solution strategies. The research design was a systematic review of opinion-based evidence, based on the guidelines by Joanna Briggs Institute. A search of five computerised databases has been conducted in April 2014 and 2271 articles were found. After screening the titles, abstracts, full texts and appraising the quality, 30 articles were finally included in the review. A meta-aggregative approach was applied in the data analysis and synthesis process. Ethical approval is not needed as it is a systematic review of published literature. Six themes were identified, including evaluating potential risks and benefits, gaining access, obtaining informed consent/assent, protecting confidentiality and privacy, building rapport and collecting rich data. The similarities and differences between research involving children and that involving adults were indicated. All potential incentives should be justified when designing the study. Further studies need to research how to evaluate individual capacity of children and how to balance protecting children's right to participate and their interests in the research. Cultural differences related to researching children in different regions should also be studied. © The Author(s) 2014.
Paternal Influences on Adolescent Sexual Risk Behaviors: A Structured Literature Review
Bouris, Alida; Lee, Jane; McCarthy, Katharine; Michael, Shannon L.; Pitt-Barnes, Seraphine; Dittus, Patricia
2012-01-01
BACKGROUND AND OBJECTIVE: To date, most parent-based research has neglected the role of fathers in shaping adolescent sexual behavior and has focused on mothers. The objective of this study was to conduct a structured review to assess the role of paternal influence on adolescent sexual behavior and to assess the methodological quality of the paternal influence literature related to adolescent sexual behavior. METHODS: We searched electronic databases: PubMed, PsychINFO, Social Services Abstracts, Family Studies Abstracts, Sociological Abstracts, and the Cumulative Index to Nursing and Allied Health Literature. Studies published between 1980 and 2011 that targeted adolescents 11 to 18 years and focused on paternal parenting processes were included. Methodological quality was assessed by using an 11-item scoring system. RESULTS: Thirteen articles were identified and reviewed. Findings suggest paternal factors are independently associated with adolescent sexual behavior relative to maternal factors. The most commonly studied paternal influence was emotional qualities of the father-adolescent relationship. Paternal communication about sex was most consistently associated with adolescent sexual behavior, whereas paternal attitudes about sex was least associated. Methodological limitations include a tendency to rely on cross-sectional design, nonprobability sampling methods, and focus on sexual debut versus broader sexual behavior. CONCLUSIONS: Existing research preliminarily suggests fathers influence the sexual behavior of their adolescent children; however, more rigorous research examining diverse facets of paternal influence on adolescent sexual behavior is needed. We provide recommendations for primary care providers and public health practitioners to better incorporate fathers into interventions designed to reduce adolescent sexual risk behavior. PMID:23071205
Mindfulness and perinatal mental health: A systematic review.
Hall, Helen G; Beattie, Jill; Lau, Rosalind; East, Christine; Anne Biro, Mary
2016-02-01
Perinatal stress is associated with adverse maternal and infant outcomes. Mindfulness training may offer a safe and acceptable strategy to support perinatal mental health. To critically appraise and synthesise the best available evidence regarding the effectiveness of mindfulness training during pregnancy to support perinatal mental health. The search for relevant studies was conducted in six electronic databases and in the grey literature. Eligible studies were assessed for methodological quality according to standardised critical appraisal instruments. Data were extracted and recorded on a pre-designed form and then entered into Review Manager. Nine studies were included in the data synthesis. It was not appropriate to combine the study results because of the variation in methodologies and the interventions tested. Statistically significant improvements were found in small studies of women undertaking mindfulness awareness training in one study for stress (mean difference (MD) -5.28, 95% confidence intervals (CI) -10.4 to -0.42, n=22), two for depression (for example MD -5.48, 95% CI -8.96 to -2.0, n=46) and four for anxiety (for example, MD -6.50, 95% CI -10.95 to -2.05, n=32). However the findings of this review are limited by significant methodological issues within the current research studies. There is insufficient evidence from high quality research on which to base recommendations about the effectiveness of mindfulness to promote perinatal mental health. The limited positive findings support the design and conduct of adequately powered, longitudinal randomised controlled trials, with active controls. Copyright © 2015 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Implementing the 4D cycle of appreciative inquiry in health care: a methodological review.
Trajkovski, Suza; Schmied, Virginia; Vickers, Margaret; Jackson, Debra
2013-06-01
To examine and critique how the phases of the 4D cycle (Discovery, Dream, Design, and Destiny) of appreciative inquiry are implemented in a healthcare context. Appreciative inquiry is a theoretical research perspective, an emerging research methodology and a world view that builds on action research, organizational learning, and organizational change. Increasing numbers of articles published provide insights and learning into its theoretical and philosophical underpinnings. Many articles describe appreciative inquiry and the outcomes of their studies; however, there is a gap in the literature examining the approaches commonly used to implement the 4D cycle in a healthcare context. A methodological review following systematic principles. A methodological review was conducted including articles from the inception of appreciative inquiry in 1986 to the time of writing this review in November, 2011. Key database searches included CINAHL, Emerald, MEDLINE, PubMed, PsycINFO, and Scopus. A methodological review following systematic principles was undertaken. Studies were included if they described in detail the methods used to implement the 4D cycle of appreciative inquiry in a healthcare context. Nine qualitative studies met the inclusion criteria. Results highlighted that appreciative inquiry application is unique and varied between studies. The 4D phases were not rigid steps and were adapted to the setting and participants. Overall, participant enthusiasm and commitment were highlighted suggesting appreciative inquiry was mostly positively perceived by participants. Appreciative inquiry provides a positive way forward shifting from problems to solutions offering a new way of practicing in health care and health research. © 2012 Blackwell Publishing Ltd.
Reynolds, Christopher R; Muggleton, Stephen H; Sternberg, Michael J E
2015-01-01
The use of virtual screening has become increasingly central to the drug development pipeline, with ligand-based virtual screening used to screen databases of compounds to predict their bioactivity against a target. These databases can only represent a small fraction of chemical space, and this paper describes a method of exploring synthetic space by applying virtual reactions to promising compounds within a database, and generating focussed libraries of predicted derivatives. A ligand-based virtual screening tool Investigational Novel Drug Discovery by Example (INDDEx) is used as the basis for a system of virtual reactions. The use of virtual reactions is estimated to open up a potential space of 1.21×1012 potential molecules. A de novo design algorithm known as Partial Logical-Rule Reactant Selection (PLoRRS) is introduced and incorporated into the INDDEx methodology. PLoRRS uses logical rules from the INDDEx model to select reactants for the de novo generation of potentially active products. The PLoRRS method is found to increase significantly the likelihood of retrieving molecules similar to known actives with a p-value of 0.016. Case studies demonstrate that the virtual reactions produce molecules highly similar to known actives, including known blockbuster drugs. PMID:26583052
Welch, Janet L; Thomas-Hawkins, Charlotte
2005-07-01
We reviewed psycho-educational intervention studies that were designed to reduce interdialytic weight gain (IDWG) in adult hemodialysis patients. Our goals were to critique research methods, describe the effectiveness of tested interventions, and make recommendations for future research. Medline, PsychInfo, and the Cumulative Index to Nursing and Applied Health (CINAHL) databases were searched to identify empirical work. Each study was evaluated in terms of sample, design, theoretical framework, intervention delivery, and outcome. Nine studies were reviewed. Self-monitoring appears to be a promising strategy to be considered to reduce IDWG. Theory was not usually used to guide interventions, designs generally had control groups, interventions were delivered individually, more than one intervention was delivered at a time, the duration of the intervention varied greatly, there was no long-term follow-up, IDWG was the only outcome, and IDWG was operationalized in different ways. Theoretical models and methodological rigor are needed to guide future research. Specific recommendations on design, measurement, and conceptual issues are offered to enhance the effectiveness of future research.
Livingston, Kara A.; Chung, Mei; Sawicki, Caleigh M.; Lyle, Barbara J.; Wang, Ding Ding; Roberts, Susan B.; McKeown, Nicola M.
2016-01-01
Background Dietary fiber is a broad category of compounds historically defined as partially or completely indigestible plant-based carbohydrates and lignin with, more recently, the additional criteria that fibers incorporated into foods as additives should demonstrate functional human health outcomes to receive a fiber classification. Thousands of research studies have been published examining fibers and health outcomes. Objectives (1) Develop a database listing studies testing fiber and physiological health outcomes identified by experts at the Ninth Vahouny Conference; (2) Use evidence mapping methodology to summarize this body of literature. This paper summarizes the rationale, methodology, and resulting database. The database will help both scientists and policy-makers to evaluate evidence linking specific fibers with physiological health outcomes, and identify missing information. Methods To build this database, we conducted a systematic literature search for human intervention studies published in English from 1946 to May 2015. Our search strategy included a broad definition of fiber search terms, as well as search terms for nine physiological health outcomes identified at the Ninth Vahouny Fiber Symposium. Abstracts were screened using a priori defined eligibility criteria and a low threshold for inclusion to minimize the likelihood of rejecting articles of interest. Publications then were reviewed in full text, applying additional a priori defined exclusion criteria. The database was built and published on the Systematic Review Data Repository (SRDR™), a web-based, publicly available application. Conclusions A fiber database was created. This resource will reduce the unnecessary replication of effort in conducting systematic reviews by serving as both a central database archiving PICO (population, intervention, comparator, outcome) data on published studies and as a searchable tool through which this data can be extracted and updated. PMID:27348733
Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul
2012-11-26
The aim of this work is to develop group-contribution(+) (GC(+)) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality of parameter estimation, such as the parameter covariance, the standard errors in predicted properties, and the confidence intervals. For parameter estimation, large data sets of experimentally measured property values of a wide range of chemicals (hydrocarbons, oxygenated chemicals, nitrogenated chemicals, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22 environment-related properties, which include the fathead minnow 96-h LC(50), Daphnia magna 48-h LC(50), oral rat LD(50), aqueous solubility, bioconcentration factor, permissible exposure limit (OSHA-TWA), photochemical oxidation potential, global warming potential, ozone depletion potential, acidification potential, emission to urban air (carcinogenic and noncarcinogenic), emission to continental rural air (carcinogenic and noncarcinogenic), emission to continental fresh water (carcinogenic and noncarcinogenic), emission to continental seawater (carcinogenic and noncarcinogenic), emission to continental natural soil (carcinogenic and noncarcinogenic), and emission to continental agricultural soil (carcinogenic and noncarcinogenic) have been modeled and analyzed. The application of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.
NASA Astrophysics Data System (ADS)
Huang, Xiao
2006-04-01
Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.
Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.
Comparing Top-Down with Bottom-Up Approaches: Teaching Data Modeling
ERIC Educational Resources Information Center
Kung, Hsiang-Jui; Kung, LeeAnn; Gardiner, Adrian
2013-01-01
Conceptual database design is a difficult task for novice database designers, such as students, and is also therefore particularly challenging for database educators to teach. In the teaching of database design, two general approaches are frequently emphasized: top-down and bottom-up. In this paper, we present an empirical comparison of students'…
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-01-01
Introduction It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. Methods and analysis We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. Ethics and dissemination The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal. PMID:24568962
Garrido-Martín, Diego; Pazos, Florencio
2018-02-27
The exponential accumulation of new sequences in public databases is expected to improve the performance of all the approaches for predicting protein structural and functional features. Nevertheless, this was never assessed or quantified for some widely used methodologies, such as those aimed at detecting functional sites and functional subfamilies in protein multiple sequence alignments. Using raw protein sequences as only input, these approaches can detect fully conserved positions, as well as those with a family-dependent conservation pattern. Both types of residues are routinely used as predictors of functional sites and, consequently, understanding how the sequence content of the databases affects them is relevant and timely. In this work we evaluate how the growth and change with time in the content of sequence databases affect five sequence-based approaches for detecting functional sites and subfamilies. We do that by recreating historical versions of the multiple sequence alignments that would have been obtained in the past based on the database contents at different time points, covering a period of 20 years. Applying the methods to these historical alignments allows quantifying the temporal variation in their performance. Our results show that the number of families to which these methods can be applied sharply increases with time, while their ability to detect potentially functional residues remains almost constant. These results are informative for the methods' developers and final users, and may have implications in the design of new sequencing initiatives.
LAILAPS: the plant science search engine.
Esch, Maria; Chen, Jinbo; Colmsee, Christian; Klapperstück, Matthias; Grafahrend-Belau, Eva; Scholz, Uwe; Lange, Matthias
2015-01-01
With the number of sequenced plant genomes growing, the number of predicted genes and functional annotations is also increasing. The association between genes and phenotypic traits is currently of great interest. Unfortunately, the information available today is widely scattered over a number of different databases. Information retrieval (IR) has become an all-encompassing bioinformatics methodology for extracting knowledge from complex, heterogeneous and distributed databases, and therefore can be a useful tool for obtaining a comprehensive view of plant genomics, from genes to traits. Here we describe LAILAPS (http://lailaps.ipk-gatersleben.de), an IR system designed to link plant genomic data in the context of phenotypic attributes for a detailed forward genetic research. LAILAPS comprises around 65 million indexed documents, encompassing >13 major life science databases with around 80 million links to plant genomic resources. The LAILAPS search engine allows fuzzy querying for candidate genes linked to specific traits over a loosely integrated system of indexed and interlinked genome databases. Query assistance and an evidence-based annotation system enable time-efficient and comprehensive information retrieval. An artificial neural network incorporating user feedback and behavior tracking allows relevance sorting of results. We fully describe LAILAPS's functionality and capabilities by comparing this system's performance with other widely used systems and by reporting both a validation in maize and a knowledge discovery use-case focusing on candidate genes in barley. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.
Preliminary test results in support of integrated EPP and SMT design methods development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yanli; Jetter, Robert I.; Sham, T. -L.
2016-02-09
The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. One test concept, the Simplified Model Test (SMT), takes into account the stress and strain redistribution in real structures by including representative follow-up characteristics in the test specimen.more » The second test concept is the two-bar thermal ratcheting tests with cyclic loading at high temperatures using specimens representing key features of potential component designs. This report summaries the previous SMT results on Alloy 617, SS316H and SS304H and presents the recent development on SMT approach on Alloy 617. These SMT specimen data are also representative of component loading conditions and have been used as part of the verification of the proposed integrated EPP and SMT design methods development. The previous two-bar thermal ratcheting test results on Alloy 617 and SS316H are also summarized and the new results from two bar thermal ratcheting tests on SS316H at a lower temperature range are reported.« less
Using SysML for MBSE analysis of the LSST system
NASA Astrophysics Data System (ADS)
Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques
2010-07-01
The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.
Brunner, Emanuel; De Herdt, Amber; Minguet, Philippe; Baldew, Se-Sergio; Probst, Michel
2013-01-01
The primary purpose was to detect randomized controlled trials investigating cognitive behaviour therapy-based (CBT) treatments applied in acute/sub-acute low back pain (LBP). The secondary purpose was to analyse the methodological properties of the included studies, and to identify theory-based treatment strategies that are applicable for physiotherapists. A systematic literature search was conducted using four databases. Risk of bias of included studies was assessed and the methodological properties summarized. In addition, content and treatment theory of detected CBT-based strategies were systematically analysed and classified into three distinctive concepts of CBT: operant, cognitive and respondent treatment. Finally, applicability of treatment strategies in physiotherapy practice was discussed. Eight studies were included in the present systematic review. Half of the studies suffered from high risk of bias, and study characteristics varied in all domains of methodology, particularly in terms of treatment design and outcome measures. Graded activity, an operant treatment approach based on principles of operant conditioning, was identified as a CBT-based strategy with traceable theoretical justification that can be applied by physiotherapists. Operant conditioning can be integrated in ambulant physiotherapy practice and is a promising CBT-based strategy for the prevention of chronic LBP.
Acceptance and commitment therapy in the treatment of anxiety: a systematic review.
Swain, Jessica; Hancock, Karen; Hainsworth, Cassandra; Bowman, Jenny
2013-12-01
With a lifetime prevalence of approximately 17% among community-dwelling adults, anxiety disorders are among the most pervasive of contemporary psychiatric afflictions. Traditional Cognitive Behaviour Therapy (CBT) is currently the first line evidence-based psychosocial intervention for the treatment of anxiety. Previous research, however, has found that a significant proportion of patients do not respond to traditional CBT or exhibit residual symptomatology at treatment cessation. Additionally, there is a paucity of evidence among child populations and for the comparative effectiveness of alternative interventions. Acceptance and Commitment Therapy (ACT) has a growing empirical base demonstrating its efficacy for an array of problems. A systematic review was conducted to examine the evidence for ACT in the treatment of anxiety. PsycInfo, PsycArticles, PsycExtra, Medline and Proquest databases were searched, reference lists examined and citation searches conducted. Two independent reviewers analysed results, determined study eligibility and assessed methodological quality. Thirty-eight studies met inclusion criteria (total n=323). The spectrum of DSM-IV anxiety disorders as well as test and public speaking anxiety were examined. Studies were predominantly between-group design and case studies, with few employing control comparisons. Several methodological issues limit conclusions; however results provide preliminary support for ACT. Larger scale, methodologically rigorous trials are needed to consolidate these findings. © 2013.
Community-based early warning systems for flood risk mitigation in Nepal
NASA Astrophysics Data System (ADS)
Smith, Paul J.; Brown, Sarah; Dugar, Sumit
2017-03-01
This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.
Methods for heat transfer and temperature field analysis of the insulated diesel, phase 3
NASA Technical Reports Server (NTRS)
Morel, Thomas; Wahiduzzaman, Syed; Fort, Edward F.; Keribar, Rifat; Blumberg, Paul N.
1988-01-01
Work during Phase 3 of a program aimed at developing a comprehensive heat transfer and thermal analysis methodology for design analysis of insulated diesel engines is described. The overall program addresses all the key heat transfer issues: (1) spatially and time-resolved convective and radiative in-cylinder heat transfer, (2) steady-state conduction in the overall structure, and (3) cyclical and load/speed temperature transients in the engine structure. These are all accounted for in a coupled way together with cycle thermodynamics. This methodology was developed during Phases 1 and 2. During Phase 3, an experimental program was carried out to obtain data on heat transfer under cooled and insulated engine conditions and also to generate a database to validate the developed methodology. A single cylinder Cummins diesel engine was instrumented for instantaneous total heat flux and heat radiation measurements. Data were acquired over a wide range of operating conditions in two engine configurations. One was a cooled baseline. The other included ceramic coated components (0.050 inches plasma sprayed zirconia)-piston, head and valves. The experiments showed that the insulated engine has a smaller heat flux than the cooled one. The model predictions were found to be in very good agreement with the data.
Dodd, Lori E; Wagner, Robert F; Armato, Samuel G; McNitt-Gray, Michael F; Beiden, Sergey; Chan, Heang-Ping; Gur, David; McLennan, Geoffrey; Metz, Charles E; Petrick, Nicholas; Sahiner, Berkman; Sayre, Jim
2004-04-01
Cancer of the lung and bronchus is the leading fatal malignancy in the United States. Five-year survival is low, but treatment of early stage disease considerably improves chances of survival. Advances in multidetector-row computed tomography technology provide detection of smaller lung nodules and offer a potentially effective screening tool. The large number of images per exam, however, requires considerable radiologist time for interpretation and is an impediment to clinical throughput. Thus, computer-aided diagnosis (CAD) methods are needed to assist radiologists with their decision making. To promote the development of CAD methods, the National Cancer Institute formed the Lung Image Database Consortium (LIDC). The LIDC is charged with developing the consensus and standards necessary to create an image database of multidetector-row computed tomography lung images as a resource for CAD researchers. To develop such a prospective database, its potential uses must be anticipated. The ultimate applications will influence the information that must be included along with the images, the relevant measures of algorithm performance, and the number of required images. In this article we outline assessment methodologies and statistical issues as they relate to several potential uses of the LIDC database. We review methods for performance assessment and discuss issues of defining "truth" as well as the complications that arise when truth information is not available. We also discuss issues about sizing and populating a database.
Multi-parameter vital sign database to assist in alarm optimization for general care units.
Welch, James; Kanter, Benjamin; Skora, Brooke; McCombie, Scott; Henry, Isaac; McCombie, Devin; Kennedy, Rosemary; Soller, Babs
2016-12-01
Continual vital sign assessment on the general care, medical-surgical floor is expected to provide early indication of patient deterioration and increase the effectiveness of rapid response teams. However, there is concern that continual, multi-parameter vital sign monitoring will produce alarm fatigue. The objective of this study was the development of a methodology to help care teams optimize alarm settings. An on-body wireless monitoring system was used to continually assess heart rate, respiratory rate, SpO 2 and noninvasive blood pressure in the general ward of ten hospitals between April 1, 2014 and January 19, 2015. These data, 94,575 h for 3430 patients are contained in a large database, accessible with cloud computing tools. Simulation scenarios assessed the total alarm rate as a function of threshold and annunciation delay (s). The total alarm rate of ten alarms/patient/day predicted from the cloud-hosted database was the same as the total alarm rate for a 10 day evaluation (1550 h for 36 patients) in an independent hospital. Plots of vital sign distributions in the cloud-hosted database were similar to other large databases published by different authors. The cloud-hosted database can be used to run simulations for various alarm thresholds and annunciation delays to predict the total alarm burden experienced by nursing staff. This methodology might, in the future, be used to help reduce alarm fatigue without sacrificing the ability to continually monitor all vital signs.
Methodology for adding glycemic index and glycemic load values to 24-hour dietary recall database.
Olendzki, Barbara C; Ma, Yunsheng; Culver, Annie L; Ockene, Ira S; Griffith, Jennifer A; Hafner, Andrea R; Hebert, James R
2006-01-01
We describe a method of adding the glycemic index (GI) and glycemic load (GL) values to the nutrient database of the 24-hour dietary recall interview (24HR), a widely used dietary assessment. We also calculated daily GI and GL values from the 24HR. Subjects were 641 healthy adults from central Massachusetts who completed 9067 24HRs. The 24HR-derived food data were matched to the International Table of Glycemic Index and Glycemic Load Values. The GI values for specific foods not in the table were estimated against similar foods according to physical and chemical factors that determine GI. Mixed foods were disaggregated into individual ingredients. Of 1261 carbohydrate-containing foods in the database, GI values of 602 foods were obtained from a direct match (47.7%), accounting for 22.36% of dietary carbohydrate. GI values from 656 foods (52.1%) were estimated, contributing to 77.64% of dietary carbohydrate. The GI values from three unknown foods (0.2%) could not be assigned. The average daily GI was 84 (SD 5.1, white bread as referent) and the average GL was 196 (SD 63). Using this methodology for adding GI and GL values to nutrient databases, it is possible to assess associations between GI and/or GL and body weight and chronic disease outcomes (diabetes, cancer, heart disease). This method can be used in clinical and survey research settings where 24HRs are a practical means for assessing diet. The implications for using this methodology compel a broader evaluation of diet with disease outcomes.
USDA-ARS?s Scientific Manuscript database
The purpose for developing the Food Intakes Converted to Retail Commodities Database (FICRCD) is to convert foods consumed in the national dietary surveys, 1994-2002, to respective amounts of retail-level food commodities. Food commodities are defined as those available for purchase in retail store...
Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel
2014-06-01
The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better diagnosis than existing techniques. It is also worth noting that the individualized characterization obtained using our methodology could be extremely helpful in designing individualized therapies. Moreover, the proposed methodology could be easily extended to other languages and even to other cognitive impairments not necessarily related to language. Copyright © 2014 Elsevier B.V. All rights reserved.
Barnes, Brian B.; Wilson, Michael B.; Carr, Peter W.; Vitha, Mark F.; Broeckling, Corey D.; Heuberger, Adam L.; Prenni, Jessica; Janis, Gregory C.; Corcoran, Henry; Snow, Nicholas H.; Chopra, Shilpi; Dhandapani, Ramkumar; Tawfall, Amanda; Sumner, Lloyd W.; Boswell, Paul G.
2014-01-01
Gas chromatography-mass spectrometry (GC-MS) is a primary tool used to identify compounds in complex samples. Both mass spectra and GC retention times are matched to those of standards, but it is often impractical to have standards on hand for every compound of interest, so we must rely on shared databases of MS data and GC retention information. Unfortunately, retention databases (e.g. linear retention index libraries) are experimentally restrictive, notoriously unreliable, and strongly instrument dependent, relegating GC retention information to a minor, often negligible role in compound identification despite its potential power. A new methodology called “retention projection” has great potential to overcome the limitations of shared chromatographic databases. In this work, we tested the reliability of the methodology in five independent laboratories. We found that even when each lab ran nominally the same method, the methodology was 3-fold more accurate than retention indexing because it properly accounted for unintentional differences between the GC-MS systems. When the labs used different methods of their own choosing, retention projections were 4- to 165-fold more accurate. More importantly, the distribution of error in the retention projections was predictable across different methods and labs, thus enabling automatic calculation of retention time tolerance windows. Tolerance windows at 99% confidence were generally narrower than those widely used even when physical standards are on hand to measure their retention. With its high accuracy and reliability, the new retention projection methodology makes GC retention a reliable, precise tool for compound identification, even when standards are not available to the user. PMID:24205931
Stoop, Rahel; Clijsen, Ron; Leoni, Diego; Soldini, Emiliano; Castellini, Greta; Redaelli, Valentina; Barbero, Marco
2017-08-01
The methodological quality of controlled clinical trials (CCTs) of physiotherapeutic treatment modalities for myofascial trigger points (MTrP) has not been investigated yet. To detect the methodological quality of CCTs for physiotherapy treatments of MTrPs and demonstrating the possible increase over time. Systematic review. A systematic search was conducted in two databases, Physiotherapy Evidence Database (PEDro) and Medicine Medical Literature Analysis and Retrieval System online (MEDLINE), using the same keywords and selection procedure corresponding to pre-defined inclusion criteria. The methodological quality, assessed by the 11-item PEDro scale, served as outcome measure. The CCTs had to compare at least two interventions, where one intervention had to lay within the scope of physiotherapy. Participants had to be diagnosed with myofascial pain syndrome or trigger points (active or latent). A total of n = 230 studies was analysed. The cervico-thoracic region was the most frequently treated body part (n = 143). Electrophysical agent applications was the most frequent intervention. The average methodological quality reached 5.5 on the PEDro scale. A total of n = 6 studies scored the value of 9. The average PEDro score increased by 0.7 points per decade between 1978 and 2015. The average PEDro score of CCTs for MTrP treatments does not reach the cut-off of 6 proposed for moderate to high methodological quality. Nevertheless, a promising trend towards an increase of the average methodological quality of CCTs for MTrPs was recorded. More high-quality CCT studies with thorough research procedures are recommended to enhance methodological quality. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Nordic Walking for the Management of People With Parkinson Disease: A Systematic Review.
Cugusi, Lucia; Manca, Andrea; Dragone, Daniele; Deriu, Franca; Solla, Paolo; Secci, Claudio; Monticone, Marco; Mercuro, Giuseppe
2017-11-01
It is well known that physical exercise is the main therapeutic element of rehabilitation programs for people with Parkinson disease (PD). As traditional forms of exercise can guarantee significant health benefits, the emergence of nonconventional physical activities, such as Nordic walking (NW), may add positive effects. To appraise the available evidence on the main effects of NW in the rehabilitation programs for people with PD and to propose a design for upcoming research that might improve the uniformity of future trials. Systematic review. A literature search of 5 established databases (PubMed, MEDLINE, Scopus, Web of Science, and Cochrane) was conducted. Any relevant randomized controlled trials pertinent to NW in PD published in English from inception to February 2017 were included. Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines were followed, and the methodologic quality of each study was assessed by the Physiotherapy Evidence Database scale. Sixty-six studies were retrieved, and 6 randomized controlled trials (221 subjects) were entered into the qualitative synthesis. Overall, these studies portrayed NW as feasible and likely to be effective in improving the functional and clinical outcomes of people with PD. When we compared NW with other exercise-based interventions, such as treadmill training, free walking, a program of standardized whole-body movements with maximal amplitude (Lee Silverman Voice Treatment BIG training), or a home-based exercise program, the findings proved controversial. High heterogeneity and methodologic discrepancies among the studies prevent us from drawing firm conclusions on the effectiveness of NW in comparison with other exercise-based interventions currently used by people with PD. Further investigations with a common design are necessary to verify whether NW may be included within conventional rehabilitation programs commonly recommended to people with PD. II. Copyright © 2017 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
de Sousa Costa, Robherson Wector; da Silva, Giovanni Lucca França; de Carvalho Filho, Antonio Oseas; Silva, Aristófanes Corrêa; de Paiva, Anselmo Cardoso; Gattass, Marcelo
2018-05-23
Lung cancer presents the highest cause of death among patients around the world, in addition of being one of the smallest survival rates after diagnosis. Therefore, this study proposes a methodology for diagnosis of lung nodules in benign and malignant tumors based on image processing and pattern recognition techniques. Mean phylogenetic distance (MPD) and taxonomic diversity index (Δ) were used as texture descriptors. Finally, the genetic algorithm in conjunction with the support vector machine were applied to select the best training model. The proposed methodology was tested on computed tomography (CT) images from the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), with the best sensitivity of 93.42%, specificity of 91.21%, accuracy of 91.81%, and area under the ROC curve of 0.94. The results demonstrate the promising performance of texture extraction techniques using mean phylogenetic distance and taxonomic diversity index combined with phylogenetic trees. Graphical Abstract Stages of the proposed methodology.
Catalá-López, Ferrán; Ridao, Manuel; Alonso-Arroyo, Adolfo; García-Altés, Anna; Cameron, Chris; González-Bermejo, Diana; Aleixandre-Benavent, Rafael; Bernal-Delgado, Enrique; Peiró, Salvador; Tabarés-Seisdedos, Rafael; Hutton, Brian
2016-01-07
Cost-effectiveness analysis has been recognized as an important tool to determine the efficiency of healthcare interventions and services. There is a need for evaluating the reporting of methods and results of cost-effectiveness analyses and establishing their validity. We describe and examine reporting characteristics of methods and results of cost-effectiveness analyses conducted in Spain during more than two decades. A methodological systematic review was conducted with the information obtained through an updated literature review in PubMed and complementary databases (e.g. Scopus, ISI Web of Science, National Health Service Economic Evaluation Database (NHS EED) and Health Technology Assessment (HTA) databases from Centre for Reviews and Dissemination (CRD), Índice Médico Español (IME) Índice Bibliográfico Español en Ciencias de la Salud (IBECS)). We identified cost-effectiveness analyses conducted in Spain that used quality-adjusted life years (QALYs) as outcome measures (period 1989-December 2014). Two reviewers independently extracted the data from each paper. The data were analysed descriptively. In total, 223 studies were included. Very few studies (10; 4.5 %) reported working from a protocol. Most studies (200; 89.7 %) were simulation models and included a median of 1000 patients. Only 105 (47.1 %) studies presented an adequate description of the characteristics of the target population. Most study interventions were categorized as therapeutic (189; 84.8 %) and nearly half (111; 49.8 %) considered an active alternative as the comparator. Effectiveness of data was derived from a single study in 87 (39.0 %) reports, and only few (40; 17.9 %) used evidence synthesis-based estimates. Few studies (42; 18.8 %) reported a full description of methods for QALY calculation. The majority of the studies (147; 65.9 %) reported that the study intervention produced "more costs and more QALYs" than the comparator. Most studies (200; 89.7 %) reported favourable conclusions. Main funding source was the private for-profit sector (135; 60.5 %). Conflicts of interest were not disclosed in 88 (39.5 %) studies. This methodological review reflects that reporting of several important aspects of methods and results are frequently missing in published cost-effectiveness analyses. Without full and transparent reporting of how studies were designed and conducted, it is difficult to assess the validity of study findings and conclusions.
Critical care medicine beds, use, occupancy and costs in the United States: a methodological review
Halpern, Neil A; Pastores, Stephen M.
2017-01-01
This article is a methodological review to help the intensivist gain insights into the classic and sometimes arcane maze of national databases and methodologies used to determine and analyze the intensive care unit (ICU) bed supply, occupancy rates, and costs in the United States (US). Data for total ICU beds, use and occupancy can be derived from two large national healthcare databases: the Healthcare Cost Report Information System (HCRIS) maintained by the federal Centers for Medicare and Medicaid Services (CMS) and the proprietary Hospital Statistics of the American Hospital Association (AHA). Two costing methodologies can be used to calculate ICU costs: the Russell equation and national projections. Both methods are based on cost and use data from the national hospital datasets or from defined groups of hospitals or patients. At the national level, an understanding of US ICU beds, use and cost helps provide clarity to the width and scope of the critical care medicine (CCM) enterprise within the US healthcare system. This review will also help the intensivist better understand published studies on administrative topics related to CCM and be better prepared to participate in their own local hospital organizations or regional CCM programs. PMID:26308432
Chambers, Duncan; Paton, Fiona; Wilson, Paul; Eastwood, Alison; Craig, Dawn; Fox, Dave; Jayne, David; McGinnes, Erika
2014-01-01
Objectives To identify and critically assess the extent to which systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery differ in their methodology and reported estimates of effect. Design Review of published systematic reviews. We searched the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA) Database from 1990 to March 2013. Systematic reviews of enhanced recovery programmes for patients undergoing colorectal surgery were eligible for inclusion. Primary and secondary outcome measures The primary outcome was length of hospital stay. We assessed changes in pooled estimates of treatment effect over time and how these might have been influenced by decisions taken by researchers as well as by the availability of new trials. The quality of systematic reviews was assessed using the Centre for Reviews and Dissemination (CRD) DARE critical appraisal process. Results 10 systematic reviews were included. Systematic reviews of randomised controlled trials have consistently shown a reduction in length of hospital stay with enhanced recovery compared with traditional care. The estimated effect tended to increase from 2006 to 2010 as more trials were published but has not altered significantly in the most recent review, despite the inclusion of several unique trials. The best estimate appears to be an average reduction of around 2.5 days in primary postoperative length of stay. Differences between reviews reflected differences in interpretation of inclusion criteria, searching and analytical methods or software. Conclusions Systematic reviews of enhanced recovery programmes show a high level of research waste, with multiple reviews covering identical or very similar groups of trials. Where multiple reviews exist on a topic, interpretation may require careful attention to apparently minor differences between reviews. Researchers can help readers by acknowledging existing reviews and through clear reporting of key decisions, especially on inclusion/exclusion and on statistical pooling. PMID:24879828
Tasneem, Asba; Aberle, Laura; Ananth, Hari; Chakraborty, Swati; Chiswell, Karen; McCourt, Brian J.; Pietrobon, Ricardo
2012-01-01
Background The ClinicalTrials.gov registry provides information regarding characteristics of past, current, and planned clinical studies to patients, clinicians, and researchers; in addition, registry data are available for bulk download. However, issues related to data structure, nomenclature, and changes in data collection over time present challenges to the aggregate analysis and interpretation of these data in general and to the analysis of trials according to clinical specialty in particular. Improving usability of these data could enhance the utility of ClinicalTrials.gov as a research resource. Methods/Principal Results The purpose of our project was twofold. First, we sought to extend the usability of ClinicalTrials.gov for research purposes by developing a database for aggregate analysis of ClinicalTrials.gov (AACT) that contains data from the 96,346 clinical trials registered as of September 27, 2010. Second, we developed and validated a methodology for annotating studies by clinical specialty, using a custom taxonomy employing Medical Subject Heading (MeSH) terms applied by an NLM algorithm, as well as MeSH terms and other disease condition terms provided by study sponsors. Clinical specialists reviewed and annotated MeSH and non-MeSH disease condition terms, and an algorithm was created to classify studies into clinical specialties based on both MeSH and non-MeSH annotations. False positives and false negatives were evaluated by comparing algorithmic classification with manual classification for three specialties. Conclusions/Significance The resulting AACT database features study design attributes parsed into discrete fields, integrated metadata, and an integrated MeSH thesaurus, and is available for download as Oracle extracts (.dmp file and text format). This publicly-accessible dataset will facilitate analysis of studies and permit detailed characterization and analysis of the U.S. clinical trials enterprise as a whole. In addition, the methodology we present for creating specialty datasets may facilitate other efforts to analyze studies by specialty groups. PMID:22438982
Tasneem, Asba; Aberle, Laura; Ananth, Hari; Chakraborty, Swati; Chiswell, Karen; McCourt, Brian J; Pietrobon, Ricardo
2012-01-01
The ClinicalTrials.gov registry provides information regarding characteristics of past, current, and planned clinical studies to patients, clinicians, and researchers; in addition, registry data are available for bulk download. However, issues related to data structure, nomenclature, and changes in data collection over time present challenges to the aggregate analysis and interpretation of these data in general and to the analysis of trials according to clinical specialty in particular. Improving usability of these data could enhance the utility of ClinicalTrials.gov as a research resource. The purpose of our project was twofold. First, we sought to extend the usability of ClinicalTrials.gov for research purposes by developing a database for aggregate analysis of ClinicalTrials.gov (AACT) that contains data from the 96,346 clinical trials registered as of September 27, 2010. Second, we developed and validated a methodology for annotating studies by clinical specialty, using a custom taxonomy employing Medical Subject Heading (MeSH) terms applied by an NLM algorithm, as well as MeSH terms and other disease condition terms provided by study sponsors. Clinical specialists reviewed and annotated MeSH and non-MeSH disease condition terms, and an algorithm was created to classify studies into clinical specialties based on both MeSH and non-MeSH annotations. False positives and false negatives were evaluated by comparing algorithmic classification with manual classification for three specialties. The resulting AACT database features study design attributes parsed into discrete fields, integrated metadata, and an integrated MeSH thesaurus, and is available for download as Oracle extracts (.dmp file and text format). This publicly-accessible dataset will facilitate analysis of studies and permit detailed characterization and analysis of the U.S. clinical trials enterprise as a whole. In addition, the methodology we present for creating specialty datasets may facilitate other efforts to analyze studies by specialty groups.
Economic evaluation of nurse staffing and nurse substitution in health care: a scoping review.
Goryakin, Yevgeniy; Griffiths, Peter; Maben, Jill
2011-04-01
Several systematic reviews have suggested that greater nurse staffing as well as a greater proportion of registered nurses in the health workforce is associated with better patient outcomes. Others have found that nurses can substitute for doctors safely and effectively in a variety of settings. However, these reviews do not generally consider the effect of nurse staff on both patient outcomes and costs of care, and therefore say little about the cost-effectiveness of nurse-provided care. Therefore, we conducted a scoping literature review of economic evaluation studies which consider the link between nurse staffing, skill mix within the nursing team and between nurses and other medical staff to determine the nature of the available economic evidence. Scoping literature review. English-language manuscripts, published between 1989 and 2009, focussing on the relationship between costs and effects of care and the level of registered nurse staffing or nurse-physician substitution/nursing skill mix in the clinical team, using cost-effectiveness, cost-utility, or cost-benefit analysis. Articles selected for the review were identified through Medline, CINAHL, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects and Google Scholar database searches. After selecting 17 articles representing 16 unique studies for review, we summarized their main findings, and assessed their methodological quality using criteria derived from recommendations from the guidelines proposed by the Panel on Cost-Effectiveness in Health Care. In general, it was found that nurses can provide cost effective care, compared to other health professionals. On the other hand, more intensive nurse staffing was associated with both better outcomes and more expensive care, and therefore cost effectiveness was not easy to assess. Although considerable progress in economic evaluation studies has been reached in recent years, a number of methodological issues remain. In the future, nurse researchers should be more actively engaged in the design and implementation of economic evaluation studies of the services they provide. Copyright © 2010 Elsevier Ltd. All rights reserved.
The Effectiveness of Pilates Exercise in People with Chronic Low Back Pain: A Systematic Review
Wells, Cherie; Kolt, Gregory S.; Marshall, Paul; Hill, Bridget; Bialocerkowski, Andrea
2014-01-01
Objective To evaluate the effectiveness of Pilates exercise in people with chronic low back pain (CLBP) through a systematic review of randomised controlled trials (RCTs). Data Sources A search for RCTs was undertaken using Medical Search Terms and synonyms for “Pilates” and “low back pain” within the maximal date range of 10 databases. Databases included the Cumulative Index to Nursing and Allied Health Literature; Cochrane Library; Medline; Physiotherapy Evidence Database; ProQuest: Health and Medical Complete, Nursing and Allied Health Source, Dissertation and Theses; Scopus; Sport Discus; Web of Science. Study Selection Two independent reviewers were involved in the selection of evidence. To be included, relevant RCTs needed to be published in the English language. From 152 studies, 14 RCTs were included. Data Extraction Two independent reviewers appraised the methodological quality of RCTs using the McMaster Critical Review Form for Quantitative Studies. The author(s), year of publication, and details regarding participants, Pilates exercise, comparison treatments, and outcome measures, and findings, were then extracted. Data Synthesis The methodological quality of RCTs ranged from “poor” to “excellent”. A meta-analysis of RCTs was not undertaken due to the heterogeneity of RCTs. Pilates exercise provided statistically significant improvements in pain and functional ability compared to usual care and physical activity between 4 and 15 weeks, but not at 24 weeks. There were no consistent statistically significant differences in improvements in pain and functional ability with Pilates exercise, massage therapy, or other forms of exercise at any time period. Conclusions Pilates exercise offers greater improvements in pain and functional ability compared to usual care and physical activity in the short term. Pilates exercise offers equivalent improvements to massage therapy and other forms of exercise. Future research should explore optimal Pilates exercise designs, and whether some people with CLBP may benefit from Pilates exercise more than others. PMID:24984069
Chen, Xin-Lin; Mo, Chuan-Wei; Lu, Li-Ya; Gao, Ri-Yang; Xu, Qian; Wu, Min-Feng; Zhou, Qian-Yi; Hu, Yue; Zhou, Xuan; Li, Xian-Tao
2017-11-01
To assess the methodological quality of systematic reviews and meta-analyses regarding acupuncture intervention for stroke and the primary studies within them. Two researchers searched PubMed, Cumulative index to Nursing and Allied Health Literature, Embase, ISI Web of Knowledge, Cochrane, Allied and Complementary Medicine, Ovid Medline, Chinese Biomedical Literature Database, China National Knowledge Infrastructure, Wanfang and Traditional Chinese Medical Database to identify systematic reviews and meta-analyses about acupuncture for stroke published from the inception to December 2016. Review characteristics and the criteria for assessing the primary studies within reviews were extracted. The methodological quality of the reviews was assessed using adapted Oxman and Guyatt Scale. The methodological quality of primary studies was also assessed. Thirty-two eligible reviews were identified, 15 in English and 17 in Chinese. The English reviews were scored higher than the Chinese reviews (P=0.025), especially in criteria for avoiding bias and the scope of search. All reviews used the quality criteria to evaluate the methodological quality of primary studies, but some criteria were not comprehensive. The primary studies, in particular the Chinese reviews, had problems with randomization, allocation concealment, blinding, dropouts and withdrawals, intent-to-treat analysis and adverse events. Important methodological flaws were found in Chinese systematic reviews and primary studies. It was necessary to improve the methodological quality and reporting quality of both the systematic reviews published in China and primary studies on acupuncture for stroke.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
A dynamic clinical dental relational database.
Taylor, D; Naguib, R N G; Boulton, S
2004-09-01
The traditional approach to relational database design is based on the logical organization of data into a number of related normalized tables. One assumption is that the nature and structure of the data is known at the design stage. In the case of designing a relational database to store historical dental epidemiological data from individual clinical surveys, the structure of the data is not known until the data is presented for inclusion into the database. This paper addresses the issues concerned with the theoretical design of a clinical dynamic database capable of adapting the internal table structure to accommodate clinical survey data, and presents a prototype database application capable of processing, displaying, and querying the dental data.
Doménech-Carbó, Antonio; Doménech-Carbó, María Teresa; Valle-Algarra, Francisco Manuel; Gimeno-Adelantado, José Vicente; Osete-Cortina, Laura; Bosch-Reig, Francisco
2016-07-13
A web-based database of voltammograms is presented for characterizing artists' pigments and corrosion products of ceramic, stone and metal objects by means of the voltammetry of immobilized particles methodology. Description of the website and the database is provided. Voltammograms are, in most cases, accompanied by scanning electron microphotographs, X-ray spectra, infrared spectra acquired in attenuated total reflectance Fourier transform infrared spectroscopy mode (ATR-FTIR) and diffuse reflectance spectra in the UV-Vis-region. For illustrating the usefulness of the database two case studies involving identification of pigments and a case study describing deterioration of an archaeological metallic object are presented. Copyright © 2016 Elsevier B.V. All rights reserved.
A systematic review of economic evaluations of treatments for patients with epilepsy.
Wijnen, Ben F M; van Mastrigt, Ghislaine A P G; Evers, Silvia M A A; Gershuni, Olga; Lambrechts, Danielle A J E; Majoie, Marian H J M; Postulart, Debby; Aldenkamp, Bert A P; de Kinderen, Reina J A
2017-05-01
The increasing number of treatment options and the high costs associated with epilepsy have fostered the development of economic evaluations in epilepsy. It is important to examine the availability and quality of these economic evaluations and to identify potential research gaps. As well as looking at both pharmacologic (antiepileptic drugs [AEDs]) and nonpharmacologic (e.g., epilepsy surgery, ketogenic diet, vagus nerve stimulation) therapies, this review examines the methodologic quality of the full economic evaluations included. Literature search was performed in MEDLINE, EMBASE, NHS Economic Evaluation Database (NHS EED), Econlit, Web of Science, and CEA Registry. In addition, Cochrane Reviews, Cochrane DARE and Cochrane Health Technology Assessment Databases were used. To identify relevant studies, predefined clinical search strategies were combined with a search filter designed to identify health economic studies. Specific search strategies were devised for the following topics: (1) AEDs, (2) patients with cognitive deficits, (3) elderly patients, (4) epilepsy surgery, (5) ketogenic diet, (6) vagus nerve stimulation, and (7) treatment of (non)convulsive status epilepticus. A total of 40 publications were included in this review, 29 (73%) of which were articles about pharmacologic interventions. Mean quality score of all articles on the Consensus Health Economic Criteria (CHEC)-extended was 81.8%, the lowest quality score being 21.05%, whereas five studies had a score of 100%. Looking at the Consolidated Health Economic Evaluation Reporting Standards (CHEERS), the average quality score was 77.0%, the lowest being 22.7%, and four studies rated as 100%. There was a substantial difference in methodology in all included articles, which hampered the attempt to combine information meaningfully. Overall, the methodologic quality was acceptable; however, some studies performed significantly worse than others. The heterogeneity between the studies stresses the need to define a reference case (e.g., how should an economic evaluation within epilepsy be performed) and to derive consensus on what constitutes "standard optimal care." Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
Golder, Su; Loke, Yoon K; Zorzela, Liliane
2014-06-01
Research indicates that the methods used to identify data for systematic reviews of adverse effects may need to differ from other systematic reviews. To compare search methods in systematic reviews of adverse effects with other reviews. The search methodologies in 849 systematic reviews of adverse effects were compared with other reviews. Poor reporting of search strategies is apparent in both systematic reviews of adverse effects and other types of systematic reviews. Systematic reviews of adverse effects are less likely to restrict their searches to MEDLINE or include only randomised controlled trials (RCTs). The use of other databases is largely dependent on the topic area and the year the review was conducted, with more databases searched in more recent reviews. Adverse effects search terms are used by 72% of reviews and despite recommendations only two reviews report using floating subheadings. The poor reporting of search strategies in systematic reviews is universal, as is the dominance of searching MEDLINE. However, reviews of adverse effects are more likely to include a range of study designs (not just RCTs) and search beyond MEDLINE. © 2014 Crown Copyright.
The Identity Mapping Project: Demographic differences in patterns of distributed identity.
Gilbert, Richard L; Dionisio, John David N; Forney, Andrew; Dorin, Philip
2015-01-01
The advent of cloud computing and a multi-platform digital environment is giving rise to a new phase of human identity called "The Distributed Self." In this conception, aspects of the self are distributed into a variety of 2D and 3D digital personas with the capacity to reflect any number of combinations of now malleable personality traits. In this way, the source of human identity remains internal and embodied, but the expression or enactment of the self becomes increasingly external, disembodied, and distributed on demand. The Identity Mapping Project (IMP) is an interdisciplinary collaboration between psychology and computer Science designed to empirically investigate the development of distributed forms of identity. Methodologically, it collects a large database of "identity maps" - computerized graphical representations of how active someone is online and how their identity is expressed and distributed across 7 core digital domains: email, blogs/personal websites, social networks, online forums, online dating sites, character based digital games, and virtual worlds. The current paper reports on gender and age differences in online identity based on an initial database of distributed identity profiles.
The U.S. Geological Survey coal quality (COALQUAL) database version 3.0
Palmer, Curtis A.; Oman, Charles L.; Park, Andy J.; Luppens, James A.
2015-12-21
Because of database size limits during the development of COALQUAL Version 1.3, many analyses of individual bench samples were merged into whole coal bed averages. The methodology for making these composite intervals was not consistent. Size limits also restricted the amount of georeferencing information and forced removal of qualifier notations such as "less than detection limit" (<) information, which can cause problems when using the data. A review of the original data sheets revealed that COALQUAL Version 2.0 was missing information that was needed for a complete understanding of a coal section. Another important database issue to resolve was the USGS "remnant moisture" problem. Prior to 1998, tests for remnant moisture (as-determined moisture in the sample at the time of analysis) were not performed on any USGS major, minor, or trace element coal analyses. Without the remnant moisture, it is impossible to convert the analyses to a usable basis (as-received, dry, etc.). Based on remnant moisture analyses of hundreds of samples of different ranks (and known residual moisture) reported after 1998, it was possible to develop a method to provide reasonable estimates of remnant moisture for older data to make it more useful in COALQUAL Version 3.0. In addition, COALQUAL Version 3.0 is improved by (1) adding qualifiers, including statistical programming to deal with the qualifiers; (2) clarifying the sample compositing problems; and (3) adding associated samples. Version 3.0 of COALQUAL also represents the first attempt to incorporate data verification by mathematically crosschecking certain analytical parameters. Finally, a new database system was designed and implemented to replace the outdated DOS program used in earlier versions of the database.
Peinemann, Frank; Tushabe, Doreen Allen; Kleijnen, Jos
2013-01-01
Background A systematic review may evaluate different aspects of a health care intervention. To accommodate the evaluation of various research questions, the inclusion of more than one study design may be necessary. One aim of this study is to find and describe articles on methodological issues concerning the incorporation of multiple types of study designs in systematic reviews on health care interventions. Another aim is to evaluate methods studies that have assessed whether reported effects differ by study types. Methods and Findings We searched PubMed, the Cochrane Database of Systematic Reviews, and the Cochrane Methodology Register on 31 March 2012 and identified 42 articles that reported on the integration of single or multiple study designs in systematic reviews. We summarized the contents of the articles qualitatively and assessed theoretical and empirical evidence. We found that many examples of reviews incorporating multiple types of studies exist and that every study design can serve a specific purpose. The clinical questions of a systematic review determine the types of design that are necessary or sufficient to provide the best possible answers. In a second independent search, we identified 49 studies, 31 systematic reviews and 18 trials that compared the effect sizes between randomized and nonrandomized controlled trials, which were statistically different in 35%, and not different in 53%. Twelve percent of studies reported both, different and non-different effect sizes. Conclusions Different study designs addressing the same question yielded varying results, with differences in about half of all examples. The risk of presenting uncertain results without knowing for sure the direction and magnitude of the effect holds true for both nonrandomized and randomized controlled trials. The integration of multiple study designs in systematic reviews is required if patients should be informed on the many facets of patient relevant issues of health care interventions. PMID:24416098
Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T
2012-12-01
Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image processing with clinical records obtained from the hospital information system. We also have developed a methodology for formatting clinical history for review by physicians and export to a decision support system. We identified several pitfalls, including the fact that important textual information extracted from the hospital information system by knowledgeable transcribers can show substantial interobserver variation, particularly when record retrieval is based on the narrative clinical records.
Gilheaney, Ó; Kerr, P; Béchet, S; Walshe, M
2016-12-01
To determine the effectiveness of endoscopic cricopharyngeal myotomy on upper oesophageal sphincter dysfunction in adults with upper oesophageal sphincter dysfunction and neurological disease. Published and unpublished studies with a quasi-experimental design investigating endoscopic cricopharyngeal myotomy effects on upper oesophageal sphincter dysfunction in humans were considered eligible. Electronic databases, grey literature and reference lists of included studies were systematically searched. Data were extracted by two independent reviewers. Methodological quality was assessed independently using the PEDro scale and MINORS tool. Of 2938 records identified, 2 studies were eligible. Risk of bias assessment indicated areas of methodological concern in the literature. Statistical analysis was not possible because of the limited number of eligible studies. No determinations could be made regarding endoscopic cricopharyngeal myotomy effectiveness in the cohort of interest. Reliable and valid evidence on the following is required to support increasing clinical usage of endoscopic cricopharyngeal myotomy: optimal candidacy selection; standardised post-operative management protocol; complications; and endoscopic cricopharyngeal myotomy effects on aspiration of food and laryngeal penetration, mean upper oesophageal sphincter resting pressure and quality of life.
Is reflexology an effective intervention? A systematic review of randomised controlled trials.
Ernst, Edzard
2009-09-07
To evaluate the evidence for and against the effectiveness of reflexology for treating any medical condition. Six electronic databases were searched from their inception to February 2009 to identify all relevant randomised controlled trials (RCTs). No language restrictions were applied. RCTs of reflexology delivered by trained reflexologists to patients with specific medical conditions. Condition studied, study design and controls, primary outcome measures, follow-up, and main results were extracted. 18 RCTs met all the inclusion criteria. The studies examined a range of conditions: anovulation, asthma, back pain, dementia, diabetes, cancer, foot oedema in pregnancy, headache, irritable bowel syndrome, menopause, multiple sclerosis, the postoperative state and premenstrual syndrome. There were > 1 studies for asthma, the postoperative state, cancer palliation and multiple sclerosis. Five RCTs yielded positive results. Methodological quality was evaluated using the Jadad scale. The methodological quality was often poor, and sample sizes were generally low. Most higher-quality trials did not generate positive findings. The best evidence available to date does not demonstrate convincingly that reflexology is an effective treatment for any medical condition.
Li, Chunxiao; Khoo, Selina; Adnan, Athirah
2017-03-01
The aim of this review is to synthesize the evidence on the effects of aquatic exercise interventions on physical function and fitness among people with spinal cord injury. Six major databases were searched from inception till June 2015: MEDLINE, CINAHL, EMBASE, PsychInfo, SPORTDiscus, and Cochrane Center Register of Controlled Trials. Two reviewers independently rated methodological quality using the modified Downs and Black Scale and extracted and synthesized key findings (i.e., participant characteristics, study design, physical function and fitness outcomes, and adverse events). Eight of 276 studies met the inclusion criteria, of which none showed high research quality. Four studies assessed physical function outcomes and 4 studies evaluated aerobic fitness as outcome measures. Significant improvements on these 2 outcomes were generally found. Other physical or fitness outcomes including body composition, muscular strength, and balance were rarely reported. There is weak evidence supporting aquatic exercise training to improve physical function and aerobic fitness among adults with spinal cord injury. Suggestions for future research include reporting details of exercise interventions, evaluating other physical or fitness outcomes, and improving methodological quality.
Tai Chi for Essential Hypertension
Wang, Jie; Feng, Bo; Yang, Xiaochen; Liu, Wei; Teng, Fei; Li, Shengjie; Xiong, Xingjiang
2013-01-01
Objectives. To assess the current clinical evidence of Tai Chi for essential hypertension (EH). Search Strategy. 7 electronic databases were searched until 20 April, 2013. Inclusion Criteria. We included randomized trials testing Tai Chi versus routine care or antihypertensive drugs. Trials testing Tai Chi combined with antihypertensive drugs versus antihypertensive drugs were also included. Data Extraction and Analyses. Study selection, data extraction, quality assessment, and data analyses were conducted according to the Cochrane standards. Results. 18 trials were included. Methodological quality of the trials was low. 14 trials compared Tai Chi with routine care. 1 trial compared Tai Chi with antihypertensive drugs. Meta-analysis all showed significant effect of TaiChi in lowering blood pressure (BP). 3 trials compared Tai Chi plus antihypertensive drugs with antihypertensive drugs. Positive results in BP were found in the other 2 combination groups. Most of the trials did not report adverse events, and the safety of Tai Chi is still uncertain. Conclusions. There is some encouraging evidence of Tai Chi for EH. However, due to poor methodological quality of included studies, the evidence remains weak. Rigorously designed trials are needed to confirm the evidence. PMID:23986780
Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.
Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L
2015-07-01
The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.
Best practices in ranking communicable disease threats: a literature review, 2015.
O'Brien, Eleanor Charlotte; Taft, Rachel; Geary, Katie; Ciotti, Massimo; Suk, Jonathan E
2016-04-28
The threat of serious, cross-border communicable disease outbreaks in Europe poses a significant challenge to public health and emergency preparedness because the relative likelihood of these threats and the pathogens involved are constantly shifting in response to a range of changing disease drivers. To inform strategic planning by enabling effective resource allocation to manage the consequences of communicable disease outbreaks, it is useful to be able to rank and prioritise pathogens. This paper reports on a literature review which identifies and evaluates the range of methods used for risk ranking. Searches were performed across biomedical and grey literature databases, supplemented by reference harvesting and citation tracking. Studies were selected using transparent inclusion criteria and underwent quality appraisal using a bespoke checklist based on the AGREE II criteria. Seventeen studies were included in the review, covering five methodologies. A narrative analysis of the selected studies suggests that no single methodology was superior. However, many of the methods shared common components, around which a 'best-practice' framework was formulated. This approach is intended to help inform decision makers' choice of an appropriate risk-ranking study design.
Meta-Analysis of the Effects of Xingnaojing Injection on Consciousness Disturbance
Wu, Lijun; Zhang, Hua; Xing, Yanwei; Gao, Yonghong; Li, Yanda; Ren, Xiaomeng; Li, Jie; Nie, Bo; Zhu, Lingqun; Shang, Hongcai; Gao, Ying
2016-01-01
Abstract Xingnaojing (XNJ) is commonly extracted from Angongniuhuang, a classic Chinese emergency prescription, and widely used in the treatment of nervous system disorders including consciousness disturbance in China. To evaluate the beneficial and adverse effects of XNJ injection, on consciousness disturbance. Seven major electronic databases were searched to retrieve randomized controlled trials designed to evaluate the clinical efficacy of XNJ alone or combined with Western medicine in treating consciousness disturbance caused by conditions such as high fever, poisoning, and stroke. The methodological quality of the included studies was assessed using criteria from the Cochrane Handbook for Systematic Review of Interventions, and analyzed using the RevMan 5.3.0 software. Seventeen randomized controlled trials on XNJ were included in this study and the trials generally showed low methodological quality. The results revealed that XNJ alone or in combination with other medicines and adjuvant methods had a positive effect on patients with fever-, poisoning-, and stroke-induced coma. XNJ effectively treated consciousness disturbances that were caused by high fever, poisoning, or stroke. PMID:26886655
Experimental Design of a UCAV-Based High-Energy Laser Weapon
2016-12-01
propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT
Matsuda, Fumio; Shinbo, Yoko; Oikawa, Akira; Hirai, Masami Yokota; Fiehn, Oliver; Kanaya, Shigehiko; Saito, Kazuki
2009-01-01
Background In metabolomics researches using mass spectrometry (MS), systematic searching of high-resolution mass data against compound databases is often the first step of metabolite annotation to determine elemental compositions possessing similar theoretical mass numbers. However, incorrect hits derived from errors in mass analyses will be included in the results of elemental composition searches. To assess the quality of peak annotation information, a novel methodology for false discovery rates (FDR) evaluation is presented in this study. Based on the FDR analyses, several aspects of an elemental composition search, including setting a threshold, estimating FDR, and the types of elemental composition databases most reliable for searching are discussed. Methodology/Principal Findings The FDR can be determined from one measured value (i.e., the hit rate for search queries) and four parameters determined by Monte Carlo simulation. The results indicate that relatively high FDR values (30–50%) were obtained when searching time-of-flight (TOF)/MS data using the KNApSAcK and KEGG databases. In addition, searches against large all-in-one databases (e.g., PubChem) always produced unacceptable results (FDR >70%). The estimated FDRs suggest that the quality of search results can be improved not only by performing more accurate mass analysis but also by modifying the properties of the compound database. A theoretical analysis indicates that FDR could be improved by using compound database with smaller but higher completeness entries. Conclusions/Significance High accuracy mass analysis, such as Fourier transform (FT)-MS, is needed for reliable annotation (FDR <10%). In addition, a small, customized compound database is preferable for high-quality annotation of metabolome data. PMID:19847304
Garg, Rakesh
2016-09-01
The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles.
Practice-Based Knowledge Discovery for Comparative Effectiveness Research: An Organizing Framework
Lucero, Robert J.; Bakken, Suzanne
2014-01-01
Electronic health information systems can increase the ability of health-care organizations to investigate the effects of clinical interventions. The authors present an organizing framework that integrates outcomes and informatics research paradigms to guide knowledge discovery in electronic clinical databases. They illustrate its application using the example of hospital acquired pressure ulcers (HAPU). The Knowledge Discovery through Informatics for Comparative Effectiveness Research (KDI-CER) framework was conceived as a heuristic to conceptualize study designs and address potential methodological limitations imposed by using a single research perspective. Advances in informatics research can play a complementary role in advancing the field of outcomes research including CER. The KDI-CER framework can be used to facilitate knowledge discovery from routinely collected electronic clinical data. PMID:25278645
Buu, Anne; Johnson, Norman J.; Li, Runze; Tan, Xianming
2011-01-01
Zero-inflated count data are very common in health surveys. This study develops new variable selection methods for the zero-inflated Poisson regression model. Our simulations demonstrate the negative consequences which arise from the ignorance of zero-inflation. Among the competing methods, the one-step SCAD method is recommended because it has the highest specificity, sensitivity, exact fit, and lowest estimation error. The design of the simulations is based on the special features of two large national databases commonly used in the alcoholism and substance abuse field so that our findings can be easily generalized to the real settings. Applications of the methodology are demonstrated by empirical analyses on the data from a well-known alcohol study. PMID:21563207
The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem.
Phadungsukanan, Weerapong; Kraft, Markus; Townsend, Joe A; Murray-Rust, Peter
2012-08-07
: This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications.
The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem
2012-01-01
This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications. PMID:22870956
Amaratunga, Thelina; Dobranowski, Julian
2016-09-01
Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Root resorption during orthodontic treatment.
Walker, Sally
2010-01-01
Medline, Embase, LILACS, The Cochrane Library (Cochrane Database of Systematic Reviews, CENTRAL, and Cochrane Oral Health Group Trials Register) Web of Science, EBM Reviews, Computer Retrieval of Information on Scientific Project (CRISP, www.crisp.cit.nih.gov), On-Line Computer Library Center (www.oclc.org), Google Index to Scientific and Technical Proceedings, PAHO (www.paho.org), WHOLis (www.who.int/library/databases/en), BBO (Brazilian Bibliography of Dentistry), CEPS (Chinese Electronic Periodical Services), Conference materials (www.bl.uk/services/bsds/dsc/conference.html), ProQuest Dissertation Abstracts and Thesis database, TrialCentral (www.trialscentral.org), National Research Register (www.controlled-trials.com), www.Clinicaltrials.gov and SIGLE (System for Information on Grey Literature in Europe). Randomised controlled trials including split mouth design, recording the presence or absence of external apical root resorption (EARR) by treatment group at the end of the treatment period. Data were extracted independently by two reviewers using specially designed and piloted forms. Quality was also assessed independently by the same reviewers. After evaluating titles and abstracts, 144 full articles were obtained of which 13 articles, describing 11 trials, fulfilled the criteria for inclusion. Differences in the methodological approaches and reporting results made quantitative statistical comparisons impossible. Evidence suggests that comprehensive orthodontic treatment causes increased incidence and severity of root resorption, and heavy forces might be particularly harmful. Orthodontically induced inflammatory root resorption is unaffected by archwire sequencing, bracket prescription, and self-ligation. Previous trauma and tooth morphology are unlikely causative factors. There is some evidence that a two- to three-month pause in treatment decreases total root resorption. The results were inconclusive in the clinical management of root resorption, but there is evidence to support the use of light forces, especially with incisor intrusion.
Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review.
Gohari, Faeze; Baradaran, Hamid Reza; Tabatabaee, Morteza; Anijidani, Shabnam; Mohammadpour Touserkani, Fatemeh; Atlasi, Rasha; Razmgir, Maryam
2015-01-01
To determine the quality of randomized controlled clinical trial (RCT) reports in diabetes research in Iran. Systematized review. We included RCTs conducted on diabetes mellitus in Iran. Animal studies, educational interventions, and non-randomized trials were excluded. We excluded duplicated publications reporting the same groups of participants and intervention. Two independent reviewers identify all eligible articles specifically designed data extraction form. We searched through international databases; Scopus, ProQuest, EBSCO, Science Direct, Web of Science, Cochrane Library, PubMed; and national databases (In Persian language) such as Magiran, Scientific Information Database (SID) and IranMedex from January 1995 to January of 2013 Two investigators assessed the quality of reporting by CONSORT 2010 (Consolidated Standards of Reporting Trials) checklist statemen.t,. Discrepancies were resolved by third reviewer consulting. One hundred and eight five (185) studies were included and appraised. Half of them (55.7 %) were published in Iranian journals. Most (89.7 %) were parallel RCTs, and being performed on type2 diabetic patients (77.8 %). Less than half of the CONSORT items (43.2 %) were reported in studies, totally. The reporting of randomization and blinding were poor. A few studies 15.1 % mentioned the method of random sequence generation and strategy of allocation concealment. And only 34.8 % of trials report how blinding was applied. The findings of this study show that the quality of RCTs conducted in Iran in diabetes research seems suboptimal and the reporting is also incomplete however an increasing trend of improvement can be seen over time. Therefore, it is suggested Iranian researchers pay much more attention to design and methodological quality in conducting and reporting of diabetes RCTs.
Pang, Marco YC; Eng, Janice J; Dawson, Andrew S; Gylfadóttir, Sif
2011-01-01
Objective To determine whether aerobic exercise improves aerobic capacity in individuals with stroke. Design A systematic review of randomized controlled trials. Databases searched MEDLINE, CINAHL, EMBASE, Cochrane Database of Systematic Reviews, Physiotherapy Evidence Database were searched. Inclusion criteria Design: randomized controlled trials; Participants: individuals with stroke; Interventions: aerobic exercise training aimed at improving aerobic capacity; Outcomes Primary outcomes: aerobic capacity [peak oxygen consumption (VO2), peak workload); Secondary outcomes: walking velocity, walking endurance. Data Analysis The methodological quality was assessed by the PEDro scale. Meta-analyses were performed for all primary and secondary outcomes. Results Nine articles (seven RCTs) were identified. The exercise intensity ranged from 50% to 80% heart rate reserve. Exercise duration was 20–40 minutes for 3–5 days a week. The total number of subjects included in the studies was 480. All studies reported positive effects on aerobic capacity, regardless of the stage of stroke recovery. Meta-analysis revealed a significant homogeneous standardized effect size (SES) in favour of aerobic exercise to improve peak VO2 (SES, 0.42; 95%CI, 0.15 to 0.69; p=0.001) and peak workload (SES, 0.50; 95%CI, 0.26 to 0.73; p<0.001). There was also a significant homogeneous SES in favour of aerobic training to improve walking velocity (SES, 0.26; 95%CI, 0.05 to 0.48; p=0.008) and walking endurance (SES, 0.30; 95%CI, 0.06to 0.55; p=0.008). Conclusions There is good evidence that aerobic exercise is beneficial for improving aerobic capacity in people with mild and moderate stroke. Aerobic exercise should be an important component of stroke rehabilitation. PMID:16541930
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-01
This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)
NASA Astrophysics Data System (ADS)
Vázquez-Suñé, Enric; Ángel Marazuela, Miguel; Velasco, Violeta; Diviu, Marc; Pérez-Estaún, Andrés; Álvarez-Marrón, Joaquina
2016-09-01
The overdevelopment of cities since the industrial revolution has shown the need to incorporate a sound geological knowledge in the management of required subsurface infrastructures and in the assessment of increasingly needed groundwater resources. Additionally, the scarcity of outcrops and the technical difficulty to conduct underground exploration in urban areas highlights the importance of implementing efficient management plans that deal with the legacy of heterogeneous subsurface information. To deal with these difficulties, a methodology has been proposed to integrate all the available spatio-temporal data into a comprehensive spatial database and a set of tools that facilitates the analysis and processing of the existing and newly added data for the city of Barcelona (NE Spain). Here we present the resulting actual subsurface 3-D geological model that incorporates and articulates all the information stored in the database. The methodology applied to Barcelona benefited from a good collaboration between administrative bodies and researchers that enabled the realization of a comprehensive geological database despite logistic difficulties. Currently, the public administration and also private sectors both benefit from the geological understanding acquired in the city of Barcelona, for example, when preparing the hydrogeological models used in groundwater assessment plans. The methodology further facilitates the continuous incorporation of new data in the implementation and sustainable management of urban groundwater, and also contributes to significantly reducing the costs of new infrastructures.
MPD3: a useful medicinal plants database for drug designing.
Mumtaz, Arooj; Ashfaq, Usman Ali; Ul Qamar, Muhammad Tahir; Anwar, Farooq; Gulzar, Faisal; Ali, Muhammad Amjad; Saari, Nazamid; Pervez, Muhammad Tariq
2017-06-01
Medicinal plants are the main natural pools for the discovery and development of new drugs. In the modern era of computer-aided drug designing (CADD), there is need of prompt efforts to design and construct useful database management system that allows proper data storage, retrieval and management with user-friendly interface. An inclusive database having information about classification, activity and ready-to-dock library of medicinal plant's phytochemicals is therefore required to assist the researchers in the field of CADD. The present work was designed to merge activities of phytochemicals from medicinal plants, their targets and literature references into a single comprehensive database named as Medicinal Plants Database for Drug Designing (MPD3). The newly designed online and downloadable MPD3 contains information about more than 5000 phytochemicals from around 1000 medicinal plants with 80 different activities, more than 900 literature references and 200 plus targets. The designed database is deemed to be very useful for the researchers who are engaged in medicinal plants research, CADD and drug discovery/development with ease of operation and increased efficiency. The designed MPD3 is a comprehensive database which provides most of the information related to the medicinal plants at a single platform. MPD3 is freely available at: http://bioinform.info .
Systematic review of the diagnostic category muscle dysmorphia.
Santos Filho, Celso Alves dos; Tirico, Patrícia Passarelli; Stefano, Sergio Carlos; Touyz, Stephen W; Claudino, Angélica Medeiros
2016-04-01
(1) To collect, analyze and synthetize the evidence on muscle dysmorphia diagnosis as defined by Pope et al. and (2) To discuss its appropriate nosology and inclusion as a specific category in psychiatric classificatory systems. A systematic search in the MEDLINE, the PsycNET, the LILACS and SciELO databases and in the International Journal of Eating Disorders was conducted looking for articles published between January 1997 and October 2014 and in EMBASE database between January 1997 and August 2013. Only epidemiological and analytical studies were considered for selection. The methodological quality of included studies was assessed according to the Evidence-Based Mental Health and the National Health and Medical Research Council's guidelines. The support for inclusion of muscle dysmorphia in psychiatric classificatory systems was examined against Blashfield et al.'s criteria. Thirty-four articles were considered eligible out of 5136. Most of the studies were cross-sectional and enrolled small, non-clinical samples. The methodological quality of all selected papers was graded at the lowest hierarchical level due to studies' designs. Forty-one percent of the publications considered the available evidence insufficient to support the inclusion of muscle dysmorphia in any existing category of psychiatric disorders. The current literature does not fulfill Blashfield et al.'s criteria for the inclusion of muscle dysmorphia as a specific entity in psychiatric diagnostic manuals. The current evidence does not ensure the validity, clinical utility, nosological classification and inclusion of muscle dysmorphia as a new disorder in classificatory systems of mental disorders. © The Royal Australian and New Zealand College of Psychiatrists 2015.
NASA Astrophysics Data System (ADS)
Gruyters, Willem; Verboven, Pieter; Rogge, Seppe; Vanmaercke, Simon; Ramon, Herman; Nicolai, Bart
2017-10-01
Freshly harvested horticultural produce require a proper temperature management to maintain their high economic value. Towards this end, low temperature storage is of crucial importance to maintain a high product quality. Optimizing both the package design of packed produce and the different steps in the postharvest cold chain can be achieved by numerical modelling of the relevant transport phenomena. This work presents a novel methodology to accurately model both the random filling of produce in a package and the subsequent cooling process. First, a cultivar-specific database of more than 100 realistic CAD models of apple and pear fruit is built with a validated geometrical 3D shape model generator. To have an accurate representation of a realistic picking season, the model generator also takes into account the biological variability of the produce shape. Next, a discrete element model (DEM) randomly chooses surface meshed bodies from the database to simulate the gravitational filling process of produce in a box or bin, using actual mechanical properties of the fruit. A computational fluid dynamics (CFD) model is then developed with the final stacking arrangement of the produce to study the cooling efficiency of packages under several conditions and configurations. Here, a typical precooling operation is simulated to demonstrate the large differences between using actual 3D shapes of the fruit and an equivalent spheres approach that simplifies the problem drastically. From this study, it is concluded that using a simplified representation of the actual fruit shape may lead to a severe overestimation of the cooling behaviour.
MicroRNAs associated with exercise and diet: a systematic review.
Flowers, Elena; Won, Gloria Y; Fukuoka, Yoshimi
2015-01-01
MicroRNAs are posttranscriptional regulators of gene expression. MicroRNAs reflect individual biologic adaptation to exposures in the environment. As such, measurement of circulating microRNAs presents an opportunity to evaluate biologic changes associated with behavioral interventions (i.e., exercise, diet) for weight loss. The aim of this study was to perform a systematic review of the literature to summarize what is known about circulating microRNAs associated with exercise, diet, and weight loss. We performed a systematic review of three scientific databases. We included studies reporting on circulating microRNAs associated with exercise, diet, and weight loss in humans. Of 1,219 studies identified in our comprehensive database search, 14 were selected for inclusion. Twelve reported on microRNAs associated with exercise, and two reported on microRNAs associated with diet and weight loss. The majority of studies used a quasiexperimental, cross-sectional design. There were numerous differences in the type and intensity of exercise and dietary interventions, the biologic source of microRNAs, and the methodological approaches used quantitate microRNAs. Data from several studies support an association between circulating microRNAs and exercise. The evidence for an association between circulating microRNAs and diet is weaker because of a small number of studies. Additional research is needed to validate previous observations using methodologically rigorous approaches to microRNA quantitation to determine the specific circulating microRNA signatures associated with behavioral approaches to weight loss. Future directions include longitudinal studies to determine if circulating microRNAs are predictive of response to behavioral interventions. Copyright © 2015 the American Physiological Society.
ERIC Educational Resources Information Center
Khalil, Deena; Kier, Meredith
2017-01-01
This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2012 CFR
2012-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2013 CFR
2013-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2011 CFR
2011-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
NASA Technical Reports Server (NTRS)
Hark, Frank; Britton, Paul; Ring, Rob; Novack, Steven D.
2016-01-01
Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFS are a set of dependent type of failures that can be caused by: system environments; manufacturing; transportation; storage; maintenance; and assembly, as examples. Since there are many factors that contribute to CCFs, the effects can be reduced, but they are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and CCF (dependent) failure and data is limited, especially for launch vehicles. The Probabilistic Risk Assessment (PRA) of NASA's Safety and Mission Assurance Directorate at Marshal Space Flight Center (MFSC) is using generic data from the Nuclear Regulatory Commission's database of common cause failures at nuclear power plants to estimate CCF due to the lack of a more appropriate data source. There remains uncertainty in the actual magnitude of the common cause risk estimates for different systems at this stage of the design. Given the limited data about launch vehicle CCF and that launch vehicles are a highly redundant system by design, it is important to make design decisions to account for a range of values for independent and CCFs. When investigating the design of the one-out-of-two component redundant system for launch vehicles, a response surface was constructed to represent the impact of the independent failure rate versus a common cause beta factor effect on a system's failure probability. This presentation will define a CCF and review estimation calculations. It gives a summary of reduction methodologies and a review of examples of historical CCFs. Finally, it presents the response surface and discusses the results of the different CCFs on the reliability of a one-out-of-two system.
NASA Technical Reports Server (NTRS)
Hark, Frank; Britton, Paul; Ring, Rob; Novack, Steven D.
2015-01-01
Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFS are a set of dependent type of failures that can be caused by: system environments; manufacturing; transportation; storage; maintenance; and assembly, as examples. Since there are many factors that contribute to CCFs, the effects can be reduced, but they are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and CCF (dependent) failure and data is limited, especially for launch vehicles. The Probabilistic Risk Assessment (PRA) of NASA's Safety and Mission Assurance Directorate at Marshall Space Flight Center (MFSC) is using generic data from the Nuclear Regulatory Commission's database of common cause failures at nuclear power plants to estimate CCF due to the lack of a more appropriate data source. There remains uncertainty in the actual magnitude of the common cause risk estimates for different systems at this stage of the design. Given the limited data about launch vehicle CCF and that launch vehicles are a highly redundant system by design, it is important to make design decisions to account for a range of values for independent and CCFs. When investigating the design of the one-out-of-two component redundant system for launch vehicles, a response surface was constructed to represent the impact of the independent failure rate versus a common cause beta factor effect on a system's failure probability. This presentation will define a CCF and review estimation calculations. It gives a summary of reduction methodologies and a review of examples of historical CCFs. Finally, it presents the response surface and discusses the results of the different CCFs on the reliability of a one-out-of-two system.
Utility of Army Design Methodology in U.S. Coast Guard Counter Narcotic Interdiction Strategy
2017-06-09
UTILITY OF ARMY DESIGN METHODOLOGY IN U.S. COAST GUARD COUNTER NARCOTIC INTERDICTION STRATEGY A thesis presented to the...Thesis 3. DATES COVERED (From - To) AUG 2016 – JUN 2017 4. TITLE AND SUBTITLE Utility of Army Design Methodology in U.S. Coast Guard Counter...Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This study investigates the utility of using Army Design Methodology (ADM) to
2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
Common Bolted Joint Analysis Tool
NASA Technical Reports Server (NTRS)
Imtiaz, Kauser
2011-01-01
Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.
A unified approach to the design of clinical reporting systems.
Gouveia-Oliveira, A; Salgado, N C; Azevedo, A P; Lopes, L; Raposo, V D; Almeida, I; de Melo, F G
1994-12-01
Computer-based Clinical Reporting Systems (CRS) for diagnostic departments that use structured data entry have a number of functional and structural affinities suggesting that a common software architecture for CRS may be defined. Such an architecture should allow easy expandability and reusability of a CRS. We report the development methodology and the architecture of SISCOPE, a CRS originally designed for gastrointestinal endoscopy that is expandable and reusable. Its main components are a patient database, a knowledge base, a reports base, and screen and reporting engines. The knowledge base contains the description of the controlled vocabulary and all the information necessary to control the menu system, and is easily accessed and modified with a conventional text editor. The structure of the controlled vocabulary is formally presented as an entity-relationship diagram. The screen engine drives a dynamic user interface and the reporting engine automatically creates a medical report; both engines operate by following a set of rules and the information contained in the knowledge base. Clinical experience has shown this architecture to be highly flexible and to allow frequent modifications of both the vocabulary and the menu system. This structure provided increased collaboration among development teams, insulating the domain expert from the details of the database, and enabling him to modify the system as necessary and to test the changes immediately. The system has also been reused in several different domains.
Relational Database Design in Information Science Education.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1985-01-01
Reports on database management system (dbms) applications designed by library school students for university community at University of Iowa. Three dbms design issues are examined: synthesis of relations, analysis of relations (normalization procedure), and data dictionary usage. Database planning prior to automation using data dictionary approach…
Exploring inattention and distraction in the SafetyNet Accident Causation Database.
Talbot, Rachel; Fagerlind, Helen; Morris, Andrew
2013-11-01
Distraction and inattention are considered to be very important and prevalent factors in the causation of road accidents. There have been many recent research studies which have attempted to understand the circumstances under which a driver becomes distracted or inattentive and how distraction/inattention can be prevented. Both factors are thought to have become more important in recent times partly due to the evolution of in-vehicle information and communication technology. This study describes a methodology that was developed to understand when factors such as distraction and inattention may have been contributors to crashes and also describes some of the consequences of distraction and inattention in terms of subsequent driver actions. The study uses data relating to distraction and inattention from the SafetyNet Accident Causation Database. This database was formulated as part of the SafetyNet project to address the lack of representative in-depth accident causation data within the European Union. Data were collected in 6 European countries using 'on-scene' and 'nearly on-scene' crash investigation methodologies. 32% of crashes recorded in the database, involved at least one driver, rider or pedestrian, who was determined to be 'Inattentive' or 'Distracted'. 212 of the drivers were assigned 'Distraction' and 140 drivers were given the code 'Inattention'. It was found that both distraction and inattention often lead to missed observations within the driving task and consequently 'Timing' or 'Direction' become critical events in the aetiology of crashes. In addition, the crash types and outcomes may differ according to the type and nature of the distraction and inattention as determined by the in-depth investigations. The development of accident coding methodology is described in this study as is its evolution into the Driver Reliability and Error Analysis Model (DREAM) version 3.0. Copyright © 2012 Elsevier Ltd. All rights reserved.
Cainzos-Achirica, Miguel; Varas-Lorenzo, Cristina; Pottegård, Anton; Asmar, Joelle; Plana, Estel; Rasmussen, Lotte; Bizouard, Geoffray; Forns, Joan; Hellfritzsch, Maja; Zint, Kristina; Perez-Gutthann, Susana; Pladevall-Vila, Manel
2018-03-23
To report and discuss estimated prevalence of potential off-label use and associated methodological challenges using a case study of dabigatran. Observational, cross-sectional study using 3 databases with different types of clinical information available: Cegedim Strategic Data Longitudinal Patient Database (CSD-LPD), France (cardiologist panel, n = 1706; general practitioner panel, n = 2813; primary care data); National Health Databases, Denmark (n = 28 619; hospital episodes and dispensed ambulatory medications); and Clinical Practice Research Datalink (CPRD), UK (linkable to Hospital Episode Statistics [HES], n = 2150; not linkable, n = 1285; primary care data plus hospital data for HES-linkable patients). August 2011 to August 2015. Two definitions were used to estimate potential off-label use: a broad definition of on-label prescribing using codes for disease indication (eg, atrial fibrillation [AF]), and a restrictive definition excluding patients with conditions for which dabigatran is not indicated (eg, valvular AF). Prevalence estimates under the broad definition ranged from 5.7% (CPRD-HES) to 34.0% (CSD-LPD) and, under the restrictive definition, from 17.4% (CPRD-HES) to 44.1% (CSD-LPD). For the majority of potential off-label users, no diagnosis potentially related to anticoagulant use was identified. Key methodological challenges were the limited availability of detailed clinical information, likely leading to overestimation of off-label use, and differences in the information available, which may explain the disparate prevalence estimates across data sources. Estimates of potential off-label use should be interpreted cautiously due to limitations in available information. In this context, CPRD HES-linkable estimates are likely to be the most accurate. Copyright © 2018 John Wiley & Sons, Ltd.
Cochrane Systematic Reviews of Chinese Herbal Medicines: An Overview
Hu, Jing; Zhang, Junhua; Zhao, Wei; Zhang, Yongling; Zhang, Li; Shang, Hongcai
2011-01-01
Objectives Our study had two objectives: a) to systematically identify all existing systematic reviews of Chinese herbal medicines (CHM) published in Cochrane Library; b) to assess the methodological quality of included reviews. Methodology/Principal Findings We performed a systematic search of the Cochrane Database of Systematic Reviews (CDSR, Issue 5, 2010) to identify all reviews of CHM. A total of fifty-eight reviews were eligible for our study. Twenty-one of the included reviews had at least one Traditional Chinese Medicine (TCM) practitioner as its co-author. 7 reviews didn't include any primary study, the remaining reviews (n = 51) included a median of 9 studies and 936 participants. 50% of reviews were last assessed as up-to-date prior to 2008. The questions addressed by 39 reviews were broad in scope, in which 9 reviews combined studies with different herbal medicines. For OQAQ, the mean of overall quality score (item 10) was 5.05 (95% CI; 4.58-5.52). All reviews assessed the methodological quality of primary studies, 16% of included primary studies used adequate sequence generation and 7% used adequate allocation concealment. Of the 51 nonempty reviews, 23 reviews were reported as being inconclusive, while 27 concluded that there might be benefit of CHM, which was limited by the poor quality or inadequate quantity of included studies. 58 reviews reported searching a median of seven electronic databases, while 10 reviews did not search any Chinese database. Conclusions Now CDSR has included large numbers of CHM reviews, our study identified some areas which could be improved, such as almost half of included reviews did not have the participation of TCM practitioners and were not up-to-date according to Cochrane criteria, some reviews pooled the results of different herbal medicines and ignored the searching of Chinese databases. PMID:22174870
Conceptual and logical level of database modeling
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2016-06-01
Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.
Hypnotherapy for insomnia: a systematic review and meta-analysis of randomized controlled trials.
Lam, Tak-Ho; Chung, Ka-Fai; Yeung, Wing-Fai; Yu, Branda Yee-Man; Yung, Kam-Ping; Ng, Tommy Ho-Yee
2015-10-01
To examine the efficacy and safety of hypnotherapy for insomnia as compared to placebo, pharmacological or non-pharmacological intervention, or no treatment. A systematic search on major electronic databases was conducted up until March 2014. Inclusion criteria are: (1) randomized controlled trials (RCTs) or quasi-RCTs; (2) intervention targeted at improving sleep; (3) hypnosis as an intervention; and (4) English language articles. Sleep diary variable is the primary outcome measure. Six RCTs of hypnotherapy and seven on autogenic training or guided imagery, comprising 502 subjects, were included. Eleven of the 13 studies had low methodological quality, as indicated by a modified Jadad score below 3, and high risks of bias in blinding and design of the control interventions. No adverse events related to hypnosis were reported, though seldom investigated. Meta-analyses found hypnotherapy significantly shortened sleep latency compared to waitlist (standardized mean difference, SMD=-0.88, 95% confidence interval (CI): -1.56, -0.19, P=0.01, I(2)=15%), but no difference compared to sham intervention (SMD: -1.08, 95% CI: -3.15, 0.09, P=0.31, I(2)=90%). Similar results were found for autogenic training or guided imagery (SMD with waitlist=-1.16, 95% CI: -1.92, -0.40, P=0.003, I(2)=0%; SMD with sham intervention=-0.50, 95% CI: -1.19, 0.19, P=0.15, I(2)=0%). Generalizability of the positive results is doubtful due to the relatively small sample size and methodological limitations. Future studies with larger sample size and better study design and methodology are called for. Copyright © 2015 Elsevier Ltd. All rights reserved.
Olah, Emoke; Poto, Laszlo; Hegyi, Peter; Szabo, Imre; Hartmann, Petra; Solymar, Margit; Petervari, Erika; Balasko, Marta; Habon, Tamas; Rumbus, Zoltan; Tenk, Judit; Rostas, Ildiko; Weinberg, Jordan; Romanovsky, Andrej A; Garami, Andras
2018-04-21
Therapeutic hypothermia was investigated repeatedly as a tool to improve the outcome of severe traumatic brain injury (TBI), but previous clinical trials and meta-analyses found contradictory results. We aimed to determine the effectiveness of therapeutic whole-body hypothermia on the mortality of adult patients with severe TBI by using a novel approach of meta-analysis. We searched the PubMed, EMBASE, and Cochrane Library databases from inception to February 2017. The identified human studies were evaluated regarding statistical, clinical, and methodological designs to ensure inter-study homogeneity. We extracted data on TBI severity, body temperature, mortality, and cooling parameters; then we calculated the cooling index, an integrated measure of therapeutic hypothermia. Forest plot of all identified studies showed no difference in the outcome of TBI between cooled and not cooled patients, but inter-study heterogeneity was high. On the contrary, by meta-analysis of RCTs which were homogenous with regards to statistical, clinical designs and precisely reported the cooling protocol, we showed decreased odds ratio for mortality in therapeutic hypothermia compared to no cooling. As independent factors, milder and longer cooling, and rewarming at < 0.25°C/h were associated with better outcome. Therapeutic hypothermia was beneficial only if the cooling index (measure of combination of cooling parameters) was sufficiently high. We conclude that high methodological and statistical inter-study heterogeneity could underlie the contradictory results obtained in previous studies. By analyzing methodologically homogenous studies, we show that cooling improves the outcome of severe TBI and this beneficial effect depends on certain cooling parameters and on their integrated measure, the cooling index.
Sáez, M
2003-01-01
In Spain, the degree and characteristics of primary care services utilization have been the subject of analysis since at least the 1980s. One of the main reasons for this interest is to assess the extent to which utilization matches primary care needs. In fact, the provision of an adequate health service for those who most need it is a generally accepted priority. The evidence shows that individual characteristics, mainly health status, are the factors most closely related to primary care utilization. Other personal characteristics, such as gender and age, could act as modulators of health care need. Some family and/or cultural variables, as well as factors related to the health care professional and institutions, could explain some of the observed variability in primary care services utilization. Socioeconomic variables, such as income, reveal a paradox. From an aggregate perspective, income is the main determinant of utilization as well as of health care expenditure. When data are analyzed for individuals, however, income is not related to primary health utilization. The situation is controversial, with methodological implications and, above all, consequences for the assessment of the efficiency in primary care utilization. Review of the literature reveals certain methodological inconsistencies that could at least partly explain the disparity of the empirical results. Among others, the following flaws can be highlighted: design problems, measurement errors, misspecification, and misleading statistical methods.Some solutions, among others, are quasi-experiments, the use of large administrative databases and of primary data sources (design problems); differentiation between types of utilization and between units of analysis other than consultations, and correction of measurement errors in the explanatory variables (measurement errors); consideration of relevant explanatory variables (misspecification); and the use of multilevel models (statistical methods).
Mueller, Monika; D'Addario, Maddalena; Egger, Matthias; Cevallos, Myriam; Dekkers, Olaf; Mugglin, Catrina; Scott, Pippa
2018-05-21
Systematic reviews and meta-analyses of observational studies are frequently performed, but no widely accepted guidance is available at present. We performed a systematic scoping review of published methodological recommendations on how to systematically review and meta-analyse observational studies. We searched online databases and websites and contacted experts in the field to locate potentially eligible articles. We included articles that provided any type of recommendation on how to conduct systematic reviews and meta-analyses of observational studies. We extracted and summarised recommendations on pre-defined key items: protocol development, research question, search strategy, study eligibility, data extraction, dealing with different study designs, risk of bias assessment, publication bias, heterogeneity, statistical analysis. We summarised recommendations by key item, identifying areas of agreement and disagreement as well as areas where recommendations were missing or scarce. The searches identified 2461 articles of which 93 were eligible. Many recommendations for reviews and meta-analyses of observational studies were transferred from guidance developed for reviews and meta-analyses of RCTs. Although there was substantial agreement in some methodological areas there was also considerable disagreement on how evidence synthesis of observational studies should be conducted. Conflicting recommendations were seen on topics such as the inclusion of different study designs in systematic reviews and meta-analyses, the use of quality scales to assess the risk of bias, and the choice of model (e.g. fixed vs. random effects) for meta-analysis. There is a need for sound methodological guidance on how to conduct systematic reviews and meta-analyses of observational studies, which critically considers areas in which there are conflicting recommendations.
Sazlina, Shariff-Ghazali; Browning, Colette; Yasin, Shajahan
2013-01-01
Introduction: Type 2 diabetes mellitus (T2DM) among people aged 60 years and above is a growing public health problem. Regular physical activity is one of the key elements in the management of T2DM. Recommendations suggest that older people with T2DM will benefit from regular physical activity for better disease control and delaying complications. Despite the known benefits, many remain sedentary. Hence, this review assessed interventions for promoting physical activity in persons aged 65 years and older with T2DM. Methods: A literature search was conducted using Ovid MEDLINE, PubMed, EMBASE, SPORTDiscus, and CINAHL databases to retrieve articles published between January 2000 and December 2012. Randomized controlled trials and quasi-experimental designs comparing different strategies to increase physical activity level in persons aged 65 years and older with T2DM were included. The methodological quality of studies was assessed. Results: Twenty-one eligible studies were reviewed, only six studies were rated as good quality and only one study specifically targeted persons aged 65 years and older. Personalized coaching, goal setting, peer support groups, use of technology, and physical activity monitors were proven to increase the level of physical activity. Incorporation of health behavior theories and follow-up supports also were successful strategies. However, the methodological quality and type of interventions promoting physical activity of the included studies in this review varied widely across the eligible studies. Conclusion: Strategies that increased level of physical activity in persons with T2DM are evident but most studies focused on middle-aged persons and there was a lack of well-designed trials. Hence, more studies of satisfactory methodological quality with interventions promoting physical activity in older people are required. PMID:24392445
Lawal, Adegboyega K; Rotter, Thomas; Kinsman, Leigh; Sari, Nazmi; Harrison, Liz; Jeffery, Cathy; Kutz, Mareike; Khan, Mohammad F; Flynn, Rachel
2014-09-19
Lean is a set of operating philosophies and methods that help create a maximum value for patients by reducing waste and waits. It emphasizes the consideration of the customer's needs, employee involvement and continuous improvement. Research on the application and implementation of lean principles in health care has been limited. This is a protocol for a systematic review, following the Cochrane Effective Practice and Organisation of Care (EPOC) methodology. The review aims to document, catalogue and synthesize the existing literature on the effects of lean implementation in health care settings especially the potential effects on professional practice and health care outcomes. We have developed a Medline keyword search strategy, and this focused strategy will be translated into other databases. All search strategies will be provided in the review. The method proposed by the Cochrane EPOC group regarding randomized study designs, non-randomised controlled trials controlled before and after studies and interrupted time series will be followed. In addition, we will also include cohort, case-control studies, and relevant non-comparative publications such as case reports. We will categorize and analyse the review findings according to the study design employed, the study quality (low- versus high-quality studies) and the reported types of implementation in the primary studies. We will present the results of studies in a tabular form. Overall, the systematic review aims to identify, assess and synthesize the evidence to underpin the implementation of lean activities in health care settings as defined in this protocol. As a result, the review will provide an evidence base for the effectiveness of lean and implementation methodologies reported in health care. PROSPERO CRD42014008853.
2014-01-01
Background Lean is a set of operating philosophies and methods that help create a maximum value for patients by reducing waste and waits. It emphasizes the consideration of the customer’s needs, employee involvement and continuous improvement. Research on the application and implementation of lean principles in health care has been limited. Methods This is a protocol for a systematic review, following the Cochrane Effective Practice and Organisation of Care (EPOC) methodology. The review aims to document, catalogue and synthesize the existing literature on the effects of lean implementation in health care settings especially the potential effects on professional practice and health care outcomes. We have developed a Medline keyword search strategy, and this focused strategy will be translated into other databases. All search strategies will be provided in the review. The method proposed by the Cochrane EPOC group regarding randomized study designs, non-randomised controlled trials controlled before and after studies and interrupted time series will be followed. In addition, we will also include cohort, case–control studies, and relevant non-comparative publications such as case reports. We will categorize and analyse the review findings according to the study design employed, the study quality (low- versus high-quality studies) and the reported types of implementation in the primary studies. We will present the results of studies in a tabular form. Discussion Overall, the systematic review aims to identify, assess and synthesize the evidence to underpin the implementation of lean activities in health care settings as defined in this protocol. As a result, the review will provide an evidence base for the effectiveness of lean and implementation methodologies reported in health care. Systematic review registration PROSPERO CRD42014008853 PMID:25238974
Matthews, M; Rathleff, M S; Claus, A; McPoil, T; Nee, R; Crossley, K; Vicenzino, B
2017-12-01
Patellofemoral pain (PFP) is a multifactorial and often persistent knee condition. One strategy to enhance patient outcomes is using clinically assessable patient characteristics to predict the outcome and match a specific treatment to an individual. A systematic review was conducted to determine which baseline patient characteristics were (1) associated with patient outcome (prognosis); or (2) modified patient outcome from a specific treatment (treatment effect modifiers). 6 electronic databases were searched (July 2016) for studies evaluating the association between those with PFP, their characteristics and outcome. All studies were appraised using the Epidemiological Appraisal Instrument. Studies that aimed to identify treatment effect modifiers underwent a checklist for methodological quality. The 24 included studies evaluated 180 participant characteristics. 12 studies investigated prognosis, and 12 studies investigated potential treatment effect modifiers. Important methodological limitations were identified. Some prognostic studies used a retrospective design. Studies aiming to identify treatment effect modifiers often analysed too many variables for the limiting sample size and typically failed to use a control or comparator treatment group. 16 factors were reported to be associated with a poor outcome, with longer duration of symptoms the most reported (>4 months). Preliminary evidence suggests increased midfoot mobility may predict those who have a successful outcome to foot orthoses. Current evidence can identify those with increased risk of a poor outcome, but methodological limitations make it difficult to predict the outcome after one specific treatment compared with another. Adequately designed randomised trials are needed to identify treatment effect modifiers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
A statistical view of protein chemical synthesis using NCL and extended methodologies.
Agouridas, Vangelis; El Mahdi, Ouafâa; Cargoët, Marine; Melnyk, Oleg
2017-09-15
Native chemical ligation and extended methodologies are the most popular chemoselective reactions for protein chemical synthesis. Their combination with desulfurization techniques can give access to small or challenging proteins that are exploited in a large variety of research areas. In this report, we have conducted a statistical review of their use for protein chemical synthesis in order to provide a flavor of the recent trends and identify the most popular chemical tools used by protein chemists. To this end, a protein chemical synthesis (PCS) database (http://pcs-db.fr) was created by collecting a set of relevant data from more than 450 publications covering the period 1994-2017. A preliminary account of what this database tells us is presented in this report. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gorrell, Lindsay M; Engel, Roger M; Lystad, Reidar P; Brown, Benjamin T
2017-03-14
Reporting of adverse events in randomized clinical trials (RCTs) is encouraged by the authors of The Consolidated Standards of Reporting Trials (CONSORT) statement. With robust methodological design and adequate reporting, RCTs have the potential to provide useful evidence on the incidence of adverse events associated with spinal manipulative therapy (SMT). During a previous investigation, it became apparent that comprehensive search strategies combining text words with indexing terms was not sufficiently sensitive for retrieving records that were known to contain reports on adverse events. The aim of this analysis was to compare the proportion of articles containing data on adverse events associated with SMT that were indexed in MEDLINE and/or EMBASE and the proportion of those that included adverse event-related words in their title or abstract. A sample of 140 RCT articles previously identified as containing data on adverse events associated with SMT was used. Articles were checked to determine if: (1) they had been indexed with relevant terms describing adverse events in the MEDLINE and EMBASE databases; and (2) they mentioned adverse events (or any related terms) in the title or abstract. Of the 140 papers, 91% were MEDLINE records, 85% were EMBASE records, 81% were found in both MEDLINE and EMBASE records, and 4% were not in either database. Only 19% mentioned adverse event-related text words in the title or abstract. There was no significant difference between MEDLINE and EMBASE records in the proportion of available papers (p = 0.078). Of the 113 papers that were found in both MEDLINE and EMBASE records, only 3% had adverse event-related indexing terms assigned to them in both databases, while 81% were not assigned an adverse event-related indexing term in either database. While there was effective indexing of RCTs involving SMT in the MEDLINE and EMBASE databases, there was a failure of allocation of adverse event indexing terms in both databases. We recommend the development of standardized definitions and reporting tools for adverse events associated with SMT. Adequate reporting of adverse events associated with SMT will facilitate accurate indexing of these types of manuscripts in the databases.
Resource-use measurement based on patient recall: issues and challenges for economic evaluation.
Thorn, Joanna C; Coast, Joanna; Cohen, David; Hollingworth, William; Knapp, Martin; Noble, Sian M; Ridyard, Colin; Wordsworth, Sarah; Hughes, Dyfrig
2013-06-01
Accurate resource-use measurement is challenging within an economic evaluation, but is a fundamental requirement for estimating efficiency. Considerable research effort has been concentrated on the appropriate measurement of outcomes and the policy implications of economic evaluation, while methods for resource-use measurement have been relatively neglected. Recently, the Database of Instruments for Resource Use Measurement (DIRUM) was set up at http://www.dirum.org to provide a repository where researchers can share resource-use measures and methods. A workshop to discuss the issues was held at the University of Birmingham in October 2011. Based on material presented at the workshop, this article highlights the state of the art of UK instruments for resource-use data collection based on patient recall. We consider methodological issues in the design and analysis of resource-use instruments, and the challenges associated with designing new questionnaires. We suggest a method of developing a good practice guideline, and identify some areas for future research. Consensus amongst health economists has yet to be reached on many aspects of resource-use measurement. We argue that researchers should now afford costing methodologies the same attention as outcome measurement, and we hope that this Current Opinion article will stimulate a debate on methods of resource-use data collection and establish a research agenda to improve the precision and accuracy of resource-use estimates.
Exploring resilience in nursing and midwifery students: a literature review.
McGowan, Jennifer E; Murray, Karen
2016-10-01
The aim of this study was to explore the concepts of 'resilience' and 'hardiness' in nursing and midwifery students in educational settings and to identify educational interventions to promote resilience. Resilience in healthcare professionals has gained increasing attention globally, yet to date resilience and resilience education in nursing and midwifery students remain largely under-researched. An integrative literature review was planned, however, only quantitative evidence was identified therefore, a review of quantitative studies was undertaken using a systematic approach. A comprehensive search was undertaken using Medline, CINAHL, Embase, PsycINFO and Maternity and Infant Care databases January 1980-February 2015. Data were extracted using a specifically designed form and quality assessed using an appropriate checklist. A narrative summary of findings and statistical outcomes was undertaken. Eight quantitative studies were included. Research relating to resilience and resilience education in nursing and midwifery students is sparse. There is a weak evidence that resilience and hardiness is associated with slightly improved academic performance and decreased burnout. However, studies were heterogeneous in design and limited by poor methodological quality. No study specifically considered student midwives. A greater understanding of the theoretical underpinnings of resilience in nursing and midwifery students is essential for the development of educational resources. It is imperative that future research considers both nursing and midwifery training cohorts and should be of strong methodological quality. © 2016 John Wiley & Sons Ltd.
How to run an effective journal club: a systematic review.
Deenadayalan, Y; Grimmer-Somers, K; Prior, M; Kumar, S
2008-10-01
Health-based journal clubs have been in place for over 100 years. Participants meet regularly to critique research articles, to improve their understanding of research design, statistics and critical appraisal. However, there is no standard process of conducting an effective journal club. We conducted a systematic literature review to identify core processes of a successful health journal club. We searched a range of library databases using established keywords. All research designs were initially considered to establish the body of evidence. Experimental or comparative papers were then critically appraised for methodological quality and information was extracted on effective journal club processes. We identified 101 articles, of which 21 comprised the body of evidence. Of these, 12 described journal club effectiveness. Methodological quality was moderate. The papers described many processes of effective journal clubs. Over 80% papers reported that journal club intervention was effective in improving knowledge and critical appraisal skills. Few papers reported on the psychometric properties of their outcome instruments. No paper reported on the translation of evidence from journal club into clinical practice. Characteristics of successful journal clubs included regular and anticipated meetings, mandatory attendance, clear long- and short-term purpose, appropriate meeting timing and incentives, a trained journal club leader to choose papers and lead discussion, circulating papers prior to the meeting, using the internet for wider dissemination and data storage, using established critical appraisal processes and summarizing journal club findings.
Initiation of a Database of CEUS Ground Motions for NGA East
NASA Astrophysics Data System (ADS)
Cramer, C. H.
2007-12-01
The Nuclear Regulatory Commission has funded the first stage of development of a database of central and eastern US (CEUS) broadband and accelerograph records, along the lines of the existing Next Generation Attenuation (NGA) database for active tectonic areas. This database will form the foundation of an NGA East project for the development of CEUS ground-motion prediction equations that include the effects of soils. This initial effort covers the development of a database design and the beginning of data collection to populate the database. It also includes some processing for important source parameters (Brune corner frequency and stress drop) and site parameters (kappa, Vs30). Besides collecting appropriate earthquake recordings and information, existing information about site conditions at recording sites will also be gathered, including geology and geotechnical information. The long-range goal of the database development is to complete the database and make it available in 2010. The database design is centered on CEUS ground motion information needs but is built on the Pacific Earthquake Engineering Research Center's (PEER) NGA experience. Documentation from the PEER NGA website was reviewed and relevant fields incorporated into the CEUS database design. CEUS database tables include ones for earthquake, station, component, record, and references. As was done for NGA, a CEUS ground- motion flat file of key information will be extracted from the CEUS database for use in attenuation relation development. A short report on the CEUS database and several initial design-definition files are available at https://umdrive.memphis.edu:443/xythoswfs/webui/_xy-7843974_docstore1. Comments and suggestions on the database design can be sent to the author. More details will be presented in a poster at the meeting.
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
Methods for the guideline-based development of quality indicators--a systematic review
2012-01-01
Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067
Design and Optimization of a Telemetric system for appliance in earthquake prediction
NASA Astrophysics Data System (ADS)
Bogdos, G.; Tassoulas, E.; Vereses, A.; Papapanagiotou, A.; Filippi, K.; Koulouras, G.; Nomicos, C.
2009-04-01
This project's aim is to design a telemetric system which will be able to collect data from a digitizer, transform it into appropriate form and transfer this data to a central system where an on-line data elaboration will take place. On-line mathematical elaboration (fractal analysis) of pre-seismic electromagnetic signals and instant display may lead to safe earthquake prediction methodologies. Ad-hoc connections and heterogeneous topologies are the core network, while wired and wireless means cooperate for an accurate and on-time transmission. The nature of data is considered very sensitive so the transmission needs to be instant. All stations are situated in rural places in order to prevent electromagnetic interferences; this imposes continuous monitoring and provision of backup data links. The central stations collect the data of every station and allocate them properly in a predefined database. Special software is designed to elaborate mathematically the incoming data and export it graphically. The developing part included digitizer design, workstation software design, transmission protocol study and simulation on OPNET, database programming, mathematical data elaborations and software development for graphical representation. All the package was tested under lab conditions and tested in real conditions. The main aspect that this project serves is the very big interest for the scientific community in case this platform will eventually be implemented and then installed in Greek countryside in large scale. The platform is designed in such a way that techniques of data mining and mathematical elaboration are possible and any extension can be adapted. The main specialization of this project is that these mechanisms and mathematical transformations can be applied on live data. This can help to rapid exploitation of the real meaning of the measured and stored data. The elaboration of this study has as primary intention to help and alleviate the analysis process while triggering the scientific community to pay attention on seismic activities in Greece watching it on-line.
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2014 CFR
2014-10-01
... individual database managers; and to perform other functions as needed for the administration of the TV bands... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers...
Bonnie Ruefenacht; Robert Benton; Vicky Johnson; Tanushree Biswas; Craig Baker; Mark Finco; Kevin Megown; John Coulston; Ken Winterberger; Mark Riley
2015-01-01
A tree canopy cover (TCC) layer is one of three elements in the National Land Cover Database (NLCD) 2011 suite of nationwide geospatial data layers. In 2010, the USDA Forest Service (USFS) committed to creating the TCC layer as a member of the Multi-Resolution Land Cover (MRLC) consortium. A general methodology for creating the TCC layer was reported at the 2012 FIA...
Analysis of the Database of Theses and Dissertations from DME/UFSCAR about Astronomy Education
NASA Astrophysics Data System (ADS)
Rodrigues Ferreira, Orlando; Voelzke, Marcos Rincon
2013-11-01
The paper presents a brief analysis of the "Database of Theses and Dissertations about Astronomy Education" from the Department of Teaching Methodology (DME) of the Federal University of São Carlos(UFSCar). This kind of study made it possible to develop new analysis and statistical data, as well as to conduct a rating of Brazilian institutions that produce academic work in the area.
A Methodology for Benchmarking Relational Database Machines,
1984-01-01
user benchmarks is to compare the multiple users to the best-case performance The data for each query classification coll and the performance...called a benchmark. The term benchmark originates from the markers used by sur - veyors in establishing common reference points for their measure...formatted databases. In order to further simplify the problem, we restrict our study to those DBMs which support the relational model. A sur - vey
Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry
NASA Technical Reports Server (NTRS)
Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.
2004-01-01
Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.
Database System Design and Implementation for Marine Air-Traffic-Controller Training
2017-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. DATABASE SYSTEM DESIGN AND...thesis 4. TITLE AND SUBTITLE DATABASE SYSTEM DESIGN AND IMPLEMENTATION FOR MARINE AIR-TRAFFIC-CONTROLLER TRAINING 5. FUNDING NUMBERS 6. AUTHOR(S...12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) This project focused on the design , development, and implementation of a centralized
Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology
NASA Astrophysics Data System (ADS)
Kirkpatrick, Brad Kenneth
In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.
Stratified sampling design based on data mining.
Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung
2013-09-01
To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.
Caron, Alexandre; Chazard, Emmanuel; Muller, Joris; Perichon, Renaud; Ferret, Laurie; Koutkias, Vassilis; Beuscart, Régis; Beuscart, Jean-Baptiste; Ficheur, Grégoire
2017-03-01
The significant risk of adverse events following medical procedures supports a clinical epidemiological approach based on the analyses of collections of electronic medical records. Data analytical tools might help clinical epidemiologists develop more appropriate case-crossover designs for monitoring patient safety. To develop and assess the methodological quality of an interactive tool for use by clinical epidemiologists to systematically design case-crossover analyses of large electronic medical records databases. We developed IT-CARES, an analytical tool implementing case-crossover design, to explore the association between exposures and outcomes. The exposures and outcomes are defined by clinical epidemiologists via lists of codes entered via a user interface screen. We tested IT-CARES on data from the French national inpatient stay database, which documents diagnoses and medical procedures for 170 million inpatient stays between 2007 and 2013. We compared the results of our analysis with reference data from the literature on thromboembolic risk after delivery and bleeding risk after total hip replacement. IT-CARES provides a user interface with 3 columns: (i) the outcome criteria in the left-hand column, (ii) the exposure criteria in the right-hand column, and (iii) the estimated risk (odds ratios, presented in both graphical and tabular formats) in the middle column. The estimated odds ratios were consistent with the reference literature data. IT-CARES may enhance patient safety by facilitating clinical epidemiological studies of adverse events following medical procedures. The tool's usability must be evaluated and improved in further research. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Invocations and intoxication: does prayer decrease alcohol consumption?
Lambert, Nathaniel M; Fincham, Frank D; Marks, Loren D; Stillman, Tyler F
2010-06-01
Four methodologically diverse studies (N = 1,758) show that prayer frequency and alcohol consumption are negatively related. In Study 1 (n = 824), we used a cross-sectional design and found that higher prayer frequency was related to lower alcohol consumption and problematic drinking behavior. Study 2 (n = 702) used a longitudinal design and found that more frequent prayer at Time 1 predicted less alcohol consumption and problematic drinking behavior at Time 2, and this relationship held when controlling for baseline levels of drinking and prayer. In Study 3 (n = 117), we used an experimental design to test for a causal relationship between prayer frequency and alcohol consumption. Participants assigned to pray every day (either an undirected prayer or a prayer for a relationship partner) for 4 weeks drank about half as much alcohol at the conclusion of the study as control participants. Study 4 (n = 115) replicated the findings of Study 3, as prayer again reduced drinking by about half. These findings are discussed in terms of prayer as reducing drinking motives. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Team X Spacecraft Instrument Database Consolidation
NASA Technical Reports Server (NTRS)
Wallenstein, Kelly A.
2005-01-01
In the past decade, many changes have been made to Team X's process of designing each spacecraft, with the purpose of making the overall procedure more efficient over time. One such improvement is the use of information databases from previous missions, designs, and research. By referring to these databases, members of the design team can locate relevant instrument data and significantly reduce the total time they spend on each design. The files in these databases were stored in several different formats with various levels of accuracy. During the past 2 months, efforts have been made in an attempt to combine and organize these files. The main focus was in the Instruments department, where spacecraft subsystems are designed based on mission measurement requirements. A common database was developed for all instrument parameters using Microsoft Excel to minimize the time and confusion experienced when searching through files stored in several different formats and locations. By making this collection of information more organized, the files within them have become more easily searchable. Additionally, the new Excel database offers the option of importing its contents into a more efficient database management system in the future. This potential for expansion enables the database to grow and acquire more search features as needed.
Algorithms and methodology used in constructing high-resolution terrain databases
NASA Astrophysics Data System (ADS)
Williams, Bryan L.; Wilkosz, Aaron
1998-07-01
This paper presents a top-level description of methods used to generate high-resolution 3D IR digital terrain databases using soft photogrammetry. The 3D IR database is derived from aerial photography and is made up of digital ground plane elevation map, vegetation height elevation map, material classification map, object data (tanks, buildings, etc.), and temperature radiance map. Steps required to generate some of these elements are outlined. The use of metric photogrammetry is discussed in the context of elevation map development; and methods employed to generate the material classification maps are given. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems. A discussion is also presented on database certification which consists of validation, verification, and accreditation procedures followed to certify that the developed databases give a true representation of the area of interest, and are fully compatible with the targeted digital simulators.
Adly, Amr A.; Abd-El-Hafiz, Salwa K.
2014-01-01
Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939
Adly, Amr A; Abd-El-Hafiz, Salwa K
2015-05-01
Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.