Sample records for files improve quality

  1. 77 FR 20317 - Acquisition, Protection, and Disclosure of Quality Improvement Organization Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services 42 CFR Part 480 Acquisition, Protection, and Disclosure of Quality Improvement Organization Information CFR Correction In... DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION INFORMATION [FR Doc. 2012-8184 Filed 4-3-12; 8:45 am] BILLING...

  2. Implementation of a Point-of-Care Radiologist-Technologist Communication Tool in a Quality Assurance Program.

    PubMed

    Ong, Leonard; Elnajjar, Pierre; Nyman, C Gregory; Mair, Thomas; Juluru, Krishna

    2017-07-01

    We implemented an Image Quality Reporting and Tracking Solution (IQuaRTS), directly linked from the PACS, to improve communication between radiologists and technologists. IQuaRTS launched in May 2015. We compared MRI issues filed in the period before IQuaRTS implementation (May-September 2014) using a manual system with MRI issues filed in the IQuaRTS period (May-September 2015). The unpaired t test was used for analysis. For assessment of overall results in the IQuaRTS period alone, all issues filed across all modalities were included. Summary statistics and charts were generated using Excel and Tableau. For MRI issues, the number of issues filed during the IQuaRTS period was 498 (2.5% of overall MRI examination volume) compared with 78 issues filed during the period before IQuaRTS implementation (0.4% of total examination volume) (p = 0.0001), representing a 625% relative increase. Tickets that documented excellent work were 8%. Other issues included images not pushed to PACS (20%), film library issues (19%), and documentation or labeling (8%). Of the issues filed, 55% were MRI-related and 25% were CT-related. The issues were stratified across six sites within our institution. Staff requiring additional training could be readily identified, and 80% of the issues were resolved within 72 hours. IQuaRTS is a cost-effective online issue reporting tool that enables robust data collection and analytics to be incorporated into quality improvement programs. One limitation of the system is that it must be implemented in an environment where staff are receptive to quality improvement.

  3. 42 CFR 478.22 - Good cause for late filing of a request for a reconsideration or hearing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS RECONSIDERATIONS AND APPEALS Utilization and Quality Control Quality Improvement Organization (QIO) Reconsiderations and... party from making the request on time. (2) Whether an action by the QIO misled the party. (3) Whether...

  4. [A method for auditing medical records quality: audit of 467 medical records within the framework of the medical information systems project quality control].

    PubMed

    Boulay, F; Chevallier, T; Gendreike, Y; Mailland, V; Joliot, Y; Sambuc, R

    1998-03-01

    Future hospital accreditation could take into account the quality of medical files. The objectives of this study is to test a method for auditing and evaluating the quality of the handing of medical files. We conducted a retrospective regional audit based on the frame of reference the National Agency for Medical Development and Evaluation, by using a sample of cases, stratified by establishment. In our region, the global budgets of 47 public and private hospitals participating in the public hospital service, are adjusted while keeping in mind the medicalised activity data (PMSI). This audit was proposed to the doctors of the Department of Medical Information on the occasion of the regulatory PMSI quality control. A total of 467 questionnaires were given by 39 of the 47 sollicited hospitals (83%). The methodological aspects (questionnaire, cooperative approach...) are discussed. The make-up of medical files can alos be improved by raising the percentage of the presence of important data or documents such as the reason for admission (74.1%), the surgery report (83.2%), and the hospitalisation report (66.6%). A system for classifying the paraclinical results is shared and systematic throughout the service or hospital in only 73.2% of cases. The quality of the handing of medical files seems problematic in our hospitals and actions for improving the quality should be undertaken as a priority.

  5. ISO9000 and the quality management system in the digital hospital.

    PubMed

    Liu, Yalan; Yao, Bin; Zhang, Zigang

    2002-01-01

    ISO9000 quality management system (ISO9000QMS) emphasize on the customer-oriented, managers' leadership and all staff's joining, adopt the process method and system management, spread the taking facts as a basis to make decision and improve consistently, and establish win-win relation with the suppliers. So, the digital hospital can adopt the ISO9000QMS. In order to establish the ISO9000QMS, the digital hospital should: (1) Design integrally, including analyzing the operation procedure, clarifying the job duties, setting up the spreading team and setting the quality policy and objectives: (2) Learning the ISO9000 quality standards; (3) Drawing up the documents, including the quality manual, program files and operation guiding files; (4) Training according the documents; (5) Executing the quality standard, including the service quality auditing, quality record auditing and quality system auditing; (6) Improving continually. With the establishment of ISO900QMS, the digital hospital can appraise more accurately, analyze quality matters statistically and avoid the interference of artificial factors.

  6. The Thames Science Plan: Suggested Hydrologic Investigations to Support Nutrient-Related Water-Quality Improvements in the Thames River Basin, Connecticut

    DTIC Science & Technology

    2005-01-01

    Nutrient- Related Water-Quality Improvements in the Thames River Basin, Connecticut Open-File Report 2005-1208 U.S. Department of the Interior U.S...Investigations to Support Nutrient- Related Water-Quality Improvements in the Thames River Basin, Connecticut 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...Suggested Hydrologic Investigations to Support Nutrient- Related Water-Quality Improvements in the Thames River Basin, Connecticut By Elaine C. Todd

  7. Comparison of quality of obturation and instrumentation time using hand files and two rotary file systems in primary molars: A single-blinded randomized controlled trial.

    PubMed

    Govindaraju, Lavanya; Jeevanandan, Ganesh; Subramanian, E M G

    2017-01-01

    In permanent dentition, different rotary systems are used for canal cleaning and shaping. Rotary instrumentation in pediatric dentistry is an emerging concept. A very few studies have compared the efficiency of rotary instrumentation for canal preparation in primary teeth. Hence, this study was performed to compare the obturation quality and instrumentation time of two rotary files systems - Protaper, Mtwo with hand files in primary molars. Forty-five primary mandibular molars were randomly allotted to one of the three groups. Instrumentation was done using K-files in Group 1; Protaper in Group 2; and Mtwo in Group 3. Instrumentation time was recorded. The canal filling quality was assessed as underfill, optimal fill, and overfill. Statistical analysis was done using Chi-square, ANOVA, and post hoc Tukey test. No significant difference was observed in the quality of obturation among three groups. Intergroup comparison of the instrumentation time showed a statistically significant difference between the three groups. The use of rotary instrumentation in primary teeth results in marked reduction in the instrumentation time and improves the quality of obturation.

  8. 42 CFR 478.36 - Record of reconsideration.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS RECONSIDERATIONS AND APPEALS Utilization and Quality Control...) Completion of litigation and the passage of the time period for filing all appeals. (b) Contents of the...

  9. 42 CFR 478.34 - Notice of a reconsidered determination.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS RECONSIDERATIONS AND APPEALS Utilization and Quality... submitting a request for an administrative hearing and the time period for filing a request. (b) Notice to...

  10. Assessing Quality of Data Standards: Framework and Illustration Using XBRL GAAP Taxonomy

    NASA Astrophysics Data System (ADS)

    Zhu, Hongwei; Wu, Harris

    The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.

  11. 78 FR 12750 - Information Collection(s) Being Reviewed by the Federal Communications Commission, Comments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-25

    ... retain benefits. Statutory authority for this information collection is contained in 47 U.S.C. sections... Plan for Our Future; Establish Just and Reasonable Rates for Local Exchange Carriers; High-Cost... to file a five-year service quality improvement plan by July 1, 2013, and to file annually thereafter...

  12. The patient perspective on the effects of medical record accessibility: a systematic review.

    PubMed

    Vermeir, Peter; Degroote, Sophie; Vandijck, Dominique; Van Tiggelen, Hanne; Peleman, Renaat; Verhaeghe, Rik; Mariman, An; Vogelaers, Dirk

    2017-06-01

    Health care is shifting from a paternalistic to a participatory model, with increasing patient involvement. Medical record accessibility to patients may contribute significantly to patient comanagement. To systematically review the literature on the patient perspective of effects of personal medical record accessibility on the individual patient, patient-physician relationship and quality of medical care. Screening of PubMed, Web of Science, Cinahl, and Cochrane Library on the keywords 'medical record', 'patient record', 'communication', 'patient participation', 'doctor-patient relationship', 'physician-patient relationship' between 1 January 2002 and 31 January 2016; systematic review after assessment for methodological quality. Out of 557 papers screened, only 12 studies qualified for the systematic review. Only a minority of patients spontaneously request access to their medical file, in contrast to frequent awareness of this patient right and the fact that patients in general have a positive view on open visit notes. The majority of those who have actually consulted their file are positive about this experience. Access to personal files improves adequacy and efficiency of communication between physician and patient, in turn facilitating decision-making and self-management. Increased documentation through patient involvement and feedback on the medical file reduces medical errors, in turn increasing satisfaction and quality of care. Information improvement through personal medical file accessibility increased reassurance and a sense of involvement and responsibility. From the patient perspective medical record accessibility contributes to co-management of personal health care.

  13. Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.

    PubMed

    Annis, S-L; Zeng, G; Wu, X; Macpherson, M

    2012-07-01

    A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.

  14. Combination of advanced encryption standard 256 bits with md5 to secure documents on android smartphone

    NASA Astrophysics Data System (ADS)

    Pasaribu, Hendra; Sitanggang, Delima; Rizki Damanik, Rudolfo; Rudianto Sitompul, Alex Chandra

    2018-04-01

    File transfer by using a smartphone has some security issues like data theft by irresponsible parties. To improve the quality of data security systems on smartphones, in this research the integration of AES 256 bit algorithm by using MD5 hashing is proposed. The use of MD5 aims to increase the key strength of the encryption and decryption process of document files. The test results show that the proposed method can increase the key strength of the encryption and decryption process in the document file. Encryption and decryption time by using AES and MD5 combination is faster than using AES only on *.txt file type and reverse results for *.docx, *.xlsx, *.pptx and *.pdf file files.

  15. DEVELOPMENT AND IMPROVEMENT OF TEMPORAL ALLOCATION FACTOR FILES

    EPA Science Inventory

    The report gives results of a project to: (1) evaluate the quality and completeness of data and methods being used for temporal allocation of emissions data, (2) identify and prioritize needed improvements to current methods for developing temporal allocation factors (TAFs), and ...

  16. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  17. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  18. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  19. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  20. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  1. WFC3/UVIS Dark Calibration: Monitoring Results and Improvements to Dark Reference Files

    NASA Astrophysics Data System (ADS)

    Bourque, M.; Baggett, S.

    2016-04-01

    The Wide Field Camera 3 (WFC3) UVIS detector possesses an intrinsic signal during exposures, even in the absence of light, known as dark current. A daily monitor program is employed every HST cycle to characterize and measure this current as well as to create calibration files which serve to subtract the dark current from science data. We summarize the results of the daily monitor program for all on-orbit data. We also introduce a new algorithm for generating the dark reference files that provides several improvements to their overall quality. Key features to the new algorithm include correcting the dark frames for Charge Transfer Efficiency (CTE) losses, using an anneal-cycle average value to measure the dark current, and generating reference files on a daily basis. This new algorithm is part of the release of the CALWF3 v3.3 calibration pipeline on February 23, 2016 (also known as "UVIS 2.0"). Improved dark reference files have been regenerated and re-delivered to the Calibration Reference Data System (CRDS) for all on-orbit data. Observers with science data taken prior to the release of CALWF3 v3.3 may request their data through the Mikulski Archive for Space Telescopes (MAST) to obtain the improved products.

  2. Information Reporting at Traditionally Black Colleges and Universities: A Call to Arms.

    ERIC Educational Resources Information Center

    Budig, Jeanne E.

    1982-01-01

    Information reporting at traditionally black colleges and universities was reviewed, based on the Higher Education General Information Survey (HEGIS) files for 36 institutions. The objective was to determine the quality of the data submitted by seven of these institutions, and to suggest actions for improving the quality of the HEGIS information…

  3. Simple interventions can greatly improve clinical documentation: a quality improvement project of record keeping on the surgical wards at a district general hospital.

    PubMed

    Glen, Peter; Earl, Naomi; Gooding, Felix; Lucas, Emily; Sangha, Nicole; Ramcharitar, Steve

    2015-01-01

    Clinical documentation is an integral part of the healthcare professional's job. Good record keeping is essential for patient care, accurate recording of consultations and for effective communication within the multidisciplinary team. Within the surgical department at the Great Western Hospital, Swindon, the case notes were deemed to be bulky and cumbersome, inhibiting effective record keeping, potentially putting patients' at risk. The aim of this quality improvement project was therefore to improve the standard of documentation, the labelling of notes and the overall filing. A baseline audit was firstly undertaken assessing the notes within the busiest surgical ward. A number of variables were assessed, but notably, only 12% (4/33) of the case notes were found to be without loose pages. Furthermore, less than half of the pages with entries written within the last 72 hours contained adequate patient identifiers on them. When assessing these entries further, the designation of the writer was only recorded in one third (11/33) of the cases, whilst the printed name of the writer was only recorded in 65% (21/33) of the entries. This project ran over a 10 month period, using a plan, do study, act methodology. Initial focus was on simple education. Afterwards, single admission folders were introduced, to contain only information required for that admission, in an attempt to streamline the notes and ease the filing. This saw a global improvement across all data subsets, with a sustained improvement of over 80% compliance seen. An educational poster was also created and displayed in clinical areas, to remind users to label their notes with patient identifying stickers. This saw a 4-fold increase (16%-68%) in the labelling of notes. In conclusion, simple, cost effective measures in streamlining medical notes, improves the quality of documentation, facilitates the filing and ultimately improves patient care.

  4. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Geary, Joseph; Hawkins, Lamar; Ahmad, Anees; Gong, Qian

    1997-01-01

    This report describes work conducted on Delivery Order 181 between October 1996 through June 1997. During this period software was written to: compute axial PSD's from RDOS AXAF-I mirror surface maps; plot axial surface errors and compute PSD's from HDOS "Big 8" axial scans; plot PSD's from FITS format PSD files; plot band-limited RMS vs axial and azimuthal position for multiple PSD files; combine and organize PSD's from multiple mirror surface measurements formatted as input to GRAZTRACE; modify GRAZTRACE to read FITS formatted PSD files; evaluate AXAF-I test results; improve and expand the capabilities of the GT x-ray mirror analysis package. During this period work began on a more user-friendly manual for the GT program, and improvements were made to the on-line help manual.

  5. Acquisition Quality Improvement Within Naval Facilities Engineering Command Southwest

    DTIC Science & Technology

    2015-06-01

    Act BMS Business Management System BPA Blanket Purchase Agreement COR Contracting Officer Representative CS Contract Specialist DASN...Services (MOPAS) missing in two service contract files. (2) Blanket Purchase Agreement ( BPA ) procedures were not followed. (3) Business

  6. The development of method for continuous improvement of master file of the nursing practice terminology.

    PubMed

    Tsuru, Satoko; Okamine, Eiko; Takada, Aya; Watanabe, Chitose; Uchiyama, Makiko; Dannoue, Hideo; Aoyagi, Hisae; Endo, Akira

    2009-01-01

    Nursing Action Master and Nursing Observation Master were released from 2002 to 2008. Two kinds of format, an Excel format and a CSV format file are prepared for maintaining them. Followings were decided as a basic rule of the maintenance: newly addition, revision, deletion, the numbering of the management and a rule of the coding. The master was developed based on it. We do quality assurance for the masters using these rules.

  7. Patient-centeredness and quality management in Dutch diabetes care organizations after a 1-year intervention.

    PubMed

    Campmans-Kuijpers, Marjo Je; Lemmens, Lidwien C; Baan, Caroline A; Rutten, Guy Ehm

    2016-01-01

    More focus on patient-centeredness in care for patients with type 2 diabetes requests increasing attention to diabetes quality management processes on patient-centeredness by managers in primary care groups and outpatient clinics. Although patient-centered care is ultimately determined by the quality of interactions between patients and clinicians at the practice level, it should be facilitated at organizational level too. This nationwide study aimed to assess the state of diabetes quality management on patient-centeredness at organizational level and its possibilities to improve after a tailored intervention. This before-after study compares the quality management on patient-centeredness within Dutch diabetes care groups and outpatient clinics before and after a 1-year stepwise intervention. At baseline, managers of 51 diabetes primary care groups and 28 outpatient diabetes clinics completed a questionnaire about the organization's quality management program. Patient-centeredness (0%-100%) was operationalized in six subdomains: facilitating self-management support, individualized care plan support, patients' access to medical files, patient education policy, safeguarding patients' interests, and formal patient involvement. The intervention consisted of feedback and benchmark and if requested a telephone call and/or a consultancy visit. After 1 year, the managers completed the questionnaire again. The 1-year changes were examined by dependent (non) parametric tests. Care groups improved significantly on patient-centeredness (from 47.1% to 53.3%; P =0.002), and on its subdomains "access to medical files" (from 42.0% to 49.4%), and "safeguarding patients' interests" (from 58.1% to 66.2%). Outpatient clinics, which scored higher at baseline (66.7%) than care groups, did not improve on patient-centeredness (65.6%: P =0.54) or its subdomains. "Formal patient involvement" remained low in both care groups (23.2%) and outpatient clinics (33.9%). After a simple intervention, care groups significantly improved their quality management on patient-centeredness, but outpatient clinics did not. Interventions to improve quality management on patient-centeredness in diabetes care organizations should differ between primary and secondary care.

  8. Compression of next-generation sequencing quality scores using memetic algorithm

    PubMed Central

    2014-01-01

    Background The exponential growth of next-generation sequencing (NGS) derived DNA data poses great challenges to data storage and transmission. Although many compression algorithms have been proposed for DNA reads in NGS data, few methods are designed specifically to handle the quality scores. Results In this paper we present a memetic algorithm (MA) based NGS quality score data compressor, namely MMQSC. The algorithm extracts raw quality score sequences from FASTQ formatted files, and designs compression codebook using MA based multimodal optimization. The input data is then compressed in a substitutional manner. Experimental results on five representative NGS data sets show that MMQSC obtains higher compression ratio than the other state-of-the-art methods. Particularly, MMQSC is a lossless reference-free compression algorithm, yet obtains an average compression ratio of 22.82% on the experimental data sets. Conclusions The proposed MMQSC compresses NGS quality score data effectively. It can be utilized to improve the overall compression ratio on FASTQ formatted files. PMID:25474747

  9. Web-based X-ray quality control documentation.

    PubMed

    David, George; Burnett, Lou Ann; Schenkel, Robert

    2003-01-01

    The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.

  10. Root canal shaping with manual stainless steel files and rotary Ni-Ti files performed by students.

    PubMed

    Sonntag, D; Guntermann, A; Kim, S K; Stachniss, V

    2003-04-01

    To investigate root canal shaping with manual stainless steel files and rotary Ni-Ti files by students. Two hundred and ten simulated root canals with the same geometrical shape and size in acrylic resin blocks were prepared by 21 undergraduate dental students with manual stainless steel files using a stepback technique or with rotary Ni-Ti files in crown-down technique. Preparation length, canal shape, incidence of fracture and preparation time were investigated. Zips and elbows occurred significantly (P < 0.001) less frequently with rotary than with manual preparation. The correct preparation length was achieved significantly (P < 0.05) more often with rotary Ni-Ti files than with manual stainless steel files. Fractures occurred significantly (P < 0.05) less frequently with hand instrumentation. The mean time required for manual preparation was significantly (P < 0.001) longer than that required for rotary preparation. Prior experience with a hand preparation technique was not reflected in an improved quality of the subsequent engine-driven preparation. Inexperienced operators achieved better canal preparations with rotary Ni-Ti instruments than with manual stainless steel files. However, rotary preparation was associated with significantly more fractures.

  11. Does competition improve health care quality?

    PubMed

    Scanlon, Dennis P; Swaminathan, Shailender; Lee, Woolton; Chernew, Michael

    2008-12-01

    To identify the effect of competition on health maintenance organizations' (HMOs) quality measures. Longitudinal analysis of a 5-year panel of the Healthcare Effectiveness Data and Information Set (HEDIS) and Consumer Assessment of Health Plans Survey(R) (CAHPS) data (calendar years 1998-2002). All plans submitting data to the National Committee for Quality Assurance (NCQA) were included regardless of their decision to allow NCQA to disclose their results publicly. NCQA, Interstudy, the Area Resource File, and the Bureau of Labor Statistics. Fixed-effects models were estimated that relate HMO competition to HMO quality controlling for an unmeasured, time-invariant plan, and market traits. Results are compared with estimates from models reliant on cross-sectional variation. Estimates suggest that plan quality does not improve with increased levels of HMO competition (as measured by either the Herfindahl index or the number of HMOs). Similarly, increased HMO penetration is generally not associated with improved quality. Cross-sectional models tend to suggest an inverse relationship between competition and quality. The strategies that promote competition among HMOs in the current market setting may not lead to improved HMO quality. It is possible that price competition dominates, with purchasers and consumers preferring lower premiums at the expense of improved quality, as measured by HEDIS and CAHPS. It is also possible that the fragmentation associated with competition hinders quality improvement.

  12. 42 CFR 480.115 - Requirements for maintaining confidentiality.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF...) Responsibilities of QIO officers and employees. The QIO must provide reasonable physical security measures to... those measures needed to secure computer files. Each QIO must instruct its officers and employees and...

  13. 42 CFR 480.115 - Requirements for maintaining confidentiality.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF...) Responsibilities of QIO officers and employees. The QIO must provide reasonable physical security measures to... those measures needed to secure computer files. Each QIO must instruct its officers and employees and...

  14. 42 CFR 480.115 - Requirements for maintaining confidentiality.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF...) Responsibilities of QIO officers and employees. The QIO must provide reasonable physical security measures to... those measures needed to secure computer files. Each QIO must instruct its officers and employees and...

  15. 42 CFR 480.115 - Requirements for maintaining confidentiality.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF...) Responsibilities of QIO officers and employees. The QIO must provide reasonable physical security measures to... those measures needed to secure computer files. Each QIO must instruct its officers and employees and...

  16. 42 CFR 480.115 - Requirements for maintaining confidentiality.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF...) Responsibilities of QIO officers and employees. The QIO must provide reasonable physical security measures to... those measures needed to secure computer files. Each QIO must instruct its officers and employees and...

  17. Plan-provider integration, premiums, and quality in the Medicare Advantage market.

    PubMed

    Frakt, Austin B; Pizer, Steven D; Feldman, Roger

    2013-12-01

    To investigate how integration between Medicare Advantage plans and health care providers is related to plan premiums and quality ratings. We used public data from the Centers for Medicare and Medicaid Services (CMS) and the Area Resource File and private data from one large insurer. Premiums and quality ratings are from 2009 CMS administrative files and some control variables are historical. We estimated ordinary least-squares models for premiums and plan quality ratings, with state fixed effects and firm random effects. The key independent variable was an indicator of plan-provider integration. With the exception of Medigap premium data, all data were publicly available. We ascertained plan-provider integration through examination of plans' websites and governance documents. We found that integrated plan-providers charge higher premiums, controlling for quality. Such plans also have higher quality ratings. We found no evidence that integration is associated with more generous benefits. Current policy encourages plan-provider integration, although potential effects on health insurance products and markets are uncertain. Policy makers and regulators may want to closely monitor changes in premiums and quality after integration and consider whether quality improvement (if any) justifies premium increases (if they occur). © Health Research and Educational Trust.

  18. Reducing Bits in Electrodeposition Process of Commercial Vehicle - A Case Study

    NASA Astrophysics Data System (ADS)

    Rahim, Nabiilah Ab; Hamedon, Zamzuri; Mohd Turan, Faiz; Iskandar, Ismed

    2016-02-01

    Painting process is critical in commercial vehicle manufacturing process for protection and decorative. The good quality on painted body is important to reduce repair cost and achieve customer satisfaction. In order to achieve the good quality, it is important to reduce the defect at the first process in painting process which is electrodeposition process. The Pareto graph and cause and effect diagram in the seven QC tools is utilized to reduce the electrodeposition defects. The main defects in the electrodeposition process in this case study are the bits. The 55% of the bits are iron filings. The iron filings which come from the metal assembly process at the body shop are minimised by controlling the spot welding parameter, defect control and standard body cleaning process. However the iron filings are still remained on the body and carry over to the paint shop. The remained iron filings on the body are settled inside the dipping tank and removed by filtration system and magnetic separation. The implementation of filtration system and magnetic separation improved 27% of bits and reduced 42% of sanding man hour with a total saving of RM38.00 per unit.

  19. Care zoning in a psychiatric intensive care unit: strengthening ongoing clinical risk assessment.

    PubMed

    Mullen, Antony; Drinkwater, Vincent; Lewin, Terry J

    2014-03-01

    To implement and evaluate the care zoning model in an eight-bed psychiatric intensive care unit and, specifically, to examine the model's ability to improve the documentation and communication of clinical risk assessment and management. Care zoning guides nurses in assessing clinical risk and planning care within a mental health context. Concerns about the varying quality of clinical risk assessment prompted a trial of the care zoning model in a psychiatric intensive care unit within a regional mental health facility. The care zoning model assigns patients to one of 3 'zones' according to their clinical risk, encouraging nurses to document and implement targeted interventions required to manage those risks. An implementation trial framework was used for this research to refine, implement and evaluate the impact of the model on nurses' clinical practice within the psychiatric intensive care unit, predominantly as a quality improvement initiative. The model was trialled for three months using a pre- and postimplementation staff survey, a pretrial file audit and a weekly file audit. Informal staff feedback was also sought via surveys and regular staff meetings. This trial demonstrated improvement in the quality of mental state documentation, and clinical risk information was identified more accurately. There was limited improvement in the quality of care planning and the documentation of clinical interventions. Nurses' initial concerns over the introduction of the model shifted into overall acceptance and recognition of the benefits. The results of this trial demonstrate that the care zoning model was able to improve the consistency and quality of risk assessment information documented. Care planning and evaluation of associated outcomes showed less improvement. Care zoning remains a highly applicable model for the psychiatric intensive care unit environment and is a useful tool in guiding nurses to carry out routine patient risk assessments. © 2013 John Wiley & Sons Ltd.

  20. [Shoulder dystocia: Quality of retranscription in medical files].

    PubMed

    Martin, E; Bouet, P-E; Sentilhes, L; Legendre, G

    2016-03-01

    Shoulder dystocia is a rare obstetrical event and potentially serious. Apart from possible psychological implications, it may be responsible for maternal (haemorrhage and perineal tear) and neonatal complications (brachial plexus) leading to complaints and even lawsuits. The transcription of this event in medical files is essential as it is a reflection of the work in an emergency. It allows the obstetrician to defend him in case of trial. Our objective was to assess the quality of the transcription of shoulder dystocia situations in medical files. Retrospective chart study conducted in a university hospital with maternity type III. The primary objective was the rate of comprehensive records (name maneuvers and order of maneuvers of the anterior shoulder hand, time between the expulsion of the head and body, Apgar score). Between 2007 and 2015, 54 cases of shoulder dystocia requiring a second line maneuver after vaginal delivery (Wood and/or Jacquemier) were included. In all, 98.2% of the files were incomplete. The maneuvers and their order were noted in 100% of cases. However, the operation was not correctly described in 16.7% of cases. The anterior shoulder was noted in 16.7% of cases. The time between the expulsion of the head and the body was noted in one single files. Neither broken collarbone nor brachial plexus were observed. To improve the management of dystocia shoulder and transcription of data in files, simulation sessions and the creation of a standardized form would be needed. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  1. DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND ...

    EPA Pesticide Factsheets

    The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website. The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.

  2. Wildlife Scenario Builder and User's Guide (Version 1.0, Beta Test)

    EPA Science Inventory

    Cover of the Wildlife Scenario Builder User's Manual The Wildlife Scenario Builder (WSB) was developed to improve the quality of wildlif...

  3. Restructuring Big Data to Improve Data Access and Performance in Analytic Services Making Research More Efficient for the Study of Extreme Weather Events and Application User Communities

    NASA Astrophysics Data System (ADS)

    Ostrenga, D.; Shen, S.; Vollmer, B.; Meyer, D. L.

    2017-12-01

    NASA climate reanalysis dataset from MERRA-2 contains numerous data for atmosphere, land, and ocean, that are grouped into 95 products of archived volume over 300 TB. The data files are saved as hourly-file, day-file (hourly time interval) and month-file containing up to 125 parameters. Due to the large number of data files and the sheer data volumes, it is a challenging for users, especially those in the application research community, to handle dealing with the original data files. Most of these researchers prefer to focus on a small region or single location using the hourly data for long time periods to analyze extreme weather events or say winds for renewable energy applications. At the GES DISC, we have been working closely with the science teams and the application user community to create several new value added data products and high quality services to facilitate the use of the model data for various types of research. We have tested converting hourly data from one-day per file into different data cubes, such as one-month, one-year, or whole-mission and then continued to analyze the efficiency of the accessibility of this newly structured data through various services. Initial results have shown that compared to the original file structure, the new data has significantly improved the performance for accessing long time series. It is noticed that the performance is associated to the cube size and structure, the compression method, and how the data are accessed. The optimized data cube structure will not only improve the data access, but also enable better online analytic services for doing statistical analysis and extreme events mining. Two case studies will be presented using the newly structured data and value added services, the California drought and the extreme drought of the Northeastern states of Brazil. Furthermore, data access and analysis through cloud storage capabilities will be investigated.

  4. Improvements in 2016 to Natural Reservoir Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin

    DOE Data Explorer

    Teresa E. Jordan

    2016-08-18

    *These files add to and replace same-named files found within Submission 559 (https://gdr.openei.org/submissions/559)* The files included in this submission contain all data pertinent to the methods and results of a cohesive multi-state analysis of all known potential geothermal reservoirs in sedimentary rocks in the Appalachian Basin region, ranked by their potential favorability. Favorability is quantified using three metrics: Reservoir Productivity Index for water; Reservoir Productivity Index; Reservoir Flow Capacity. The metrics are explained in the Reservoirs Methodology Memo (included in zip file). The product represents a minimum spatial extent of potential sedimentary rock geothermal reservoirs. Only natural porosity and permeability were analyzed. Shapefile and images of the spatial distributions of these reservoir quality metrics and of the uncertainty on these metrics are included as well. UPDATE: Accompanying geologic reservoirs data may be found at: https://gdr.openei.org/submissions/881 (linked below).

  5. [Intranet-based integrated information system of radiotherapy-related images and diagnostic reports].

    PubMed

    Nakamura, R; Sasaki, M; Oikawa, H; Harada, S; Tamakawa, Y

    2000-03-01

    To use an intranet technique to develop an information system that simultaneously supports both diagnostic reports and radiotherapy planning images. Using a file server as the gateway a radiation oncology LAN was connected to an already operative RIS LAN. Dose-distribution images were saved in tagged-image-file format by way of a screen dump to the file server. X-ray simulator images and portal images were saved in encapsulated postscript format in the file server and automatically converted to portable document format. The files on the file server were automatically registered to the Web server by the search engine and were available for searching and browsing using the Web browser. It took less than a minute to register planning images. For clients, searching and browsing the file took less than 3 seconds. Over 150,000 reports and 4,000 images from a six-month period were accessible. Because the intranet technique was used, construction and maintenance was completed without specialty. Prompt access to essential information about radiotherapy has been made possible by this system. It promotes public access to radiotherapy planning that may improve the quality of treatment.

  6. Plan–Provider Integration, Premiums, and Quality in the Medicare Advantage Market

    PubMed Central

    Frakt, Austin B; Pizer, Steven D; Feldman, Roger

    2013-01-01

    Objective. To investigate how integration between Medicare Advantage plans and health care providers is related to plan premiums and quality ratings. Data Source. We used public data from the Centers for Medicare and Medicaid Services (CMS) and the Area Resource File and private data from one large insurer. Premiums and quality ratings are from 2009 CMS administrative files and some control variables are historical. Study Design. We estimated ordinary least-squares models for premiums and plan quality ratings, with state fixed effects and firm random effects. The key independent variable was an indicator of plan–provider integration. Data Collection. With the exception of Medigap premium data, all data were publicly available. We ascertained plan–provider integration through examination of plans’ websites and governance documents. Principal Findings. We found that integrated plan–providers charge higher premiums, controlling for quality. Such plans also have higher quality ratings. We found no evidence that integration is associated with more generous benefits. Conclusions. Current policy encourages plan–provider integration, although potential effects on health insurance products and markets are uncertain. Policy makers and regulators may want to closely monitor changes in premiums and quality after integration and consider whether quality improvement (if any) justifies premium increases (if they occur). PMID:23800017

  7. GenePRIMP: A Gene Prediction Improvement Pipeline For Prokaryotic Genomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyrpides, Nikos C.; Ivanova, Natalia N.; Pati, Amrita

    2010-07-08

    GenePRIMP (Gene Prediction Improvement Pipeline, Http://geneprimp.jgi-psf.org), a computational process that performs evidence-based evaluation of gene models in prokaryotic genomes and reports anomalies including inconsistent start sites, missing genes, and split genes. We show that manual curation of gene models using the anomaly reports generated by GenePRIMP improves their quality and demonstrate the applicability of GenePRIMP in improving finishing quality and comparing different genome sequencing and annotation technologies. Keywords in context: Gene model, Quality Control, Translation start sites, Automatic correction. Hardware requirements; PC, MAC; Operating System: UNIX/LINUX; Compiler/Version: Perl 5.8.5 or higher; Special requirements: NCBI Blast and nr installation; File Types:more » Source Code, Executable module(s), Sample problem input data; installation instructions other; programmer documentation. Location/transmission: http://geneprimp.jgi-psf.org/gp.tar.gz« less

  8. The design, construction and implementation of a computerised trauma registry in a developing South African metropolitan trauma service.

    PubMed

    Laing, G L; Bruce, J L; Aldous, C; Clarke, D L

    2014-01-01

    The Pietermaritzburg Metropolitan Trauma Service formerly lacked a robust computerised trauma registry. This made surgical audit difficult for the purpose of quality of care improvement and development. We aimed to design, construct and implement a computerised trauma registry within our service. Twelve months following its implementation, we sought to examine and report on the quality of the registry. Formal ethical approval to maintain a computerised trauma registry was obtained prior to undertaking any design and development. Appropriate commercial software was sourced to develop this project. The registry was designed as a flat file. A flat file is a plain text or mixed text and binary file which usually contains one record per line or physical record. Thereafter the registry file was launched onto a secure server. This provided the benefits of access security and automated backups. Registry training was provided to clients by the developer. The exercise of data capture was then integrated into the process of service delivery, taking place at the endpoint of patient care (discharge, transfer or death). Twelve months following its implementation, the compliance rates of data entry were measured. The developer of this project managed to design, construct and implement an electronic trauma registry into the service. Twelve months following its implementation the data were extracted and audited to assess the quality. A total of 2640 patient entries were captured onto the registry. Compliance rates were in the order of eighty percent and client satisfaction rates were high. A number of deficits were identified. These included the omission of weekend discharges and underreporting of deaths. The construction and implementation of the computerised trauma registry was the beginning of an endeavour to continue improvements in the quality of care within our service. The registry provided a reliable audit at twelve months post implementation. Deficits and limitations were identified and new strategies have been planned to overcome these problems and integrate the trauma registry into the process of clinical care. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Visibility into the Work: TQM Work Process Analysis with HPT and ISD.

    ERIC Educational Resources Information Center

    Beagles, Charles A.; Griffin, Steven L.

    2003-01-01

    Discusses the use of total quality management (TQM), work process flow diagrams, and ISD (instructional systems development) tools with HPT (human performance technology) to address performance gaps in the Veterans Benefits Administration (VBA). Describes performance goals, which were to improve accuracy and reduce backlog of claim files. (LRW)

  10. 77 FR 68856 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing of Amendments No. 1, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... increased order flow provides better execution quality on the Exchange because customers enjoy greater price... flow, and on the extent of price improvement provided to directed customer Complex Orders. The data... 707 is intended to prohibit coordinated actions between Directed Participants and order flow providers...

  11. An experimental study of factors affecting the selective inhibition of sintering process

    NASA Astrophysics Data System (ADS)

    Asiabanpour, Bahram

    Selective Inhibition of Sintering (SIS) is a new rapid prototyping method that builds parts in a layer-by-layer fabrication basis. SIS works by joining powder particles through sintering in the part's body, and by sintering inhibition of some selected powder areas. The objective of this research has been to improve the new SIS process, which has been invented at USC. The process improvement is based on statistical design of experiments. To conduct the needed experiments a working machine and related path generator software were needed. The machine and its control software were made available prior to this research. The path generator algorithms and software had to be created. This program should obtain model geometry data from a CAD file and generate an appropriate path file for the printer nozzle. Also, the program should generate a simulation file for path file inspection using virtual prototyping. The activities related to path generator constitute the first part of this research, which has resulted in an efficient path generator. In addition, to reach an acceptable level of accuracy, strength, and surface quality in the fabricated parts, all effective factors in the SIS process should be identified and controlled. Simultaneous analytical and experimental studies were conducted to recognize effective factors and to control the SIS process. Also, it was known that polystyrene was the most appropriate polymer powder and saturated potassium iodide was the most effective inhibitor among the available candidate materials. In addition, statistical tools were applied to improve the desirable properties of the parts fabricated by the SIS process. An investigation of part strength was conducted using the Response Surface Methodology (RSM) and a region of acceptable operating conditions for the part strength was found. Then, through analysis of the experimental results, the impact of the factors on the final part surface quality and dimensional accuracy was modeled. After developing a desirability function model, process operating conditions for maximum desirability were identified. Finally, the desirability model was validated.

  12. ReQON: a Bioconductor package for recalibrating quality scores from next-generation sequencing data

    PubMed Central

    2012-01-01

    Background Next-generation sequencing technologies have become important tools for genome-wide studies. However, the quality scores that are assigned to each base have been shown to be inaccurate. If the quality scores are used in downstream analyses, these inaccuracies can have a significant impact on the results. Results Here we present ReQON, a tool that recalibrates the base quality scores from an input BAM file of aligned sequencing data using logistic regression. ReQON also generates diagnostic plots showing the effectiveness of the recalibration. We show that ReQON produces quality scores that are both more accurate, in the sense that they more closely correspond to the probability of a sequencing error, and do a better job of discriminating between sequencing errors and non-errors than the original quality scores. We also compare ReQON to other available recalibration tools and show that ReQON is less biased and performs favorably in terms of quality score accuracy. Conclusion ReQON is an open source software package, written in R and available through Bioconductor, for recalibrating base quality scores for next-generation sequencing data. ReQON produces a new BAM file with more accurate quality scores, which can improve the results of downstream analysis, and produces several diagnostic plots showing the effectiveness of the recalibration. PMID:22946927

  13. 77 FR 38279 - Combined Notice of Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-27

    .... Description: CO2 Gas Quality Settlement Filing of Wyoming Interstate Company, LLC. Filed Date: 6/11/12.... Description: Fuel Filing to be effective 7/1/2012. Filed Date: 6/20/12. Accession Number: 20120620-5118...

  14. Multiple-file vs. single-file endodontics in dental practice: a study in routine care.

    PubMed

    Bartols, Andreas; Laux, Gunter; Walther, Winfried

    2016-01-01

    Little is known about the differences of rotary multiple file endodontic therapy and single-file reciprocating endodontic treatment under routine care conditions in dental practice. This multicenter study was performed to compare the outcome of multiple-file (MF) and single-file (SF) systems for primary root canal treatment under conditions of general dental practice regarding reduction of pain with a visual analogue scale (VAS 100), improvement of oral-health-related quality of life (OHRQoL) with the german short version of the oral health impact profile (OHIP-G-14) and the speed of root canal preparation. Ten general dental practitioners (GDPs) participated in the study as practitioner-investigators (PI). In the first five-month period of the study, the GDPs treated patients with MF systems. After that, the GDPs treated the patients in the second five-month period with a SF system (WaveOne). The GDPs documented the clinical findings at the beginning and on completion of treatment. The patients documented their pain and OHRQoL before the beginning and before completion of treatment. A total of 599 patients were included in the evaluation. 280 patients were in the MF group, 319 were in the SF WaveOne group. In terms of pain reduction and improvement in OHIP-G-14, the improvement in both study groups (MF and SF) was very similar based on univariate analysis methods. Pain reduction was 34.4 (SD 33.7) VAS (MF) vs. 35.0 (SD 35.4) VAS (SF) ( p  = 0.840) and the improvement in OHIP-G-14 score was 9.4 (SD 10.3) (MF) vs. 8.5 (SD 10.2) (SF) ( p  = 0.365). The treatment time per root canal was 238.9 s (SD 206.2 s) (MF) vs. 146.8 sec. (SD 452.8 sec) (SF) ( p  = 0.003). Regarding improvement of endodontic pain and OHRQoL measure with OHIP-G-14, there were no statistical significant differences between the SF und the MF systems. WaveOne-prepared root canals significantly faster than MF systems.

  15. Clinical Evaluation of Quality of Obturation and Instrumentation Time using Two Modified Rotary File Systems with Manual Instrumentation in Primary Teeth.

    PubMed

    Govindaraju, Lavanya; Jeevanandan, Ganesh; Subramanian, Emg

    2017-09-01

    Pulp therapy in primary teeth has been performed using various instrumentation techniques. However, the conventional instrumentation technique used for root canal preparation in primary teeth is hand instrumentation. Various Nickel-Titanium (Ni-Ti) instruments are available to perform efficient root canal preparation in primary teeth. These Ni-Ti instruments has been designed to aid in better root canal preparation in permanent teeth but are rarely used in primary teeth. It is necessary to assess the feasibility of using these adult rotary files with a modified sequence in primary teeth. To compare the quality of obturation and instrumentation time during root canal preparation using hand files and modified rotary file systems in primary molars. Forty-five primary mandibular molars were randomly assigned to three experimental groups (n=15). Group I was instrumented using k-hand files, Group II with S2 ProTaper universal file and Group III with 0.25 tip 4% taper K3 rotary file. Standardized digital radiographs were taken before and after root canal instrumentation. Root canal preparation time was also recorded. Statistical analysis of the obtained data was done using SPSS Software version 17.0. An intergroup comparison of the instrumentation time and the quality of obturation was done using ANOVA and Chi-square test with the level of significance set at 0.05. No significant differences were noted with regard to the quality of obturation (p=0.791). However, a statistically significant difference was noted in the instrumentation time between the three groups (p<0.05). ProTaper rotary system had significantly lesser instrumentation time when compared to that of K3 rotary system and hand file system. The hand files, S2 ProTaper Universal and K3 0.25 tip 4% taper files systems performed similarly with respect to the quality of obturation. There was a significant difference in instrumentation time with manual instrumentation compared to the modified rotary file systems in primary teeth.

  16. 78 FR 44997 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-25

    ... historic Rule 144A transaction data, to amend the definition of Historic TRACE Data to reference the three... dissemination of Rule 144A transactions. Nine commenters supported such dissemination and three commenters were... addition, transparency in this sector may improve the quality of the valuation of securities and derivative...

  17. Program Review: A Tool for Continuous Improvement of Academic Programs. AIR Professional File. Number 105, Fall 2007

    ERIC Educational Resources Information Center

    Pitter, Gita Wijesinghe

    2007-01-01

    Program reviews became widely used as quality assurance activities in the United States beginning in the 1970s. Since then, they have evolved as an essential component in demonstrating institutional effectiveness to accrediting bodies. The paper discusses various approaches to reviews with a focus on a recently reengineered institutional program…

  18. Improving the Quality of Backup Process for Publishing Houses and Printing Houses

    NASA Astrophysics Data System (ADS)

    Proskuriakov, N. E.; Yakovlev, B. S.; Pries, V. V.

    2018-04-01

    The analysis of main types for data threats, used by print media, and their influence on the vitality and security of information is made. The influence of the programs settings for preparing archive files, the types of file managers on the backup process is analysed. We proposed a simple and economical version of the practical implementation of the backup process consisting of 4 components: the command line interpreter, the 7z archiver, the Robocopy utility, and network storage. We recommend that the best option would be to create backup copies, consisting of three local copies of data and two network copies.

  19. Malpractice litigation and nursing home quality of care.

    PubMed

    Konetzka, R Tamara; Park, Jeongyoung; Ellis, Robert; Abbo, Elmer

    2013-12-01

    To assess the potential deterrent effect of nursing home litigation threat on nursing home quality. We use a panel dataset of litigation claims and Nursing Home Online Survey Certification and Reporting (OSCAR) data from 1995 to 2005 in six states: Florida, Illinois, Wisconsin, New Jersey, Missouri, and Delaware, for a total of 2,245 facilities. Claims data are from Westlaw's Adverse Filings database, a proprietary legal database, on all malpractice, negligence, and personal injury/wrongful death claims filed against nursing facilities. A lagged 2-year moving average of the county-level number of malpractice claims is used to represent the threat of litigation. We use facility fixed-effects models to examine the relationship between the threat of litigation and nursing home quality. We find significant increases in registered nurse-to-total staffing ratios in response to rising malpractice threat, and a reduction in pressure sores among highly staffed facilities. However, the magnitude of the deterrence effect is small. Deterrence in response to the threat of malpractice litigation is unlikely to lead to widespread improvements in nursing home quality. This should be weighed against other benefits and costs of litigation to assess the net benefit of tort reform. © Health Research and Educational Trust.

  20. A criteria-based audit of the management of severe pre-eclampsia in Kampala, Uganda.

    PubMed

    Weeks, A D; Alia, G; Ononge, S; Otolorin, E O; Mirembe, F M

    2005-12-01

    To improve the quality of clinical care for women with severe pre-eclampsia. A criteria-based audit was conducted in a large government hospital in Uganda. Management practices were evaluated against standards developed by an expert panel by retrospectively evaluating 43 case files. Results of the audit were presented, and recommendations developed and implemented. A re-audit was conducted 6 months later. The initial audit showed that most standards were rarely achieved. Reasons were discussed. Guidelines were produced, additional supplies were purchased following a fundraising effort, labor ward procedures were streamlined, and staffing was increased. In the re-audit there were significant improvements in diagnosis, monitoring, and treatment. Criteria-based audit can improve the quality of maternity care in countries with limited resources.

  1. Clinical Evaluation of Quality of Obturation and Instrumentation Time using Two Modified Rotary File Systems with Manual Instrumentation in Primary Teeth

    PubMed Central

    Govindaraju, Lavanya; Subramanian, EMG

    2017-01-01

    Introduction Pulp therapy in primary teeth has been performed using various instrumentation techniques. However, the conventional instrumentation technique used for root canal preparation in primary teeth is hand instrumentation. Various Nickel-Titanium (Ni-Ti) instruments are available to perform efficient root canal preparation in primary teeth. These Ni-Ti instruments has been designed to aid in better root canal preparation in permanent teeth but are rarely used in primary teeth. It is necessary to assess the feasibility of using these adult rotary files with a modified sequence in primary teeth. Aim To compare the quality of obturation and instrumentation time during root canal preparation using hand files and modified rotary file systems in primary molars. Materials and Methods Forty-five primary mandibular molars were randomly assigned to three experimental groups (n=15). Group I was instrumented using k-hand files, Group II with S2 ProTaper universal file and Group III with 0.25 tip 4% taper K3 rotary file. Standardized digital radiographs were taken before and after root canal instrumentation. Root canal preparation time was also recorded. Statistical analysis of the obtained data was done using SPSS Software version 17.0. An intergroup comparison of the instrumentation time and the quality of obturation was done using ANOVA and Chi-square test with the level of significance set at 0.05. Results No significant differences were noted with regard to the quality of obturation (p=0.791). However, a statistically significant difference was noted in the instrumentation time between the three groups (p<0.05). ProTaper rotary system had significantly lesser instrumentation time when compared to that of K3 rotary system and hand file system. Conclusion The hand files, S2 ProTaper Universal and K3 0.25 tip 4% taper files systems performed similarly with respect to the quality of obturation. There was a significant difference in instrumentation time with manual instrumentation compared to the modified rotary file systems in primary teeth. PMID:29207834

  2. Root-canal shaping with manual and rotary Ni-Ti files performed by students.

    PubMed

    Sonntag, D; Delschen, S; Stachniss, V

    2003-11-01

    To investigate root-canal shaping with manual and rotary Ni-Ti files performed by students. Thirty undergraduate dental students prepared 150 simulated curved root canals in resin blocks with manual Ni-Ti files with a stepback technique and 450 simulated curved canals with rotary Ni-Ti files with a crowndown technique. Incidence of fracture, preparation length, canal shape and preparation time were investigated. Questionnaires were then issued to the students for them to note their experience of the two preparation methods. Zips and elbows occurred significantly (P < 0.001) less frequently with rotary than with manual preparation. The correct preparation length was achieved significantly (P < 0.05) more often with rotary files than with manual files. Instrument fractures were recorded in only 1.3% of cases with both rotary and manual preparation. The mean time required for manual preparation was significantly (P < 0.001) longer than that required for rotary preparation. Prior experience with a hand preparation technique was not reflected in an improved quality of the subsequent rotary preparation. Approximately 83% of the students claimed to have a greater sense of security in rotary than in manual preparation. Overall 50% felt that manual and engine-driven preparation should be given equal status in undergraduate dental education. Inexperienced operators achieved better canal preparations with rotary instruments than with manual files. No difference in fracture rate was recorded between the two systems.

  3. Using GDAL to Convert NetCDF 4 CF 1.6 to GeoTIFF: Interoperability Problems and Solutions for Data Providers and Distributors

    NASA Astrophysics Data System (ADS)

    Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.

    2015-12-01

    An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.

  4. Caregiver staffing in nursing homes and their influence on quality of care: using dynamic panel estimation methods.

    PubMed

    Castle, Nicholas G; Anderson, Ruth A

    2011-06-01

    There is inconclusive evidence that nursing home caregiver staffing characteristics influence quality of care. In this research, the relationship of caregiver staffing levels, turnover, agency use, and professional staff mix with quality is further examined using a longitudinal analysis to overcome weaknesses of earlier research. The data used came from a survey of nursing home administrators, Nursing Home Compare, the Online Survey Certification and Reporting data, and the Area Resource File. The staffing variables of Registered Nurses, Licensed Practical Nurses, and Nurse Aides were measured quarterly from 2003 through 2007, and came from 2839 facilities. Generalized method of moments estimation was used to examine the effects of changes in staffing characteristics on changes in 4 quality measures (physical restraint use, catheter use, pain management, and pressure sores). Regression analyses show a robust association between the staffing characteristic variables and quality indicators. A change to more favorable staffing is generally associated with a change to better quality. With longitudinal information and quarterly staffing information, we are able to show that for many nursing homes improving staffing characteristics will improve quality of care.

  5. Integrating risk management data in quality improvement initiatives within an academic neurosurgery department.

    PubMed

    McLaughlin, Nancy; Garrett, Matthew C; Emami, Leila; Foss, Sarah K; Klohn, Johanna L; Martin, Neil A

    2016-01-01

    OBJECT While malpractice litigation has had many negative impacts on health care delivery systems, information extracted from lawsuits could potentially guide toward venues to improve care. The authors present a comprehensive review of lawsuits within a tertiary academic neurosurgical department and report institutional and departmental strategies to mitigate liability by integrating risk management data with quality improvement initiatives. METHODS The Comprehensive Risk Intelligence Tool database was interrogated to extract claims/suits abstracts concerning neurosurgical cases that were closed from January 2008 to December 2012. Variables included demographics of the claimant, type of procedure performed (if any), claim description, insured information, case outcome, clinical summary, contributing factors and subfactors, amount incurred for indemnity and expenses, and independent expert opinion in regard to whether the standard of care was met. RESULTS During the study period, the Department of Neurosurgery received the most lawsuits of all surgical specialties (30 of 172), leading to a total incurred payment of $4,949,867. Of these lawsuits, 21 involved spinal pathologies and 9 cranial pathologies. The largest group of suits was from patients with challenging medical conditions who underwent uneventful surgeries and postoperative courses but filed lawsuits when they did not see the benefits for which they were hoping; 85% of these claims were withdrawn by the plaintiffs. The most commonly cited contributing factors included clinical judgment (20 of 30), technical skill (19 of 30), and communication (6 of 30). CONCLUSIONS While all medical and surgical subspecialties must deal with the issue of malpractice and liability, neurosurgery is most affected both in terms of the number of suits filed as well as monetary amounts awarded. To use the suits as learning tools for the faculty and residents and minimize the associated costs, quality initiatives addressing the most frequent contributing factors should be instituted in care redesign strategies, enabling strategic alignment of quality improvement and risk management efforts.

  6. DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND DOCUMENTATION

    EPA Science Inventory

    The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.

  7. Veterans Justice Outreach Program: VA Could Improve Management by Establishing Performance Measures and Fully Assessing Risks

    DTIC Science & Technology

    2016-04-01

    Assessing Risks Report to Congressional Requesters April 2016 GAO-16-393 United States Government Accountability Office United States...Government Accountability Office Highlights of GAO-16-393, a report to congressional requesters April 2016 VETERANS JUSTICE OUTREACH PROGRAM...quality, timeliness, efficiency, cost of service, and outcome. GAO, Tax Administration: IRS Needs to Further Refine Its Tax Filing Season

  8. Data Quality Verification at STScI - Automated Assessment and Your Data

    NASA Astrophysics Data System (ADS)

    Dempsey, R.; Swade, D.; Scott, J.; Hamilton, F.; Holm, A.

    1996-12-01

    As satellite based observatories improve their ability to deliver wider varieties and more complex types of scientific data, so to does the process of analyzing and reducing these data. It becomes correspondingly imperative that Guest Observers or Archival Researchers have access to an accurate, consistent, and easily understandable summary of the quality of their data. Previously, at the STScI, an astronomer would display and examine the quality and scientific usefulness of every single observation obtained with HST. Recently, this process has undergone a major reorganization at the Institute. A major part of the new process is that the majority of data are assessed automatically with little or no human intervention. As part of routine processing in the OSS--PODPS Unified System (OPUS), the Observatory Monitoring System (OMS) observation logs, the science processing trailer file (also known as the TRL file), and the science data headers are inspected by an automated tool, AUTO_DQ. AUTO_DQ then determines if any anomalous events occurred during the observation or through processing and calibration of the data that affects the procedural quality of the data. The results are placed directly into the Procedural Data Quality (PDQ) file as a string of predefined data quality keywords and comments. These in turn are used by the Contact Scientist (CS) to check the scientific usefulness of the observations. In this manner, the telemetry stream is checked for known problems such as losses of lock, re-centerings, or degraded guiding, for example, while missing data or calibration errors are also easily flagged. If the problem is serious, the data are then queued for manual inspection by an astronomer. The success of every target acquisition is verified manually. If serious failures are confirmed, the PI and the scheduling staff are notified so that options concerning rescheduling the observations can be explored.

  9. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  10. Malpractice Litigation and Nursing Home Quality of Care

    PubMed Central

    Konetzka, R Tamara; Park, Jeongyoung; Ellis, Robert; Abbo, Elmer

    2013-01-01

    Objective. To assess the potential deterrent effect of nursing home litigation threat on nursing home quality. Data Sources/Study Setting. We use a panel dataset of litigation claims and Nursing Home Online Survey Certification and Reporting (OSCAR) data from 1995 to 2005 in six states: Florida, Illinois, Wisconsin, New Jersey, Missouri, and Delaware, for a total of 2,245 facilities. Claims data are from Westlaw's Adverse Filings database, a proprietary legal database, on all malpractice, negligence, and personal injury/wrongful death claims filed against nursing facilities. Study Design. A lagged 2-year moving average of the county-level number of malpractice claims is used to represent the threat of litigation. We use facility fixed-effects models to examine the relationship between the threat of litigation and nursing home quality. Principal Findings. We find significant increases in registered nurse-to-total staffing ratios in response to rising malpractice threat, and a reduction in pressure sores among highly staffed facilities. However, the magnitude of the deterrence effect is small. Conclusions. Deterrence in response to the threat of malpractice litigation is unlikely to lead to widespread improvements in nursing home quality. This should be weighed against other benefits and costs of litigation to assess the net benefit of tort reform. PMID:23741985

  11. Further Development, Support and Enhancement of CONDUIT

    NASA Technical Reports Server (NTRS)

    Veronica, Moldoveanu; Levine, William S.

    1999-01-01

    From the first airplanes steered by handles, wheels, and pedals to today's advanced aircraft, there has been a century of revolutionary inventions, all of them contributing to flight quality. The stability and controllability of aircraft as they appear to a pilot are called flying or handling qualities. Many years after the first airplanes flew, flying qualities were identified and ranked from desirable to unsatisfactory. Later on engineers developed design methods to satisfy these practical criteria. CONDUIT, which stands for Control Designer's Unified Interface, is a modern software package that provides a methodology for optimization of flight control systems in order to improve the flying qualities. CONDUIT is dependent on an the optimization engine called CONSOL-OPTCAD (C-O). C-O performs multicriterion parametric optimization. C-O was successfully tested on a variety of control problems. The optimization-based computational system, C-O, requires a particular control system description as a MATLAB file and possesses the ability to modify the vector of design parameters in an attempt to satisfy performance objectives and constraints specified by the designer, in a C-type file. After the first optimization attempts on the UH-60A control system, an early interface system, named GIFCORCODE (Graphical Interface for CONSOL-OPTCAD for Rotorcraft Controller Design) was created.

  12. A Study to Determine the Best Way for Letterman Army Medical Center to Comply with the 1981 JCAH (Joint Commission on Accreditation of Hospitals) Quality Assurance Standard

    DTIC Science & Technology

    1980-08-01

    follow-up action taken would be prima facie evidence of the program’s positive impact on patient care/ clinical performance. Direct and indirect...demonstration of improved patient care/clinical performance must be shown. Prima facie evidence 57 of such improvement would include a summary of identified...even in relation to the seven other HSC medical cen- ters, because of the following reasons: - Active duty patients cannot- claim or file a law- suit

  13. High-grade video compression of echocardiographic studies: a multicenter validation study of selected motion pictures expert groups (MPEG)-4 algorithms.

    PubMed

    Barbier, Paolo; Alimento, Marina; Berna, Giovanni; Celeste, Fabrizio; Gentile, Francesco; Mantero, Antonio; Montericcio, Vincenzo; Muratori, Manuela

    2007-05-01

    Large files produced by standard compression algorithms slow down spread of digital and tele-echocardiography. We validated echocardiographic video high-grade compression with the new Motion Pictures Expert Groups (MPEG)-4 algorithms with a multicenter study. Seven expert cardiologists blindly scored (5-point scale) 165 uncompressed and compressed 2-dimensional and color Doppler video clips, based on combined diagnostic content and image quality (uncompressed files as references). One digital video and 3 MPEG-4 algorithms (WM9, MV2, and DivX) were used, the latter at 3 compression levels (0%, 35%, and 60%). Compressed file sizes decreased from 12 to 83 MB to 0.03 to 2.3 MB (1:1051-1:26 reduction ratios). Mean SD of differences was 0.81 for intraobserver variability (uncompressed and digital video files). Compared with uncompressed files, only the DivX mean score at 35% (P = .04) and 60% (P = .001) compression was significantly reduced. At subcategory analysis, these differences were still significant for gray-scale and fundamental imaging but not for color or second harmonic tissue imaging. Original image quality, session sequence, compression grade, and bitrate were all independent determinants of mean score. Our study supports use of MPEG-4 algorithms to greatly reduce echocardiographic file sizes, thus facilitating archiving and transmission. Quality evaluation studies should account for the many independent variables that affect image quality grading.

  14. Mobile Care (Moca) for Remote Diagnosis and Screening

    PubMed Central

    Celi, Leo Anthony; Sarmenta, Luis; Rotberg, Jhonathan; Marcelo, Alvin; Clifford, Gari

    2010-01-01

    Moca is a cell phone-facilitated clinical information system to improve diagnostic, screening and therapeutic capabilities in remote resource-poor settings. The software allows transmission of any medical file, whether a photo, x-ray, audio or video file, through a cell phone to (1) a central server for archiving and incorporation into an electronic medical record (to facilitate longitudinal care, quality control, and data mining), and (2) a remote specialist for real-time decision support (to leverage expertise). The open source software is designed as an end-to-end clinical information system that seamlessly connects health care workers to medical professionals. It is integrated with OpenMRS, an existing open source medical records system commonly used in developing countries. PMID:21822397

  15. Particle Pollution

    MedlinePlus

    ... of running) so you don't breathe as hard. Avoid busy roads and highways where PM is usually worse because of emissions from cars and trucks. For more tools to help you learn about air quality, visit Tracking Air Quality . Top of Page File Formats Help: How do I view different file formats ( ...

  16. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  17. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  18. Multidate Landsat lake quality monitoring program

    NASA Technical Reports Server (NTRS)

    Fisher, L. T.; Scarpace, F. L.; Thomsen, R. G.

    1979-01-01

    A unified package of files and programs has been developed to automate the multidate Landsat-derived analyses of water quality for about 3000 inland lakes throughout Wisconsin. A master lakes file which stores geographic information on the lakes, a file giving the latitudes and longitudes of control points for scene navigation, and a program to estimate control point locations and produce microfiche character maps for scene navigation are among the files and programs of the system. The use of ground coordinate systems to isolate irregular shaped areas which can be accessed at will appears to provide an economical means of restricting the size of the data set.

  19. Print quality challenges for the next decade

    NASA Astrophysics Data System (ADS)

    Meyer, John D.

    1990-07-01

    The decade of the eighties has seen a remarkable transformation in the performance and capabilities of shared and personal printers. Dramatic gains have been made in four key areas: cost, throughput, reliability and most significantly, print quality. The improvements in text print quality due to algorithmic fonts and increased resolution have been pivotal in the creation of the desktop publishing market. Electronic pre-press systems now include hardware to receive Postscript files accompanied by color originals for scanning and separation. These systems have application in the commercial printing of a wide variety of material e.g. books, magazines, brochures, newspapers. The vision of the future of hardcopy now embraces the full spectrum from typeset text to full color reproduction of natural images due to the advent of grayscale and color capability in printer technology. This will place increased demands for improvements in print quality, particularly in the use of grayscale and color. This paper gives an overview of the challenges which must be met and discusses data communication standards and print quality measurement techniques as a means of meeting these challenges for both color and black and white output.

  20. The relationship between advertising, price, and nursing home quality.

    PubMed

    Kash, Bita A; Miller, Thomas R

    2009-01-01

    Theoretically, nursing homes should engage in advertising for the following two reasons: (a) to improve awareness of the services offered in a particular market and (b) to signal high-quality services. In this study, we build upon results from prior studies of nursing home advertising activity, market competition, and quality. The purpose of this study was to examine the association between advertising expenses, price, and quality. We focused on answering the question: Do nursing homes use advertising and price to signal superior quality? The Texas Nursing Facilities Medicaid Cost Report, the Texas Quality Reporting System, and the Area Resource File were merged for the year 2003. We used three alternative measures of quality to improve the robustness of this exploratory analysis. Quality measures were examined using Bonferroni correlation coefficient analysis. Associations between advertising expenses and quality were evaluated using three regression models predicting quality. We also examined the association of the price of a private bed per day with quality. Advertising expenses were not associated with better nursing home quality as measured by three quality scales. The average price customers pay for one private bed per day was associated with better quality only in one of the three quality regression models. The price of nursing home care might be a better indicator of quality and necessary to increase as quality of care is improved in the nursing homes sector. Because more advertising expenditures are not necessarily associated with better quality, consumers could be mislead by advertisements and choose poor quality nursing homes. Nursing home administrators should focus on customer relationship management tools instead of expensive advertising. Relationship management tools are proven marketing techniques for the health services sector, usually less expensive than advertising, and help with staff retention and quality outcomes.

  1. Geographic Information for Analysis of Highway Runoff-Quality Data on a National or Regional Scale in the Conterminous United States

    USGS Publications Warehouse

    Smieszek, Tomas W.; Granato, Gregory E.

    2000-01-01

    Spatial data are important for interpretation of water-quality information on a regional or national scale. Geographic information systems (GIS) facilitate interpretation and integration of spatial data. The geographic information and data compiled for the conterminous United States during the National Highway Runoff Water-Quality Data and Methodology Synthesis project is described in this document, which also includes information on the structure, file types, and the geographic information in the data files. This 'geodata' directory contains two subdirectories, labeled 'gisdata' and 'gisimage.' The 'gisdata' directory contains ArcInfo coverages, ArcInfo export files, shapefiles (used in ArcView), Spatial Data Transfer Standard Topological Vector Profile format files, and meta files in subdirectories organized by file type. The 'gisimage' directory contains the GIS data in common image-file formats. The spatial geodata includes two rain-zone region maps and a map of national ecosystems originally published by the U.S. Environmental Protection Agency; regional estimates of mean annual streamflow, and water hardness published by the Federal Highway Administration; and mean monthly temperature, mean annual precipitation, and mean monthly snowfall modified from data published by the National Climatic Data Center and made available to the public by the Oregon Climate Service at Oregon State University. These GIS files were compiled for qualitative spatial analysis of available data on a national and(or) regional scale and therefore should be considered as qualitative representations, not precise geographic location information.

  2. Adequacy of Nasqan data to describe areal and temporal variability of water quality of the San Juan River Drainage basin upstream from Shiprock New Mexico

    USGS Publications Warehouse

    Goetz, C.L.; Abeyta, Cynthia G.

    1987-01-01

    Analyses indicate that water quality in the San Juan River drainage basin upstream from Shiprock, New Mexico, is quite variable from station to station. Analyses are based on water quality data from the U.S. Geological Survey WATSTORE files and the New Mexico Environmental Improvement Division 's files. In the northeastern part of the basin, most streams are calcium-bicarbonate waters. In the northwestern and southern part of the basin, the streams are calcium-sulfate and sodium-sulfate waters. Geology, climate, and land use and water use affect the water quality. Statistical analysis shows that streamflow, suspended-sediment, dissolved-iron, dissolved-orthophosphate-phosphorus, dissolved-sodium, dissolved-sulfate, and dissolved-manganese concentrations, specific conductance, and pH are highly variable among most stations. Dissolved-radium-226 concentration is the least variable among stations. A trend in one or more water quality constituents for the time period, October 1, 1973, through September 30, 1981, was detected at 15 out of 36 stations tested. The NASQAN stations Animas River at Farmington and San Juan River at Shiprock, New Mexico, record large volumes of flow that represent an integration of the flow from many upstream tributaries. The data collected do not represent what is occurring at specific points upstream in the basin, but do provide accurate information on how water quality is changing over time at the station location. A water quality, streamflow model would be necessary to predict accurately what is occurring simultaneously in the entire basin. (USGS)

  3. Understanding the barriers to setting up a healthcare quality improvement process in resource-limited settings: a situational analysis at the Medical Department of Kamuzu Central Hospital in Lilongwe, Malawi.

    PubMed

    Agyeman-Duah, Josephine Nana Afrakoma; Theurer, Antje; Munthali, Charles; Alide, Noor; Neuhann, Florian

    2014-01-02

    Knowledge regarding the best approaches to improving the quality of healthcare and their implementation is lacking in many resource-limited settings. The Medical Department of Kamuzu Central Hospital in Malawi set out to improve the quality of care provided to its patients and establish itself as a recognized centre in teaching, operations research and supervision of district hospitals. Efforts in the past to achieve these objectives were short-lived, and largely unsuccessful. Against this background, a situational analysis was performed to aid the Medical Department to define and prioritize its quality improvement activities. A mix of quantitative and qualitative methods was applied using checklists for observed practice, review of registers, key informant interviews and structured patient interviews. The mixed methods comprised triangulation by including the perspectives of the clients, healthcare providers from within and outside the department, and the field researcher's perspectives by means of document review and participatory observation. Human resource shortages, staff attitudes and shortage of equipment were identified as major constraints to patient care, and the running of the Medical Department. Processes, including documentation in registers and files and communication within and across cadres of staff were also found to be insufficient and thus undermining the effort of staff and management in establishing a sustained high quality culture. Depending on their past experience and knowledge, the stakeholder interviewees revealed different perspectives and expectations of quality healthcare and the intended quality improvement process. Establishing a quality improvement process in resource-limited settings is an enormous task, considering the host of challenges that these facilities face. The steps towards changing the status quo for improved quality care require critical self-assessment, the willingness to change as well as determined commitment and contributions from clients, staff and management.

  4. An assessment of scientific and technical aspects of closed investigations of canine forensics DNA – case series from the University of California, Davis, USA

    PubMed Central

    Scharnhorst, Günther; Kanthaswamy, Sree

    2011-01-01

    Aim To describe and assess the scientific and technical aspects of animal forensic testing at the University of California, Davis. The findings and recommendations contained in this report are designed to assess the past, evaluate the present, and recommend reforms that will assist the animal forensic science community in providing the best possible services that comply with court standards and bear judicial scrutiny. Methods A batch of 32 closed files of domestic dog DNA cases processed at the University of California, Davis, between August 2003 and July 2005 were reviewed in this study. The case files comprised copies of all original paperwork, copies of the cover letter or final report, laboratory notes, notes on analyses, submission forms, internal chains of custody, printed images and photocopies of evidence, as well as the administrative and technical reviews of those cases. Results While the fundamental aspects of animal DNA testing may be reliable and acceptable, the scientific basis for forensic testing animal DNA needs to be improved substantially. In addition to a lack of standardized and validated genetic testing protocols, improvements are needed in a wide range of topics including quality assurance and quality control measures, sample handling, evidence testing, statistical analysis, and reporting. Conclusion This review implies that although a standardized panel of short tandem repeat and mitochondrial DNA markers and publicly accessible genetic databases for canine forensic DNA analysis are already available, the persistent lack of supporting resources, including standardized quality assurance and quality control programs, still plagues the animal forensic community. This report focuses on closed cases from the period 2003-2005, but extends its scope more widely to include other animal DNA forensic testing services. PMID:21674824

  5. An assessment of scientific and technical aspects of closed investigations of canine forensics DNA--case series from the University of California, Davis, USA.

    PubMed

    Scharnhorst, Günther; Kanthaswamy, Sree

    2011-06-01

    To describe and assess the scientific and technical aspects of animal forensic testing at the University of California, Davis. The findings and recommendations contained in this report are designed to assess the past, evaluate the present, and recommend reforms that will assist the animal forensic science community in providing the best possible services that comply with court standards and bear judicial scrutiny. A batch of 32 closed files of domestic dog DNA cases processed at the University of California, Davis, between August 2003 and July 2005 were reviewed in this study. The case files comprised copies of all original paperwork, copies of the cover letter or final report, laboratory notes, notes on analyses, submission forms, internal chains of custody, printed images and photocopies of evidence, as well as the administrative and technical reviews of those cases. While the fundamental aspects of animal DNA testing may be reliable and acceptable, the scientific basis for forensic testing animal DNA needs to be improved substantially. In addition to a lack of standardized and validated genetic testing protocols, improvements are needed in a wide range of topics including quality assurance and quality control measures, sample handling, evidence testing, statistical analysis, and reporting. This review implies that although a standardized panel of short tandem repeat and mitochondrial DNA markers and publicly accessible genetic databases for canine forensic DNA analysis are already available, the persistent lack of supporting resources, including standardized quality assurance and quality control programs, still plagues the animal forensic community. This report focuses on closed cases from the period 2003-2005, but extends its scope more widely to include other animal DNA forensic testing services.

  6. Artifacts and noise removal in electrocardiograms using independent component analysis.

    PubMed

    Chawla, M P S; Verma, H K; Kumar, Vinod

    2008-09-26

    Independent component analysis (ICA) is a novel technique capable of separating independent components from electrocardiogram (ECG) complex signals. The purpose of this analysis is to evaluate the effectiveness of ICA in removing artifacts and noise from ECG recordings. ICA is applied to remove artifacts and noise in ECG segments of either an individual ECG CSE data base file or all files. The reconstructed ECGs are compared with the original ECG signal. For the four special cases discussed, the R-Peak magnitudes of the CSE data base ECG waveforms before and after applying ICA are also found. In the results, it is shown that in most of the cases, the percentage error in reconstruction is very small. The results show that there is a significant improvement in signal quality, i.e. SNR. All the ECG recording cases dealt showed an improved ECG appearance after the use of ICA. This establishes the efficacy of ICA in elimination of noise and artifacts in electrocardiograms.

  7. Lessons Learned in Deploying the World s Largest Scale Lustre File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillow, David A; Fuller, Douglas; Wang, Feiyi

    2010-01-01

    The Spider system at the Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) is the world's largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF's diverse computational environment, the project had a number of ambitious goals. To support the workloads of the OLCF's diverse computational platforms, the aggregate performance and storage capacity of Spider exceed that of our previously deployed systems by a factor of 6x - 240 GB/sec, and 17x - 10 Petabytes, respectively. Furthermore, Spider supports over 26,000 clients concurrently accessing themore » file system, which exceeds our previously deployed systems by nearly 4x. In addition to these scalability challenges, moving to a center-wide shared file system required dramatically improved resiliency and fault-tolerance mechanisms. This paper details our efforts in designing, deploying, and operating Spider. Through a phased approach of research and development, prototyping, deployment, and transition to operations, this work has resulted in a number of insights into large-scale parallel file system architectures, from both the design and the operational perspectives. We present in this paper our solutions to issues such as network congestion, performance baselining and evaluation, file system journaling overheads, and high availability in a system with tens of thousands of components. We also discuss areas of continued challenges, such as stressed metadata performance and the need for file system quality of service alongside with our efforts to address them. Finally, operational aspects of managing a system of this scale are discussed along with real-world data and observations.« less

  8. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  9. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  10. Academic podcasting: quality media delivery.

    PubMed

    Tripp, Jacob S; Duvall, Scott L; Cowan, Derek L; Kamauu, Aaron W C

    2006-01-01

    A video podcast of the CME-approved University of Utah Department of Biomedical Informatics seminar was created in order to address issues with streaming video quality, take advantage of popular web-based syndication methods, and make the files available for convenient, subscription-based download. An RSS feed, which is automatically generated, contains links to the media files and allows viewers to easily subscribe to the weekly seminars in a format that guarantees consistent video quality.

  11. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  12. THE NEW YORK CITY URBAN DISPERSION PROGRAM MARCH 2005 FIELD STUDY: TRACER METHODS AND RESULTS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WATSON, T.B.; HEISER, J.; KALB, P.

    The Urban Dispersion Program March 2005 Field Study tracer releases, sampling, and analytical methods are described in detail. There were two days where tracer releases and sampling were conducted. A total of 16.0 g of six tracers were released during the first test day or Intensive Observation Period (IOP) 1 and 15.7 g during IOP 2. Three types of sampling instruments were used in this study. Sequential air samplers, or SAS, collected six-minute samples, while Brookhaven atmospheric tracer samplers (BATS) and personal air samplers (PAS) collected thirty-minute samples. There were a total of 1300 samples resulting from the two IOPs.more » Confidence limits in the sampling and analysis method were 20% as determined from 100 duplicate samples. The sample recovery rate was 84%. The integrally averaged 6-minute samples were compared to the 30-minute samples. The agreement was found to be good in most cases. The validity of using a background tracer to calculate sample volumes was examined and also found to have a confidence level of 20%. Methods for improving sampling and analysis are discussed. The data described in this report are available as Excel files. An additional Excel file of quality assured tracer data for use in model validation efforts is also available. The file consists of extensively quality assured BATS tracer data with background concentrations subtracted.« less

  13. Using compressed images in multimedia education

    NASA Astrophysics Data System (ADS)

    Guy, William L.; Hefner, Lance V.

    1996-04-01

    The classic radiologic teaching file consists of hundreds, if not thousands, of films of various ages, housed in paper jackets with brief descriptions written on the jackets. The development of a good teaching file has been both time consuming and voluminous. Also, any radiograph to be copied was unavailable during the reproduction interval, inconveniencing other medical professionals needing to view the images at that time. These factors hinder motivation to copy films of interest. If a busy radiologist already has an adequate example of a radiological manifestation, it is unlikely that he or she will exert the effort to make a copy of another similar image even if a better example comes along. Digitized radiographs stored on CD-ROM offer marked improvement over the copied film teaching files. Our institution has several laser digitizers which are used to rapidly scan radiographs and produce high quality digital images which can then be converted into standard microcomputer (IBM, Mac, etc.) image format. These images can be stored on floppy disks, hard drives, rewritable optical disks, recordable CD-ROM disks, or removable cartridge media. Most hospital computer information systems include radiology reports in their database. We demonstrate that the reports for the images included in the users teaching file can be copied and stored on the same storage media as the images. The radiographic or sonographic image and the corresponding dictated report can then be 'linked' together. The description of the finding or findings of interest on the digitized image is thus electronically tethered to the image. This obviates the need to write much additional detail concerning the radiograph, saving time. In addition, the text on this disk can be indexed such that all files with user specified features can be instantly retrieve and combined in a single report, if desired. With the use of newer image compression techniques, hundreds of cases may be stored on a single CD-ROM depending on the quality of image required for the finding in question. This reduces the weight of a teaching file from that of a baby elephant to that of a single CD-ROM disc. Thus, with this method of teaching file preparation and storage the following advantages are realized: (1) Technically easier and less time consuming image reproduction. (2) Considerably less unwieldy and substantially more portable teaching files. (3) Novel ability to index files and then retrieve specific cases of choice based on descriptive text.

  14. Proposal for a Standard Format for Neurophysiology Data Recording and Exchange.

    PubMed

    Stead, Matt; Halford, Jonathan J

    2016-10-01

    The lack of interoperability between information networks is a significant source of cost in health care. Standardized data formats decrease health care cost, improve quality of care, and facilitate biomedical research. There is no common standard digital format for storing clinical neurophysiologic data. This review proposes a new standard file format for neurophysiology data (the bulk of which is video-electroencephalographic data), entitled the Multiscale Electrophysiology Format, version 3 (MEF3), which is designed to address many of the shortcomings of existing formats. MEF3 provides functionality that addresses many of the limitations of current formats. The proposed improvements include (1) hierarchical file structure with improved organization; (2) greater extensibility for big data applications requiring a large number of channels, signal types, and parallel processing; (3) efficient and flexible lossy or lossless data compression; (4) industry standard multilayered data encryption and time obfuscation that permits sharing of human data without the need for deidentification procedures; (5) resistance to file corruption; (6) facilitation of online and offline review and analysis; and (7) provision of full open source documentation. At this time, there is no other neurophysiology format that supports all of these features. MEF3 is currently gaining industry and academic community support. The authors propose the use of the MEF3 as a standard format for neurophysiology recording and data exchange. Collaboration between industry, professional organizations, research communities, and independent standards organizations is needed to move the project forward.

  15. Quality of Diabetes Care: The Challenges of an Increasing Epidemic in Mexico. Results from Two National Health Surveys (2006 and 2012)

    PubMed Central

    Flores-Hernández, Sergio; Saturno-Hernández, Pedro J.; Reyes-Morales, Hortensia; Barrientos-Gutiérrez, Tonatiuh; Villalpando, Salvador; Hernández-Ávila, Mauricio

    2015-01-01

    Background The quality of diabetes care remains suboptimal according to numerous studies assessing the achievement of quality indicators for diabetes care in various healthcare settings. We report about global and specific quality indicators for diabetes care and their association to glycemic control at the population level in two national health surveys in Mexico. Methods We conducted a cross-sectional analysis of the 2006 and 2012 National Health Surveys in Mexico. We examined quality of care for 2,965 and 4,483 adults (≥ 20 years) with diagnosed type 2 diabetes using fourteen simple and two composite indicators derived from self-reported information. In a subsample for both surveys, glycated hemoglobin (HbA1c) was measured at the time of the interview. We obtained survey weight-adjusted estimators using multiple regression models (logistic and linear) with combined data files, including survey year as covariate to assess change. Results Global quality of care in 2012 was 40.8%, with a relative improvement of 11.7% between 2006 and 2012. Detections of cardiovascular disease risk factors (dyslipidemia and hypertension) were the indicators with the highest improvement, while non-pharmaceutical treatment and diabetic foot exams showed minor changes. We found a significant association between the quality of the process of diabetes care and glycemic control (OR 2.53, 95% CI 1.63-3.94). Age more than 65 years old, the type of health subsystem, gender (males), and high socio-economic status were also significantly associated to glycemic control. Conclusions Quality diabetes care and glycemic control improved and are significantly associated. However, according to international standards, the current situation remains suboptimal. A more holistic approach is needed, with an emphasis on improving quality in outpatient care. PMID:26230991

  16. Quality of Diabetes Care: The Challenges of an Increasing Epidemic in Mexico. Results from Two National Health Surveys (2006 and 2012).

    PubMed

    Flores-Hernández, Sergio; Saturno-Hernández, Pedro J; Reyes-Morales, Hortensia; Barrientos-Gutiérrez, Tonatiuh; Villalpando, Salvador; Hernández-Ávila, Mauricio

    2015-01-01

    The quality of diabetes care remains suboptimal according to numerous studies assessing the achievement of quality indicators for diabetes care in various healthcare settings. We report about global and specific quality indicators for diabetes care and their association to glycemic control at the population level in two national health surveys in Mexico. We conducted a cross-sectional analysis of the 2006 and 2012 National Health Surveys in Mexico. We examined quality of care for 2,965 and 4,483 adults (≥ 20 years) with diagnosed type 2 diabetes using fourteen simple and two composite indicators derived from self-reported information. In a subsample for both surveys, glycated hemoglobin (HbA1c) was measured at the time of the interview. We obtained survey weight-adjusted estimators using multiple regression models (logistic and linear) with combined data files, including survey year as covariate to assess change. Global quality of care in 2012 was 40.8%, with a relative improvement of 11.7% between 2006 and 2012. Detections of cardiovascular disease risk factors (dyslipidemia and hypertension) were the indicators with the highest improvement, while non-pharmaceutical treatment and diabetic foot exams showed minor changes. We found a significant association between the quality of the process of diabetes care and glycemic control (OR 2.53, 95% CI 1.63-3.94). Age more than 65 years old, the type of health subsystem, gender (males), and high socio-economic status were also significantly associated to glycemic control. Quality diabetes care and glycemic control improved and are significantly associated. However, according to international standards, the current situation remains suboptimal. A more holistic approach is needed, with an emphasis on improving quality in outpatient care.

  17. Machine-Readable Data Files in the Social Sciences: An Anthropologist and a Librarian Look at the Issues.

    ERIC Educational Resources Information Center

    Bernard, H. Russell; Jones, Ray

    1984-01-01

    Focuses on problems in making machine-readable data files (MRDFs) accessible and in using them: quality of data in MRDFs themselves (social scientists' concern) and accessibility--availability of bibliographic control, quality of documentation, level of user skills (librarians' concern). Skills needed by social scientists and librarians are…

  18. Ensuring long-term reliability of the data storage on optical disc

    NASA Astrophysics Data System (ADS)

    Chen, Ken; Pan, Longfa; Xu, Bin; Liu, Wei

    2008-12-01

    "Quality requirements and handling regulation of archival optical disc for electronic records filing" is released by The State Archives Administration of the People's Republic of China (SAAC) on its network in March 2007. This document established a complete operative managing process for optical disc data storage in archives departments. The quality requirements of the optical disc used in archives departments are stipulated. Quality check of the recorded disc before filing is considered to be necessary and the threshold of the parameter of the qualified filing disc is set down. The handling regulations for the staffs in the archives departments are described. Recommended environment conditions of the disc preservation, recording, accessing and testing are presented. The block error rate of the disc is selected as main monitoring parameter of the lifetime of the filing disc and three classes pre-alarm lines are created for marking of different quality check intervals. The strategy of monitoring the variation of the error rate curve of the filing discs and moving the data to a new disc or a new media when the error rate of the disc reaches the third class pre-alarm line will effectively guarantee the data migration before permanent loss. Only when every step of the procedure is strictly implemented, it is believed that long-term reliability of the data storage on optical disc for archives departments can be effectively ensured.

  19. Understanding Emissions in East Asia - The KORUS 2015 Emissions Inventory

    NASA Astrophysics Data System (ADS)

    Woo, J. H.; Kim, Y.; Park, R.; Choi, Y.; Simpson, I. J.; Emmons, L. K.; Streets, D. G.

    2017-12-01

    The air quality over Northeast Asia have been deteriorated for decades due to high population and energy use in the region. Despite of more stringent air pollution control policies by the governments, air quality over the region seems not been improved as much - even worse sometimes. The needs of more scientific understanding of inter-relationship among emissions, transport, chemistry over the region are much higher to effectively protect public health and ecosystems. Two aircraft filed campaigns targeting year 2016, MAPS-Seoul and KORUS-AQ, have been organized to study the air quality of over Korea and East Asia relating to chemical evolution, emission inventories, trans-boundary contribution, and satellite application. We developed a new East-Asia emissions inventory, named KORUS2015, based on NIER/KU-CREATE (Comprehensive Regional Emissions inventory for Atmospheric Transport Experiment), in support of the filed campaigns. For anthropogenic emissions, it has 54 fuel classes, 201 sub-sectors and 13 pollutants, including CO2, SO2, NOx, CO, NMVOC, NH3, PM10, and PM2.5. Since the KORUS2015 emissions framework was developed using the integrated climate and air quality assessment modeling framework (i.e. GAINS) and is fully connected with the comprehensive emission processing/modeling systems (i.e. SMOKE, KU-EPS, and MEGAN), it can be effectively used to support atmospheric field campaigns for science and policy. During the field campaigns, we are providing modeling emissions inventory to participating air quality models, such as CMAQ, WRF-Chem, CAMx, GEOS-Chem, MOZART, for forecasting and post-analysis modes. Based on initial assessment of those results, we are improving our emissions, such as VOC speciation, biogenic VOCs modeling. From the 2nditeration between emissions and modeling/measurement, further analysis results will be presented at the conference. Acknowledgements : This subject is supported by Korea Ministry of Environment as "Climate Change Correspondence Program." This work was supported under the framework of national strategy project on fine particulate matters by Ministry of Science, ICT and Future Planning.

  20. Atmospheric Boundary Layer Wind Data During the Period January 1, 1998 Through January 21, 1999 at the Dallas-Fort Worth Airport. Volume 2; Data and Processing

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen; Rodgers, William G., Jr.

    2000-01-01

    The NASA Langley Research Center's Aircraft Vortex Spacing System (AVOSS) requires accurate winds and turbulence to determine aircraft wake vortex behavior near the ground. Volume 1 described the wind input and quality analysis process. This volume documents the data available during the period January 1998 through January 1999 and the partitioning and concatenation of files for time of day, turbulence, non duplication, cross wind profile quality and ceiling and visibility. It provides the resultant filtered files for the first three partitions as well as identification of ceiling/visibility conditions when they were below 5000 feet and 5 miles respectively. It also includes the wind profile quality flags to permit automatic selection of files for AVOSS application using selected ceiling/visibility and wind profile quality values and flags (or no flags).

  1. Patient-reported denials, appeals, and complaints: associations with overall plan ratings.

    PubMed

    Quigley, Denise D; Haviland, Amelia M; Dembosky, Jacob W; Klein, David J; Elliott, Marc N

    2018-03-01

    To assess whether Medicare patients' reports of denied care, appeals/complaints, and satisfactory resolution were associated with ratings of their health plan or care. Retrospective analysis of 2010 Medicare Advantage Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey data. Multivariate linear regression of data from 154,766 respondents (61.1% response rate) tested the association of beneficiary ratings of plan and care with beneficiary reports of denied care, appeals, complaints, and complaint resolution, adjusting for beneficiary demographics. Beneficiaries who reported being denied needed care rated their plans and care significantly less positively, by 17.2 points (on a 100-point scale) and 9.1 points, respectively. Filing an appeal was not statistically significantly associated with further lower ratings. Beneficiaries who filed a complaint that was satisfactorily resolved gave slightly lower ratings of plans (-3.4 points) and care (-2.5 points) than those not filing a complaint (P <.001 for all results). Lower ratings from patients reporting complaints and denied care may notably affect the overall 0-10 CAHPS ratings of Medicare Advantage plans. Our results suggest that beneficiaries may attribute the actions that lead to complaints or denials to plans more than to the care they received. Successful complaint resolution and utilization management review might eliminate most deficits associated with complaints and denied care, consistent with the service recovery paradox. High rates of complaints and denied care might identify areas that need improved utilization management review, customer service, and quality improvement. Among those reporting being denied care, filing an appeal was not associated with lower patient ratings of plan or care.

  2. Analysis of ETMS Data Quality for Traffic Flow Management Decisions

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano B.; Sridhar, Banavar; Kim, Douglas

    2003-01-01

    The data needed for air traffic flow management decision support tools is provided by the Enhanced Traffic Management System (ETMS). This includes both the tools that are in current use and the ones being developed for future deployment. Since the quality of decision support provided by all these tools will be influenced by the quality of the input ETMS data, an assessment of ETMS data quality is needed. Motivated by this desire, ETMS data quality is examined in this paper in terms of the unavailability of flight plans, deviation from the filed flight plans, departure delays, altitude errors and track data drops. Although many of these data quality issues are not new, little is known about their extent. A goal of this paper is to document the magnitude of data quality issues supported by numerical analysis of ETMS data. Guided by this goal, ETMS data for a 24-hour period were processed to determine the number of aircraft with missing flight plan messages at any given instant of time. Results are presented for aircraft above 18,000 feet altitude and also at all altitudes. Since deviation from filed flight plan is also a major cause of trajectory-modeling errors, statistics of deviations are presented. Errors in proposed departure times and ETMS-generated vertical profiles are also shown. A method for conditioning the vertical profiles for improving demand prediction accuracy is described. Graphs of actual sector counts obtained using these vertical profiles are compared with those obtained using the Host data for sectors in the Fort Worth Center to demonstrate the benefit of preprocessing. Finally, results are presented to quantify the extent of data drops. A method for propagating track positions during ETMS data drops is also described.

  3. Assessing the Value of High-Quality Care for Work-Associated Carpal Tunnel Syndrome in a Large Integrated Health Care System: Study Design.

    PubMed

    Conlon, Craig; Asch, Steven; Hanson, Mark; Avins, Andrew; Levitan, Barbara; Roth, Carol; Robbins, Michael; Dworsky, Michael; Seabury, Seth; Nuckols, Teryl

    2016-01-01

    Little is known about quality of care for occupational health disorders, although it may affect worker health and workers' compensation costs. Carpal tunnel syndrome (CTS) is a common work-associated condition that causes substantial disability. To describe the design of a study that is assessing quality of care for work-associated CTS and associations with clinical outcomes and costs. Prospective observational study of 477 individuals with new workers' compensation claims for CTS without acute trauma who were treated at 30 occupational health clinics from 2011 to 2013 and followed for 18 months. Timing of key clinical events, adherence to 45 quality measures, changes in scores on the Boston Carpal Tunnel Questionnaire and 12-item Short Form Health Survey Version 2 (SF-12v2), and costs associated with medical care and disability. Two hundred sixty-seven subjects (56%) received a diagnosis of CTS and had claims filed around the first visit to occupational health, 104 (22%) received a diagnosis before that visit and claim, and 98 (21%) received a diagnosis or had claims filed after that visit. One hundred seventy-eight (37%) subjects had time off work, which started around the time of surgery in 147 (83%) cases and lasted a median of 41 days (interquartile range = 42 days). The timing of diagnosis varied, but time off work was generally short and related to surgery. If associations of quality of care with key medical, economic, and quality-of-life outcomes are identified for work-associated CTS, systematic efforts to evaluate and improve quality of medical care for this condition are warranted.

  4. Toward information management in corporations (2)

    NASA Astrophysics Data System (ADS)

    Shibata, Mitsuru

    If construction of inhouse information management systems in an advanced information society should be positioned along with the social information management, its base making begins with reviewing current paper filing systems. Since the problems which inhere in inhouse information management systems utilizing OA equipments also inhere in paper filing systems, the first step toward full scale inhouse information management should be to grasp and solve the fundamental problems in current filing systems. This paper describes analysis of fundamental problems in filing systems, making new type of offices and analysis of improvement needs in filing systems, and some points in improving filing systems.

  5. Raising Awareness of Medicare Member Rights Among Seniors and Caregivers in California

    PubMed Central

    Grossman, Ruth M.; Fu, Patricia L.; Sabogal, Fabio

    2010-01-01

    Many Medicare recipients do not understand their health care rights. Lumetra, formerly California's Medicare quality improvement organization, developed a multifaceted outreach program to increase beneficiary awareness of its services and of the right to file quality-of-care complaints and discharge appeals. Layered outreach activities to Medicare members and their caregivers in 2 targeted counties consisted of paid media, direct mailings, community outreach, and online marketing. Calls to Lumetra's helpline and visits to its Web site—measures of beneficiary awareness of case review services—increased by 106% and 1214%, respectively, in the targeted counties during the 4-month outreach period. Only small increases occurred in nontargeted counties. Increases in quality-of-care complaints and discharge appeal rates were detected during a longer follow-up period. PMID:19965568

  6. Raising awareness of Medicare member rights among seniors and caregivers in California.

    PubMed

    Olson, Rebecca; Grossman, Ruth M; Fu, Patricia L; Sabogal, Fabio

    2010-01-01

    Many Medicare recipients do not understand their health care rights. Lumetra, formerly California's Medicare quality improvement organization, developed a multifaceted outreach program to increase beneficiary awareness of its services and of the right to file quality-of-care complaints and discharge appeals. Layered outreach activities to Medicare members and their caregivers in 2 targeted counties consisted of paid media, direct mailings, community outreach, and online marketing. Calls to Lumetra's helpline and visits to its Web site--measures of beneficiary awareness of case review services--increased by 106% and 1214%, respectively, in the targeted counties during the 4-month outreach period. Only small increases occurred in nontargeted counties. Increases in quality-of-care complaints and discharge appeal rates were detected during a longer follow-up period.

  7. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    NASA Astrophysics Data System (ADS)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to manage the wide variety of marine information as common vocabularies, Catalogues (CSR & EDIOS), Data and Metadata. 3.Other tools for analysis and data management - Import_DB: Script to import data and metadata from the Medatlas ASCII files into the database. - SelDamar/Selavi: interface with the database for local and web access. Allows selective retrievals applying the criteria introduced by the user, as geographical bounds, data responsible, cruises, platform, time periods, etc. Includes also statistical reference values calculation, plotting of original and mean profiles together with vertical interpolation. - ExtractDAMAR: Script to extract data when they are archived in ASCII files that meet the criteria upon an user request through SelDamar interface and export them in ODV format, making also a unit conversion.

  8. Release of the ENDF/B-VII.1 Evaluated Nuclear Data File

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, David

    2012-06-30

    The Cross Section Evaluation Working Group (CSEWG) released the ENDF/B-VII.1 library on December 22, 2011. The ENDF/B-VII.1 library is CSEWG's latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0, including: many new evaluation in the neutron sublibrary (423 in all and over 190 of these contain covariances), new fission product yields and a greatly improved decay data sublibrary. This summary barely touches on the five years worth of advances present in the ENDF/B-VII.1 library. We expect that these changes will lead tomore » improved integral performance in reactors and other applications. Furthermore, the expansion of covariance data in this release will allow for better uncertainty quantification, reducing design margins and costs. The ENDF library is an ongoing and evolving effort. Currently, the ENDF data community embarking on several parallel efforts to improve library management: (1) The adoption of a continuous integration system to provide evaluators 'instant' feedback on the quality of their evaluations and to provide data users with working 'beta' quality libraries in between major releases. (2) The transition to new hierarchical data format - the Generalized Nuclear Data (GND) format. We expect GND to enable new kinds of evaluated data which cannot be accommodated in the legacy ENDF format. (3) The development of data assimilation and uncertainty propagation techniques to enable the consistent use of integral experimental data in the evaluation process.« less

  9. 42 CFR 84.40 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  10. 42 CFR 84.40 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  11. 42 CFR 84.40 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  12. 42 CFR 84.40 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  13. 42 CFR 84.40 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  14. PDB data curation.

    PubMed

    Wang, Yanchao; Sunderraman, Rajshekhar

    2006-01-01

    In this paper, we propose two architectures for curating PDB data to improve its quality. The first one, PDB Data Curation System, is developed by adding two parts, Checking Filter and Curation Engine, between User Interface and Database. This architecture supports the basic PDB data curation. The other one, PDB Data Curation System with XCML, is designed for further curation which adds four more parts, PDB-XML, PDB, OODB, Protin-OODB, into the previous one. This architecture uses XCML language to automatically check errors of PDB data that enables PDB data more consistent and accurate. These two tools can be used for cleaning existing PDB files and creating new PDB files. We also show some ideas how to add constraints and assertions with XCML to get better data. In addition, we discuss the data provenance that may affect data accuracy and consistency.

  15. A series of PDB related databases for everyday needs.

    PubMed

    Joosten, Robbie P; te Beek, Tim A H; Krieger, Elmar; Hekkelman, Maarten L; Hooft, Rob W W; Schneider, Reinhard; Sander, Chris; Vriend, Gert

    2011-01-01

    The Protein Data Bank (PDB) is the world-wide repository of macromolecular structure information. We present a series of databases that run parallel to the PDB. Each database holds one entry, if possible, for each PDB entry. DSSP holds the secondary structure of the proteins. PDBREPORT holds reports on the structure quality and lists errors. HSSP holds a multiple sequence alignment for all proteins. The PDBFINDER holds easy to parse summaries of the PDB file content, augmented with essentials from the other systems. PDB_REDO holds re-refined, and often improved, copies of all structures solved by X-ray. WHY_NOT summarizes why certain files could not be produced. All these systems are updated weekly. The data sets can be used for the analysis of properties of protein structures in areas ranging from structural genomics, to cancer biology and protein design.

  16. Use of geographic information system to display water-quality data from San Juan basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thorn, C.R.; Dam, W.L.

    1989-09-01

    The ARC/INFO geographic information system is creating thematic maps of the San Juan basin as part of the USGS Regional Aquifer-System Analysis program. (Use of trade names is for descriptive purposes only and does not constitute endorsement by the US Geological Survey.) Maps created by a Prime version of ARC/INFO, to be published in a series of Hydrologic Investigations Atlas reports for selected geologic units, will include outcrop patters, water-well locations, and water-quality data. The San Juan basin study area, encompassing about 19,400 mi{sup 2}, can be displayed with ARC/INFO at various scales; on the same scale, generated water-quality mapsmore » can be compared and overlain with other maps such as potentiometric surface and depth to top of a geologic or hydrologic unit. Selected water-quality and well data (including latitude and longitude) are retrieved from the USGS National Water Information System data base for a specified geologic unit. Data are formatted by Fortran programs and read into an INFO data base. Two parallel files - an INFO file containing water-quality data and well data and an ARC file containing the site coordinates - are joined to form the ARC/INFO data base. A file containing a series of commands using Prime's Command Procedure language is used to select coverage, display, and position data on the map. Data interpretation is enhanced by displaying water-quality data throughout the basin in combination with other hydrologic and geologic data.« less

  17. Quality aspects of ex vivo root canal treatments done by undergraduate dental students using four different endodontic treatment systems.

    PubMed

    Jungnickel, Luise; Kruse, Casper; Vaeth, Michael; Kirkevang, Lise-Lotte

    2018-04-01

    To evaluate factors associated with treatment quality of ex vivo root canal treatments performed by undergraduate dental students using different endodontic treatment systems. Four students performed root canal treatment on 80 extracted human teeth using four endodontic treatment systems in designated treatment order following a Latin square design. Lateral seal and length of root canal fillings was radiographically assessed; for lateral seal, a graded visual scale was used. Treatment time was measured separately for access preparation, biomechanical root canal preparation, obturation and for the total procedure. Mishaps were registered. An ANOVA mirroring the Latin square design was performed. Use of machine-driven nickel-titanium systems resulted in overall better quality scores for lateral seal than use of the manual stainless-steel system. Among systems with machine-driven files, scores did not significantly differ. Use of machine-driven instruments resulted in shorter treatment time than manual instrumentation. Machine-driven systems with few files achieved shorter treatment times. With increasing number of treatments, root canal-filling quality increased, treatment time decreased; a learning curve was plotted. No root canal shaping file separated. The use of endodontic treatment systems with machine-driven files led to higher quality lateral seal compared to the manual system. The three contemporary machine-driven systems delivered comparable results regarding quality of root canal fillings; they were safe to use and provided a more efficient workflow than the manual technique. Increasing experience had a positive impact on the quality of root canal fillings while treatment time decreased.

  18. Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Early, A. B.; Beach, A. L., III; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.

    2015-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The ASDC Toolsets for Airborne Data (TAD) is designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. TAD makes use of aircraft data stored in the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) file format. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Its level of acceptance is due in part to it being generally self-describing for researchers, i.e., it provides necessary data descriptions for proper research use. Despite this, there are a number of issues with the current ICARTT format, especially concerning the machine readability. In order to overcome these issues, the TAD team has developed an "idealized" file format. This format is ASCII and is sufficiently machine readable to sustain the TAD system, however, it is not fully compatible with the current ICARTT format. The process of mapping ICARTT metadata to the idealized format, the format specifics, and the actual conversion process will be discussed. The goal of this presentation is to demonstrate an example of how to improve the machine readability of ASCII data format protocols.

  19. Demographics and the correlation between irrational parenthood cognitions and marital relationship quality in infertile women in Zanjan province in 2016.

    PubMed

    Safaei Nezhad, Arezoo; Vakili, Mohammad Masoud; Ebrahimi, Loghman; Kharaghani, Roghieh

    2018-06-11

    The study determined the correlation between Irrational Parenthood Cognitions (IPC) and marital quality by demographic variables in infertile women. A correlational study with a census method was conducted on all primary infertile women, who had a file in Zanjan, Iran. A 47% significant positive correlation was showed between IPC and marital quality (p < 0.001). The highest correlation was observed in the subsets of women aged 31-40 years, with high level of education, those at third socioeconomic classes, those with less than 10 years of married life, and women whose husband had no children from their previous marriage (all ps < 0.05). Providing counseling services to women with primary infertility, especially high-risk women may help to reduce IPC and to improve marital quality. © 2018 Wiley Periodicals, Inc.

  20. Nursing Home Price and Quality Responses to Publicly Reported Quality Information

    PubMed Central

    Clement, Jan P; Bazzoli, Gloria J; Zhao, Mei

    2012-01-01

    Objective To assess whether the release of Nursing Home Compare (NHC) data affected self-pay per diem prices and quality of care. Data Sources Primary data sources are the Annual Survey of Wisconsin Nursing Homes for 2001–2003, Online Survey and Certification Reporting System, NHC, and Area Resource File. Study Design We estimated fixed effects models with robust standard errors of per diem self-pay charge and quality before and after NHC. Principal Findings After NHC, low-quality nursing homes raised their prices by a small but significant amount and decreased their use of restraints but did not reduce pressure sores. Mid-level and high-quality nursing homes did not significantly increase self-pay prices after NHC nor consistently change quality. Conclusions Our findings suggest that the release of quality information affected nursing home behavior, especially pricing and quality decisions among low-quality facilities. Policy makers should continue to monitor quality and prices for self-pay residents and scrutinize low-quality homes over time to see whether they are on a pathway to improve quality. In addition, policy makers should not expect public reporting to result in quick fixes to nursing home quality problems. PMID:22092366

  1. URBAN STORMWATER INVESTIGATIONS BY THE U. S. GEOLOGICAL SURVEY.

    USGS Publications Warehouse

    Jennings, Marshall E.

    1985-01-01

    Urban stormwater hydrology studies in the U. S. Geological Survey are currently focused on compilation of national data bases containing flood-peak and short time-interval rainfall, discharge and water-quality information for urban watersheds. Current data bases, updated annually, are nationwide in scope. Supplementing the national data files are published reports of interpretative analyses, a map report and research products including improved instrumentation and deterministic modeling capabilities. New directions of Survey investigations include gaging programs for very small catchments and for stormwater detention facilities.

  2. Data Processing Aspects of MEDLARS

    PubMed Central

    Austin, Charles J.

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287

  3. DATA PROCESSING ASPECTS OF MEDLARS.

    PubMed

    AUSTIN, C J

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.

  4. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less

  5. Documentation of a multiple-technique computer program for plotting major-ion composition of natural waters

    USGS Publications Warehouse

    Briel, L.I.

    1993-01-01

    A computer program was written to produce 6 different types of water-quality diagrams--Piper, Stiff, pie, X-Y, boxplot, and Piper 3-D--from the same file of input data. The Piper 3-D diagram is a new method that projects values from the surface of a Piper plot into a triangular prism to show how variations in chemical composition can be related to variations in other water-quality variables. This program is an analytical tool to aid in the interpretation of data. This program is interactive, and the user can select from a menu the type of diagram to be produced and a large number of individual features. Alternatively, these choices can be specified in the data file, which provides a batch mode for running the program. The program does not display water-quality diagrams directly; plots are written to a file. Four different plot- file formats are available: device-independent metafiles, Adobe PostScript graphics files, and two Hewlett-Packard graphics language formats (7475 and 7586). An ASCII data-table file is also produced to document the computed values. This program is written in Fortran '77 and uses graphics subroutines from either the PRIOR AGTK or the DISSPLA graphics library. The program has been implemented on Prime series 50 and Data General Aviion computers within the USGS; portability to other computing systems depends on the availability of the graphics library.

  6. Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.

    PubMed

    Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory

    2016-06-13

    Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.

  7. The Airline Quality Rating 2001 (PDF file)

    DOT National Transportation Integrated Search

    2001-04-01

    The Airline Quality Rating (AQR) was developed and first announced in early : 1991 as an objective method of comparing airline quality on combined multiple : performance criteria. This current report, Airline Quality Rating 2001, reflects monthly Air...

  8. Improving Internet Archive Service through Proxy Cache.

    ERIC Educational Resources Information Center

    Yu, Hsiang-Fu; Chen, Yi-Ming; Wang, Shih-Yong; Tseng, Li-Ming

    2003-01-01

    Discusses file transfer protocol (FTP) servers for downloading archives (files with particular file extensions), and the change to HTTP (Hypertext transfer protocol) with increased Web use. Topics include the Archie server; proxy cache servers; and how to improve the hit rate of archives by a combination of caching and better searching mechanisms.…

  9. Health maintenance organization (HMO) performance and consumer complaints: an empirical study of frustrating HMO activities.

    PubMed

    Born, Patricia H; Query, J Tim

    2004-01-01

    Growing public interest in the operations of managed care plans has fueled a variety of activities to collect and analyze their performance. These activities include studies of financial performance, analysis of enrollment decisions, and, more recently, the development of systems for measuring healthcare quality to improve accountability to consumers. In this study, the authors focus on the activities of managed care plans that may frustrate patients and providers and, subsequently, motivate patients to file complaints. Using data from three sources, they evaluate the relationships between complaints against managed care plans and two metrics of performance: (a) the financial performance of the plan, and (b) the quality of care provided. Their findings indicate that complaints against health maintenance organizations are significantly related to the plans' levels of quality and to actions that may impede access to care.

  10. 17 CFR 12.11 - Formalities of filing of documents with the Proceedings Clerk.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... must be on good quality white paper, 81/2 by 111/2 inches and bound at the top only. Documents e-mailed... copies. Unless otherwise specifically provided, or unless filed by fax or e-mail in accordance with the... Proceedings Clerk. (b) Title page. All documents filed with the Proceedings Clerk must include at the head...

  11. 17 CFR 12.11 - Formalities of filing of documents with the Proceedings Clerk.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... must be on good quality white paper, 81/2 by 111/2 inches and bound at the top only. Documents e-mailed... copies. Unless otherwise specifically provided, or unless filed by fax or e-mail in accordance with the... Proceedings Clerk. (b) Title page. All documents filed with the Proceedings Clerk must include at the head...

  12. Database Deposit Service through JOIS : JAFIC File on Food Industry and Osaka Urban Engineering File

    NASA Astrophysics Data System (ADS)

    Kataoka, Akihiro

    JICST has launched the database deposit service for the excellent quality in small-and medium size, both of which have no dissemination network. JAFIC File on Food Industry produced by the Japan Food Industry Center and Osaka Urban Engineering File by Osaka City have been in service by JOIS since March 2, 1987. In this paper the outline of the above databases is introduced in focussing on the items covered and retrieved by JOIS.

  13. Access to orphan drugs despite poor quality of clinical evidence

    PubMed Central

    Dupont, Alain G; Van Wilder, Philippe B

    2011-01-01

    AIM We analysed the Belgian reimbursement decisions of orphan drugs as compared with those of innovative drugs for more common but equally severe diseases, with special emphasis on the quality of clinical evidence. METHODS Using the National Health Insurance Agency administrative database, we evaluated all submitted orphan drug files between 2002 and 2007. A quality analysis of the clinical evidence in the orphan reimbursement files was performed. The evaluation reports of the French ‘Haute Autorité de Santé’, including the five-point scale parameter ‘Service Médical Rendu (SMR), were examined to compare disease severity. Chi-squared tests (at P < 0.05 significance level) were used to compare the outcome of the reimbursement decisions between orphan and non-orphan innovative medicines. RESULTS Twenty-five files of orphan drugs and 117 files of non-orphan drugs were evaluated. Twenty-two of 25 (88%) submissions of orphan drugs were granted reimbursement as opposed to 74 of the 117 (63%) non-orphan innovative medicines (P = 0.02). Only 52% of the 25 orphan drug files included a randomized controlled trial as opposed to 84% in a random control sample of 25 non-orphan innovative submissions (P < 0.01). The duration of drug exposure was in most cases far too short in relation to the natural history of the disease. CONCLUSIONS Orphan drug designation predicts reimbursement despite poor quality of clinical evidence. The evidence gap at market authorization should be reduced by post-marketing programmes, in which the centralized regulatory and the local reimbursement authorities collaborate in an efficient way across the European Union member states. PMID:21395641

  14. Web-based healthcare hand drawing management system.

    PubMed

    Hsieh, Sheau-Ling; Weng, Yung-Ching; Chen, Chi-Huang; Hsu, Kai-Ping; Lin, Jeng-Wei; Lai, Feipei

    2010-01-01

    The paper addresses Medical Hand Drawing Management System architecture and implementation. In the system, we developed four modules: hand drawing management module; patient medical records query module; hand drawing editing and upload module; hand drawing query module. The system adapts windows-based applications and encompasses web pages by ASP.NET hosting mechanism under web services platforms. The hand drawings implemented as files are stored in a FTP server. The file names with associated data, e.g. patient identification, drawing physician, access rights, etc. are reposited in a database. The modules can be conveniently embedded, integrated into any system. Therefore, the system possesses the hand drawing features to support daily medical operations, effectively improve healthcare qualities as well. Moreover, the system includes the printing capability to achieve a complete, computerized medical document process. In summary, the system allows web-based applications to facilitate the graphic processes for healthcare operations.

  15. Translator for Optimizing Fluid-Handling Components

    NASA Technical Reports Server (NTRS)

    Landon, Mark; Perry, Ernest

    2007-01-01

    A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.

  16. The influence of software filtering in digital mammography image quality

    NASA Astrophysics Data System (ADS)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  17. Improving detection and quality of assessment of child abuse and partner abuse is achievable with a formal organisational change approach.

    PubMed

    Wills, Russell; Ritchie, Miranda; Wilson, Mollie

    2008-03-01

    To improve detection and quality of assessment of child and partner abuse within a health service. A formal organisational change approach was used to implement the New Zealand Family Violence Intervention Guidelines in a mid-sized regional health service. The approach includes obtaining senior management support, community collaboration, developing resources to support practice, research, evaluation and training. Formal pre-post evaluations were conducted of the training. Barriers and enablers of practice change were assessed through 85 interviews with 60 staff. More than 6000 clinical records were audited to assess rates of questioning for partner abuse. Identifications of partner abuse and referrals made were counted through the Family Violence Accessory File. Referrals to the Department of Child, Youth and Family Services (CYFS) were recorded routinely by the CYFS. Audits assessed quality of assessment of child and partner abuse, when identified. More than 700 staff were trained in dual assessment for child and partner abuse. Evaluations demonstrate improved confidence following training, though staff still need support. Barriers and enablers to asking about partner abuse were identified. Referrals from the health service to the CYFS increased from 10 per quarter to 70 per quarter. Identification of partner abuse increased from 30 to 80 per 6-month period. Routine questioning rates for partner abuse vary between services. Achieving and sustaining improved rates of identification and quality of assessment of child and partner abuse is possible with a formal organisational change approach.

  18. 48 CFR 246.401 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance 246.401 General. The requirement for a quality assurance surveillance plan shall be addressed and documented in the contract file... services, the contracting officer should prepare a quality assurance surveillance plan to facilitate...

  19. Hospital implementation of health information technology and quality of care: are they related?

    PubMed

    Restuccia, Joseph D; Cohen, Alan B; Horwitt, Jedediah N; Shwartz, Michael

    2012-09-27

    Recently, there has been considerable effort to promote the use of health information technology (HIT) in order to improve health care quality. However, relatively little is known about the extent to which HIT implementation is associated with hospital patient care quality. We undertook this study to determine the association of various HITs with: hospital quality improvement (QI) practices and strategies; adherence to process of care measures; risk-adjusted inpatient mortality; patient satisfaction; and assessment of patient care quality by hospital quality managers and front-line clinicians. We conducted surveys of quality managers and front-line clinicians (physicians and nurses) in 470 short-term, general hospitals to obtain data on hospitals' extent of HIT implementation, QI practices and strategies, assessments of quality performance, commitment to quality, and sufficiency of resources for QI. Of the 470 hospitals, 401 submitted complete data necessary for analysis. We also developed measures of hospital performance from several publicly data available sources: Hospital Compare adherence to process of care measures; Medicare Provider Analysis and Review (MEDPAR) file; and Hospital Consumer Assessment of Healthcare Providers and Systems HCAHPS® survey. We used Poisson regression analysis to examine the association between HIT implementation and QI practices and strategies, and general linear models to examine the relationship between HIT implementation and hospital performance measures. Controlling for potential confounders, we found that hospitals with high levels of HIT implementation engaged in a statistically significant greater number of QI practices and strategies, and had significantly better performance on mortality rates, patient satisfaction measures, and assessments of patient care quality by hospital quality managers; there was weaker evidence of higher assessments of patient care quality by front-line clinicians. Hospital implementation of HIT was positively associated with activities intended to improve patient care quality and with higher performance on four of six performance measures.

  20. A site of communication among enterprises for supporting occupational health and safety management system.

    PubMed

    Velonakis, E; Mantas, J; Mavrikakis, I

    2006-01-01

    The occupational health and safety management constitutes a field of increasing interest. Institutions in cooperation with enterprises make synchronized efforts to initiate quality management systems to this field. Computer networks can offer such services via TCP/IP which is a reliable protocol for workflow management between enterprises and institutions. A design of such network is based on several factors in order to achieve defined criteria and connectivity with other networks. The network will be consisted of certain nodes responsible to inform executive persons on Occupational Health and Safety. A web database has been planned for inserting and searching documents, for answering and processing questionnaires. The submission of files to a server and the answers to questionnaires through the web help the experts to make corrections and improvements on their activities. Based on the requirements of enterprises we have constructed a web file server. We submit files in purpose users could retrieve the files which need. The access is limited to authorized users and digital watermarks authenticate and protect digital objects. The Health and Safety Management System follows ISO 18001. The implementation of it, through the web site is an aim. The all application is developed and implemented on a pilot basis for the health services sector. It is all ready installed within a hospital, supporting health and safety management among different departments of the hospital and allowing communication through WEB with other hospitals.

  1. The development of an information system and installation of an Internet web database for the purposes of the occupational health and safety management system.

    PubMed

    Mavrikakis, I; Mantas, J; Diomidous, M

    2007-01-01

    This paper is based on the research on the possible structure of an information system for the purposes of occupational health and safety management. We initiated a questionnaire in order to find the possible interest on the part of potential users in the subject of occupational health and safety. The depiction of the potential interest is vital both for the software analysis cycle and development according to previous models. The evaluation of the results tends to create pilot applications among different enterprises. Documentation and process improvements ascertained quality of services, operational support, occupational health and safety advice are the basics of the above applications. Communication and codified information among intersted parts is the other target of the survey regarding health issues. Computer networks can offer such services. The network will consist of certain nodes responsible to inform executives on Occupational Health and Safety. A web database has been installed for inserting and searching documents. The submission of files to a server and the answers to questionnaires through the web help the experts to perform their activities. Based on the requirements of enterprises we have constructed a web file server. We submit files so that users can retrieve the files which they need. The access is limited to authorized users. Digital watermarks authenticate and protect digital objects.

  2. Users' information-seeking behavior on a medical library Website

    PubMed Central

    Rozic-Hristovski, Anamarija; Hristovski, Dimitar; Todorovski, Ljupco

    2002-01-01

    The Central Medical Library (CMK) at the Faculty of Medicine, University of Ljubljana, Slovenia, started to build a library Website that included a guide to library services and resources in 1997. The evaluation of Website usage plays an important role in its maintenance and development. Analyzing and exploring regularities in the visitors' behavior can be used to enhance the quality and facilitate delivery of information services, identify visitors' interests, and improve the server's performance. The analysis of the CMK Website users' navigational behavior was carried out by analyzing the Web server log files. These files contained information on all user accesses to the Website and provided a great opportunity to learn more about the behavior of visitors to the Website. The majority of the available tools for Web log file analysis provide a predefined set of reports showing the access count and the transferred bytes grouped along several dimensions. In addition to the reports mentioned above, the authors wanted to be able to perform interactive exploration and ad hoc analysis and discover trends in a user-friendly way. Because of that, we developed our own solution for exploring and analyzing the Web logs based on data warehousing and online analytical processing technologies. The analytical solution we developed proved successful, so it may find further application in the field of Web log file analysis. We will apply the findings of the analysis to restructuring the CMK Website. PMID:11999179

  3. Recent enhancements to the GRIDGEN structured grid generation system

    NASA Technical Reports Server (NTRS)

    Steinbrenner, John P.; Chawner, John R.

    1992-01-01

    Significant enhancements are being implemented into the GRIDGEN3D, multiple block, structured grid generation software. Automatic, point-to-point, interblock connectivity will be possible through the addition of the domain entity to GRIDBLOCK's block construction process. Also, the unification of GRIDGEN2D and GRIDBLOCK has begun with the addition of edge grid point distribution capability to GRIDBLOCK. The geometric accuracy of surface grids and the ease with which databases may be obtained is being improved by adding support for standard computer-aided design formats (e.g., PATRAN Neutral and IGES files). Finally, volume grid quality was improved through addition of new SOR algorithm features and the new hybrid control function type to GRIDGEN3D.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brewer, M; Gordon, C; Tien, C

    Purpose: To follow the Integrating Healthcare Enterprise - Radiation Oncology (IHE-RO) initiative of proper cross-vendor technology integration, an automated chart checker (ACC) was developed. ACC compares extracted data from an approved patient plan in the Eclipse treatment planning system (TPS) against data existing in the Mosaiq treatment management system (TMS). ACC automatically analyzes these parameters using built-in quality checklists to provide further aid in chart review. Methods: Eclipse TPS data are obtained using Eclipse scripting API (ESAPI) while Mosaiq TMS data are obtained from a radiotherapy-treatment-planning (RTP) file. Using this information, ACC identifies TPS-TMS discrepancies in 18 primary beam parametersmore » including MU, energy, jaw positions, gantry angle, table angle, accessories, and bolus for up to 31 beams. Next, approximately 40 items from traditional quality checklists are evaluated such as prescription consistency, DRR graticule placement, plan approval status, global max dose, and dose tracking coefficients. Parameters were artificially modified to determine if ACC would detect an error in data transfer and to test each component of quality checklists. Results: Using ESAPI scripting and RTP file-processing, ACC was able to properly aggregate data from TPS and TMS for up to 31 beams. Errors were artificially introduced into each plan parameter, and ACC was able to successfully detect all of them within seconds. Next, ACC was able to successfully detect mistakes in the chart by identifying deviations with its quality checklists, within seconds. Conclusion: ACC effectively addresses the potential issue of faulty cross-vendor data transfer, as described by IHE-RO. In addition, ACC was also able to detect deviations from its built-in quality checklists. ACC is already an invaluable tool for efficient and standardized chart review and will continue to improve as its incorporated checklists become more comprehensive.« less

  5. UAEMIAAE

    Atmospheric Science Data Center

    2013-12-19

    UAEMIAAE Aerosol product. ( File version details ) File version  F07_0015  has better ... properties. File version  F08_0016  has improved cloud screening procedure resulting in better aerosol optical depth. ... Coverage:  August - October 2004 File Format:  HDF-EOS Tools:  FTP Access: Data Pool ...

  6. Neck pain in children: a retrospective case series.

    PubMed

    Cox, Jocelyn; Davidian, Christine; Mior, Silvano

    2016-09-01

    Spinal pain in the paediatric population is a significant health issue, with an increasing prevalence as they age. Paediatric patients attend for chiropractor care for spinal pain, yet, there is a paucity of quality evidence to guide the practitioner with respect to appropriate care planning. A retrospective chart review was used to describe chiropractic management of paediatric neck pain. Two researchers abstracted data from 50 clinical files that met inclusion criteria from a general practice chiropractic office in the Greater Toronto Area, Canada. Data were entered into SPSS 15 and descriptively analyzed. Fifty paediatric neck pain patient files were analysed. Patients' age ranged between 6 and 18 years (mean 13 years). Most (98%) were diagnosed with Grade I-II mechanical neck pain. Treatment frequency averaged 5 visits over 19 days; with spinal manipulative therapy used in 96% of patients. Significant improvement was recorded in 96% of the files. No adverse events were documented. Paediatric mechanical neck pain appears to be successfully managed by chiropractic care. Spinal manipulative therapy appears to benefit paediatric mechanical neck pain resulting from day-today activities with no reported serious adverse events. Results can be used to inform clinical trials assessing effectiveness of manual therapy in managing paediatric mechanical neck pain.

  7. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  8. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  9. IGS Data Flow

    NASA Technical Reports Server (NTRS)

    Noll, Carey

    2006-01-01

    The IGS analysis centers and user community in general need to be assured that the data centers archive a consistent set of files. Changes to the archives can occur because of the re-publishing of data, the transmission of historic data, and the resulting re-distribution (or lack thereof) of these data from data center to data center. To ensure the quality of the archives, a defined data flow and method of archive population needs to be established. This poster will diagram and review the current IGS data flow, discuss problems that have occurred, and provide recommendations for improvement.

  10. Interactive visualization tools for the structural biologist.

    PubMed

    Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M

    2013-10-01

    In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.

  11. The Albuquerque Seismological Laboratory Data Quality Analyzer

    NASA Astrophysics Data System (ADS)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the website to diagnose changes in noise levels or other anomalous data. This allows for errors to be corrected quickly and efficiently. The code is designed to be flexible for adding metrics and portable for use in other networks. We anticipate further development of the DQA by improving the existing web-interface, adding more metrics, adding an interface to facilitate the verification of historic station metadata and performance, and an interface to allow better monitoring of data quality goals.

  12. Results of a Telephone Survey of Television Station Managers Concerning the NASA SCI Files(TM) and NASA CONNECT(TM)

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Perry, Jeannine

    2004-01-01

    A telephone survey of television station managers concerning 2 instructional television programs, the NASA SCI Files(TM) and NASA CONNECT(TM), offered by the NASA Langley Center for Distance Learning (CDL) was conducted. Using a 4-point scale, with 4 being very satisfied, survey participants reported that they were either very satisfied (77.1 percent) or satisfied (19.9 percent) with the overall (educational and technical) quality of the NASA SCI Files(TM). Using a 4-point scale, with 4 being very satisfied, survey participants reported that they were either very satisfied (77.9 percent) or satisfied (19.1 percent) with the overall (educational and technical) quality of NASA CONNECT(TM) .

  13. High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.; Ciotti, Robert B.

    2012-01-01

    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.

  14. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    USGS Publications Warehouse

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  15. 76 FR 72452 - International Mail Contract

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... Excel file. In Order No. 549, the Commission approved the Inbound Market Dominant Multi-Service... 2012 Agreement is provided in the Excel file included with its filing. It contends that improvements...

  16. 76 FR 76201 - International Mail Contract

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-06

    ... Agreement and supporting financial documentation as a separate Excel file. In Order No. 549, the Commission... the Excel file included with its filing. It contends that improvements should enhance mail efficiency...

  17. 75 FR 50021 - Self-Regulatory Organizations; Notice of Filing of Proposed Rule Change by NASDAQ OMX PHLX, Inc...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-62678; File No. SR-Phlx-2010-108] Self-Regulatory Organizations; Notice of Filing of Proposed Rule Change by NASDAQ OMX PHLX, Inc. Relating to a Proposed Price Improvement System, Price Improvement XL (PIXL\\SM\\) August 10, 2010. Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934...

  18. Improving File System Performance by Striping

    NASA Technical Reports Server (NTRS)

    Lam, Terance L.; Kutler, Paul (Technical Monitor)

    1998-01-01

    This document discusses the performance and advantages of striped file systems on the SGI AD workstations. Performance of several striped file system configurations are compared and guidelines for optimal striping are recommended.

  19. MISR Level 3 Radiance Versioning

    Atmospheric Science Data Center

    2016-11-04

    ... ESDT Product File Name Prefix Current Quality Designations MIL3DRD, MIL3MRD, MIL3QRD, and MIL3YRD ... Data Product Specification Rev K  (PDF). Update to work with new format of the input PGE 1 files.   F02_0007 ...

  20. 48 CFR 1536.201 - Evaluation of contracting performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... performance. 1536.201 Section 1536.201 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY... Contracting for Construction 1536.201 Evaluation of contracting performance. (a) The Contracting Officer will... will file the form in the contractor performance evaluation files which it maintains. (e) The Quality...

  1. High-quality and small-capacity e-learning video featuring lecturer-superimposing PC screen images

    NASA Astrophysics Data System (ADS)

    Nomura, Yoshihiko; Murakami, Michinobu; Sakamoto, Ryota; Sugiura, Tokuhiro; Matsui, Hirokazu; Kato, Norihiko

    2006-10-01

    Information processing and communication technology are progressing quickly, and are prevailing throughout various technological fields. Therefore, the development of such technology should respond to the needs for improvement of quality in the e-learning education system. The authors propose a new video-image compression processing system that ingeniously employs the features of the lecturing scene. While dynamic lecturing scene is shot by a digital video camera, screen images are electronically stored by a PC screen image capturing software in relatively long period at a practical class. Then, a lecturer and a lecture stick are extracted from the digital video images by pattern recognition techniques, and the extracted images are superimposed on the appropriate PC screen images by off-line processing. Thus, we have succeeded to create a high-quality and small-capacity (HQ/SC) video-on-demand educational content featuring the advantages: the high quality of image sharpness, the small electronic file capacity, and the realistic lecturer motion.

  2. Utilization Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    DOE Data Explorer

    Jordan, Teresa E.

    2015-09-30

    This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau Places within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped risk of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied. UPDATE: Newer version of the Utilization Analysis has been added here: https://gdr.openei.org/submissions/878

  3. GPFA-AB_Phase1UtilizationTask4DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau ‘Places’ within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped ‘risk’ of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied.

  4. Xpatch prediction improvements to support multiple ATR applications

    NASA Astrophysics Data System (ADS)

    Andersh, Dennis J.; Lee, Shung W.; Moore, John T.; Sullivan, Douglas P.; Hughes, Jeff A.; Ling, Hao

    1998-08-01

    This paper describes an electromagnetic computer prediction code for generating radar cross section (RCS), time-domain signature sand synthetic aperture radar (SAR) images of realistic 3D vehicles. The vehicle, typically an airplane or a ground vehicle, is represented by a computer-aided design (CAD) file with triangular facets, IGES curved surfaces, or solid geometries.The computer code, Xpatch, based on the shooting-and-bouncing-ray technique, is used to calculate the polarimetric radar return from the vehicles represented by these different CAD files. Xpatch computers the first- bounce physical optics (PO) plus the physical theory of diffraction (PTD) contributions. Xpatch calculates the multi-bounce ray contributions by using geometric optics and PO for complex vehicles with materials. It has been found that the multi-bounce calculations, the radar return in typically 10 to 15 dB too low. Examples of predicted range profiles, SAR, imagery, and RCS for several different geometries are compared with measured data to demonstrate the quality of the predictions. Recent enhancements to Xpatch include improvements for millimeter wave applications and hybridization with finite element method for small geometric features and augmentation of additional IGES entities to support trimmed and untrimmed surfaces.

  5. 47 CFR 73.871 - Amendment of LPFM broadcast station applications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... limitation during the pertinent filing window. (b) Amendments that would improve the comparative position of new and major change applications will not be accepted after the close of the pertinent filing window... the pertinent filing window. Subject to the provisions of this section, such amendments may be filed...

  6. MISR Level 1A CCD, 1B1, 1B2, and Browse Products

    Atmospheric Science Data Center

    2013-04-01

    ... ESDT Product File Name Prefix Current Quality Designations MI1B2E MISR_AM1_GRP_ELLIPSOID_GM, ... should be used. All calibration files for the life of the mission have been reprocessed using the best available calibration. ...

  7. Performance Data Gathering and Representation from Fixed-Size Statistical Data

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Jin, Haoqiang H.; Schmidt, Melisa A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The two commonly-used performance data types in the super-computing community, statistics and event traces, are discussed and compared. Statistical data are much more compact but lack the probative power event traces offer. Event traces, on the other hand, are unbounded and can easily fill up the entire file system during program execution. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. Two basic ideas are employed: the use of averages to replace recording data for each instance and 'formulae' to represent sequences associated with communication and control flow. The user can trade off tracing overhead, trace data size with data quality incrementally. In other words, the user will be able to limit the amount of trace data collected and, at the same time, carry out some of the analysis event traces offer using space-time views. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected with event traces. We found that the trace files thus obtained are, indeed, small, bounded and predictable before program execution, and that the quality of the space-time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at runtime to learn longer sequences.

  8. Patient-centeredness and quality management in Dutch diabetes care organizations after a 1-year intervention

    PubMed Central

    Campmans-Kuijpers, Marjo JE; Lemmens, Lidwien C; Baan, Caroline A; Rutten, Guy EHM

    2016-01-01

    Background More focus on patient-centeredness in care for patients with type 2 diabetes requests increasing attention to diabetes quality management processes on patient-centeredness by managers in primary care groups and outpatient clinics. Although patient-centered care is ultimately determined by the quality of interactions between patients and clinicians at the practice level, it should be facilitated at organizational level too. This nationwide study aimed to assess the state of diabetes quality management on patient-centeredness at organizational level and its possibilities to improve after a tailored intervention. Methods This before–after study compares the quality management on patient-centeredness within Dutch diabetes care groups and outpatient clinics before and after a 1-year stepwise intervention. At baseline, managers of 51 diabetes primary care groups and 28 outpatient diabetes clinics completed a questionnaire about the organization’s quality management program. Patient-centeredness (0%–100%) was operationalized in six subdomains: facilitating self-management support, individualized care plan support, patients’ access to medical files, patient education policy, safeguarding patients’ interests, and formal patient involvement. The intervention consisted of feedback and benchmark and if requested a telephone call and/or a consultancy visit. After 1 year, the managers completed the questionnaire again. The 1-year changes were examined by dependent (non) parametric tests. Results Care groups improved significantly on patient-centeredness (from 47.1% to 53.3%; P=0.002), and on its subdomains “access to medical files” (from 42.0% to 49.4%), and “safeguarding patients’ interests” (from 58.1% to 66.2%). Outpatient clinics, which scored higher at baseline (66.7%) than care groups, did not improve on patient-centeredness (65.6%: P=0.54) or its subdomains. “Formal patient involvement” remained low in both care groups (23.2%) and outpatient clinics (33.9%). Conclusion After a simple intervention, care groups significantly improved their quality management on patient-centeredness, but outpatient clinics did not. Interventions to improve quality management on patient-centeredness in diabetes care organizations should differ between primary and secondary care. PMID:27784994

  9. Risk Factor Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  10. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  11. Assessment of the reliability of data collected for the Department of Veterans Affairs national surgical quality improvement program.

    PubMed

    Davis, Chester L; Pierce, John R; Henderson, William; Spencer, C David; Tyler, Christine; Langberg, Robert; Swafford, Jennan; Felan, Gladys S; Kearns, Martha A; Booker, Brigitte

    2007-04-01

    The Office of the Medical Inspector of the Department of Veterans Affairs (VA) studied the reliability of data collected by the VA's National Surgical Quality Improvement Program (NSQIP). The study focused on case selection bias, accuracy of reports on patients who died, and interrater reliability measurements of patient risk variables and outcomes. Surgical data from a sample of 15 VA medical centers were analyzed. For case selection bias, reviewers applied NSQIP criteria to include or exclude 2,460 patients from the database, comparing their results with those of NSQIP staff. For accurate reporting of patients who died, reviewers compared Social Security numbers of 10,444 NSQIP records with those found in the VA Beneficiary Identification and Records Locator Subsystem, VA Patient Treatment Files, and Social Security Administration death files. For measurement of interrater reliability, reviewers reabstracted 59 variables in each of 550 patient medical records that also were recorded in the NSQIP database. On case selection bias, the reviewers agreed with NSQIP decisions on 2,418 (98%) of the 2,460 cases. Computer record matching identified 4 more deaths than the NSQIP total of 198, a difference of about 2%. For 52 of the categorical variables, agreement, uncorrected for chance, was 96%. For 48 of 52 categorical variables, kappas ranged from 0.61 to 1.0 (substantial to almost perfect agreement); none of the variables had kappas of less than 0.20 (slight to poor agreement). This sample of medical centers shows adherence to criteria in selecting cases for the NSQIP database, for reporting deaths, and for collecting patient risk variables.

  12. Storing files in a parallel computing system using list-based index to identify replica files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value formore » one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.« less

  13. Design of housing file box of fire academy based on RFID

    NASA Astrophysics Data System (ADS)

    Li, Huaiyi

    2018-04-01

    This paper presents a design scheme of intelligent file box based on RFID. The advantages of RFID file box and traditional file box are compared and analyzed, and the feasibility of RFID file box design is analyzed based on the actual situation of our university. After introducing the shape and structure design of the intelligent file box, the paper discusses the working process of the file box, and explains in detail the internal communication principle of the RFID file box and the realization of the control system. The application of the RFID based file box will greatly improve the efficiency of our school's archives management.

  14. A design for a new catalog manager and associated file management for the Land Analysis System (LAS)

    NASA Technical Reports Server (NTRS)

    Greenhagen, Cheryl

    1986-01-01

    Due to the larger number of different types of files used in an image processing system, a mechanism for file management beyond the bounds of typical operating systems is necessary. The Transportable Applications Executive (TAE) Catalog Manager was written to meet this need. Land Analysis System (LAS) users at the EROS Data Center (EDC) encountered some problems in using the TAE catalog manager, including catalog corruption, networking difficulties, and lack of a reliable tape storage and retrieval capability. These problems, coupled with the complexity of the TAE catalog manager, led to the decision to design a new file management system for LAS, tailored to the needs of the EDC user community. This design effort, which addressed catalog management, label services, associated data management, and enhancements to LAS applications, is described. The new file management design will provide many benefits including improved system integration, increased flexibility, enhanced reliability, enhanced portability, improved performance, and improved maintainability.

  15. [Primary quality control in Israel Air Force clinics].

    PubMed

    Gilutz, H; Shamis, A; Ben-Amitay, D; Burger, A; Caine, Y G

    1994-05-15

    The practice of primary medicine within a military framework differs from that in the civilian environment in: accessibility, its consumers, obligations of the providers, involvement of the funder (the commanders), and ability to define and enforce professional guide lines. These differences influence the scope of medical service, as well as affect the methods and results of quality control. A system of quality control evaluation and feedback of military primary care in 16 Israel Air Force clinics was carried out by a team of experienced physicians using peer group review and according to a specially prepared protocol. Emphasis was placed on medical record assessment using obligatory markers of adequate medical evaluation and treatment. Identification of the population at risk, further medical training, and medical administration with a direct effect on the quality of medical treatment were also evaluated. 2 quality control surveys with feedback were carried out 6 months apart. The overall mean score was 81.66 +/- 7.16% at the first evaluation, increasing to 88.60 +/- 7.46% at the second (p < 0.01). The greatest improvements were in follow-up of population at risk (increasing from 68.4% to 86.4%, p < 0.025), training of medical teams, (from 75.7% to 87.5%, p < 0.05) and patient case management (from 79.4% to 85.1%, N.S.). Categories in which there was no improvement were medical records, recovery of old medical files and patient education. The categories in which there was improvement had a common denominator: "recognition of importance" and "provision of patterns" by headquarters. The quality control system was designed for routine use, and not as a research project.(ABSTRACT TRUNCATED AT 250 WORDS)

  16. The Added Value of Log File Analyses of the Use of a Personal Health Record for Patients With Type 2 Diabetes Mellitus

    PubMed Central

    Kelders, Saskia M.; Braakman-Jansen, Louise M. A.; van Gemert-Pijnen, Julia E. W. C.

    2014-01-01

    The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. PMID:24876574

  17. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  18. Using Clustering Strategies for Creating Authority Files.

    ERIC Educational Resources Information Center

    French, James C.; Powell, Allison L.; Schulman, Eric

    2000-01-01

    Discussion of quality control of data in online bibliographic databases focuses on authority files. Describes approximate string matching, introduces the concept of approximate word matching and clustering, and presents a case study using the Astrophysics Data System (ADS) that shows how to reduce human effort involved in authority work. (LRW)

  19. An efficient approach for video information retrieval

    NASA Astrophysics Data System (ADS)

    Dong, Daoguo; Xue, Xiangyang

    2005-01-01

    Today, more and more video information can be accessed through internet, satellite, etc.. Retrieving specific video information from large-scale video database has become an important and challenging research topic in the area of multimedia information retrieval. In this paper, we introduce a new and efficient index structure OVA-File, which is a variant of VA-File. In OVA-File, the approximations close to each other in data space are stored in close positions of the approximation file. The benefit is that only a part of approximations close to the query vector need to be visited to get the query result. Both shot query algorithm and video clip algorithm are proposed to support video information retrieval efficiently. The experimental results showed that the queries based on OVA-File were much faster than that based on VA-File with small loss of result quality.

  20. 77 FR 37751 - Representation Proceedings, Unfair Labor Practice Proceedings, and Miscellaneous and General...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-25

    ... initiative to make electronic filing, or ``eFiling,'' available to parties in all cases before the FLRA. Making eFiling available to its parties is another way in which the FLRA is using technology to improve... email: [email protected] . SUPPLEMENTARY INFORMATION: In the first stage of its eFiling initiative...

  1. Fast large-scale object retrieval with binary quantization

    NASA Astrophysics Data System (ADS)

    Zhou, Shifu; Zeng, Dan; Shen, Wei; Zhang, Zhijiang; Tian, Qi

    2015-11-01

    The objective of large-scale object retrieval systems is to search for images that contain the target object in an image database. Where state-of-the-art approaches rely on global image representations to conduct searches, we consider many boxes per image as candidates to search locally in a picture. In this paper, a feature quantization algorithm called binary quantization is proposed. In binary quantization, a scale-invariant feature transform (SIFT) feature is quantized into a descriptive and discriminative bit-vector, which allows itself to adapt to the classic inverted file structure for box indexing. The inverted file, which stores the bit-vector and box ID where the SIFT feature is located inside, is compact and can be loaded into the main memory for efficient box indexing. We evaluate our approach on available object retrieval datasets. Experimental results demonstrate that the proposed approach is fast and achieves excellent search quality. Therefore, the proposed approach is an improvement over state-of-the-art approaches for object retrieval.

  2. Spatial Allocator for air quality modeling

    EPA Pesticide Factsheets

    The Spatial Allocator is a set of tools that helps users manipulate and generate data files related to emissions and air quality modeling without requiring the use of a commercial Geographic Information System.

  3. Digital processing of radiographic images from PACS to publishing.

    PubMed

    Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R

    2001-03-01

    Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.

  4. Does the American College of Surgeons National Surgical Quality Improvement Program pediatric provide actionable quality improvement data for surgical neonates?

    PubMed

    Bucher, Brian T; Duggan, Eileen M; Grubb, Peter H; France, Daniel J; Lally, Kevin P; Blakely, Martin L

    2016-09-01

    The purpose of this project was to examine the American College of Surgeons National Surgical Quality Improvement Program Pediatric (ACSNSQIP-P) Participant Use File (PUF) to compare risk-adjusted outcomes of neonates versus other pediatric surgical patients. In the ACS-NSQIP-P 2012-2013 PUF, patients were classified as preterm neonate, term neonate, or nonneonate at the time of surgery. The primary outcomes were 30-day mortality and composite morbidity. Patient characteristics significantly associated with the primary outcomes were used to build a multivariate logistic regression model. The overall 30-day mortality rate for preterm neonates, term neonate, and nonneonates was 4.9%, 2.0%, 0.1%, respectively (p<0.0001). The overall 30-day morbidity rate for preterm neonates, term neonates, and nonneonates was 27.0%, 17.4%, 6.4%, respectively (p<0.0001). After adjustment for preoperative and operative risk factors, both preterm (adjusted odds ratio, 95% CI: 2.0, 1.4-3.0) and term neonates (aOR, 95% CI: 1.9, 1.2-3.1) had a significantly increased odds of 30-day mortality compared to nonneonates. Surgical neonates are a cohort who are particularity susceptible to postoperative morbidity and mortality after adjusting for preoperative and operative risk factors. Collaborative efforts focusing on surgical neonates are needed to understand the unique characteristics of this cohort and identify the areas where the morbidity and mortality can be improved. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Documentation to the NCES Common Core of Data Local Education Agency Universe Survey: School Year 2009-10, Version Provisional 2a. NCES 2011-349rev

    ERIC Educational Resources Information Center

    Keaton, Patrick; Sable, Jennifer; Liu, Fei

    2012-01-01

    This revised data file includes corrections that were provided to NCES as a result of a special collection effort designed to address data quality issues found in the 1a release of this file. In May 2012, NCES became aware of data errors for key data items for several schools on the published version of the SY 2009-10 school file; in some cases…

  6. 24 CFR 985.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... of the PHA's quality control sample is as follows: Universe Minimum number of files or recordsto be... universe is: the number of admissions in the last year for each of the two quality control samples under...

  7. 24 CFR 985.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... of the PHA's quality control sample is as follows: Universe Minimum number of files or recordsto be... universe is: the number of admissions in the last year for each of the two quality control samples under...

  8. 24 CFR 985.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of the PHA's quality control sample is as follows: Universe Minimum number of files or recordsto be... universe is: the number of admissions in the last year for each of the two quality control samples under...

  9. 77 FR 61572 - Malcolm Baldrige National Quality Award Panel of Judges

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    ... composed of twelve members prominent in the fields of quality, innovation, and performance management and.... Phillip Singerman, Associate Director for Innovation & Industry Services. [FR Doc. 2012-24915 Filed 10-9...

  10. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  11. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... effective management, safety, and proper performance of chest image acquisition, digitization, processing... digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object (e.g... radiographic image files from six or more sample chest radiographs that are of acceptable quality to one or...

  12. NCCA 2010 Water

    EPA Pesticide Factsheets

    Data from the National Aquatic Resource Surveys: The following data are available for download as comma separated values (.csv) files. Sort the table using the pull down menus or headers to more easily locate the data. Right click on the file name and select Save Link As to save the file to your computer. Make sure to also download the companion metadata file (.txt) for the list of field labels. See the survey technical document for more information on the data analyses.This dataset is associated with the following publications:Yurista , P., J. Kelly , and J. Scharold. Great Lakes nearshore-offshore: Distinct water quality regions. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 42: 375-385, (2016).Kelly , J., P. Yurista , M. Starry, J. Scharold , W. Bartsch , and A. Cotter. The first US National Coastal Condition Assessment survey in the Great Lakes: Development of the GIS frame and exploration of spatial variation in nearshore water quality results. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 41: 1060-1074, (2015).

  13. Navajo coal and air quality in Shiprock, New Mexico

    USGS Publications Warehouse

    Bunnell, Joseph E.; Garcia, Linda V.

    2006-01-01

    Among the Navajo people, high levels of respiratory disease, such as asthma, exist in a population with low rates of cigarette smoking. Air quality outdoors and indoors affects respiratory health. Many Navajo Nation residents burn locally mined coal in their homes for heat, as coal is the most economical energy source. The U.S. Geological Survey and Dine College, in cooperation with the Navajo Division of Health, are conducting a study in the Shiprock, New Mexico, area to determine if indoor use of this coal might be contributing to some of the respiratory health problems experienced by the residents. Researchers in this study will (1) examine respiratory health data, (2) identify stove type and use, (3) analyze samples of coal that are used locally, and (4) measure and characterize air quality inside selected homes. This Fact Sheet summarizes the interim results of the study in both English and Navajo. This Fact Sheet is available in three versions: * English [800-KB PDF file ] * Navajo [computer must have Navajo language fonts installed - 304-KB PDF file] * Image of the Navajo language version [19.8-MB PDF file

  14. Digitizing an Analog Radiography Teaching File Under Time Constraint: Trade-Offs in Efficiency and Image Quality.

    PubMed

    Loehfelm, Thomas W; Prater, Adam B; Debebe, Tequam; Sekhar, Aarti K

    2017-02-01

    We digitized the radiography teaching file at Black Lion Hospital (Addis Ababa, Ethiopia) during a recent trip, using a standard digital camera and a fluorescent light box. Our goal was to photograph every radiograph in the existing library while optimizing the final image size to the maximum resolution of a high quality tablet computer, preserving the contrast resolution of the radiographs, and minimizing total library file size. A secondary important goal was to minimize the cost and time required to take and process the images. Three workers were able to efficiently remove the radiographs from their storage folders, hang them on the light box, operate the camera, catalog the image, and repack the radiographs back to the storage folder. Zoom, focal length, and film speed were fixed, while aperture and shutter speed were manually adjusted for each image, allowing for efficiency and flexibility in image acquisition. Keeping zoom and focal length fixed, which kept the view box at the same relative position in all of the images acquired during a single photography session, allowed unused space to be batch-cropped, saving considerable time in post-processing, at the expense of final image resolution. We present an analysis of the trade-offs in workflow efficiency and final image quality, and demonstrate that a few people with minimal equipment can efficiently digitize a teaching file library.

  15. Usage analysis of user files in UNIX

    NASA Technical Reports Server (NTRS)

    Devarakonda, Murthy V.; Iyer, Ravishankar K.

    1987-01-01

    Presented is a user-oriented analysis of short term file usage in a 4.2 BSD UNIX environment. The key aspect of this analysis is a characterization of users and files, which is a departure from the traditional approach of analyzing file references. Two characterization measures are employed: accesses-per-byte (combining fraction of a file referenced and number of references) and file size. This new approach is shown to distinguish differences in files as well as users, which cam be used in efficient file system design, and in creating realistic test workloads for simulations. A multi-stage gamma distribution is shown to closely model the file usage measures. Even though overall file sharing is small, some files belonging to a bulletin board system are accessed by many users, simultaneously and otherwise. Over 50% of users referenced files owned by other users, and over 80% of all files were involved in such references. Based on the differences in files and users, suggestions to improve the system performance were also made.

  16. Dynamic Simulation on the Installation Process of HGIS in Transformer Substation

    NASA Astrophysics Data System (ADS)

    Lin, Tao; Li, Shaohua; Wang, Hu; Che, Deyong; Qi, Guangcai; Yao, Jianfeng; Zhang, Qingzhe

    The technological requirements of Hypid Gas Insulated Switchgear (HGIS) installation in transformer substation is high and the control points of quality is excessive. Most of the engineers and technicians in the construction enterprises are not familiar with equipments of HGIS. In order to solve these problem, equipments of HGIS is modeled on the computer by SolidWorks software. Installation process of civil foundation and closed-type equipments is optimized dynamically with virtual assemble technology. Announcements and application work are composited into animation file. Skills of modeling and simulation is tidied classify as well. The result of the visual dynamic simulation can instruct the actual construction process of HGIS to a certain degree and can promote reasonable construction planning and management. It can also improve the method and quality of staff training for electric power construction enterprises.

  17. An anaesthesia information management system as a tool for a quality assurance program: 10years of experience.

    PubMed

    Motamed, Cyrus; Bourgain, Jean Louis

    2016-06-01

    Anaesthesia Information Management Systems (AIMS) generate large amounts of data, which might be useful for quality assurance programs. This study was designed to highlight the multiple contributions of our AIMS system in extracting quality indicators over a period of 10years. The study was conducted from 2002 to 2011. Two methods were used to extract anaesthesia indicators: the manual extraction of individual files for monitoring neuromuscular relaxation and structured query language (SQL) extraction for other indicators which were postoperative nausea and vomiting (PONV), pain, sedation scores, pain-related medications, scores and postoperative hypothermia. For each indicator, a program of information/meetings and adaptation/suggestions for operating room and PACU personnel was initiated to improve quality assurance, while data were extracted each year. The study included 77,573 patients. The mean overall completeness of data for the initial years ranged from 55 to 85% and was indicator-dependent, which then improved to 95% completeness for the last 5years. The incidence of neuromuscular monitoring was initially 67% and then increased to 95% (P<0.05). The rate of pharmacological reversal remained around 53% throughout the study. Regarding SQL data, an improvement of severe postoperative pain and PONV scores was observed throughout the study, while mild postoperative hypothermia remained a challenge, despite efforts for improvement. The AIMS system permitted the follow-up of certain indicators through manual sampling and many more via SQL extraction in a sustained and non-time-consuming way across years. However, it requires competent and especially dedicated resources to handle the database. Copyright © 2016 Société française d'anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  18. Linking log files with dosimetric accuracy--A multi-institutional study on quality assurance of volumetric modulated arc therapy.

    PubMed

    Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar

    2015-12-01

    To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. 24 CFR 985.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of the PHA's quality control sample is as follows: Universe Minimum number of files or records to be... universe is: the number of admissions in the last year for each of the two quality control samples under...

  20. 24 CFR 985.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of the PHA's quality control sample is as follows: Universe Minimum number of files or records to be... universe is: the number of admissions in the last year for each of the two quality control samples under...

  1. Formation of qualified BaHfO3 doped Y0.5Gd0.5Ba2Cu3O7-δ film on CeO2 buffered IBAD-MgO tape by self-seeding pulsed laser deposition

    NASA Astrophysics Data System (ADS)

    Liu, Linfei; Wang, Wei; Yao, Yanjie; Wu, Xiang; Lu, Saidan; Li, Yijie

    2018-05-01

    Improvement in the in-filed transport properties of REBa2Cu3O7-δ (RE = rare earth elements, REBCO) coated conductor is needed to meet the performance requirements for various practical applications, which can be accomplished by introducing artificial pinning centers (APCs), such as second phase dopant. However, with increasing dopant level the critical current density Jc at 77 K in zero applied magnetic field decreases. In this paper, in order to improve Jc we propose a seed layer technique. 5 mol% BaHfO3 (BHO) doped Y0.5Gd0.5Ba2Cu3O7-δ (YGBCO) epilayer with an inserted seed layer was grown on CeO2 buffered ion beam assisted deposition MgO (IBAD-MgO) tape by pulsed laser deposition. The effect of the conditions employed to prepare the seed layer, including tape moving speed and chemical composition, on the quality of 5 mol% BHO doped YGBCO epilayer was systematically investigated by X-ray diffraction (XRD) measurements and scanning electron microscopy (SEM) observations. It was found that all the samples with seed layer have higher Jc (77 K, self-field) than the 5 mol% BHO doped YGBCO film without seed layer. The seed layer could inhibit deterioration of the Jc at 77 K and self-filed. Especially, the self-seed layer (5 mol% BHO doped YGBCO seed layer) was more effective in improving the crystal quality, surface morphology and superconducting performance. At 4.2 K, the 5 mol% BHO doped YGBCO film with 4 nm thick self-seed layer had a very high flux pinning force density Fp of 860 GN/m3 for B//c under a 9 T field, and more importantly, the peak of the Fp curve was not observed.

  2. Plate-based diversity subset screening generation 2: an improved paradigm for high-throughput screening of large compound files.

    PubMed

    Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Loesel, Jens; McLoughlin, David; Mills, James; Peakman, Marie-Claire; Sharp, Robert E; Williams, Christine; Zhu, Hongyao

    2016-11-01

    High-throughput screening (HTS) is an effective method for lead and probe discovery that is widely used in industry and academia to identify novel chemical matter and to initiate the drug discovery process. However, HTS can be time consuming and costly and the use of subsets as an efficient alternative to screening entire compound collections has been investigated. Subsets may be selected on the basis of chemical diversity, molecular properties, biological activity diversity or biological target focus. Previously, we described a novel form of subset screening: plate-based diversity subset (PBDS) screening, in which the screening subset is constructed by plate selection (rather than individual compound cherry-picking), using algorithms that select for compound quality and chemical diversity on a plate basis. In this paper, we describe a second-generation approach to the construction of an updated subset: PBDS2, using both plate and individual compound selection, that has an improved coverage of the chemical space of the screening file, whilst only selecting the same number of plates for screening. We describe the validation of PBDS2 and its successful use in hit and lead discovery. PBDS2 screening became the default mode of singleton (one compound per well) HTS for lead discovery in Pfizer.

  3. Water-quality, streamflow, and meteorological data for the Tualatin River Basin, Oregon, 1991-93

    USGS Publications Warehouse

    Doyle, M.C.; Caldwell, J.M.

    1996-01-01

    Surface-water-quality data, ground-water-quality data, streamflow data, field measurements, aquatic-biology data, meteorological data, and quality-assurance data were collected in the Tualatin River Basin from 1991 to 1993 by the U.S. Geological Survey (USGS) and the Unified Sewerage Agency of Washington County, Oregon (USA). The data from that study, which are part of this report, are presented in American Standard Code for Information Interchange (ASCII) format in subject-specific data files on a Compact Disk-Read Only Memory (CD-ROM). The text of this report describes the objectives of the study, the location of sampling sites, sample-collection and processing techniques, equipment used, laboratory analytical methods, and quality-assurance procedures. The data files on CD-ROM contain the analytical results of water samples collected in the Tualatin River Basin, streamflow measurements of the main-stem Tualatin River and its major tributaries, flow data from the USA wastewater-treatment plants, flow data from stations that divert water from the main-stem Tualatin River, aquatic-biology data, and meteorological data from the Tualatin Valley Irrigation District (TVID) Agrimet Weather Station located in Verboort, Oregon. Specific information regarding the contents of each data file is given in the text. The data files use a series of letter codes that distinguish each line of data. These codes are defined in data tables accompanying the text. Presenting data on CD-ROM offers several advantages: (1) the data can be accessed easily and manipulated by computers, (2) the data can be distributed readily over computer networks, and (3) the data may be more easily transported and stored than a large printed report. These data have been used by the USGS to (1) identify the sources, transport, and fate of nutrients in the Tualatin River Basin, (2) quantify relations among nutrient loads, algal growth, low dissolved-oxygen concentrations, and high pH, and (3) develop and calibrate a water- quality model that allows managers to test options for alleviating water-quality problems.

  4. State-Level Cancer Quality Assessment and Research

    PubMed Central

    Lipscomb, Joseph; Gillespie, Theresa W.

    2016-01-01

    Over a decade ago, the Institute of Medicine called for a national cancer data system in the United States to support quality-of-care assessment and improvement, including research on effective interventions. Although considerable progress has been achieved in cancer quality measurement and effectiveness research, the nation still lacks a population-based data infrastructure for accurately identifying cancer patients and tracking services and outcomes over time. For compelling reasons, the most effective pathway forward may be the development of state-level cancer data systems, in which central registry data are linked to multiple public and private secondary sources. These would include administrative/claims files from Medicare, Medicaid, and private insurers. Moreover, such a state-level system would promote rapid learning by encouraging adoption of near-real-time reporting and feedback systems, such as the Commission on Cancer’s new Rapid Quality Reporting System. The groundwork for such a system is being laid in the state of Georgia, and similar work is advancing in other states. The pace of progress depends on the successful resolution of issues related to the application of information technology, financing, and governance. PMID:21799333

  5. Criteria to Extract High-Quality Protein Data Bank Subsets for Structure Users.

    PubMed

    Carugo, Oliviero; Djinović-Carugo, Kristina

    2016-01-01

    It is often necessary to build subsets of the Protein Data Bank to extract structural trends and average values. For this purpose it is mandatory that the subsets are non-redundant and of high quality. The first problem can be solved relatively easily at the sequence level or at the structural level. The second, on the contrary, needs special attention. It is not sufficient, in fact, to consider the crystallographic resolution and other feature must be taken into account: the absence of strings of residues from the electron density maps and from the files deposited in the Protein Data Bank; the B-factor values; the appropriate validation of the structural models; the quality of the electron density maps, which is not uniform; and the temperature of the diffraction experiments. More stringent criteria produce smaller subsets, which can be enlarged with more tolerant selection criteria. The incessant growth of the Protein Data Bank and especially of the number of high-resolution structures is allowing the use of more stringent selection criteria, with a consequent improvement of the quality of the subsets of the Protein Data Bank.

  6. Presentation of a quality management program in off-pump coronary bypass surgery.

    PubMed

    Bougioukakis, Petros; Kluegl, Stefan J; Babin-Ebell, Joerg; Tagarakis, Giorgios I; Mandewirth, Martin; Zacher, Michael; Diegeler, Anno

    2014-01-01

    To increase the number of off-pump coronary procedures at our institution, a new surgical team was formed. The first 3 years of "learning period" were accompanied by a quality management program aimed to control and adjust the surgical process and to ensure the safety and quality of the procedure. All patients were operated on by the same surgeon between January 2004 and December 2006; all procedures were performed under the following quality management protocol. First, a flow chart regulated surgical and anesthetic details. Second, an online file, named "disturbance file," was used to report work flow interruption, disturbance, and intraoperative events, that is, myocardial ischemia, hypotension, conversion to cardiopulmonary bypass, and any violation of the protocol. Each event was coded with 1 point and added to a score (the higher the score is, the greater the disturbance). Outcome parameters known as major events-major cardiac and cerebral events: mortality within 30 days/myocardial infarction confirmed by electrocardiogram or significantly high levels of total creatine kinase-myocardial muscle creatine kinase/reintervention within 30 days/stroke--and new-onset dialysis were also measured. Success was defined as freedom from any of those events and depicted in a cumulative sum control (CUSUM) chart. Outcome data and CUSUM were correlated with the intraoperative Disturbance Index. In total, 490 off-pump coronary bypass operations were performed by the named surgeon during the study period. The 30-day mortality was reduced from 4.0% to 1.9%. Disturbance Index score of greater than 1 declined from 41.6% to 23.3%. All major cardiac and cerebral events declined. The CUSUM chart showed two critical periods during the learning period, which made an adjustment of the protocol necessary. Quality management control is efficient in improving the postoperative results of a surgical procedure. A learning period is of cardinal importance for any new team wishing to engage in a novel surgical technique.

  7. ASSOCIATIVE ADJUSTMENTS TO REDUCE ERRORS IN DOCUMENT SEARCHING.

    ERIC Educational Resources Information Center

    BRYANT, EDWARD C.; AND OTHERS

    ASSOCIATIVE ADJUSTMENTS TO A DOCUMENT FILE ARE CONSIDERED AS A MEANS FOR IMPROVING RETRIEVAL. A THEORETICAL INVESTIGATION OF THE STATISTICAL PROPERTIES OF A GENERALIZED MISMATCH MEASURE WAS CARRIED OUT AND IMPROVEMENTS IN RETRIEVAL RESULTING FROM PERFORMING ASSOCIATIVE REGRESSION ADJUSTMENTS ON DATA FILE WERE EXAMINED BOTH FROM THE THEORETICAL AND…

  8. 76 FR 82016 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... with additional opportunities for price improvement. B. Self-Regulatory Organization's Statement on... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-66038; File No. SR-CBOE-2011-117] Self... Change Relating to Its Automated Improvement Mechanism December 22, 2011. Pursuant to Section 19(b)(1) of...

  9. Testing the hospital value proposition: an empirical analysis of efficiency and quality.

    PubMed

    Huerta, Timothy R; Ford, Eric W; Peterson, Lori T; Brigham, Keith H

    2008-01-01

    To assess the relationship between hospitals' X-inefficiency levels and overall care quality based on the National Quality Forum's 27 safe practices score and to improve the analytic strategy for assessing X-inefficiency. The 2005 versions of the American Hospital Association and Leapfrog Group's annual surveys were the basis of the study. Additional case mix indices and market variables were drawn from the Centers for Medicare and Medicaid Services data sources and the Area Resource File. Data envelopment analysis was used to determine hospitals' X-inefficiency scores relative to their market-level competitors. Regression was used to assess the relationship between X-inefficiency and quality, controlling for organizational and market characteristics. Expenses (total and labor expenditures), case-mix-adjusted admissions, length of stay, and licensed beds defined the X-inefficiency function. The overall National Quality Forum's safe practice score, health maintenance organization penetration, market share, and teaching status served as independent control variables in the regression. The National Quality Forum's safe practice scores are significantly and positively correlated to hospital X-inefficiency levels (beta = .105, p < or = .05). The analysis of the value proposition had very good explanatory power (adjusted R(2) = .414; p < or = .001; df = 7, 265). Contrary to earlier findings, health maintenance organization penetration and being a teaching hospital were positively related to X-inefficiency. Similar with others' findings, greater market share and for-profit ownership were negatively associated with X-inefficiency. Measurement of overall hospital quality is improving but can still be made better. Nevertheless, the National Quality Forum's measure is significantly related to efficiency and could be used to create differential pay-for-performance programs. A market-segmented analytic strategy for studying hospitals' efficiency yields results with a high degree of explanatory power.

  10. Can Real-Time Data Also Be Climate Quality?

    NASA Astrophysics Data System (ADS)

    Brewer, M.; Wentz, F. J.

    2015-12-01

    GMI, AMSR-2 and WindSat herald a new era of highly accurate and timely microwave data products. Traditionally, there has been a large divide between real-time and re-analysis data products. What if these completely separate processing systems could be merged? Through advanced modeling and physically based algorithms, Remote Sensing Systems (RSS) has narrowed the gap between real-time and research-quality. Satellite microwave ocean products have proven useful for a wide array of timely Earth science applications. Through cloud SST capabilities have enormously benefited tropical cyclone forecasting and day to day fisheries management, to name a few. Oceanic wind vectors enhance operational safety of shipping and recreational boating. Atmospheric rivers are of import to many human endeavors, as are cloud cover and knowledge of precipitation events. Some activities benefit from both climate and real-time operational data used in conjunction. RSS has been consistently improving microwave Earth Science Data Records (ESDRs) for several decades, while making near real-time data publicly available for semi-operational use. These data streams have often been produced in 2 stages: near real-time, followed by research quality final files. Over the years, we have seen this time delay shrink from months or weeks to mere hours. As well, we have seen the quality of near real-time data improve to the point where the distinction starts to blur. We continue to work towards better and faster RFI filtering, adaptive algorithms and improved real-time validation statistics for earlier detection of problems. Can it be possible to produce climate quality data in real-time, and what would the advantages be? We will try to answer these questions…

  11. Challenges in data quality: the influence of data quality assessments on data availability and completeness in a voluntary medical male circumcision programme in Zimbabwe

    PubMed Central

    Xiao, Y; Bochner, A F; Makunike, B; Holec, M; Xaba, S; Tshimanga, M; Chitimbire, V; Barnhart, S; Feldacker, C

    2017-01-01

    Objectives To assess availability and completeness of data collected before and after a data quality audit (DQA) in voluntary medical male circumcision (VMMC) sites in Zimbabwe to determine the effect of this process on data quality. Setting 4 of 10 VMMC sites in Zimbabwe that received a DQA in February, 2015 selected by convenience sampling. Participants Retrospective reviews of all client intake forms (CIFs) from November, 2014 and May, 2015. A total of 1400 CIFs were included from those 2 months across four sites. Primary and secondary outcomes Data availability was measured as the percentage of VMMC clients whose CIF was on file at each site. A data evaluation tool measured the completeness of 34 key CIF variables. A comparison of pre-DQA and post-DQA results was conducted using χ2 and t-tests. Results After the DQA, high record availability of over 98% was maintained by sites 3 and 4. For sites 1 and 2, record availability increased by 8.0% (p=0.001) and 9.7% (p=0.02), respectively. After the DQA, sites 1, 2 and 3 improved significantly in data completeness across 34 key indicators, increasing by 8.6% (p<0.001), 2.7% (p=0.003) and 3.8% (p<0.001), respectively. For site 4, CIF data completeness decreased by 1.7% (p<0.01) after the DQA. Conclusions Our findings suggest that CIF data availability and completeness generally improved after the DQA. However, gaps in documentation of vital signs and adverse events signal areas for improvement. Additional emphasis on data completeness would help support high-quality programme implementation and availability of reliable data for decision-making. PMID:28132009

  12. File concepts for parallel I/O

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1989-01-01

    The subject of input/output (I/O) was often neglected in the design of parallel computer systems, although for many problems I/O rates will limit the speedup attainable. The I/O problem is addressed by considering the role of files in parallel systems. The notion of parallel files is introduced. Parallel files provide for concurrent access by multiple processes, and utilize parallelism in the I/O system to improve performance. Parallel files can also be used conventionally by sequential programs. A set of standard parallel file organizations is proposed, organizations are suggested, using multiple storage devices. Problem areas are also identified and discussed.

  13. Improving the quality of palliative care for ambulatory patients with lung cancer

    PubMed Central

    von Plessen, Christian; Aslaksen, Aslak

    2005-01-01

    Problem Most patients with advanced lung cancer currently receive much of their health care, including chemotherapy, as outpatients. Patients have to deal with the complex and time consuming logistics of ambulatory cancer care. At the same time, members of staff often waste considerable time and energy in organisational aspects of care that could be better used in direct interaction with patients. Design Quality improvement study using direct observation and run and flow charts, and focus group meetings with patients and families regarding perceptions of the clinic and with staff regarding satisfaction with working conditions. Setting Thoracic oncology outpatient clinic at a Norwegian university hospital where patients receive chemotherapy and complementary palliative care. Key measures for improvement Waiting time and time wasted during consultations; calmer working situation at the clinic; satisfaction among patients. Strategies for change Rescheduled patients' appointments, automated retrieval of blood test results, systematic reporting in patients' files, design of an information leaflet, and refurnishing of the waiting area at the clinic. Effects of change Interventions resulted in increased satisfaction for patients and staff, reduced waiting time, and reduced variability of waiting time. Lessons learnt Direct observation, focus groups, questionnaires on patients' satisfaction, and measurement of process time were useful in systematically improving care in this outpatient clinic. The description of this experience can serve as an example for the improvement of a microsystem, particularly in other settings with similar problems. PMID:15933354

  14. Digital surveying and mapping of forest road network for development of a GIS tool for the effective protection and management of natural ecosystems

    NASA Astrophysics Data System (ADS)

    Drosos, Vasileios C.; Liampas, Sarantis-Aggelos G.; Doukas, Aristotelis-Kosmas G.

    2014-08-01

    In our time, the Geographic Information Systems (GIS) have become important tools, not only in the geosciences and environmental sciences, as well as virtually for all researches that require monitoring, planning or land management. The purpose of this paper was to develop a planning tool and decision making tool using AutoCAD Map software, ArcGIS and Google Earth with emphasis on the investigation of the suitability of forest roads' mapping and the range of its implementation in Greece in prefecture level. Integrating spatial information into a database makes data available throughout the organization; improving quality, productivity, and data management. Also working in such an environment, you can: Access and edit information, integrate and analyze data and communicate effectively. To select desirable information such as forest road network in a very early stage in the planning of silviculture operations, for example before the planning of the harvest is carried out. The software programs that were used were AutoCAD Map for the export in shape files for the GPS data, and ArcGIS in shape files (ArcGlobe), while Google Earth with KML files (Keyhole Markup Language) in order to better visualize and evaluate existing conditions, design in a real-world context and exchange information with government agencies, utilities, and contractors in both CAD and GIS data formats. The automation of the updating procedure and transfer of any files between agencies-departments is one of the main tasks of the integrated GIS-tool among the others should be addressed.

  15. Annual Quality Assurance Conference Files by Nicola Watson and Rui Li

    EPA Pesticide Factsheets

    26th Annual Quality Assurance Conference. Abstract: An Innovative Water Management Device for Online and Canister-based Thermal Desorption of Trace-level VVOCs in High Humidity Ambient Air by Nicola Watson and Rui Li

  16. 76 FR 60806 - Malcolm Baldrige National Quality Award Panel of Judges

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... composed of twelve members prominent in the fields of quality, innovation, and performance management and... Innovation & Industry Services. [FR Doc. 2011-25261 Filed 9-29-11; 8:45 am] BILLING CODE 3510-13-P ...

  17. Improving transmission efficiency of large sequence alignment/map (SAM) files.

    PubMed

    Sakib, Muhammad Nazmus; Tang, Jijun; Zheng, W Jim; Huang, Chin-Tser

    2011-01-01

    Research in bioinformatics primarily involves collection and analysis of a large volume of genomic data. Naturally, it demands efficient storage and transfer of this huge amount of data. In recent years, some research has been done to find efficient compression algorithms to reduce the size of various sequencing data. One way to improve the transmission time of large files is to apply a maximum lossless compression on them. In this paper, we present SAMZIP, a specialized encoding scheme, for sequence alignment data in SAM (Sequence Alignment/Map) format, which improves the compression ratio of existing compression tools available. In order to achieve this, we exploit the prior knowledge of the file format and specifications. Our experimental results show that our encoding scheme improves compression ratio, thereby reducing overall transmission time significantly.

  18. SILVA tree viewer: interactive web browsing of the SILVA phylogenetic guide trees.

    PubMed

    Beccati, Alan; Gerken, Jan; Quast, Christian; Yilmaz, Pelin; Glöckner, Frank Oliver

    2017-09-30

    Phylogenetic trees are an important tool to study the evolutionary relationships among organisms. The huge amount of available taxa poses difficulties in their interactive visualization. This hampers the interaction with the users to provide feedback for the further improvement of the taxonomic framework. The SILVA Tree Viewer is a web application designed for visualizing large phylogenetic trees without requiring the download of any software tool or data files. The SILVA Tree Viewer is based on Web Geographic Information Systems (Web-GIS) technology with a PostgreSQL backend. It enables zoom and pan functionalities similar to Google Maps. The SILVA Tree Viewer enables access to two phylogenetic (guide) trees provided by the SILVA database: the SSU Ref NR99 inferred from high-quality, full-length small subunit sequences, clustered at 99% sequence identity and the LSU Ref inferred from high-quality, full-length large subunit sequences. The Tree Viewer provides tree navigation, search and browse tools as well as an interactive feedback system to collect any kinds of requests ranging from taxonomy to data curation and improving the tool itself.

  19. EuCliD--a medical registry.

    PubMed

    Steil, H; Amato, C; Carioni, C; Kirchgessner, J; Marcelli, D; Mitteregger, A; Moscardo, V; Orlandini, G; Gatti, E

    2004-01-01

    The European Clinical Database EuCliD small star, filled has been developed as a tool for supervising selected quality indicators of about 200 European dialysis centers. Major efforts had to be made to comply with local and European laws regarding data security. EuCliD is a Lotus Notes based flat-file database currently containing medical data of more than 14,000 dialysis patients from 10 European countries. Another 15,000 patients from 150 centers in 4 South-American countries will be added soon. Data are entered either manually or by means of interfaces to existing local data managing systems. This information is transferred to a central Lotus Notes Server. Data evaluation was performed with statistical tools like SPSS. EuCliD is used as a part of the CQI (Continuous Quality Improvement) management system of Fresenius Medical Care (FMC) dialysis units. Each participating dialysis center receives (currently every half year) benchmarking reports at a regular interval. The benchmark for all quality parameters is the weighted mean of the corresponding data of all centers. An obvious impact of data sampling and data evaluation on the quality of the treatments could be observed within the first one and a half years of working with EuCliD. This also concerns important outcome predictors like Kt/V and hemoglobin concentration as the outcome itself expressed in hospitalization days and survival rates. With the help of EuCliD the user is able to sample clinical data, identify problems, search for solutions with the aim of improving the dialysis treatment quality and guarantee a high-class treatment quality for all patients.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sublet, J.-Ch.; Koning, A.J.; Forrest, R.A.

    The reasons for the conversion of the European Activation File, EAF into ENDF-6 format are threefold. First, it significantly enhances the JEFF-3.0 release by the addition of an activation file. Second, to considerably increase its usage by using a recognized, official file format, allowing existing plug-in processes to be effective; and third, to move towards a universal nuclear data file in contrast to the current separate general and special-purpose files. The format chosen for the JEFF-3.0/A file uses reaction cross sections (MF-3), cross sections (MF-10), and multiplicities (MF-9). Having the data in ENDF-6 format allows the ENDF suite of utilitiesmore » and checker codes to be used alongside many other utility, visualizing, and processing codes. It is based on the EAF activation file used for many applications from fission to fusion, including dosimetry, inventories, depletion-transmutation, and geophysics. JEFF-3.0/A takes advantage of four generations of EAF files. Extensive benchmarking activities on these files provide feedback and validation with integral measurements. These, in parallel with a detailed graphical analysis based on EXFOR, have been applied stimulating new measurements, significantly increasing the quality of this activation file. The next step is to include the EAF uncertainty data for all channels into JEFF-3.0/A.« less

  1. Nursing home quality and financial performance: does the racial composition of residents matter?

    PubMed

    Chisholm, Latarsha; Weech-Maldonado, Robert; Laberge, Alex; Lin, Feng-Chang; Hyer, Kathryn

    2013-12-01

    To examine the effects of the racial composition of residents on nursing homes' financial and quality performance. The study examined Medicare and Medicaid-certified nursing homes across the United States that submitted Medicare cost reports between the years 1999 and 2004 (11,472 average per year). Data were obtained from the Minimum Data Set, the On-Line Survey Certification and Reporting, Medicare Cost Reports, and the Area Resource File. Panel data regression with random intercepts and negative binomial regression were conducted with state and year fixed effects. Financial and quality performance differed between nursing homes with high proportions of black residents and nursing homes with no or medium proportions of black residents. Nursing homes with no black residents had higher revenues and higher operating margins and total profit margins and they exhibited better processes and outcomes than nursing homes with high proportions of black residents. Nursing homes' financial viability and quality of care are influenced by the racial composition of residents. Policy makers should consider initiatives to improve both the financial and quality performance of nursing homes serving predominantly black residents. © Health Research and Educational Trust.

  2. Conducting remote bioanalytical data monitoring and review based on scientific quality objectives.

    PubMed

    He, Ling

    2011-07-01

    For bioanalytical laboratories that follow GLP regulations and generate data for new drug filing, ensuring quality standards set by regulatory guidance is a fundamental expectation. Numerous guidelines and White Papers have been published by regulatory agencies, professional working groups and field experts in the past two decades, and have significantly improved the standards of good practices for bioanalysis. From a sponsor's perspective, continuous quality monitoring of the data generated by CRO laboratories, identifying adverse trends and taking corrective and preventative actions against issues encountered, are critical aspects of effective bioanalytical outsourcing management. This is especially important for clinical bioanalysis, where one validated assay is applied for analyzing a large number of samples of diverse demographics and disease states. This perspective article presents thoughts toward remote data monitoring and its merits for scientific quality oversight, and introduces a novel Bioanalytical Data Review software that was custom-developed and platform-neural, to conduct remote data monitoring on raw or processed LC-MS/MS data from CROs. Flexible, adaptive and user-customizable queries are applied for conducting project-, batch- and sample-level data review based on scientific quality performance factors commonly assessed for good bioanalytical practice.

  3. Smartfiles: An OO approach to data file interoperability

    NASA Technical Reports Server (NTRS)

    Haines, Matthew; Mehrotra, Piyush; Vanrosendale, John

    1995-01-01

    Data files for scientific and engineering codes typically consist of a series of raw data values whose descriptions are buried in the programs that interact with these files. In this situation, making even minor changes in the file structure or sharing files between programs (interoperability) can only be done after careful examination of the data file and the I/O statement of the programs interacting with this file. In short, scientific data files lack self-description, and other self-describing data techniques are not always appropriate or useful for scientific data files. By applying an object-oriented methodology to data files, we can add the intelligence required to improve data interoperability and provide an elegant mechanism for supporting complex, evolving, or multidisciplinary applications, while still supporting legacy codes. As a result, scientists and engineers should be able to share datasets with far greater ease, simplifying multidisciplinary applications and greatly facilitating remote collaboration between scientists.

  4. Appalachian Basin Play Fairway Analysis: Thermal Quality Analysis in Low-Temperature Geothermal Play Fairway Analysis (GPFA-AB

    DOE Data Explorer

    Teresa E. Jordan

    2015-11-15

    This collection of files are part of a larger dataset uploaded in support of Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB, DOE Project DE-EE0006726). Phase 1 of the GPFA-AB project identified potential Geothermal Play Fairways within the Appalachian basin of Pennsylvania, West Virginia and New York. This was accomplished through analysis of 4 key criteria or ‘risks’: thermal quality, natural reservoir productivity, risk of seismicity, and heat utilization. Each of these analyses represent a distinct project task, with the fifth task encompassing combination of the 4 risks factors. Supporting data for all five tasks has been uploaded into the Geothermal Data Repository node of the National Geothermal Data System (NGDS). This submission comprises the data for Thermal Quality Analysis (project task 1) and includes all of the necessary shapefiles, rasters, datasets, code, and references to code repositories that were used to create the thermal resource and risk factor maps as part of the GPFA-AB project. The identified Geothermal Play Fairways are also provided with the larger dataset. Figures (.png) are provided as examples of the shapefiles and rasters. The regional standardized 1 square km grid used in the project is also provided as points (cell centers), polygons, and as a raster. Two ArcGIS toolboxes are available: 1) RegionalGridModels.tbx for creating resource and risk factor maps on the standardized grid, and 2) ThermalRiskFactorModels.tbx for use in making the thermal resource maps and cross sections. These toolboxes contain “item description” documentation for each model within the toolbox, and for the toolbox itself. This submission also contains three R scripts: 1) AddNewSeisFields.R to add seismic risk data to attribute tables of seismic risk, 2) StratifiedKrigingInterpolation.R for the interpolations used in the thermal resource analysis, and 3) LeaveOneOutCrossValidation.R for the cross validations used in the thermal interpolations. Some file descriptions make reference to various 'memos'. These are contained within the final report submitted October 16, 2015. Each zipped file in the submission contains an 'about' document describing the full Thermal Quality Analysis content available, along with key sources, authors, citation, use guidelines, and assumptions, with the specific file(s) contained within the .zip file highlighted.

  5. Meaningful Peer Review in Radiology: A Review of Current Practices and Potential Future Directions.

    PubMed

    Moriarity, Andrew K; Hawkins, C Matthew; Geis, J Raymond; Dreyer, Keith J; Kamer, Aaron P; Khandheria, Paras; Morey, Jose; Whitfill, James; Wiggins, Richard H; Itri, Jason N

    2016-12-01

    The current practice of peer review within radiology is well developed and widely implemented compared with other medical specialties. However, there are many factors that limit current peer review practices from reducing diagnostic errors and improving patient care. The development of "meaningful peer review" requires a transition away from compliance toward quality improvement, whereby the information and insights gained facilitate education and drive systematic improvements that reduce the frequency and impact of diagnostic error. The next generation of peer review requires significant improvements in IT functionality and integration, enabling features such as anonymization, adjudication by multiple specialists, categorization and analysis of errors, tracking, feedback, and easy export into teaching files and other media that require strong partnerships with vendors. In this article, the authors assess various peer review practices, with focused discussion on current limitations and future needs for meaningful peer review in radiology. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  6. Use of audit, feedback and education increased guideline implementation in a multidisciplinary stroke unit.

    PubMed

    Vratsistas-Curto, Angela; McCluskey, Annie; Schurr, Karl

    2017-01-01

    The audit-feedback cycle is a behaviour change intervention used to reduce evidence-practice gaps. In this study, repeat audits, feedback, education and training were used to change practice and increase compliance with Australian guideline recommendations for stroke rehabilitation. To increase the proportion of patients with stroke receiving best practice screening, assessment and treatment. A before-and-after study design was used. Data were collected from medical records (n=15 files per audit). Four audits were conducted between 2009 and 2013. Consecutive files of patients with stroke admitted to the stroke unit were selected and audited retrospectively. Staff behaviour change interventions included four cycles of audit feedback, and education to assist staff with change. The primary outcome measure was the proportion of eligible patients receiving best practice against target behaviours, based on audit data. Between the first and fourth audit (2009 and 2013), 20 of the 27 areas targeted (74%) met or exceeded the minimum target of 10% change. Practice areas that showed the most change included sensation screening (+75%) and rehabilitation (+100%); neglect screening (+92%) and assessment (100%). Some target behaviours showed a drop in compliance such as anxiety and depression screening (-27%) or little or no overall improvement such as patient education about stroke (6% change). Audit feedback and education increased the proportion of inpatients with stroke receiving best practice rehabilitation in some, but not all practice areas. An ongoing process of quality improvement is needed to help sustain these improvements.

  7. 76 FR 49824 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-11

    ... competition and affords the opportunity for price improvement to more options contracts. B. Self-Regulatory... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65043; File No. SR-Phlx-2011-104] Self... Change Relating to the Extension of a Pilot Program Regarding Price Improvement XL August 5, 2011...

  8. 77 FR 1098 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-09

    ... price improvement auctions to occur on C2. B. Self-Regulatory Organization's Statement on Burden on... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-66075; File No. SR-C2-2011-042] Self-Regulatory... Rule Change Related to the Exchange's Automated Improvement Mechanisms December 30, 2011. Pursuant to...

  9. A pediatric death audit in a large referral hospital in Malawi.

    PubMed

    Fitzgerald, Elizabeth; Mlotha-Mitole, Rachel; Ciccone, Emily J; Tilly, Alyssa E; Montijo, Jennie M; Lang, Hans-Joerg; Eckerle, Michelle

    2018-02-21

    Death audits have been used to describe pediatric mortality in under-resourced settings, where record keeping is often a challenge. This information provides the cornerstone for the foundation of quality improvement initiatives. Malawi, located in sub-Saharan Africa, currently has an Under-5 mortality rate of 64/1000. Kamuzu Central Hospital, in the capital city Lilongwe, is a busy government referral hospital, which admits up to 3000 children per month. A study published in 2013 reported mortality rates as high as 9%. This is the first known audit of pediatric death files conducted at this hospital. A retrospective chart review on all pediatric deaths that occurred at Kamuzu Central Hospital (excluding deaths in the neonatal nursery) during a 13-month period was done using a standardized death audit form. A descriptive analysis was completed, including patient demographics, HIV and nutritional status, and cause of death. Modifiable factors were identified that may have contributed to mortality, including a lack of vital sign collection, poor documentation, and delays in the procurement or results of tests, studies, and specialist review. Seven hundred forty three total pediatric deaths were recorded and 700 deceased patient files were reviewed. The mortality rate by month ranged from a low of 2.2% to a high of 4.4%. Forty-four percent of deaths occurred within the first 24 h of admission, and 59% occurred within the first 48 h. The most common causes of death were malaria, malnutrition, HIV-related illnesses, and sepsis. The mortality rate for this pediatric referral center has dramatically decreased in the 6 years since the last published mortality data, but remains high. Areas identified for continued development include improved record keeping, improved patient assessment and monitoring, and more timely and reliable provision of testing and treatment. This study demonstrates that in low-resource settings, where reliable record keeping is often difficult, death audits are useful tools to describe the sickest patient population and determine factors possibly contributing to mortality that may be amenable to quality improvement interventions.

  10. 77 FR 43237 - Malcolm Baldrige National Quality Award Panel of Judges

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... prominent in the fields of quality, innovation, and performance management and appointed by the Secretary of... Director for Innovation & Industry Services. [FR Doc. 2012-18068 Filed 7-23-12; 8:45 am] BILLING CODE 3510...

  11. Rural hospital information technology implementation for safety and quality improvement: lessons learned.

    PubMed

    Tietze, Mari F; Williams, Josie; Galimbertti, Marisa

    2009-01-01

    This grant involved a hospital collaborative for excellence using information technology over 3-year period. The project activities focused on the improvement of patient care safety and quality in Southern rural and small community hospitals through the use of technology and education. The technology component of the design involved the implementation of a Web-based business analytic tool that allows hospitals to view data, create reports, and analyze their safety and quality data. Through a preimplementation and postimplementation comparative design, the focus of the implementation team was twofold: to recruit participant hospitals and to implement the technology at each of the 66 hospital sites. Rural hospitals were defined as acute care hospitals located in a county with a population of less than 100 000 or a state-administered Critical Access Hospital, making the total study population target 188 hospitals. Lessons learned during the information technology implementation of these hospitals are reflective of the unique culture, financial characteristics, organizational structure, and technology architecture of rural hospitals. Specific steps such as recruitment, information technology assessment, conference calls for project planning, data file extraction and transfer, technology training, use of e-mail, use of telephones, personnel management, and engaging information technology vendors were found to vary greatly among hospitals.

  12. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  13. Comparison between rotary and manual instrumentation in primary teeth.

    PubMed

    Crespo, S; Cortes, O; Garcia, C; Perez, L

    2008-01-01

    The aim of this study was to compare the efficiency in both, preparation time and root canal shape, when using the Nickel Titanium (Ni-Ti) rotary and K-Files hand instrumentation on root canal preparation of single rooted primary teeth. Sixty single rooted primary teeth were selected and divided into two equal groups: Group (I) 30 teeth instrumented with manual K-files and group (II) 30 teeth instrumented with Ni-Ti rotary files (ProFile 0.04). Instrumentation times were calculated and root canal impressions were taken with light bodied silicone in order to evaluate the shape. The data was analyzed with SPSS program using the t-test and the Chi-square test to compare their means. The preparation time with group (I) K-files was significantly higher than in group (II) rotary files (ProFile 0.04), with a p= .005. The ProFile system showed a significantly more favorable canal taper when compared to the K-files system (P= .002). The use of rotary files in primary teeth has several advantages when compared with manual K files: the efficiency in both, preparation time and root canal shape. 1. A decreased working time, that helps maintain patient cooperation by diminishing the potential for tiredness. 2. The shape of the root canal is more conical, favoring a higher quality of the root canal filling, and increasing clinical success.

  14. Plan for DoD Wide Demonstrations of a DoD Improved Interactive Electronic Technical Manual (IETM) Architecture

    DTIC Science & Technology

    1998-07-01

    all the MS Word files into FrameMaker + SGML format and use the FrameMaker application to SGML tag all of the data in accordance with the Army TM...Document Type Definitions (DTDs) in MIL-STD- 2361. The edited SGML tagged files are saved as PDF files for delivery to the field. The FrameMaker ...as TIFF files and being imported into FrameMaker prior to saving the TMs as PDF files. Since the hardware to be used by the AN/PPS-5 technician is

  15. Functional evaluation of telemedicine with super high definition images and B-ISDN.

    PubMed

    Takeda, H; Matsumura, Y; Okada, T; Kuwata, S; Komori, M; Takahashi, T; Minatom, K; Hashimoto, T; Wada, M; Fujio, Y

    1998-01-01

    In order to determine whether a super high definition (SHD) image running at a series of 2048 resolution x 2048 line x 60 frame/sec was capable of telemedicine, we established a filing system for medical images and two experiments for transmission of high quality images were performed. All images of various types, produced from one case of ischemic heart disease were digitized and registered into the filing system. Images consisted of plain chest x-ray, electrocardiogram, ultrasound cardiogram, cardiac scintigram, coronary angiogram, left ventriculogram and so on. All images were animated and totaled a number of 243. We prepared a graphic user interface (GUI) for image retrieval based on the medical events and modalities. Twenty one cardiac specialists evaluated quality of the SHD images to be somewhat poor compared to the original pictures but sufficient for making diagnoses, and effective as a tool for teaching and case study purposes. The system capability of simultaneously displaying several animated images was especially deemed effective in grasping comprehension of diagnosis. Efficient input methods and creating capacity of filing all produced images are future issue. Using B-ISDN network, the SHD file was prefetched to the servers at Kyoto University Hospital and BBCC (Bradband ISDN Business chance & Culture Creation) laboratory as an telemedicine experiment. Simultaneous video conference system, the control of image retrieval and pointing function made the teleconference successful in terms of high quality of medical images, quick response time and interactive data exchange.

  16. [Quality of data on early neonatal deaths].

    PubMed

    Pedrosa, Linda Délia Carvalho de Oliveira; Sarinho, Silvia Wanick; Ximenes, Ricardo Arraes de Alencar; Ordonha, Manoelina R

    2007-01-01

    To investigate the quality of official neonatal death data in Maceió, Alagoas. A descriptive study was conducted on early neonatal deaths in hospitals between January 1, 2001, and December 31, 2002, to compare data entry in the Death Certificate (DC) and Mortality Information System (MIS) with a standardized form filled out with data of medical files from the mothers and newborn. The frequency with which the following variables failed to be recorded in the DC and SIM was studied: type of death, address, age of mother, gender, birth weight, and delivery type, age at death and gestational age. MIS reliability was verified using simple concordance, sensitivity and Kappa indicator. MIS recorded 451 deaths, of which 50 were excluded. Mother's age was omitted from MIS in 44.1% of cases. 85.7% to 100% of the variables not filled in for the DC were recovered from the medical files. There was good concordance between DC and medical files for type of delivery, weight and age. Birth weight and age of mother presented the least concordance between medical files and MIS. MIS presented 69.2% sensitivity for weight and 36.3% for age of mother, thus demonstrating little capability to correctly supply information to generate perinatal health indicators. Because of incomplete filling out, quality of the DC becomes precarious and makes the MIS inadequate, even though it covers 100% of neonatal deaths in Maceió. Inefficiency of the system is increased by failure of MIS technicians to correct errors found and input all the information available.

  17. Tools to Ease Your Internet Adventures: Part I.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This first of a two-part series highlights three tools that improve accessibility to Internet resources: (1) Alex, a database that accesses files in FTP (file transfer protocol) sites; (2) Archie, software that searches for file names with a user's search term; and (3) Gopher, a menu-driven program to access Internet sites. (LRW)

  18. The Use of Internet FAQs (Frequently Asked Questions) and Files as Cost-Effective Supplements to Textbooks and Substitutions for Photocopies.

    ERIC Educational Resources Information Center

    MacFarland, Thomas W.; Yates, Jan M.

    Gaining access to current and high-quality curriculum resource materials has become more difficult due to escalation in the prices of textbooks and in rigid interpretations of copyright laws which limit photocopying. Internet Frequently Asked Questions (FAQs) and files may offer a partial solution. Originally developed for the benefit of Usenet…

  19. Clinical validation of different echocardiographic motion pictures expert group-4 algorythms and compression levels for telemedicine.

    PubMed

    Barbier, Paolo; Alimento, Marina; Berna, Giovanni; Cavoretto, Dario; Celeste, Fabrizio; Muratori, Manuela; Guazzi, Maurizio D

    2004-01-01

    Tele-echocardiography is not widely used because of lengthy transmission times when using standard Motion Pictures Expert Groups (MPEG)-2 lossy compression algorythms, unless expensive high bandwidth lines are used. We sought to validate the newer MPEG-4 algorythms to allow further reduction in echocardiographic motion video file size. Four cardiologists expert in echocardiography read blindly 165 randomized uncompressed and compressed 2D and color Doppler normal and pathologic motion images. One Digital Video and 3 MPEG-4 compression algorythms were tested, the latter at 3 decreasing compression quality levels (100%, 65% and 40%). Mean diagnostic and image quality scores were computed for each file and compared across the 3 compression levels using uncompressed files as controls. File dimensions decreased from a range of uncompressed 12-83 MB to MPEG-4 0.03-2.3 MB. All algorythms showed mean scores that were not significantly different from uncompressed source, except the MPEG-4 DivX algorythm at the highest selected compression (40%, p=.002). These data support the use of MPEG-4 compression to reduce echocardiographic motion image size for transmission purposes, allowing cost reduction through use of low bandwidth lines.

  20. KungFQ: a simple and powerful approach to compress fastq files.

    PubMed

    Grassi, Elena; Di Gregorio, Federico; Molineris, Ivan

    2012-01-01

    Nowadays storing data derived from deep sequencing experiments has become pivotal and standard compression algorithms do not exploit in a satisfying manner their structure. A number of reference-based compression algorithms have been developed but they are less adequate when approaching new species without fully sequenced genomes or nongenomic data. We developed a tool that takes advantages of fastq characteristics and encodes them in a binary format optimized in order to be further compressed with standard tools (such as gzip or lzma). The algorithm is straightforward and does not need any external reference file, it scans the fastq only once and has a constant memory requirement. Moreover, we added the possibility to perform lossy compression, losing some of the original information (IDs and/or qualities) but resulting in smaller files; it is also possible to define a quality cutoff under which corresponding base calls are converted to N. We achieve 2.82 to 7.77 compression ratios on various fastq files without losing information and 5.37 to 8.77 losing IDs, which are often not used in common analysis pipelines. In this paper, we compare the algorithm performance with known tools, usually obtaining higher compression levels.

  1. Molray--a web interface between O and the POV-Ray ray tracer.

    PubMed

    Harris, M; Jones, T A

    2001-08-01

    A publicly available web-based interface is presented for producing high-quality ray-traced images and movies from the molecular-modelling program O [Jones et al. (1991), Acta Cryst. A47, 110-119]. The interface allows the user to select O-plot files and set parameters to create standard input files for the popular ray-tracing renderer POV-Ray, which can then produce publication-quality still images or simple movies. To ensure ease of use, we have made this service available to the O user community via the World Wide Web. The public Molray server is available at http://xray.bmc.uu.se/molray.

  2. Hydrologic data from urban watersheds in the Tampa Bay area, Florida

    USGS Publications Warehouse

    Lopez, Miguel A.; Michaelis, D.M.

    1979-01-01

    Hydrologic data are being collected in 10 urbanized watersheds located in the Tampa Bay area, Florida. The gaged watersheds have impervious areas that range from 19 percent for a residential watershed in north Tampa to nearly 100 percent for a downtown Tampa watershed. Land-use types, including roads, residential, commercial, industrial, institutional, recreational , and open space, have been determined for each watershed. Rainfall and storm runoff data collected since 1971 for one site and since 1975 for six other sites through September 1976, have been processed. These data are recorded at 5-minute intervals and are stored in the U. S. Geological Survey WATSTORE unit values file. Daily rainfall at 12 sites and daily pan evaporation at one site have been stored in the WATSTORE daily values file. Chemical and biological analyses of storm runoff for six sites, base flow for seven sites, and analyses of bottom material for seven sites are also stored in the WATSTORE water-quality files. Rainfall and storm runoff for selected storms, daily rainfall, and daily pan-evaporation data are summarized in this report. Water-quality analyses of all water-quality samples also are listed. (Woodard-USGS).

  3. Annotation of phenotypic diversity: decoupling data curation and ontology curation using Phenex.

    PubMed

    Balhoff, James P; Dahdul, Wasila M; Dececchi, T Alexander; Lapp, Hilmar; Mabee, Paula M; Vision, Todd J

    2014-01-01

    Phenex (http://phenex.phenoscape.org/) is a desktop application for semantically annotating the phenotypic character matrix datasets common in evolutionary biology. Since its initial publication, we have added new features that address several major bottlenecks in the efficiency of the phenotype curation process: allowing curators during the data curation phase to provisionally request terms that are not yet available from a relevant ontology; supporting quality control against annotation guidelines to reduce later manual review and revision; and enabling the sharing of files for collaboration among curators. We decoupled data annotation from ontology development by creating an Ontology Request Broker (ORB) within Phenex. Curators can use the ORB to request a provisional term for use in data annotation; the provisional term can be automatically replaced with a permanent identifier once the term is added to an ontology. We added a set of annotation consistency checks to prevent common curation errors, reducing the need for later correction. We facilitated collaborative editing by improving the reliability of Phenex when used with online folder sharing services, via file change monitoring and continual autosave. With the addition of these new features, and in particular the Ontology Request Broker, Phenex users have been able to focus more effectively on data annotation. Phenoscape curators using Phenex have reported a smoother annotation workflow, with much reduced interruptions from ontology maintenance and file management issues.

  4. Annual Quality Assurance Conference Files by Tom Mancuso

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstract: Learn about the NEW EPA Method 325b for Refinery Fence Line Monitoring and TO-17 Extended for Soil Gas by Tom Mancuso and Abstract: Success Using Alternate Carrier Gases for Volatile Methods

  5. 48 CFR 237.172 - Service Contracts Surveillance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Surveillance. 237.172 Section 237.172 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS...-General 237.172 Service Contracts Surveillance. Ensure that quality assurance surveillance plans are....) Retain quality assurance surveillance plans in the official contract file. See https://sam.dau.mil, Step...

  6. 48 CFR 237.172 - Service Contracts Surveillance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Surveillance. 237.172 Section 237.172 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS...-General 237.172 Service Contracts Surveillance. Ensure that quality assurance surveillance plans are....) Retain quality assurance surveillance plans in the official contract file. See https://sam.dau.mil, Step...

  7. 48 CFR 237.172 - Service Contracts Surveillance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Surveillance. 237.172 Section 237.172 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS...-General 237.172 Service Contracts Surveillance. Ensure that quality assurance surveillance plans are....) Retain quality assurance surveillance plans in the official contract file. See https://sam.dau.mil, Step...

  8. 48 CFR 237.172 - Service Contracts Surveillance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Surveillance. 237.172 Section 237.172 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS...-General 237.172 Service Contracts Surveillance. Ensure that quality assurance surveillance plans are....) Retain quality assurance surveillance plans in the official contract file. See https://sam.dau.mil, Step...

  9. 48 CFR 237.172 - Service Contracts Surveillance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Surveillance. 237.172 Section 237.172 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS...-General 237.172 Service Contracts Surveillance. Ensure that quality assurance surveillance plans are....) Retain quality assurance surveillance plans in the official contract file. See https://sam.dau.mil, Step...

  10. Users manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)

  11. 77 FR 9717 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing of a Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... opportunities for price improvement. B. Self-Regulatory Organization's Statement on Burden on Competition C2... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-66384; File No. SR-C2-2012-006] Self-Regulatory... Automated Improvement Mechanism February 13, 2012. Pursuant to Section 19(b)(1) of the Securities Exchange...

  12. 78 FR 17249 - Self-Regulatory Organizations; BOX Options Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ... PIP Order which may result in greater opportunity for price improvement for customers. B. Self... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69135; File No. SR-BOX-2013-11] Self-Regulatory... Amend the BOX Price Improvement Period (``PIP'') Rule 7150 March 14, 2013. Pursuant to Section 19(b)(1...

  13. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  14. Data File Standard for Flow Cytometry, version FCS 3.1.

    PubMed

    Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R

    2010-01-01

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.

  15. Data File Standard for Flow Cytometry, Version FCS 3.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spidlen, Josef; Moore, Wayne; Parks, David

    2009-11-10

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allowsmore » files created by one type of acquisition hardware and software to be analyzed by any other type. The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.« less

  16. The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.

    2016-02-01

    The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.

  17. NALNET book system: Cost benefit study

    NASA Technical Reports Server (NTRS)

    Dewath, N. V.; Palmour, V. E.; Foley, J. R.; Henderson, M. M.; Shockley, C. W.

    1981-01-01

    The goals of the NASA's library network system, NALNET, the functions of the current book system, the products and services of a book system required by NASA Center libraries, and the characteristics of a system that would best supply those products and services were assessed. Emphasis was placed on determining the most cost effective means of meeting NASA's requirements for an automated book system. Various operating modes were examined including the current STIMS file, the PUBFILE, developing software improvements for products as appropriate to the Center needs, and obtaining cataloging and products from the bibliographic utilities including at least OCLC, RLIN, BNA, and STIF. It is recommended that NALNET operate under the STIMS file mode and obtain cataloging and products from the bibliographic utilities. The recommendations are based on the premise that given the current state of the art in library automation it is not cost effective for NASA to maintain a full range of cataloging services on its own system. The bibliographic utilities can support higher quality systems with a greater range of services at a lower total cost.

  18. Next Generation Global Navigation Satellite Systems (GNSS) Processing at NASA CDDIS

    NASA Astrophysics Data System (ADS)

    Michael, B. P.; Noll, C. E.

    2016-12-01

    The Crustal Dynamics Data Information System (CDDIS) has been providing access to space geodesy and related data sets since 1982, and in particular, Global Navigation Satellite Systems (GNSS) data and derived products since 1992. The CDDIS became one of the Earth Observing System Data and Information System (EOSDIS) archive centers in 2007. As such, CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. With a growing understanding of the needs and goals of its science users CDDIS continues to improve these services. Due to the importance of GNSS data and derived products in scientific studies over the last decade, CDDIS has seen its ingest volume explode to over 30 million files per year or more than one file per second from over hundreds of simultaneous data providers. In order to accommodate this increase and to streamline operations and fully automate the workflow, CDDIS has recently updated the data submission process and GNSS processing. This poster will cover this new ingest infrastructure, workflow, and the agile techniques applied in its development and current operations.

  19. A Patient Record-Filing System for Family Practice

    PubMed Central

    Levitt, Cheryl

    1988-01-01

    The efficient storage and easy retrieval of quality records are a central concern of good family practice. Many physicians starting out in practice have difficulty choosing a practical and lasting system for storing their records. Some who have established practices are installing computers in their offices and finding that their filing systems are worn, outdated, and incompatible with computerized systems. This article describes a new filing system installed simultaneously with a new computer system in a family-practice teaching centre. The approach adopted solved all identifiable problems and is applicable in family practices of all sizes.

  20. I/O Performance Characterization of Lustre and NASA Applications on Pleiades

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Rappleye, Jason; Chang, Johnny; Barker, David Peter; Biswas, Rupak; Mehrotra, Piyush

    2012-01-01

    In this paper we study the performance of the Lustre file system using five scientific and engineering applications representative of NASA workload on large-scale supercomputing systems such as NASA s Pleiades. In order to facilitate the collection of Lustre performance metrics, we have developed a software tool that exports a wide variety of client and server-side metrics using SGI's Performance Co-Pilot (PCP), and generates a human readable report on key metrics at the end of a batch job. These performance metrics are (a) amount of data read and written, (b) number of files opened and closed, and (c) remote procedure call (RPC) size distribution (4 KB to 1024 KB, in powers of 2) for I/O operations. RPC size distribution measures the efficiency of the Lustre client and can pinpoint problems such as small write sizes, disk fragmentation, etc. These extracted statistics are useful in determining the I/O pattern of the application and can assist in identifying possible improvements for users applications. Information on the number of file operations enables a scientist to optimize the I/O performance of their applications. Amount of I/O data helps users choose the optimal stripe size and stripe count to enhance I/O performance. In this paper, we demonstrate the usefulness of this tool on Pleiades for five production quality NASA scientific and engineering applications. We compare the latency of read and write operations under Lustre to that with NFS by tracing system calls and signals. We also investigate the read and write policies and study the effect of page cache size on I/O operations. We examine the performance impact of Lustre stripe size and stripe count along with performance evaluation of file per process and single shared file accessed by all the processes for NASA workload using parameterized IOR benchmark.

  1. Data Science Bowl Launched to Improve Lung Cancer Screening | Division of Cancer Prevention

    Cancer.gov

    [[{"fid":"2078","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl Logo","field_file_image_title_text[und][0][value]":"Data Science Bowl Logo","field_folder[und]":"76"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl

  2. 75 FR 76503 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63416; File No. SR-BX-2010-083] Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Proposed Rule Change Relating to The Price Improvement... Items I and II below, which Items have been prepared by the self-regulatory organization. The Commission...

  3. 75 FR 36458 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-25

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-62316; File No. SR-ISE-2010-15] Self-Regulatory... Rule Change, as Modified by Amendment Nos. 1 and 2, Related to the Price Improvement Mechanism June 17.... 2 replaces and supersedes the original filing and Amendment No. 1 thereto in their entirety. I. Self...

  4. Optimizing Input/Output Using Adaptive File System Policies

    NASA Technical Reports Server (NTRS)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  5. Constructing Space-Time Views from Fixed Size Statistical Data: Getting the Best of both Worlds

    NASA Technical Reports Server (NTRS)

    Schmidt, Melisa; Yan, Jerry C.

    1997-01-01

    Many performance monitoring tools are currently available to the super-computing community. The performance data gathered and analyzed by these tools fall under two categories: statistics and event traces. Statistical data is much more compact but lacks the probative power event traces offer. Event traces, on the other hand, can easily fill up the entire file system during execution such that the instrumented execution may have to be terminated half way through. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. The user can trade-off tracing overhead, trace data size vs. data quality incrementally. In other words, the user will be able to limit the amount of trace collected and, at the same time, carry out some of the analysis event traces offer using space-time views for the entire execution. Two basic ideas arc employed: the use of averages to replace recording data for each instance and formulae to represent sequences associated with communication and control flow. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected vs. event traces. We found that the trace files thus obtained are, in deed, small, bounded and predictable before program execution and that the quality of the space time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture 100% of all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at run-time to learn longer sequences.

  6. Constructing Space-Time Views from Fixed Size Statistical Data: Getting the Best of Both Worlds

    NASA Technical Reports Server (NTRS)

    Schmidt, Melisa; Yan, Jerry C.; Bailey, David (Technical Monitor)

    1996-01-01

    Many performance monitoring tools are currently available to the super-computing community. The performance data gathered and analyzed by these tools fall under two categories: statistics and event traces. Statistical data is much more compact but lacks the probative power event traces offer. Event traces, on the other hand, can easily fill up the entire file system during execution such that the instrumented execution may have to be terminated half way through. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. The user can trade-off tracing overhead, trace data size vs. data quality incrementally. In other words, the user will be able to limit the amount of trace collected and, at the same time, carry out some of the analysis event traces offer using spacetime views for the entire execution. Two basic ideas are employed: the use of averages to replace recording data for each instance and "formulae" to represent sequences associated with communication and control flow. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected vs. event traces. We found that the trace files thus obtained are, in deed, small, bounded and predictable before program execution and that the quality of the space time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture 100% of all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at run-time to learn longer sequences.

  7. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.

  8. Guidelines and standard procedures for studies of ground-water quality; selection and installation of wells, and supporting documentation

    USGS Publications Warehouse

    Lapham, W.W.; Wilde, F.D.; Koterba, M.T.

    1997-01-01

    This is the first of a two-part report to document guidelines and standard procedures of the U.S. Geological Survey for the acquisition of data in ground-water-quality studies. This report provides guidelines and procedures for the selection and installation of wells for water-quality studies/*, and the required or recommended supporting documentation of these activities. Topics include (1) documentation needed for well files, field folders, and electronic files; (2) criteria and information needed for the selection of water-supply and observation wells, including site inventory and data collection during field reconnaissance; and (3) criteria and preparation for installation of monitoring wells, including the effects of equipment and materials on the chemistry of ground-water samples, a summary of drilling and coring methods, and information concerning well completion, development, and disposition.

  9. Continuous-Energy Data Checks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haeck, Wim; Conlin, Jeremy Lloyd; McCartney, Austin Paul

    The purpose of this report is to provide an overview of all Quality Assurance tests that have to be performed on a nuclear data set to be transformed into an ACE formatted nuclear data file. The ACE file is capable of containing different types of data such as continuous energy neutron data, thermal scattering data, etc. Within this report, we will limit ourselves to continuous energy neutron data.

  10. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.

  11. The Medicare Health Outcomes Survey program: overview, context, and near-term prospects.

    PubMed

    Jones, Nathaniel; Jones, Stephanie L; Miller, Nancy A

    2004-07-12

    In 1996, the Centers for Medicare & Medicaid Services (CMS) initiated the Medicare Health Outcomes Survey (HOS). It is the first national survey to measure the quality of life and functional health status of Medicare beneficiaries enrolled in managed care. The program seeks to gather valid and reliable health status data in Medicare managed care for use in quality improvement activities, public reporting, plan accountability and improving health outcomes based on competition. The context that led to the development of the HOS was formed by the convergence of the following factors: 1) a recognized need to monitor the performance of managed care plans, 2) technical expertise and advancement in the areas of quality measurement and health outcomes assessment, 3) the existence of a tested functional health status assessment tool (SF-36)1, which was valid for an elderly population, 4) CMS leadership, and 5) political interest in quality improvement. Since 1998, there have been six baseline surveys and four follow up surveys. CMS, working with its partners, performs the following tasks as part of the HOS program: 1) Supports the technical/scientific development of the HOS measure, 2) Certifies survey vendors, 3) Collects Health Plan Employer Data and Information Set(HEDIS)2 HOS data, 4) Cleans, scores, and disseminates annual rounds of HOS data, public use files and reports to CMS, Quality Improvement Organizations (QIOs), Medicare+Choice Organizations (M+COs), and other stakeholders, 5) Trains M+COs and QIOs in the use of functional status measures and best practices for improving care, 6) Provides technical assistance to CMS, QIOs, M+COs and other data users, and 7) Conducts analyses using HOS data to support CMS and HHS priorities.CMS has recently sponsored an evaluation of the HOS program, which will provide the information necessary to enhance the future administration of the program. Information collected to date reveals that the HOS program is a valuable tool that provides a rich set of data that is useful for quality monitoring and improvement efforts. To enhance the future of the HOS program, many stakeholders recommend the implementation of incentives to encourage the use of the data, while others identify the need to monitor the health status of plan disenrollees.Overall, the HOS program represents an important vehicle for collecting outcomes data from Medicare beneficiaries. The new Medicare Prescription Drug, Improvement, and Modernization Act (2003) mandates the collection and use of data for outcomes measurement. Consequently, it is important to improve HOS to most effectively meet the mandate.

  12. Oscillation characteristics of endodontic files: numerical model and its validation.

    PubMed

    Verhaagen, Bram; Lea, Simon C; de Bruin, Gerrit J; van der Sluis, Luc W M; Walmsley, A Damien; Versluis, Michel

    2012-11-01

    During a root canal treatment, an antimicrobial fluid is injected into the root canal to eradicate all bacteria from the root canal system. Agitation of the fluid using an ultrasonically vibrating miniature file results in a significant improvement in the cleaning efficacy over conventional syringe irrigation. Numerical analysis of the oscillation characteristics of the file, modeled as a tapered, driven rod, shows a sinusoidal wave pattern with an increase in amplitude and decrease in wavelength toward the free end of the file. Measurements of the file oscillation with a scanning laser vibrometer show good agreement with the numerical simulation. The numerical model of endodontic file oscillation has the potential for predicting the oscillation pattern and fracture likeliness of various file types and the acoustic streaming they induce during passive ultrasonic irrigation.

  13. Streamlined, Inexpensive 3D Printing of the Brain and Skull.

    PubMed

    Naftulin, Jason S; Kimchi, Eyal Y; Cash, Sydney S

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.

  14. Improving the Taiwan Military’s Disaster Relief Response to Typhoons

    DTIC Science & Technology

    2015-06-01

    circulation, are mostly westbound. When they reach the vicinity of Taiwan or the Philippines , which are always at the edge of the Pacific subtropical high...files from the POM base case model, one set for each design point. To automate the process of running all the GAMS files, a Windows batch file ( BAT ...is used to call on GAMS to solve each version of the model. The BAT file creates a new directory for each run to hold output, and one of the outputs

  15. The ProteoRed MIAPE web toolkit: A User-friendly Framework to Connect and Share Proteomics Standards*

    PubMed Central

    Medina-Aunon, J. Alberto; Martínez-Bartolomé, Salvador; López-García, Miguel A.; Salazar, Emilio; Navajas, Rosana; Jones, Andrew R.; Paradela, Alberto; Albar, Juan P.

    2011-01-01

    The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. First, it can verify that the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Second, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing, or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/. PMID:21983993

  16. An Audit of Emergency Department Accreditation Based on Joint Commission International Standards (JCI).

    PubMed

    Hashemi, Behrooz; Motamedi, Maryam; Etemad, Mania; Rahmati, Farhad; Forouzanfar, Mohammad Mehdi; Kaghazchi, Fatemeh

    2014-01-01

    Despite thousands of years from creation of medical knowledge, it not much passes from founding the health care systems. Accreditation is an effective mechanism for performance evaluation, quality enhancement, and the safety of health care systems. This study was conducted to assess the results of emergency department (ED) accreditation in Shohadaye Tajrish Hospital, Tehran, Iran, 2013 in terms of domesticated standards of joint commission international (JCI) standards. This cohort study with a four-month follow up was conducted in the ED of Shohadaye Tajrish Hospital in 2013. The standard evaluation checklist of Iran hospitals (based on JCI standards) included 24 heading and 337 subheading was used for this purpose. The effective possible causes of weak spots were found and their solutions considered. After correction, assessment of accreditation were repeated again. Finally, the achieved results of two periods were analyzed using SPSS version 20. Quality improvement, admission in department and patient assessment, competency and capability test for staffs, collection and analysis of data, training of patients, and facilities had the score of below 50%. The mean of total score for accreditation in ED in the first period was 60.4±30.15 percent and in the second period 68.9±22.9 (p=0.005). Strategic plans, head of department, head nurse, resident physician, responsible nurse for the shift, and personnel file achieved the score of 100%. Of total headings below 50% in the first period just in two cases, collection and analysis of data with growth of 40% as well as competency and capability test for staffs with growth of 17%, were reached to more than 50%. Based on findings of the present study, the ED of Shohadaye Tajrish hospital reached the score of below 50% in six heading of quality improvement, admission in department and patient assessment, competency and capability test for staffs, collection and analysis of data, training of patients, and facilities. While, the given score in strategic plans, head of department, head nurse, resident physician, responsible nurse for the shifts, and personnel file was 100%.

  17. Support the Design of Improved IUE NEWSIPS High Dispersion Extraction Algorithms: Improved IUE High Dispersion Extraction Algorithms

    NASA Technical Reports Server (NTRS)

    Lawton, Pat

    2004-01-01

    The objective of this work was to support the design of improved IUE NEWSIPS high dispersion extraction algorithms. The purpose of this work was to evaluate use of the Linearized Image (LIHI) file versus the Re-Sampled Image (SIHI) file, evaluate various extraction, and design algorithms for evaluation of IUE High Dispersion spectra. It was concluded the use of the Re-Sampled Image (SIHI) file was acceptable. Since the Gaussian profile worked well for the core and the Lorentzian profile worked well for the wings, the Voigt profile was chosen for use in the extraction algorithm. It was found that the gamma and sigma parameters varied significantly across the detector, so gamma and sigma masks for the SWP detector were developed. Extraction code was written.

  18. Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.

    PubMed

    Houston, Lauren; Probst, Yasmine; Humphries, Allison

    2015-01-01

    Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.

  19. A novel platform for in vitro analysis of torque, forces, and three-dimensional file displacements during root canal preparations: application to ProTaper rotary files.

    PubMed

    Diop, Amadou; Maurel, Nathalie; Oiknine, Michel; Patoor, Etienne; Machtou, Pierre

    2009-04-01

    We proposed a new testing setup and in vitro experimental procedure allowing the analysis of the forces, torque, and file displacements during the preparation of root canals using nickel-titanium rotary endodontic files. We applied it to the preparation of 20 fresh frozen cadaveric teeth using ProTaper files (Dentsply Maillefer, Ballaigues, Switzerland), according to a clinically used sequence. During the preparations, a clinical hand motion was performed by an endodontist, and we measured the applied torque around the file axis and also the involved three-dimensional forces and 3-dimensional file displacements. Such a biomechanical procedure is useful to better understand the working conditions of the files in terms of loads and displacements. It could be used to analyze the effects of various mechanical and geometric parameters on the files' behavior and to get data for modelling purposes. Finally, it could contribute to studies aiming to improve files design in order to reduce the risks of file fractures.

  20. Improvement in rheumatic fever and rheumatic heart disease management and prevention using a health centre-based continuous quality improvement approach.

    PubMed

    Ralph, Anna P; Fittock, Marea; Schultz, Rosalie; Thompson, Dale; Dowden, Michelle; Clemens, Tom; Parnaby, Matthew G; Clark, Michele; McDonald, Malcolm I; Edwards, Keith N; Carapetis, Jonathan R; Bailie, Ross S

    2013-12-18

    Rheumatic heart disease (RHD) remains a major health concern for Aboriginal Australians. A key component of RHD control is prevention of recurrent acute rheumatic fever (ARF) using long-term secondary prophylaxis with intramuscular benzathine penicillin (BPG). This is the most important and cost-effective step in RHD control. However, there are significant challenges to effective implementation of secondary prophylaxis programs. This project aimed to increase understanding and improve quality of RHD care through development and implementation of a continuous quality improvement (CQI) strategy. We used a CQI strategy to promote implementation of national best-practice ARF/RHD management guidelines at primary health care level in Indigenous communities of the Northern Territory (NT), Australia, 2008-2010. Participatory action research methods were employed to identify system barriers to delivery of high quality care. This entailed facilitated discussion with primary care staff aided by a system assessment tool (SAT). Participants were encouraged to develop and implement strategies to overcome identified barriers, including better record-keeping, triage systems and strategies for patient follow-up. To assess performance, clinical records were audited at baseline, then annually for two years. Key performance indicators included proportion of people receiving adequate secondary prophylaxis (≥80% of scheduled 4-weekly penicillin injections) and quality of documentation. Six health centres participated, servicing approximately 154 people with ARF/RHD. Improvements occurred in indicators of service delivery including proportion of people receiving ≥40% of their scheduled BPG (increasing from 81/116 [70%] at baseline to 84/103 [82%] in year three, p = 0.04), proportion of people reviewed by a doctor within the past two years (112/154 [73%] and 134/156 [86%], p = 0.003), and proportion of people who received influenza vaccination (57/154 [37%] to 86/156 [55%], p = 0.001). However, the proportion receiving ≥80% of scheduled BPG did not change. Documentation in medical files improved: ARF episode documentation increased from 31/55 (56%) to 50/62 (81%) (p = 0.004), and RHD risk category documentation from 87/154 (56%) to 103/145 (76%) (p < 0.001). Large differences in performance were noted between health centres, reflected to some extent in SAT scores. A CQI process using a systems approach and participatory action research methodology can significantly improve delivery of ARF/RHD care.

  1. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  2. Clinical impact of dosimetric changes for volumetric modulated arc therapy in log file-based patient dose calculations.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2017-10-01

    A log file-based method cannot detect dosimetric changes due to linac component miscalibration because log files are insensitive to miscalibration. Herein, clinical impacts of dosimetric changes on a log file-based method were determined. Five head-and-neck and five prostate plans were applied. Miscalibration-simulated log files were generated by inducing a linac component miscalibration into the log file. Miscalibration magnitudes for leaf, gantry, and collimator at the general tolerance level were ±0.5mm, ±1°, and ±1°, respectively, and at a tighter tolerance level achievable on current linac were ±0.3mm, ±0.5°, and ±0.5°, respectively. Re-calculations were performed on patient anatomy using log file data. Changes in tumor control probability/normal tissue complication probability from treatment planning system dose to re-calculated dose at the general tolerance level was 1.8% on planning target volume (PTV) and 2.4% on organs at risk (OARs) in both plans. These changes at the tighter tolerance level were improved to 1.0% on PTV and to 1.5% on OARs, with a statistically significant difference. We determined the clinical impacts of dosimetric changes on a log file-based method using a general tolerance level and a tighter tolerance level for linac miscalibration and found that a tighter tolerance level significantly improved the accuracy of the log file-based method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. Dimensions of quality of antenatal care sservice at suez, egypt.

    PubMed

    Rahman El Gammal, Hanan Abbas Abdo Abdel

    2014-07-01

    The 5(th) millennium development goal aims at reducing maternal mortality by 75% by the year 2015. According to the World Health Organization, there was an estimated 358,000 maternal deaths globally in 2008. Developing countries accounted for 99% of these deaths of which three-fifths occurred in Sub-Saharan Africa. In primary health care (PHC), quality of antenatal care is fundamental and critically affects service continuity. Nevertheless, medical research ignores the issue and it is lacking scientific inquiry, particularly in Egypt. The aim of the following study is to assess the quality of antenatal care in urban Suez Governorate, Egypt. A cross-sectional primary health care center (PHCC) based study conducted at five PHCC in urban Suez, Egypt. The total sample size collected from clients, physicians and medical records. Parameters assessed auditing of medical records, assessing provider and pregnant women satisfaction. Nearly 97% of respondents were satisfied about the quality of antenatal care, while provider's satisfaction was 61% and for file, auditing was 76.5 ± 5.6. The present study shows that client satisfaction, physicians' satisfaction and auditing of medical record represent an idea about opportunities for improvement.

  4. Dimensions of Quality of Antenatal Care Sservice at Suez, Egypt

    PubMed Central

    Rahman El Gammal, Hanan Abbas Abdo Abdel

    2014-01-01

    Introduction: The 5th millennium development goal aims at reducing maternal mortality by 75% by the year 2015. According to the World Health Organization, there was an estimated 358,000 maternal deaths globally in 2008. Developing countries accounted for 99% of these deaths of which three-fifths occurred in Sub-Saharan Africa. In primary health care (PHC), quality of antenatal care is fundamental and critically affects service continuity. Nevertheless, medical research ignores the issue and it is lacking scientific inquiry, particularly in Egypt. Aim of the Study: The aim of the following study is to assess the quality of antenatal care in urban Suez Governorate, Egypt. Materials and Methods: A cross-sectional primary health care center (PHCC) based study conducted at five PHCC in urban Suez, Egypt. The total sample size collected from clients, physicians and medical records. Parameters assessed auditing of medical records, assessing provider and pregnant women satisfaction. Results: Nearly 97% of respondents were satisfied about the quality of antenatal care, while provider's satisfaction was 61% and for file, auditing was 76.5 ± 5.6. Conclusion: The present study shows that client satisfaction, physicians’ satisfaction and auditing of medical record represent an idea about opportunities for improvement. PMID:25374861

  5. How Effective are Mindfulness-Based Interventions for Reducing Stress Among Healthcare Professionals? A Systematic Review and Meta-Analysis.

    PubMed

    Burton, Amy; Burgess, Catherine; Dean, Sarah; Koutsopoulou, Gina Z; Hugh-Jones, Siobhan

    2017-02-01

    Workplace stress is high among healthcare professionals (HCPs) and is associated with reduced psychological health, quality of care and patient satisfaction. This systematic review and meta-analysis reviews evidence on the effectiveness of mindfulness-based interventions (MBIs) for reducing stress in HCPs. A systematic literature search was conducted. Papers were screened for suitability using inclusion criteria and nine papers were subjected to review and quality assessment. Seven papers, for which full statistical findings could be obtained, were also subjected to meta-analysis. Results of the meta-analysis suggest that MBIs have the potential to significantly improve stress among HCPs; however, there was evidence of a file drawer problem. The quality of the studies was high in relation to the clarity of aims, data collection and analysis, but weaker in terms of sample size and the use of theoretical frameworks. MBIs have the potential to reduce stress among HCPs; however, more high-quality research is needed before this finding can be confirmed. Future studies would benefit from long-term follow-up measures to determine any continuing effects of mindfulness training on stress outcomes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Recent developments in user-job management with Ganga

    NASA Astrophysics Data System (ADS)

    Currie, R.; Elmsheuser, J.; Fay, R.; Owen, P. H.; Richards, A.; Slater, M.; Sutcliffe, W.; Williams, M.

    2015-12-01

    The Ganga project was originally developed for use by LHC experiments and has been used extensively throughout Run1 in both LHCb and ATLAS. This document describes some the most recent developments within the Ganga project. There have been improvements in the handling of large scale computational tasks in the form of a new GangaTasks infrastructure. Improvements in file handling through using a new IGangaFile interface makes handling files largely transparent to the end user. In addition to this the performance and usability of Ganga have both been addressed through the development of a new queues system allows for parallel processing of job related tasks.

  7. An Improved B+ Tree for Flash File Systems

    NASA Astrophysics Data System (ADS)

    Havasi, Ferenc

    Nowadays mobile devices such as mobile phones, mp3 players and PDAs are becoming evermore common. Most of them use flash chips as storage. To store data efficiently on flash, it is necessary to adapt ordinary file systems because they are designed for use on hard disks. Most of the file systems use some kind of search tree to store index information, which is very important from a performance aspect. Here we improved the B+ search tree algorithm so as to make flash devices more efficient. Our implementation of this solution saves 98%-99% of the flash operations, and is now the part of the Linux kernel.

  8. preAssemble: a tool for automatic sequencer trace data processing.

    PubMed

    Adzhubei, Alexei A; Laerdahl, Jon K; Vlasova, Anna V

    2006-01-17

    Trace or chromatogram files (raw data) are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling). This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages--Phred and Staden are used by preAssemble to perform sequence quality processing. The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace) data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence processing.

  9. Digital files for northeast Asia geodynamics, mineral deposit location, and metallogenic belt maps, stratigraphic columns, descriptions of map units, and descriptions of metallogenic belts

    USGS Publications Warehouse

    Nokleberg, Warren J.; Badarch, Gombosuren; Berzin, Nikolai A.; Diggles, Michael F.; Hwang, Duk-Hwan; Khanchuk, Alexander I.; Miller, Robert J.; Naumova, Vera V.; Obolensky, Alexander A.; Ogasawara, Masatsugu; Parfenov, Leonid M.; Prokopiev, Andrei V.; Rodionov, Sergey M.; Yan, Hongquan

    2004-01-01

    This is the online version of a CD-ROM publication. It contains all of the data that are on the disc but extra files have been removed: index files, software installers, and Windows autolaunch files. This publication contains a a series of files for Northeast Asia geodynamics, mineral deposit location, and metallogenic belt maps descriptions of map units and metallogenic belts, and stratigraphic columns. This region includes Eastern Siberia, Russian Far East, Mongolia, Northeast China, South Korea, and Japan. The files include: (1) a geodynamics map at a scale of 1:5,000,000; (2) page-size stratigraphic columns for major terranes; (3) a generalized geodynamics map at a scale of 1:15,000,000; (4) a mineral deposit location map at a scale of 1:7,500,000; (5) metallogenic belt maps at a scale of 1:15,000,000; (6) detailed descriptions of geologic units with references; (7) detailed descriptions of metallogenic belts with references; and (8) summary mineral deposit and metallogenic belt tables. The purpose of this publication is to provide high-quality, digital graphic files for maps and figures, and Word files for explanations, descriptions, and references to customers and users.

  10. Self-proxy agreement and correlates of health-related quality of life in young adults with epilepsy and mild intellectual disabilities.

    PubMed

    Zimmermann, Friederike; Endermann, Michael

    2008-07-01

    This study investigated health-related quality of life (HRQOL) in young adults with epilepsy and intellectual disabilities. First, agreement between self-reports and proxy reports of HRQOL was examined. Second, medical and psychological contributions to HRQOL were explored. Thirty-six patients were interviewed using the Quality of Life in Epilepsy inventory (QOLIE-31), the Hospital Anxiety and Depression Scale, and the Neuroticism and Extraversion scales of the NEO Five-Factor Inventory. Medical data were taken from files. Professional caregivers completed rephrased QOLIE-31-questionnaires. The perspectives on HRQOL differed systematically: Caregivers underrated their clients' HRQOL on average. Few correlations with medical characteristics emerged, whereas all psychological variables were strongly related to HRQOL. Neuroticism, Age at Disability Onset, and their interaction explained 71% of the HRQOL variance. Results indicate that proxy reports do not provide valid substitutes for most of the self-reported HRQOL subscales. Psychological treatment of negative affectivity and after critical life events in adolescence may improve HRQOL in young adults with epilepsy and mild intellectual disabilities.

  11. Self-directed study using MP3 players to improve auscultation proficiency of physicians: a randomized, controlled trial.

    PubMed

    Donato, Anthony A; Kaliyadan, Antony G; Wasser, Thomas

    2014-01-01

    Studies of physicians at all levels of training demonstrate significant deficiencies in cardiac auscultation skills. The best instructional methods to augment these skills are not known. This study was a randomized, controlled trial of 83 noncardiologist volunteers exposed to a 12-week lower cognitive load self-study group using MP3 players containing heart sound audio files compared to a group receiving a 1-time 1-hour higher cognitive load multimedia lecture using the same audio files. The primary outcome measure was change in 15-question posttest score at 4 and 12 weeks as compared to pretest on recognition of identical audio files introduced during training. In the self-study group, the association of total exposure and deliberate practice effort (estimated by standard deviation of files played/mean) to improvement in test score was measured as a secondary end point. Self-study group participants improved as compared to pretest by 4.42 ± 3.41 answers correct at 12 weeks (5.09-9.51 correct, p < .001), while those exposed to the multimedia lecture improved by an average of 1.13 ± 3.2 answers correct (4.48-5.61 correct, p = .03). In the self-study arm, improvement in the posttest was positively associated with both total exposure (β = 0.55, p < .001) and deliberate practice score (β = 0.31, p = .02). A lower cognitive load self-study of audio files improved recognition of cardiac sounds, as compared to multimedia lecture, and deliberate practice strategies improved study efficiency. More investigation is needed to assess transfer of learning to a wider range of cardiac sounds in both simulated and clinical environments. © 2014 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.

  12. Use of audit, feedback and education increased guideline implementation in a multidisciplinary stroke unit

    PubMed Central

    2017-01-01

    Background The audit-feedback cycle is a behaviour change intervention used to reduce evidence-practice gaps. In this study, repeat audits, feedback, education and training were used to change practice and increase compliance with Australian guideline recommendations for stroke rehabilitation. Objective To increase the proportion of patients with stroke receiving best practice screening, assessment and treatment. Methods A before-and-after study design was used. Data were collected from medical records (n=15 files per audit). Four audits were conducted between 2009 and 2013. Consecutive files of patients with stroke admitted to the stroke unit were selected and audited retrospectively. Staff behaviour change interventions included four cycles of audit feedback, and education to assist staff with change. The primary outcome measure was the proportion of eligible patients receiving best practice against target behaviours, based on audit data. Results Between the first and fourth audit (2009 and 2013), 20 of the 27 areas targeted (74%) met or exceeded the minimum target of 10% change. Practice areas that showed the most change included sensation screening (+75%) and rehabilitation (+100%); neglect screening (+92%) and assessment (100%). Some target behaviours showed a drop in compliance such as anxiety and depression screening (−27%) or little or no overall improvement such as patient education about stroke (6% change). Conclusions Audit feedback and education increased the proportion of inpatients with stroke receiving best practice rehabilitation in some, but not all practice areas. An ongoing process of quality improvement is needed to help sustain these improvements. PMID:29450304

  13. Educational Video Recording and Editing for The Hand Surgeon

    PubMed Central

    Rehim, Shady A.; Chung, Kevin C.

    2016-01-01

    Digital video recordings are increasingly used across various medical and surgical disciplines including hand surgery for documentation of patient care, resident education, scientific presentations and publications. In recent years, the introduction of sophisticated computer hardware and software technology has simplified the process of digital video production and improved means of disseminating large digital data files. However, the creation of high quality surgical video footage requires basic understanding of key technical considerations, together with creativity and sound aesthetic judgment of the videographer. In this article we outline the practical steps involved with equipment preparation, video recording, editing and archiving as well as guidance for the choice of suitable hardware and software equipment. PMID:25911212

  14. Prefetching in file systems for MIMD multiprocessors

    NASA Technical Reports Server (NTRS)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  15. The National Anesthesia Clinical Outcomes Registry.

    PubMed

    Liau, Adrian; Havidich, Jeana E; Onega, Tracy; Dutton, Richard P

    2015-12-01

    The Anesthesia Quality Institute (AQI) was chartered in 2008 by the American Society of Anesthesiologists to develop the National Anesthesia Clinical Outcomes Registry (NACOR). In this Technical Communication, we will describe how data enter NACOR, how they are authenticated, and how they are analyzed and reported. NACOR accepts case-level administrative, clinical, and quality capture data from voluntarily participating anesthesia practices and health care facilities in the United States. All data are transmitted to the AQI in summary electronic files generated by billing, quality capture, and electronic health care record software, typically on a monthly basis. All data elements are mapped to fields in the NACOR schema in accordance with a publicly available data dictionary. Incoming data are loaded into NACOR by AQI technologists and are subject to both manual and automated review to identify systematically missing elements, miscoding, and inadvertent corruption. Data are deidentified in compliance with Health Insurance Portability and Accountability Act regulations. The database server of AQI, which houses the NACOR database, is protected by 2 firewalls within the American Society of Anesthesiologists' network infrastructure; this system has not been breached. The NACOR Participant User File, a deidentified case-level dataset of information from NACOR, is available to researchers at participating institutions. NACOR architecture and the nature of the Participant User File include both strengths and weaknesses.

  16. Architecture and evolution of Goddard Space Flight Center Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Bedet, Jean-Jacques; Bodden, Lee; Rosen, Wayne; Sherman, Mark; Pease, Phil

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been developed to enhance Earth Science research by improved access to remote sensor earth science data. Building and operating an archive, even one of a moderate size (a few Terabytes), is a challenging task. One of the critical components of this system is Unitree, the Hierarchical File Storage Management System. Unitree, selected two years ago as the best available solution, requires constant system administrative support. It is not always suitable as an archive and distribution data center, and has moderate performance. The Data Archive and Distribution System (DADS) software developed to monitor, manage, and automate the ingestion, archive, and distribution functions turned out to be more challenging than anticipated. Having the software and tools is not sufficient to succeed. Human interaction within the system must be fully understood to improve efficiency to improve efficiency and ensure that the right tools are developed. One of the lessons learned is that the operability, reliability, and performance aspects should be thoroughly addressed in the initial design. However, the GSFC DAAC has demonstrated that it is capable of distributing over 40 GB per day. A backup system to archive a second copy of all data ingested is under development. This backup system will be used not only for disaster recovery but will also replace the main archive when it is unavailable during maintenance or hardware replacement. The GSFC DAAC has put a strong emphasis on quality at all level of its organization. A Quality team has also been formed to identify quality issues and to propose improvements. The DAAC has conducted numerous tests to benchmark the performance of the system. These tests proved to be extremely useful in identifying bottlenecks and deficiencies in operational procedures.

  17. [Interest of evaluation of professional practice for the improvement of the management of postoperative pain with patient controlled analgesia (PCA)].

    PubMed

    Baumann, A; Cuignet-Royer, E; Cornet, C; Trueck, S; Heck, M; Taron, F; Peignier, C; Chastel, A; Gervais, P; Bouaziz, H; Audibert, G; Mertes, P-M

    2010-10-01

    To evaluate the daily practice of postoperative PCA in Nancy University Hospital, in continuity with a quality program of postoperative pain (POP) care conducted in 2003. A retrospective audit of patient medical records. A review of all the medical records of consecutive surgical patients managed by PCA over a 5-week period in six surgical services. Criteria studied: Evaluation of hospital means (eight criteria) and of medical and nursing staff practice (16 criteria). A second audit was conducted 6 months after the implementation of quality improvement measures. Assessment of the hospital means: temperature chart including pain scores and PCA drug consumption, patient information leaflet, PCA protocol, postoperative pre-filled prescription form (PFPF) for post-anaesthesia care including PCA, and optional training of nurses in postoperative pain management. EVALUATION OF PRACTICES: One hundred and fifty-nine files of a total of 176 patients were analyzed (88%). Improvements noted after 6 months: trace of POP evaluation progressed from 73 to 87%, advance prescription of PCA adjustment increased from 56 to 68% and of the treatment of adverse effects from 54 to 68%, trace of PCA adaptation by attending nurse from 15 to 43%, trace of the administration of the treatment of adverse effects by attending nurse from 24% to 64%, as did the use of PFPF from 59 to 70%. The usefulness of a pre-filled prescription form for post-anaesthesia care including PCA prescription is demonstrated. Quality improvement measures include: poster information and pocket guides on PCA for nurses, training of 3 nurses per service to act as "PCA advisers" who will in turn train their ward colleagues in PCA management and the use of equipment until an acute pain team is established. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  18. Dynamic Non-Hierarchical File Systems for Exascale Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Darrell E.; Miller, Ethan L

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in themore » scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.« less

  19. 76 FR 28397 - Acceleration of Broadband Deployment by Improving Policies Regarding Public Rights of Way and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... directly relevant to intended use? For example, in some cases in the past, localities owning rights of way...: Parties who choose to file by paper must file an original and four copies of each filing. If more than one... negative, related to broadband deployment. In the case of comments that name any state or local government...

  20. Quality expectations and tolerance limits of trial master files (TMF) – Developing a risk-based approach for quality assessments of TMFs

    PubMed Central

    Hecht, Arthur; Busch-Heidger, Barbara; Gertzen, Heiner; Pfister, Heike; Ruhfus, Birgit; Sanden, Per-Holger; Schmidt, Gabriele B.

    2015-01-01

    This article addresses the question of when a trial master file (TMF) can be considered sufficiently accurate and complete: What attributes does the TMF need to have so that a clinical trial can be adequately reconstructed from documented data and procedures? Clinical trial sponsors face significant challenges in assembling the TMF, especially when dealing with large, international, multicenter studies; despite all newly introduced archiving techniques it is becoming more and more difficult to ensure that the TMF is complete. This is directly reflected in the number of inspection findings reported and published by the EMA in 2014. Based on quality risk management principles in clinical trials the authors defined the quality expectations for the different document types in a TMF and furthermore defined tolerance limits for missing documents. This publication provides guidance on what type of documents and processes are most important, and in consequence, indicates on which documents and processes trial team staff should focus in order to achieve a high-quality TMF. The members of this working group belong to the CQAG Group (Clinical Quality Assurance Germany) and are QA (quality assurance) experts (auditors or compliance functions) with long-term experience in the practical handling of TMFs. PMID:26693218

  1. Quality expectations and tolerance limits of trial master files (TMF) - Developing a risk-based approach for quality assessments of TMFs.

    PubMed

    Hecht, Arthur; Busch-Heidger, Barbara; Gertzen, Heiner; Pfister, Heike; Ruhfus, Birgit; Sanden, Per-Holger; Schmidt, Gabriele B

    2015-01-01

    This article addresses the question of when a trial master file (TMF) can be considered sufficiently accurate and complete: What attributes does the TMF need to have so that a clinical trial can be adequately reconstructed from documented data and procedures? Clinical trial sponsors face significant challenges in assembling the TMF, especially when dealing with large, international, multicenter studies; despite all newly introduced archiving techniques it is becoming more and more difficult to ensure that the TMF is complete. This is directly reflected in the number of inspection findings reported and published by the EMA in 2014. Based on quality risk management principles in clinical trials the authors defined the quality expectations for the different document types in a TMF and furthermore defined tolerance limits for missing documents. This publication provides guidance on what type of documents and processes are most important, and in consequence, indicates on which documents and processes trial team staff should focus in order to achieve a high-quality TMF. The members of this working group belong to the CQAG Group (Clinical Quality Assurance Germany) and are QA (quality assurance) experts (auditors or compliance functions) with long-term experience in the practical handling of TMFs.

  2. Continuous Manufacturing in Pharmaceutical Process Development and Manufacturing.

    PubMed

    Burcham, Christopher L; Florence, Alastair J; Johnson, Martin D

    2018-06-07

    The pharmaceutical industry has found new applications for the use of continuous processing for the manufacture of new therapies currently in development. The transformation has been encouraged by regulatory bodies as well as driven by cost reduction, decreased development cycles, access to new chemistries not practical in batch, improved safety, flexible manufacturing platforms, and improved product quality assurance. The transformation from batch to continuous manufacturing processing is the focus of this review. The review is limited to small, chemically synthesized organic molecules and encompasses the manufacture of both active pharmaceutical ingredients (APIs) and the subsequent drug product. Continuous drug product is currently used in approved processes. A few examples of production of APIs under current good manufacturing practice conditions using continuous processing steps have been published in the past five years, but they are lagging behind continuous drug product with respect to regulatory filings.

  3. Medicare+Choice: what lies ahead?

    PubMed

    Layne, R Jeffrey

    2002-03-01

    Health plans have continued to exit the Medicare+Choice program in recent years, despite efforts of Congress and the Centers for Medicare and Medicaid Services (CMS) to reform the program. Congress and CMS therefore stand poised to make additional, substantial reforms to the program. CMS has proposed to consolidate its oversight of the program, extend the due date for Medicare+Choice plans to file their adjusted community rate proposals, revise risk-adjustment processes, streamline the marketing review process, enhance quality-improvement requirements, institute results based performance assessment audits, coordinate policy changes to coincide with contracting cycles, expand its fall advertising campaign for the program, provide better employer-based Medicare options for beneficiaries, and take steps to minimize beneficiary costs. Congressional leaders have proposed various legislative remedies to improve the program, including creation of an entirely new pricing structure for the program based on a competitive bidding process.

  4. 47 CFR 43.21 - Transactions with affiliates.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... file, by April 1 of each year, a report designed to capture trends in service quality under price cap... report designed to capture trends in service quality under price cap regulation. The report shall contain...) REPORTS OF COMMUNICATION COMMON CARRIERS AND CERTAIN AFFILIATES § 43.21 Transactions with affiliates. (a...

  5. 47 CFR 43.21 - Transactions with affiliates.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... file, by April 1 of each year, a report designed to capture trends in service quality under price cap... report designed to capture trends in service quality under price cap regulation. The report shall contain...) REPORTS OF COMMUNICATION COMMON CARRIERS AND CERTAIN AFFILIATES § 43.21 Transactions with affiliates. (a...

  6. File-access characteristics of parallel scientific workloads

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David; Purakayastha, Apratim; Best, Michael; Ellis, Carla Schlatter

    1995-01-01

    Phenomenal improvements in the computational performance of multiprocessors have not been matched by comparable gains in I/O system performance. This imbalance has resulted in I/O becoming a significant bottleneck for many scientific applications. One key to overcoming this bottleneck is improving the performance of parallel file systems. The design of a high-performance parallel file system requires a comprehensive understanding of the expected workload. Unfortunately, until recently, no general workload studies of parallel file systems have been conducted. The goal of the CHARISMA project was to remedy this problem by characterizing the behavior of several production workloads, on different machines, at the level of individual reads and writes. The first set of results from the CHARISMA project describe the workloads observed on an Intel iPSC/860 and a Thinking Machines CM-5. This paper is intended to compare and contrast these two workloads for an understanding of their essential similarities and differences, isolating common trends and platform-dependent variances. Using this comparison, we are able to gain more insight into the general principles that should guide parallel file-system design.

  7. A cloud-based multimodality case file for mobile devices.

    PubMed

    Balkman, Jason D; Loehfelm, Thomas W

    2014-01-01

    Recent improvements in Web and mobile technology, along with the widespread use of handheld devices in radiology education, provide unique opportunities for creating scalable, universally accessible, portable image-rich radiology case files. A cloud database and a Web-based application for radiologic images were developed to create a mobile case file with reasonable usability, download performance, and image quality for teaching purposes. A total of 75 radiology cases related to breast, thoracic, gastrointestinal, musculoskeletal, and neuroimaging subspecialties were included in the database. Breast imaging cases are the focus of this article, as they best demonstrate handheld display capabilities across a wide variety of modalities. This case subset also illustrates methods for adapting radiologic content to cloud platforms and mobile devices. Readers will gain practical knowledge about storage and retrieval of cloud-based imaging data, an awareness of techniques used to adapt scrollable and high-resolution imaging content for the Web, and an appreciation for optimizing images for handheld devices. The evaluation of this software demonstrates the feasibility of adapting images from most imaging modalities to mobile devices, even in cases of full-field digital mammograms, where high resolution is required to represent subtle pathologic features. The cloud platform allows cases to be added and modified in real time by using only a standard Web browser with no application-specific software. Challenges remain in developing efficient ways to generate, modify, and upload radiologic and supplementary teaching content to this cloud-based platform. Online supplemental material is available for this article. ©RSNA, 2014.

  8. Field spectrometer (S191H) preprocessor tape quality test program design document

    NASA Technical Reports Server (NTRS)

    Campbell, H. M.

    1976-01-01

    Program QA191H performs quality assurance tests on field spectrometer data recorded on 9-track magnetic tape. The quality testing involves the comparison of key housekeeping and data parameters with historic and predetermined tolerance limits. Samples of key parameters are processed during the calibration period and wavelength cal period, and the results are printed out and recorded on an historical file tape.

  9. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.

  10. Construction of the radiation oncology teaching files system for charged particle radiotherapy.

    PubMed

    Masami, Mukai; Yutaka, Ando; Yasuo, Okuda; Naoto, Takahashi; Yoshihisa, Yoda; Hiroshi, Tsuji; Tadashi, Kamada

    2013-01-01

    Our hospital started the charged particle therapy since 1996. New institutions for charged particle therapy are planned in the world. Our hospital are accepting many visitors from those newly planned medical institutions and having many opportunities to provide with the training to them. Based upon our experiences, we have developed the radiation oncology teaching files system for charged particle therapy. We adopted the PowerPoint of Microsoft as a basic framework of our teaching files system. By using our export function of the viewer any physician can create teaching files easily and effectively. Now our teaching file system has 33 cases for clinical and physics contents. We expect that we can improve the safety and accuracy of charged particle therapy by using our teaching files system substantially.

  11. [Master files: less paper, more substance. Special rules for special medicines: Plasma Master File and Vaccine Antigen Master File].

    PubMed

    Seitz, Rainer; Haase, M

    2008-07-01

    The process of reviewing the European pharmaceutical legislation resulted in a codex, which contains two new instruments related to marketing authorisation of biological medicines: Plasma Master File (PMF) and Vaccine Antigen Master File (VAMF). In the manufacture of plasma derivatives (e. g. coagulation factors, albumin, immunoglobulins), usually the same starting material, i. e. a plasma pool, is used for several products. In the case of vaccines, the same active substance, i.e. vaccine antigen, may be included in several combination vaccine products. The intention behind the introduction of PMF and VAMF was to avoid unnecessary and redundant documentation, and to improve and harmonise assessment by means of procedures for certification of master files on the community level.

  12. Improved quality of care for patients infected or colonised with ESBL-producing Enterobacteriaceae in a French teaching hospital: impact of an interventional prospective study and development of specific tools.

    PubMed

    Mondain, Véronique; Lieutier, Florence; Pulcini, Céline; Degand, Nicolas; Landraud, Luce; Ruimy, Raymond; Fosse, Thierry; Roger, Pierre Marie

    2018-05-01

    The increasing incidence of ESBL-producing Enterobacteriaceae (ESBL-E) in France prompted the publication of national recommendations in 2010. Based on these, we developed a toolkit and a warning system to optimise management of ESBL-E infected or colonised patients in both community and hospital settings. The impact of this initiative on quality of care was assessed in a teaching hospital. The ESBL toolkit was developed in 2011 during multidisciplinary meetings involving a regional network of hospital, private clinic and laboratory staff in Southeastern France. It includes antibiotic treatment protocols, a check list, mail templates and a patient information sheet focusing on infection control. Upon identification of ESBL-E, the warning system involves alerting the attending physician and the infectious disease (ID) advisor, with immediate, advice-based implementation of the toolkit. The procedure and toolkit were tested in our teaching hospital. Patient management was compared before and after implementation of the toolkit over two 3-month periods (July-October 2010 and 2012). Implementation of the ESBL-E warning system and ESBL-E toolkit was tested for 87 patients in 2010 and 92 patients in 2012, resulting in improved patient management: expert advice sought and followed (16 vs 97%), information provided to the patient's general practitioner (18 vs 63%) and coding of the condition in the patient's medical file (17 vs 59%), respectively. Our multidisciplinary strategy improved quality of care for in-patients infected or colonised with ESBL-E, increasing compliance with national recommendations.

  13. Improvement of Michigan climatic files in pavement ME design.

    DOT National Transportation Integrated Search

    2015-10-01

    Climatic inputs have a great influence on Mechanistic-Empirical design results of flexible : and rigid pavements. Currently the state of Michigan has 24 climatic files embedded in Pavement ME : Design (PMED), but several limitations have been identif...

  14. FISA Improvements Act of 2013

    THOMAS, 113th Congress

    Sen. Feinstein, Dianne [D-CA

    2013-10-31

    Senate - 11/12/2013 By Senator Feinstein from Select Committee on Intelligence filed written report. Report No. 113-119. Additional and Minority views filed. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  15. Performance Analysis of the Unitree Central File

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Flater, David

    1994-01-01

    This report consists of two parts. The first part briefly comments on the documentation status of two major systems at NASA#s Center for Computational Sciences, specifically the Cray C98 and the Convex C3830. The second part describes the work done on improving the performance of file transfers between the Unitree Mass Storage System running on the Convex file server and the users workstations distributed over a large georgraphic area.

  16. ARCUS Internet Media Archive (IMA): A Window Into the Arctic - An Online Resource for Education and Outreach

    NASA Astrophysics Data System (ADS)

    Buxbaum, T. M.; Warnick, W. K.; Polly, B.; Hueffer, L. J.; Behr, S. A.

    2006-12-01

    The ARCUS Internet Media Archive (IMA) is a collection of photos, graphics, videos, and presentations about the Arctic that are shared through the Internet. It provides the arctic research community and the public at large with a centralized location where images and video pertaining to polar research can be browsed and retrieved for a variety of uses. The IMA currently contains almost 5,000 publicly accessible photos, including 3,000 photos from the National Science Foundation funded Teachers and Researchers Exploring and Collaborating (TREC) program, an educational research experience in which K-12 teachers participate in arctic research as a pathway to improving science education. The IMA also includes 360 video files, 260 audio files, and approximately 8,000 additional resources that are being prepared for public access. The contents of this archive are organized by file type, contributor's name, event, or by organization, with each photo or file accompanied by information on content, contributor source, and usage requirements. All the files are keyworded and all information, including file name and description, is completely searchable. ARCUS plans to continue to improve and expand the IMA with a particular focus on providing graphics depicting key arctic research results and findings as well as edited video archives of relevant scientific community meetings.

  17. Design and Implementation of a Metadata-rich File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less

  18. Scale-up of networked HIV treatment in Nigeria: creation of an integrated electronic medical records system.

    PubMed

    Chaplin, Beth; Meloni, Seema; Eisen, Geoffrey; Jolayemi, Toyin; Banigbe, Bolanle; Adeola, Juliette; Wen, Craig; Reyes Nieva, Harry; Chang, Charlotte; Okonkwo, Prosper; Kanki, Phyllis

    2015-01-01

    The implementation of PEPFAR programs in resource-limited settings was accompanied by the need to document patient care on a scale unprecedented in environments where paper-based records were the norm. We describe the development of an electronic medical records system (EMRS) put in place at the beginning of a large HIV/AIDS care and treatment program in Nigeria. Databases were created to record laboratory results, medications prescribed and dispensed, and clinical assessments, using a relational database program. A collection of stand-alone files recorded different elements of patient care, linked together by utilities that aggregated data on national standard indicators and assessed patient care for quality improvement, tracked patients requiring follow-up, generated counts of ART regimens dispensed, and provided 'snapshots' of a patient's response to treatment. A secure server was used to store patient files for backup and transfer. By February 2012, when the program transitioned to local in-country management by APIN, the EMRS was used in 33 hospitals across the country, with 4,947,433 adult, pediatric and PMTCT records that had been created and continued to be available for use in patient care. Ongoing trainings for data managers, along with an iterative process of implementing changes to the databases and forms based on user feedback, were needed. As the program scaled up and the volume of laboratory tests increased, results were produced in a digital format, wherever possible, that could be automatically transferred to the EMRS. Many larger clinics began to link some or all of the databases to local area networks, making them available to a larger group of staff members, or providing the ability to enter information simultaneously where needed. The EMRS improved patient care, enabled efficient reporting to the Government of Nigeria and to U.S. funding agencies, and allowed program managers and staff to conduct quality control audits. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Development of an Improved Time Varying Loudness Model with the Inclusion of Binaural Loudness Summation

    NASA Astrophysics Data System (ADS)

    Charbonneau, Jeremy

    As the perceived quality of a product is becoming more important in the manufacturing industry, more emphasis is being placed on accurately predicting the sound quality of everyday objects. This study was undertaken to improve upon current prediction techniques with regard to the psychoacoustic descriptor of loudness and an improved binaural summation technique. The feasibility of this project was first investigated through a loudness matching experiment involving thirty-one subjects and pure tones of constant sound pressure level. A dependence of binaural summation on frequency was observed which had previously not been a subject of investigation in the reviewed literature. A follow-up investigation was carried out with forty-eight volunteers and pure tones of constant sensation level. Contrary to existing theories in literature the resulting loudness matches revealed an amplitude versus frequency relationship which confirmed the perceived increase in loudness when a signal was presented to both ears simultaneously as opposed to one ear alone. The resulting trend strongly indicated that the higher the frequency of the presented signal, the greater the increase in observed binaural summation. The results from each investigation were summarized into a single binaural summation algorithm and inserted into an improved time-varying loudness model. Using experimental techniques, it was demonstrated that the updated binaural summation algorithm was a considerable improvement over the state of the art approach for predicting the perceived binaural loudness. The improved function retained the ease of use from the original model while additionally providing accurate estimates of diotic listening conditions from monaural WAV files. It was clearly demonstrated using a validation jury test that the revised time-varying loudness model was a significant improvement over the previously standardized approach.

  20. 77 FR 44218 - Marine Mammals; File No. 16111

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-27

    ... (B. physalus), humpback (Megaptera novaeangliae), eastern gray (Eschrichtius robustus), sperm... significantly impact the quality of the human environment and that preparation of an environmental impact...

  1. 29 CFR 1983.103 - Filing of retaliation complaints.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION... THE CONSUMER PRODUCT SAFETY IMPROVEMENT ACT OF 2008. Complaints, Investigations, Findings and... may be filed orally or in writing. Oral complaints will be reduced to writing by OSHA. If the...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    SmartImport.py is a Python source-code file that implements a replacement for the standard Python module importer. The code is derived from knee.py, a file in the standard Python diestribution , and adds functionality to improve the performance of Python module imports in massively parallel contexts.

  3. I/O performance evaluation of a Linux-based network-attached storage device

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoyan; Dong, Yonggui; Wu, Jinglian; Jia, Huibo; Feng, Guanping

    2002-09-01

    In a Local Area Network (LAN), clients are permitted to access the files on high-density optical disks via a network server. But the quality of read service offered by the conventional server is not satisfied because of the multiple functions on the server and the overmuch caller. This paper develops a Linux-based Network-Attached Storage (NAS) server. The Operation System (OS), composed of an optimized kernel and a miniaturized file system, is stored in a flash memory. After initialization, the NAS device is connected into the LAN. The administrator and users could configure the access the server through the web page respectively. In order to enhance the quality of access, the management of buffer cache in file system is optimized. Some benchmark programs are peformed to evaluate the I/O performance of the NAS device. Since data recorded in optical disks are usually for reading accesses, our attention is focused on the reading throughput of the device. The experimental results indicate that the I/O performance of our NAS device is excellent.

  4. Radiographic technical quality of root canal treatment performed by a new rotary single-file system.

    PubMed

    Colombo, Marco; Bassi, Cristina; Beltrami, Riccardo; Vigorelli, Paolo; Spinelli, Antonio; Cavada, Andrea; Dagna, Alberto; Chiesa, Marco; Poggio, Claudio

    2017-01-01

    The aim of the present study was to evaluate radiographically the technical quality of root canal filling performed by postgraduate students with a new single-file Nickel-Titanium System (F6 Skytaper Komet) in clinical practice. Records of 74 patients who had received endodontic treatment by postgraduate students at the School of Dentistry, Faculty of Medicine, University of Pavia in the period between September 2015 and April 2016 were collected and examined: the final sample consisted 114 teeth and 204 root canals. The quality of endodontic treatment was evaluated by examining the length of the filling in relation to the radiographic apex, the density of the obturation according to the presence of voids and the taper of root canal filling. Chi-squared analysis was used to determine statistically significant differences between the technical quality of root fillings according to tooth's type, position and curvature. The results showed that 75,49%, 82,84% and 90,69% of root filled canals had adequate length, density and taper respectively. Overall, the technical quality of root canal fillings performed by postgraduates students was acceptable in 60,78% of the cases.

  5. Criteria-based audit to improve quality of care of foetal distress: standardising obstetric care at a national referral hospital in a low resource setting, Tanzania.

    PubMed

    Mgaya, Andrew H; Litorp, Helena; Kidanto, Hussein L; Nyström, Lennarth; Essén, Birgitta

    2016-11-08

    In Tanzania, substandard intrapartum management of foetal distress contributes to a third of perinatal deaths, and the majority are term deliveries. We conducted a criteria-based audit with feedback to determine whether standards of diagnosis and management of foetal distress would be improved in a low-resource setting. During 2013-2015, a criteria-based audit was performed at the national referral hospital in Dar es Salaam. Case files of deliveries with a diagnosis of foetal distress were identified and audited. Two registered nurses under supervision of a nurse midwife, a specialist obstetrician and a consultant obstetrician, reviewed the case files. Criteria for standard diagnosis and management of foetal distress were developed based on international and national guidelines, and literature reviews, and then, stepwise applied, in an audit cycle. During the baseline audit, substandard care was identified, and recommendations for improvement of care were proposed and implemented. The effect of the implementations was assessed by the differences in percentage of standard diagnosis and management between the baseline and re-audit, using Chi-square test or Fisher's exact test, when appropriate. In the baseline audit and re-audit, 248 and 251 deliveries with a diagnosis of foetal distress were identified and audited, respectively. The standard of diagnosis increased significantly from 52 to 68 % (p < 0.001). Standards of management improved tenfold from 0.8 to 8.8 % (p < 0.001). Improved foetal heartbeat monitoring using a Fetal Doppler was the major improvement in diagnoses, while change of position of the mother and reduced time interval from decision to perform caesarean section to delivery were the major improvements in management (all p < 0.001). Percentage of cases with substandard diagnosis and management was significantly reduced in both referred public and non-referred private patients (all p ≤ 0.01) but not in non-referred public and referred private patients. The criteria-based audit was able to detect substandard diagnosis and management of foetal distress and improved care using feedback and available resources.

  6. Improved NASTRAN plotting

    NASA Technical Reports Server (NTRS)

    Chan, Gordon C.

    1991-01-01

    The new 1991 COSMIC/NASTRAN version, compatible with the older versions, tries to remove some old constraints and make it easier to extract information from the plot file. It also includes some useful improvements and new enhancements. New features available in the 1991 version are described. They include a new PLT1 tape with simplified ASCII plot commands and short records, combined hidden and shrunk plot, an x-y-z coordinate system on all structural plots, element offset plot, improved character size control, improved FIND and NOFIND logic, a new NASPLOT post-prosessor to perform screen plotting or generate PostScript files, and a BASIC/NASTPLOT program for PC.

  7. 41 CFR 101-26.100-3 - Warranties.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Form (SF) 368, Product Quality Deficiency Report, in duplicate, shall be sent to the GSA Discrepancy... should be clearly stated in the text of the SF 368. This information will be maintained as a quality history file for use in future procurements. (b) If the contractor refuses to correct, or fails to replace...

  8. ARCUS Internet Media Archive (IMA): A Resource for Outreach and Education

    NASA Astrophysics Data System (ADS)

    Polly, Z.; Warnick, W. K.; Polly, J.

    2008-12-01

    The ARCUS Internet Media Archive (IMA) is a collection of photos, graphics, videos, and presentations about the Arctic that are shared through the Internet. It provides the arctic research community and the public at large with a centralized location where images and video pertaining to polar research can be browsed and retrieved for a variety of uses. The IMA currently contains almost 6,500 publicly accessible photos, including 4,000 photos from the National Science Foundation funded Teachers and Researchers Exploring and Collaborating (TREC, now PolarTREC) program, an educational research experience in which K-12 teachers participate in arctic research as a pathway to improving science education. The IMA also includes 450 video files, 270 audio files, nearly 100 graphics and logos, 28 presentations, and approximately 10,000 additional resources that are being prepared for public access. The contents of this archive are organized by file type, contributor's name, event, or by organization, with each photo or file accompanied by information on content, contributor source, and usage requirements. All the files are key-worded and all information, including file name and description, is completely searchable. ARCUS plans to continue to improve and expand the IMA with a particular focus on providing graphics depicting key arctic research results and findings as well as edited video archives of relevant scientific community meetings. To submit files or for more information and to view the ARCUS Internet Media Archive, please go to: http://media.arcus.org or email photo@arcus.org.

  9. Application Of Decision Tree Approach To Student Selection Model- A Case Study

    NASA Astrophysics Data System (ADS)

    Harwati; Sudiya, Amby

    2016-01-01

    The main purpose of the institution is to provide quality education to the students and to improve the quality of managerial decisions. One of the ways to improve the quality of students is to arrange the selection of new students with a more selective. This research takes the case in the selection of new students at Islamic University of Indonesia, Yogyakarta, Indonesia. One of the university's selection is through filtering administrative selection based on the records of prospective students at the high school without paper testing. Currently, that kind of selection does not yet has a standard model and criteria. Selection is only done by comparing candidate application file, so the subjectivity of assessment is very possible to happen because of the lack standard criteria that can differentiate the quality of students from one another. By applying data mining techniques classification, can be built a model selection for new students which includes criteria to certain standards such as the area of origin, the status of the school, the average value and so on. These criteria are determined by using rules that appear based on the classification of the academic achievement (GPA) of the students in previous years who entered the university through the same way. The decision tree method with C4.5 algorithm is used here. The results show that students are given priority for admission is that meet the following criteria: came from the island of Java, public school, majoring in science, an average value above 75, and have at least one achievement during their study in high school.

  10. Storage, retrieval, and edit of digital video using Motion JPEG

    NASA Astrophysics Data System (ADS)

    Sudharsanan, Subramania I.; Lee, D. H.

    1994-04-01

    In a companion paper we describe a Micro Channel adapter card that can perform real-time JPEG (Joint Photographic Experts Group) compression of a 640 by 480 24-bit image within 1/30th of a second. Since this corresponds to NTSC video rates at considerably good perceptual quality, this system can be used for real-time capture and manipulation of continuously fed video. To facilitate capturing the compressed video in a storage medium, an IBM Bus master SCSI adapter with cache is utilized. Efficacy of the data transfer mechanism is considerably improved using the System Control Block architecture, an extension to Micro Channel bus masters. We show experimental results that the overall system can perform at compressed data rates of about 1.5 MBytes/second sustained and with sporadic peaks to about 1.8 MBytes/second depending on the image sequence content. We also describe mechanisms to access the compressed data very efficiently through special file formats. This in turn permits creation of simpler sequence editors. Another advantage of the special file format is easy control of forward, backward and slow motion playback. The proposed method can be extended for design of a video compression subsystem for a variety of personal computing systems.

  11. Better Management of Private Pension Plan Data Can Reduce Costs and Improve ERISA Administration.

    DTIC Science & Technology

    1981-10-19

    effectiveness of (1) the agencies’ efforts to make sure pension plans file ERISA annual reports, annual premium filings, and summary plan descrip- tions...Information required to be reported annually by private pension plans is not being effectively , efficiently, or economically managed. Although complex...maries if Labor is to effectively provide requestors with summaries from its own files as anticipated by ERISA. Such action would add significantly to

  12. Chemical annotation of small and peptide-like molecules at the Protein Data Bank

    PubMed Central

    Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.

    2013-01-01

    Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661

  13. Peer Review and Surgical Innovation: Robotic Surgery and Its Hurdles.

    PubMed

    Vyas, Dinesh; Cronin, Sean

    2015-12-01

    The peer review processes as outlined in the Health Care Quality Improvement Act (HCQIA) is meant ensure quality standard of care through a self-policing mechanism by the medical community. This process grants immunity for people filing a peer review, which is meant to protect whistleblowers. However, it also creates a loophole that can be used maliciously to hinder competition. This is accentuated when surgeons are integrating new technologies, such as robotic surgery, into their practice. With more than 2000 da Vinci robots in use and more than 300 new units being shipped each year, robotic surgery has become a mainstay in the surgical field. The applications for robots continue to expand as surgeons discover their expanding capability. We need a better peer review process. That ensures the peer review is void of competitive bias. Peer reviewers need to be familiar with the procedure and the technology. The current process could stymie innovation in the name of competition.

  14. MAGI: a Node.js web service for fast microRNA-Seq analysis in a GPU infrastructure.

    PubMed

    Kim, Jihoon; Levy, Eric; Ferbrache, Alex; Stepanowsky, Petra; Farcas, Claudiu; Wang, Shuang; Brunner, Stefan; Bath, Tyler; Wu, Yuan; Ohno-Machado, Lucila

    2014-10-01

    MAGI is a web service for fast MicroRNA-Seq data analysis in a graphics processing unit (GPU) infrastructure. Using just a browser, users have access to results as web reports in just a few hours->600% end-to-end performance improvement over state of the art. MAGI's salient features are (i) transfer of large input files in native FASTA with Qualities (FASTQ) format through drag-and-drop operations, (ii) rapid prediction of microRNA target genes leveraging parallel computing with GPU devices, (iii) all-in-one analytics with novel feature extraction, statistical test for differential expression and diagnostic plot generation for quality control and (iv) interactive visualization and exploration of results in web reports that are readily available for publication. MAGI relies on the Node.js JavaScript framework, along with NVIDIA CUDA C, PHP: Hypertext Preprocessor (PHP), Perl and R. It is freely available at http://magi.ucsd.edu. © The Author 2014. Published by Oxford University Press.

  15. Chemical annotation of small and peptide-like molecules at the Protein Data Bank.

    PubMed

    Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M

    2013-01-01

    Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.

  16. Electrochemical impedance spectroscopy investigation on the clinical lifetime of ProTaper rotary file system.

    PubMed

    Penta, Virgil; Pirvu, Cristian; Demetrescu, Ioana

    2014-01-01

    The main objective of the current paper is to show that electrochemical impedance spectroscopy (EIS) could be a method for evaluating and predicting of ProTaper rotary file system clinical lifespan. This particular aspect of everyday use of the endodontic files is of great importance in each dental practice and has profound clinical implications. The method used for quantification resides in the electrochemical impedance spectroscopy theory and has in its main focus the characteristics of the surface titanium oxide layer. This electrochemical technique has been adapted successfully to identify the quality of the Ni-Ti files oxide layer. The modification of this protective layer induces changes in corrosion behavior of the alloy modifying the impedance value of the file. In order to assess the method, 14 ProTaper sets utilized on different patients in a dental clinic have been submitted for testing using EIS. The information obtained in regard to the surface oxide layer has offered an indication of use and proves that the said layer evolves with each clinical application. The novelty of this research is related to an electrochemical technique successfully adapted for Ni-Ti file investigation and correlation with surface and clinical aspects.

  17. Improving Assistance to Domestic and Sexual Violence Victims Act of 2009

    THOMAS, 111th Congress

    Sen. Leahy, Patrick J. [D-VT

    2009-01-26

    Senate - 10/01/2009 By Senator Leahy from Committee on the Judiciary filed written report. Report No. 111-85. Minority views filed. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  18. Profex: a graphical user interface for the Rietveld refinement program BGMN.

    PubMed

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-10-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.

  19. Profex: a graphical user interface for the Rietveld refinement program BGMN

    PubMed Central

    Doebelin, Nicola; Kleeberg, Reinhard

    2015-01-01

    Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems. PMID:26500466

  20. Aquatic Toxicity Information Retrieval Data Base (ACQUIRE). Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The purpose of Acquire is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. Scientific papers published both nationally and internationally on the toxicity of chemicals to aquatic organisms and plants are collected and reviewed for ACQUIRE. Independently compiled data files that meet ACQUIRE parameter and quality assurance criteria are also included. Selected toxicity test results and related testing information for any individual chemical from laboratory and field aquatic toxicity effects are included for tests with freshwater and marine organisms. The total number of data records in ACQUIRE is now over 105,300.more » This includes data from 6000 references, for 5200 chemicals and 2400 test species. A major data file, Acute Toxicity of Organic Chemicals (ATOC), has been incorporated into ACQUIRE. The ATOC file contains laboratory acute test data on 525 organic chemicals using juvenile fathead minnows.« less

  1. Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.

    2013-10-01

    Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).

  2. Conducting research on the Medicare market: the need for better data and methods.

    PubMed

    Wong, H S; Hellinger, F J

    2001-04-01

    To highlight data limitations, the need to improve data collection, the need to develop better analytic methods, and the need to use alternative data sources to conduct research related to the Medicare program. Objectives were achieved by reviewing existing studies on risk selection in Medicare HMOs, examining their data limitations, and introducing a new approach that circumvents many of these shortcomings. Data for years 1995-97 for five states (Arizona, Florida, Massachusetts, New York, and Pennsylvania) from the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases (SIDs), maintained by the Agency for Healthcare Research and Quality; and the Health Care Financing Administration's Medicare Managed Care Market Penetration Data Files and Medicare Provider Analysis and Review Files. Analysis of hospital utilization rates for Medicare beneficiaries in the traditional fee-for-service (FFS) Medicare and Medicare HMO sectors and examination of the relationship between these rates and the Medicare HMO penetration rates. Medicare HMOs have lower hospital utilization rates than their FFS counterparts, differences in utilization rates vary across states, and HMO penetration rates are inversely related to our rough measure of favorable selection. Substantial growth in Medicare HMO enrollment and the implementation of a new risk-adjusted payment system have led to an increasing need for research on the Medicare program. Improved data collection, better methods, new creative approaches, and alternative data sources are needed to address these issues in a timely and suitable manner.

  3. Representation of thermal infrared imaging data in the DICOM using XML configuration files.

    PubMed

    Ruminski, Jacek

    2007-01-01

    The DICOM standard has become a widely accepted and implemented format for the exchange and storage of medical imaging data. Different imaging modalities are supported however there is not a dedicated solution for thermal infrared imaging in medicine. In this article we propose new ideas and improvements to final proposal of the new DICOM Thermal Infrared Imaging structures and services. Additionally, we designed, implemented and tested software packages for universal conversion of existing thermal imaging files to the DICOM format using XML configuration files. The proposed solution works fast and requires minimal number of user interactions. The XML configuration file enables to compose a set of attributes for any source file format of thermal imaging camera.

  4. Investigation for improving Global Positioning System (GPS) orbits using a discrete sequential estimator and stochastic models of selected physical processes

    NASA Technical Reports Server (NTRS)

    Goad, Clyde C.; Chadwell, C. David

    1993-01-01

    GEODYNII is a conventional batch least-squares differential corrector computer program with deterministic models of the physical environment. Conventional algorithms were used to process differenced phase and pseudorange data to determine eight-day Global Positioning system (GPS) orbits with several meter accuracy. However, random physical processes drive the errors whose magnitudes prevent improving the GPS orbit accuracy. To improve the orbit accuracy, these random processes should be modeled stochastically. The conventional batch least-squares algorithm cannot accommodate stochastic models, only a stochastic estimation algorithm is suitable, such as a sequential filter/smoother. Also, GEODYNII cannot currently model the correlation among data values. Differenced pseudorange, and especially differenced phase, are precise data types that can be used to improve the GPS orbit precision. To overcome these limitations and improve the accuracy of GPS orbits computed using GEODYNII, we proposed to develop a sequential stochastic filter/smoother processor by using GEODYNII as a type of trajectory preprocessor. Our proposed processor is now completed. It contains a correlated double difference range processing capability, first order Gauss Markov models for the solar radiation pressure scale coefficient and y-bias acceleration, and a random walk model for the tropospheric refraction correction. The development approach was to interface the standard GEODYNII output files (measurement partials and variationals) with software modules containing the stochastic estimator, the stochastic models, and a double differenced phase range processing routine. Thus, no modifications to the original GEODYNII software were required. A schematic of the development is shown. The observational data are edited in the preprocessor and the data are passed to GEODYNII as one of its standard data types. A reference orbit is determined using GEODYNII as a batch least-squares processor and the GEODYNII measurement partial (FTN90) and variational (FTN80, V-matrix) files are generated. These two files along with a control statement file and a satellite identification and mass file are passed to the filter/smoother to estimate time-varying parameter states at each epoch, improved satellite initial elements, and improved estimates of constant parameters.

  5. Connecting the providers in your healthcare community: one step at a time.

    PubMed

    Nelson, Rosemarie

    2005-01-01

    The practice of medicine is a business of communications. Communications can be facilitated by technology. Healthcare providers organized in medical practices, hospitals, and nursing homes have tremendous needs to effectively communicate within their organizations and between their organizations. The focus on electronic medical records comes not only from the need to communicate but also from a desire to reduce administrative costs and to improve services and quality of care to patients. Frustration with the inadequacies of a paper chart-filing system drives providers in all delivery venues toward technology at an increasing rate. Implementation barriers to technology adoption in medical practices can be overcome by incremental approaches and knowledge-transfer assistance from affiliated community healthcare partners such as hospitals.

  6. Electroconvulsive Therapy Practice in the Province of Quebec: Linked Health Administrative Data Study from 1996 to 2013.

    PubMed

    Lemasson, Morgane; Haesebaert, Julie; Rochette, Louis; Pelletier, Eric; Lesage, Alain; Patry, Simon

    2017-01-01

    As part of a quality improvement process, we propose a model of routinely monitoring electroconvulsive therapy (ECT) in Canadian provinces using linked health administrative databases to generate provincial periodic reports, influence policy, and standardise ECT practices. ECT practice in Quebec was studied from 1996 to 2013, using longitudinal data from the Quebec Integrated Chronic Disease Surveillance System of the Institut National de Santé Publique du Québec, which links 5 health administrative databases. The population included all persons, aged 18 y and over, eligible for the health insurance registry, who received an ECT treatment at least once during the year. Among recorded cases, 75% were identified by physician claims and hospitalisation files, 19% exclusively by physician claims, and 6% by hospitalisation files. From 1996 to 2013, 8,149 persons in Quebec received ECT with an annual prevalence rate of 13 per 100,000. A decline was observed, which was more pronounced in women and in older persons. On average, each patient received 9.7 treatments of ECT annually. The proportion of acute ECT decreased whereas maintenance treatment proportions increased. A wide variation in the use of ECT was observed among regions and psychiatrists. This study demonstrates the profitable use of administrative data to monitor ECT use in Quebec, and provides a reliable method that could be replicated in other Canadian provinces. Although Quebec has one of the lowest utilisation rates reported in industrialized countries, regional disparities highlighted the need for a deeper examination of the quality and monitoring of ECT care and services.

  7. NASA CDDIS: Next Generation System

    NASA Astrophysics Data System (ADS)

    Michael, B. P.; Noll, C. E.; Woo, J. Y.; Limbacher, R. I.

    2017-12-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to make space geodesy and geodynamics related data and derived products available in a central archive, to maintain information about the archival of these data, to disseminate these data and information in a timely manner to a global scientific research community, and to provide user based tools for the exploration and use of the archive. As the techniques and data volume have increased, the CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. As a major step taken to improve services, the CDDIS has transitioned to a new hardware system and implemented incremental upgrades to a new software system to meet these goals while increasing automation. This new system increases the ability of the CDDIS to consistently track errors and issues associated with data and derived product files uploaded to the system and to perform post-ingest checks on all files received for the archive. In addition, software to process new data sets and changes to existing data sets have been implemented to handle new formats and any issues identified during the ingest process. In this poster, we will discuss the CDDIS archive in general as well as review and contrast the system structures and quality control measures employed before and after the system upgrade. We will also present information about new data sets and changes to existing data and derived products archived at the CDDIS.

  8. flowAI: automatic and interactive anomaly discerning tools for flow cytometry data.

    PubMed

    Monaco, Gianni; Chen, Hao; Poidinger, Michael; Chen, Jinmiao; de Magalhães, João Pedro; Larbi, Anis

    2016-08-15

    Flow cytometry (FCM) is widely used in both clinical and basic research to characterize cell phenotypes and functions. The latest FCM instruments analyze up to 20 markers of individual cells, producing high-dimensional data. This requires the use of the latest clustering and dimensionality reduction techniques to automatically segregate cell sub-populations in an unbiased manner. However, automated analyses may lead to false discoveries due to inter-sample differences in quality and properties. We present an R package, flowAI, containing two methods to clean FCM files from unwanted events: (i) an automatic method that adopts algorithms for the detection of anomalies and (ii) an interactive method with a graphical user interface implemented into an R shiny application. The general approach behind the two methods consists of three key steps to check and remove suspected anomalies that derive from (i) abrupt changes in the flow rate, (ii) instability of signal acquisition and (iii) outliers in the lower limit and margin events in the upper limit of the dynamic range. For each file analyzed our software generates a summary of the quality assessment from the aforementioned steps. The software presented is an intuitive solution seeking to improve the results not only of manual but also and in particular of automatic analysis on FCM data. R source code available through Bioconductor: http://bioconductor.org/packages/flowAI/ CONTACTS: mongianni1@gmail.com or Anis_Larbi@immunol.a-star.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. [Root canal treatment of mandibular second premolar tooth with taurodontism].

    PubMed

    Vujasković, Mirjana; Karadzić, Branislav; Miletić, Vesna

    2008-01-01

    Taurodontism is a morphoanatomical change in the shape of a tooth. An enlarged body of a tooth with smaller than usual roots is a characteristic feature. Internal tooth anatomy correlates with this appearance, which means that a taurodontal tooth has a large pulp chamber and apically positioned furcations. This dental anomaly may be associated with different syndromes and congenital discoders. The case report presents the patient of a rare case of taurodontism in the mandibular second premolar with chronic periodontitis. Endodontic treatment was performed after dental history and clinical examination. Special care is required in all segments of endodontic treatment of a taurodontal tooth from the identification orifice, canal exploration, determining working length, cleaning and shaping and obturation of the root canal. Precurved K-file was used for canal exploration and location of the furcation. One mesial and one distal canal with the buccal position were identified in the apical third of the root canal. The working lengths of two canals were determined by radiographic interpretation with two K-files in each canal and verified with the apex locator. During canal instrumentation, the third canal was located in the disto-lingual position. The working length of the third canal was established using the apex locator. Thorough knowledge of tooth anatomy and its variations can lead to lower percentage of endodontic failure. Each clinical case involving these teeth should be investigated carefully, clinically and radiographically to detect additional root canals. High quality radiographs from different angles and proper instrumentarium improve the quality of endodontic procedure.

  10. Finite Element Analysis of a Copper Single Crystal Shape Memory Alloy-Based Endodontic Instruments

    NASA Astrophysics Data System (ADS)

    Vincent, Marin; Thiebaud, Frédéric; Bel Haj Khalifa, Saifeddine; Engels-Deutsch, Marc; Ben Zineb, Tarak

    2015-10-01

    The aim of the present paper is the development of endodontic Cu-based single crystal Shape Memory Alloy (SMA) instruments in order to eliminate the antimicrobial and mechanical deficiencies observed with the conventional Nickel-Titane (NiTi) SMA files. A thermomechanical constitutive law, already developed and implemented in a finite element code by our research group, is adopted for the simulation of the single crystal SMA behavior. The corresponding material parameters were identified starting from experimental results for a tensile test at room temperature. A computer-aided design geometry has been achieved and considered for a finite element structural analysis of the endodontic Cu-based single crystal SMA files. They are meshed with tetrahedral continuum elements to improve the computation time and the accuracy of results. The geometric parameters tested in this study are the length of the active blade, the rod length, the pitch, the taper, the tip diameter, and the rod diameter. For each set of adopted parameters, a finite element model is built and tested in a combined bending-torsion loading in accordance with ISO 3630-1 norm. The numerical analysis based on finite element procedure allowed purposing an optimal geometry suitable for Cu-based single crystal SMA endodontic files. The same analysis was carried out for the classical NiTi SMA files and a comparison was made between the two kinds of files. It showed that Cu-based single crystal SMA files are less stiff than the NiTi files. The Cu-based endodontic files could be used to improve the root canal treatments. However, the finite element analysis brought out the need for further investigation based on experiments.

  11. Measurements of file transfer rates over dedicated long-haul connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Settlemyer, Bradley W; Imam, Neena

    2016-01-01

    Wide-area file transfers are an integral part of several High-Performance Computing (HPC) scenarios. Dedicated network connections with high capacity, low loss rate and low competing traffic, are increasingly being provisioned over current HPC infrastructures to support such transfers. To gain insights into these file transfers, we collected transfer rate measurements for Lustre and xfs file systems between dedicated multi-core servers over emulated 10 Gbps connections with round trip times (rtt) in 0-366 ms range. Memory transfer throughput over these connections is measured using iperf, and file IO throughput on host systems is measured using xddprof. We consider two file systemmore » configurations: Lustre over IB network and xfs over SSD connected to PCI bus. Files are transferred using xdd across these connections, and the transfer rates are measured, which indicate the need to jointly optimize the connection and host file IO parameters to achieve peak transfer rates. In particular, these measurements indicate that (i) peak file transfer rate is lower than peak connection and host IO throughput, in some cases by as much as 50% or lower, (ii) xdd request sizes that achieve peak throughput for host file IO do not necessarily lead to peak file transfer rates, and (iii) parallelism in host IO and TCP transport does not always improve the file transfer rates.« less

  12. Analyzing endosonic root canal file oscillations: an in vitro evaluation.

    PubMed

    Lea, Simon C; Walmsley, A Damien; Lumley, Philip J

    2010-05-01

    Passive ultrasonic irrigation may be used to improve bacterial reduction within the root canal. The technique relies on a small file being driven to oscillate freely within the canal and activating an irrigant solution through biophysical forces such as microstreaming. There is limited information regarding a file's oscillation patterns when operated while surrounded by fluid as is the case within a canal root. Files of different sizes (#10 and #30, 27 mm and 31 mm) were connected to an ultrasound generator via a 120 degrees file holder. Files were immersed in a water bath, and a laser vibrometer set up with measurement lines superimposed over the files. The laser vibrometer was scanned over the oscillating files. Measurements were repeated 10 times for each file/power setting used. File mode shapes are comprised of a series of nodes/antinodes, with thinner, longer files producing more antinodes. The maximum vibration occurred at the free end of the file. Increasing generator power had no significant effect on this maximum amplitude (p > 0.20). Maximum displacement amplitudes were 17 to 22 microm (#10 file, 27 mm), 15 to 21 microm (#10 file, 31 mm), 6 to 9 microm (#30 file, 27 mm), and 5 to 7 microm (#30, 31 mm) for all power settings. Antinodes occurring along the remaining file length were significantly larger at generator power 1 than at powers 2 through 5 (p < 0.03). At higher generator powers, energy delivered to the file is dissipated in unwanted vibration resulting in reduced vibration displacement amplitudes. This may reduce the occurrence of the biophysical forces necessary to maximize the technique's effectiveness. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    2015-06-15

    Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less

  14. Field Evaluation in Four NEEMO Divers of a Prototype In-suit Doppler Ultrasound Bubble Detector

    NASA Technical Reports Server (NTRS)

    Acock, K. E.; Gernhardt, M. L.; Conkin, J.; Powell, M. R.

    2004-01-01

    It is desirable to know if astronauts produce venous gas emboli (VGE) as a result of their exposure to 4.3 psia during space walks. The current prototype in-suit Doppler (ISD) ultrasound bubble detector provides an objective assessment of decompression stress by monitoring for VGE. The NOAA Aquarius habitat and NASA Extreme Environment Mission Operations (NEEMO) series of dives provided an opportunity to assess the ability of the prototype ISDs to record venous blood flow and possibly detect VGE in the pulmonary artery. From July 16 to 29,2003, four aquanauts (two males and two females) donned the ISD for a 4 hr automated recording session, following excursion dives (up to 6hrs and 29 MSW below storage depth) from air saturation at 17 MSW. Doppler recordings for 32 excursion dives were collected. The recordings consisted of approximately 150 digital wave files. Each wave file contained 24 sec of recording for each min. A 1 - 4 Doppler Quality Score (DQS) was assigned to each wave file in 17 of the 32 records evaluated to date. A DQS of 1 indicates a poor flow signal and a score of 4 indicates an optimum signal. Only 23% of all wave files had DQSs considered adequate to detect low grade VGE (Spencer I-II). The distribution of DQS in 2,356 wave files is as follows: DQS 1-56%, DQS 2-21%, DQS 3-18% and DQS 4-5%. Six of the 17 records had false positive VGE (Spencer I-IV) detected in one or more wave files per dive record. The false positive VGE recordings are attributable to air entrainment associated with drinking (verified by control tests), and this observation is important as astronauts drink water during space walks. The current ISD design provides quality recordings only over a narrow range of chest anatomy.

  15. Usability and Interoperability Improvements for an EASE-Grid 2.0 Passive Microwave Data Product Using CF Conventions

    NASA Astrophysics Data System (ADS)

    Hardman, M.; Brodzik, M. J.; Long, D. G.

    2017-12-01

    Beginning in 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Historical versions of the gridded passive microwave data sets were produced as flat binary files described in human-readable documentation. This format is error-prone and makes it difficult to reliably include all processing and provenance. Funded by NASA MEaSUREs, we have completely reprocessed the gridded data record that includes SMMR, SSM/I-SSMIS and AMSR-E. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) files are self-describing. Our approach to the new data set was to create netCDF4 files that use standard metadata conventions and best practices to incorporate file-level, machine- and human-readable contents, geolocation, processing and provenance metadata. We followed the flexible and adaptable Climate and Forecast (CF-1.6) Conventions with respect to their coordinate conventions and map projection parameters. Additionally, we made use of Attribute Conventions for Dataset Discovery (ACDD-1.3) that provided file-level conventions with spatio-temporal bounds that enable indexing software to search for coverage. Our CETB files also include temporal coverage and spatial resolution in the file-level metadata for human-readability. We made use of the JPL CF/ACDD Compliance Checker to guide this work. We tested our file format with real software, for example, netCDF Command-line Operators (NCO) power tools for unlimited control on spatio-temporal subsetting and concatenation of files. The GDAL tools understand the CF metadata and produce fully-compliant geotiff files from our data. ArcMap can then reproject the geotiff files on-the-fly and work with other geolocated data such as coastlines, with no special work required. We expect this combination of standards and well-tested interoperability to significantly improve the usability of this important ESDR for the Earth Science community.

  16. Generation of Long-time Complex Signals for Testing the Instruments for Detection of Voltage Quality Disturbances

    NASA Astrophysics Data System (ADS)

    Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir

    2018-04-01

    Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.

  17. Managing an Uncontrolled Vocabulary Ex Post Factor

    ERIC Educational Resources Information Center

    Lefever, Maureen; And Others

    1972-01-01

    The operational retrospective retrieval service offered by BIOSIS exploits a file created essentially without vocabulary control. A pragmatic program of file building criteria has been persued which has provided improved retrieval and an annual summary of the vocabulary of the literature. (11 references) (Author/KE)

  18. Depression

    MedlinePlus

    ... Behavioral Health Statistics and Quality. (2017). 2016 National Survey on Drug Use and Health: Table 8.56A ( ... the United States: Results from the 2015 National Survey on Drug Use and Health (PDF file, 2. ...

  19. 77 FR 31836 - Marine Mammals; File No. 15240

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-30

    .... borealis), humpback (Megaptera novaeangliae), sperm (Physeter macrocephalus), and North Pacific right... the EA, NMFS determined that issuance of the permit would not significantly impact the quality of the...

  20. Joint-probability Analysis of the Natural Variability of Tropical Oceanic Precipitation

    NASA Technical Reports Server (NTRS)

    Yuter, Sandra E.

    2004-01-01

    Data projects pertaining to KWAJEX are described.Data sets delivered to the Goddard Distributed Active Archive Center (DAAC): 1) Kwajalein Experiment (KWAJEX) S-band calibrated, quality-controlled radar data, 1221 1 files of 3D volume data and 6832 files of 2D low-level reflectivity. 2) Raw and quality-control- processed versions of University of Washington Joss-Waldvogel disdrometer measurements obtained during KWAJEX. 3) A time series of synoptic-scale gif images of the Geostationary Meteorological Satellite (GMS) IR data for the KWAJEX period. The GMS satellite data set for the KWAJEX period was obtained from the University of Wisconsin and reprocessed into format amenable for comparison with radar data.Aircraft microphysics flight-leg definitions for all aircraft and all missions during KWAJEX were completed to facilitate microphysics data processing.

  1. Geohydrologic and water-quality characterization of a fractured-bedrock test hole in an area of Marcellus shale gas development, Bradford County, Pennsylvania

    USGS Publications Warehouse

    Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.

    2013-01-01

    Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.

  2. Improvement in rheumatic fever and rheumatic heart disease management and prevention using a health centre-based continuous quality improvement approach

    PubMed Central

    2013-01-01

    Background Rheumatic heart disease (RHD) remains a major health concern for Aboriginal Australians. A key component of RHD control is prevention of recurrent acute rheumatic fever (ARF) using long-term secondary prophylaxis with intramuscular benzathine penicillin (BPG). This is the most important and cost-effective step in RHD control. However, there are significant challenges to effective implementation of secondary prophylaxis programs. This project aimed to increase understanding and improve quality of RHD care through development and implementation of a continuous quality improvement (CQI) strategy. Methods We used a CQI strategy to promote implementation of national best-practice ARF/RHD management guidelines at primary health care level in Indigenous communities of the Northern Territory (NT), Australia, 2008–2010. Participatory action research methods were employed to identify system barriers to delivery of high quality care. This entailed facilitated discussion with primary care staff aided by a system assessment tool (SAT). Participants were encouraged to develop and implement strategies to overcome identified barriers, including better record-keeping, triage systems and strategies for patient follow-up. To assess performance, clinical records were audited at baseline, then annually for two years. Key performance indicators included proportion of people receiving adequate secondary prophylaxis (≥80% of scheduled 4-weekly penicillin injections) and quality of documentation. Results Six health centres participated, servicing approximately 154 people with ARF/RHD. Improvements occurred in indicators of service delivery including proportion of people receiving ≥40% of their scheduled BPG (increasing from 81/116 [70%] at baseline to 84/103 [82%] in year three, p = 0.04), proportion of people reviewed by a doctor within the past two years (112/154 [73%] and 134/156 [86%], p = 0.003), and proportion of people who received influenza vaccination (57/154 [37%] to 86/156 [55%], p = 0.001). However, the proportion receiving ≥80% of scheduled BPG did not change. Documentation in medical files improved: ARF episode documentation increased from 31/55 (56%) to 50/62 (81%) (p = 0.004), and RHD risk category documentation from 87/154 (56%) to 103/145 (76%) (p < 0.001). Large differences in performance were noted between health centres, reflected to some extent in SAT scores. Conclusions A CQI process using a systems approach and participatory action research methodology can significantly improve delivery of ARF/RHD care. PMID:24350582

  3. Water Quality Analysis Tool (WQAT) | Science Inventory | US ...

    EPA Pesticide Factsheets

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-processed and geographically gridded remotely sensed images are available. A graphical user interface (GUI), was created to enable the user to select and display imagery from a variety of remote sensing data sources. The user can select a date (or date range) and location to extract pixels from the remotely sensed imagery. The GUI is used to obtain all available pixel values (i.e. pixel from all available bands of all available satellites) for a given location on a given date and time. The resultant data set can be analyzed or saved to a file for future use. The WQAT software provides users with a way to establish algorithms between remote sensing reflectance (Rrs) and any available in situ parameters, as well as statistical and regression analysis. The combined data sets can be used to improve water quality research and studies. Satellites provide spatially synoptic data at high frequency (daily to weekly). These characteristics are desirable for supplementing existing water quality observations and for providing information for large aquatic ecosystems that are historically under-sampled by field programs. Thus, the Water Quality Assessment Tool (WQAT) software tool was developed to suppo

  4. The Influence of Hospital Market Competition on Patient Mortality and Total Performance Score.

    PubMed

    Haley, Donald Robert; Zhao, Mei; Spaulding, Aaron; Hamadi, Hanadi; Xu, Jing; Yeomans, Katelyn

    2016-01-01

    The Affordable Care Act of 2010 launch of Medicare Value-Based Purchasing has become the platform for payment reform. It is a mechanism by which buyers of health care services hold providers accountable for high-quality and cost-effective care. The objective of the study was to examine the relationship between quality of hospital care and hospital competition using the quality-quantity behavioral model of hospital behavior. The quality-quantity behavioral model of hospital behavior was used as the conceptual framework for this study. Data from the American Hospital Association database, the Hospital Compare database, and the Area Health Resources Files database were used. Multivariate regression analysis was used to examine the effect of hospital competition on patient mortality. Hospital market competition was significantly and negatively related to the 3 mortality rates. Consistent with the literature, hospitals located in more competitive markets had lower mortality rates for patients with acute myocardial infarction, heart failure, and pneumonia. The results suggest that hospitals may be more readily to compete on quality of care and patient outcomes. The findings are important because policies that seek to control and negatively influence a competitive hospital environment, such as Certificate of Need legislation, may negatively affect patient mortality rates. Therefore, policymakers should encourage the development of policies that facilitate a more competitive and transparent health care marketplace to potentially and significantly improve patient mortality.

  5. 77 FR 64973 - Don W. Gilbert Hydro Power, LLC; Notice of Application Accepted for Filing With the Commission...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... water quality certification; (2) a copy of the request for certification, including proof of the date on which the certifying agency received the request; or (3) evidence of waiver of water quality...-foot-diameter, 700-foot-long partially buried steel or plastic penstock; (3) a powerhouse containing...

  6. 75 FR 16092 - Lockhart Power Company; Notice of Application Accepted for Filing, Soliciting Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... 24-foot-high concrete and rubble masonry dam, with 4-foot-high flashboards; (2) three sand gates; (3... days following the date of issuance of this notice: (1) A copy of the water quality certification; (2... received the request; or (3) evidence of waiver of water quality certification. Kimberly D. Bose, Secretary...

  7. The Influence of Consistent Assignment on Nursing Home Deficiency Citations

    ERIC Educational Resources Information Center

    Castle, Nicholas G.

    2011-01-01

    Objective: The association of consistent assignment of nurse aides (NAs) with quality of care and quality of life of nursing home residents is examined (using 5 groups of deficiency citations). Methods: Data used came from a survey of nursing home administrators, the Online Survey Certification and Reporting data, and the Area Resource File. The…

  8. Segy-change: The swiss army knife for the SEG-Y files

    NASA Astrophysics Data System (ADS)

    Stanghellini, Giuseppe; Carrara, Gabriela

    Data collected during active and passive seismic surveys can be stored in many different, more or less standard, formats. One of the most popular is the SEG-Y format, developed since 1975 to store single-line seismic digital data on tapes, and now evolved to store them into hard-disk and other media as well. Unfortunately, sometimes, files that are claimed to be recorded in the SEG-Y format cannot be processed using available free or industrial packages. Aiming to solve this impasse we present segy-change, a pre-processing software program to view, analyze, change and fix errors present in SEG-Y data files. It is written in C language and it can be used also as a software library and is compatible with most operating systems. Segy-change allows the user to display and optionally change the values inside all parts of a SEG-Y file: the file header, the trace headers and the data blocks. In addition, it allows to do a quality check on the data by plotting the traces. We provide instructions and examples on how to use the software.

  9. AQUIRE: Aquatic Toxicity Information Retrieval data base. Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, E.; Pilli, A.

    The purpose of Aquatic Toxicity Information Retrieval (AQUIRE) data base is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. Scientific papers published both nationally and internationally on the toxicity of chemicals to aquatic organisms and plants are collected and reviewed for AQUIRE. Independently compiled data files that meet AQUIRE parameter and quality assurance criteria are also included. Selected toxicity-test results and related testing information for any individual chemical from laboratory and field aquatic toxicity tests are extracted and added to AQUIRE. Acute, sublethal, and bioconcentration effects are included for tests withmore » freshwater and marine organisms. The total number of data records in AQUIRE now equals 104,500. This includes data from 6000 references, for 5200 chemicals and 2400 test species. A major data file, Acute Toxicity of Organic Chemicals (ATOC), has been incorporated into AQUIRE. The ATOC file contains laboratory acute test data on 525 organic chemicals using juvenile fathead minnows. The complete data file can be accessed by requesting review code 5 as a search parameter.« less

  10. The Information Quality Act: OMB’s Guidance and Initial Implementation

    DTIC Science & Technology

    2004-08-19

    Fiscal Year 2001. CRS-3 3 The Chamber of Commerce describes itself on its website as the world’s largest not-for- profit business federation. See [http...resuscitation and the use of automated external defibrillators. OSHA agreed to do so. In another case, the Chamber of Commerce requested that EPA revise the...decision — the Department of Justice (DOJ) filed a brief recommending the dismissal of a lawsuit filed under the IQA by the Chamber of Commerce and the Salt

  11. Development of climate data input files for the Mechanistic-Empirical Pavement Design Guide (MEPDG).

    DOT National Transportation Integrated Search

    2011-06-30

    Prior to this effort, Mississippi's MEPDG climate files were limited to 12 weather stations in only 10 countries and only seven weather stations had over 8 years (100 months)of data. Hence, building MEPDG climate input datasets improves modeling accu...

  12. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    NASA Astrophysics Data System (ADS)

    Schreiner, Steffen; Bagnasco, Stefano; Sankar Banerjee, Subho; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Zhu, Jianlin

    2011-12-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  13. Figure 3

    EPA Pesticide Factsheets

    The Figure.tar.gz contains a directory for each WRF ensemble run. In these directories are *.csv files for each meteorology variable examined. These are comma delimited text files that contain statistics for each observation site. Also provided is an R script that reads these files (user would need to change directory pointers) and computes the variability of error and bias of the ensemble at each site and plots these for reproduction of figure 3.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  14. Astronomical Instrumentation System Markup Language

    NASA Astrophysics Data System (ADS)

    Goldbaum, Jesse M.

    2016-05-01

    The Astronomical Instrumentation System Markup Language (AISML) is an Extensible Markup Language (XML) based file format for maintaining and exchanging information about astronomical instrumentation. The factors behind the need for an AISML are first discussed followed by the reasons why XML was chosen as the format. Next it's shown how XML also provides the framework for a more precise definition of an astronomical instrument and how these instruments can be combined to form an Astronomical Instrumentation System (AIS). AISML files for several instruments as well as one for a sample AIS are provided. The files demonstrate how AISML can be utilized for various tasks from web page generation and programming interface to instrument maintenance and quality management. The advantages of widespread adoption of AISML are discussed.

  15. 40 CFR 51.363 - Quality assurance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... using either electronic or written forms to be retained in the inspector and station history files, with...) Covert vehicles covering the range of vehicle technology groups (e.g., carbureted and fuel-injected...

  16. Reduced Root Cortical Cell File Number Improves Drought Tolerance in Maize1[C][W][OPEN

    PubMed Central

    Chimungu, Joseph G.; Brown, Kathleen M.

    2014-01-01

    We tested the hypothesis that reduced root cortical cell file number (CCFN) would improve drought tolerance in maize (Zea mays) by reducing the metabolic costs of soil exploration. Maize genotypes with contrasting CCFN were grown under well-watered and water-stressed conditions in greenhouse mesocosms and in the field in the United States and Malawi. CCFN ranged from six to 19 among maize genotypes. In mesocosms, reduced CCFN was correlated with 57% reduction of root respiration per unit of root length. Under water stress in the mesocosms, genotypes with reduced CCFN had between 15% and 60% deeper rooting, 78% greater stomatal conductance, 36% greater leaf CO2 assimilation, and between 52% to 139% greater shoot biomass than genotypes with many cell files. Under water stress in the field, genotypes with reduced CCFN had between 33% and 40% deeper rooting, 28% lighter stem water oxygen isotope enrichment (δ18O) signature signifying deeper water capture, between 10% and 35% greater leaf relative water content, between 35% and 70% greater shoot biomass at flowering, and between 33% and 114% greater yield than genotypes with many cell files. These results support the hypothesis that reduced CCFN improves drought tolerance by reducing the metabolic costs of soil exploration, enabling deeper soil exploration, greater water acquisition, and improved growth and yield under water stress. The large genetic variation for CCFN in maize germplasm suggests that CCFN merits attention as a breeding target to improve the drought tolerance of maize and possibly other cereal crops. PMID:25355868

  17. DPOI: Distributed software system development platform for ocean information service

    NASA Astrophysics Data System (ADS)

    Guo, Zhongwen; Hu, Keyong; Jiang, Yongguo; Sun, Zhaosui

    2015-02-01

    Ocean information management is of great importance as it has been employed in many areas of ocean science and technology. However, the developments of Ocean Information Systems (OISs) often suffer from low efficiency because of repetitive work and continuous modifications caused by dynamic requirements. In this paper, the basic requirements of OISs are analyzed first, and then a novel platform DPOI is proposed to improve development efficiency and enhance software quality of OISs by providing off-the-shelf resources. In the platform, the OIS is decomposed hierarchically into a set of modules, which can be reused in different system developments. These modules include the acquisition middleware and data loader that collect data from instruments and files respectively, the database that stores data consistently, the components that support fast application generation, the web services that make the data from distributed sources syntactical by use of predefined schemas and the configuration toolkit that enables software customization. With the assistance of the development platform, the software development needs no programming and the development procedure is thus accelerated greatly. We have applied the development platform in practical developments and evaluated its efficiency in several development practices and different development approaches. The results show that DPOI significantly improves development efficiency and software quality.

  18. Statewide Hospital Discharge Data: Collection, Use, Limitations, and Improvements.

    PubMed

    Andrews, Roxanne M

    2015-08-01

    To provide an overview of statewide hospital discharge databases (HDD), including their uses in health services research and limitations, and to describe Agency for Healthcare Research and Quality (AHRQ) Enhanced State Data grants to address clinical and race-ethnicity data limitations. Almost all states have statewide HDD collected by public or private data organizations. Statewide HDD, based on the hospital claim with state variations, contain useful core variables and require minimal collection burden. AHRQ's Healthcare Cost and Utilization Project builds uniform state and national research files using statewide HDD. States, hospitals, and researchers use statewide HDD for many purposes. Illustrating researchers' use, during 2012-2014, HSR published 26 HDD-based articles on health policy, access, quality, clinical aspects of care, race-ethnicity and insurance impacts, economics, financing, and research methods. HDD have limitations affecting their use. Five AHRQ grants focused on enhancing clinical data and three grants aimed at improving race-ethnicity data. ICD-10 implementation will significantly affect the HDD. The AHRQ grants, information technology advances, payment policy changes, and the need for outpatient information may stimulate other statewide HDD changes. To remain a mainstay of health services research, statewide HDD need to keep pace with changing user needs while minimizing collection burdens. © Health Research and Educational Trust.

  19. Statewide Hospital Discharge Data: Collection, Use, Limitations, and Improvements

    PubMed Central

    Andrews, Roxanne M

    2015-01-01

    Objectives To provide an overview of statewide hospital discharge databases (HDD), including their uses in health services research and limitations, and to describe Agency for Healthcare Research and Quality (AHRQ) Enhanced State Data grants to address clinical and race–ethnicity data limitations. Principal Findings Almost all states have statewide HDD collected by public or private data organizations. Statewide HDD, based on the hospital claim with state variations, contain useful core variables and require minimal collection burden. AHRQ’s Healthcare Cost and Utilization Project builds uniform state and national research files using statewide HDD. States, hospitals, and researchers use statewide HDD for many purposes. Illustrating researchers’ use, during 2012–2014, HSR published 26 HDD-based articles on health policy, access, quality, clinical aspects of care, race–ethnicity and insurance impacts, economics, financing, and research methods. HDD have limitations affecting their use. Five AHRQ grants focused on enhancing clinical data and three grants aimed at improving race–ethnicity data. Conclusion ICD-10 implementation will significantly affect the HDD. The AHRQ grants, information technology advances, payment policy changes, and the need for outpatient information may stimulate other statewide HDD changes. To remain a mainstay of health services research, statewide HDD need to keep pace with changing user needs while minimizing collection burdens. PMID:26150118

  20. Effective clinical education: strategies for teaching medical students and residents in the office.

    PubMed

    Cayley, William E

    2011-08-01

    Educating medical students and residents in the office presents the challenges of providing quality medical care, maintaining efficiency, and incorporating meaningful education for learners. Numerous teaching strategies to address these challenges have been described in the medical educational literature, but only a few teaching strategies have been evaluated for their impact on education and office practice. Literature on the impact of office-based teaching strategies on educational outcomes and on office efficiency was selected from a Pub Med search, from review of references in retrieved articles, and from the author's personal files. Two teaching strategies, "one-minute preceptor" (OMP) and "SNAPPS," have been shown to improve educational processes and outcomes. Two additional strategies, "Aunt Minnie" pattern recognition and "activated demonstration," show promise but have not been fully evaluated. None of these strategies has been shown to improve office efficiency. OMP and SNAPPS are strategies that can be used in office precepting to improve educational processes and outcomes, while pattern recognition and activated demonstration show promise but need further assessment. Additional areas of research also are suggested.

  1. Identification and description of potential ground-water quality monitoring wells in Florida

    USGS Publications Warehouse

    Seaber, P.R.; Thagard, M.E.

    1986-01-01

    The results of a survey of existing wells in Florida that meet the following criteria are presented: (1) well location is known , (2) principal aquifer is known, (3) depth of well is known, (4) well casing depth is known, (5) well water had been analyzed between 1970 and 1982, and (6) well data are stored in the U.S. Geological Survey 's (USGS) computer files. Information for more than 20,000 wells in Florida were stored in the USGS Master Water Data Index of the National Water Data Exchange and in the National Water Data Storage and Retrieval System 's Groundwater Site Inventory computerized files in 1982. Wells in these computer files that had been sampled for groundwater quality before November 1982 in Florida number 13,739; 1,846 of these wells met the above criteria and are the potential (or candidate) groundwater quality monitoring wells included in this report. The distribution by principal aquifer of the 1,846 wells identified as potential groundwater quality monitoring wells is as follows: 1,022 tap the Floridan aquifer system, 114 tap the intermediate aquifers, 232 tap the surficial aquifers, 246 tap the Biscayne aquifer, and 232 tap the sand-and-gravel aquifer. These wells are located in 59 of Florida 's 67 counties. This report presents the station descriptions, which include location , site characteristics, period of record, and the type and frequency of chemical water quality data collected for each well. The 1,846 well locations are plotted on 14 USGS 1:250,000 scale, 1 degree by 2 degree, quadrangle maps. This relatively large number of potential (or candidate) monitoring wells, geographically and geohydrologically dispersed, provides a basis for a future groundwater quality monitoring network and computerized data base for Florida. There is a large variety of water quality determinations available from these wells, both areally and temporally. Future sampling of these wells would permit analyses of time and areal trends for selected water quality characteristics throughout the State. The identification and description of the potential monitoring wells and the listing of the type and frequency of the groundwater quality data forms a foundation for both the network and the data base. (Author 's abstract)

  2. Streamlined, Inexpensive 3D Printing of the Brain and Skull

    PubMed Central

    Cash, Sydney S.

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3–4 in consumable plastic filament as described, and the total process takes 14–17 hours, almost all of which is unsupervised (preprocessing = 4–6 hr; printing = 9–11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1–5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes. PMID:26295459

  3. Discovery of Marine Datasets and Geospatial Metadata Visualization

    NASA Astrophysics Data System (ADS)

    Schwehr, K. D.; Brennan, R. T.; Sellars, J.; Smith, S.

    2009-12-01

    NOAA's National Geophysical Data Center (NGDC) provides the deep archive of US multibeam sonar hydrographic surveys. NOAA stores the data as Bathymetric Attributed Grids (BAG; http://www.opennavsurf.org/) that are HDF5 formatted files containing gridded bathymetry, gridded uncertainty, and XML metadata. While NGDC provides the deep store and a basic ERSI ArcIMS interface to the data, additional tools need to be created to increase the frequency with which researchers discover hydrographic surveys that might be beneficial for their research. Using Open Source tools, we have created a draft of a Google Earth visualization of NOAA's complete collection of BAG files as of March 2009. Each survey is represented as a bounding box, an optional preview image of the survey data, and a pop up placemark. The placemark contains a brief summary of the metadata and links to directly download of the BAG survey files and the complete metadata file. Each survey is time tagged so that users can search both in space and time for surveys that meet their needs. By creating this visualization, we aim to make the entire process of data discovery, validation of relevance, and download much more efficient for research scientists who may not be familiar with NOAA's hydrographic survey efforts or the BAG format. In the process of creating this demonstration, we have identified a number of improvements that can be made to the hydrographic survey process in order to make the results easier to use especially with respect to metadata generation. With the combination of the NGDC deep archiving infrastructure, a Google Earth virtual globe visualization, and GeoRSS feeds of updates, we hope to increase the utilization of these high-quality gridded bathymetry. This workflow applies equally well to LIDAR topography and bathymetry. Additionally, with proper referencing and geotagging in journal publications, we hope to close the loop and help the community create a true “Geospatial Scholar” infrastructure.

  4. Techtalk: Telecommunications for Improving Developmental Education.

    ERIC Educational Resources Information Center

    Caverly, David C.; Broderick, Bill

    1993-01-01

    Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)

  5. Volunteer Income Tax Assistance Programs and Taxpayer Actions to Improve Personal Finances

    ERIC Educational Resources Information Center

    Bobbitt, Erica; Bowen, Cathy F.; Kuleck, Robin L.; Taverno, Ronald

    2012-01-01

    The income tax-filing process creates teachable moments for learning about taxes and other financial matters. Educators and volunteers from Penn State Cooperative Extension helped taxpayers file 2008 returns under Volunteer Income Tax Assistance Program (VITA). Nearly 600 filers (588) completed and simultaneously received educational information…

  6. A New Data Access Mechanism for HDFS

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Sun, Zhenyu; Wei, Zhanchen; Sun, Gongxing

    2017-10-01

    With the era of big data emerging, Hadoop has become the de facto standard of big data processing platform. However, it is still difficult to get legacy applications, such as High Energy Physics (HEP) applications, to run efficiently on Hadoop platform. There are two reasons which lead to the difficulties mentioned above: firstly, random access is not supported on Hadoop File System (HDFS), secondly, it is difficult to make legacy applications adopt to HDFS streaming data processing mode. In order to address the two issues, a new read and write mechanism of HDFS is proposed. With this mechanism, data access is done on the local file system instead of through HDFS streaming interfaces. To enable files modified by users, three attributes including permissions, owner and group are imposed on Block objects. Blocks stored on Datanodes have the same attributes as the file they are owned by. Users can modify blocks when the Map task running locally, and HDFS is responsible to update the rest replicas later after the block modification finished. To further improve the performance of Hadoop system, a complete localization task execution mechanism is implemented for I/O intensive jobs. Test results show that average CPU utilization is improved by 10% with the new task selection strategy, data read and write performances are improved by about 10% and 30% separately.

  7. Data Quality Monitoring and Noise Analysis at the EUREF Permanent Network

    NASA Astrophysics Data System (ADS)

    Kenyeres, A.; Bruyninx, C.

    2004-12-01

    The EUREF Permanent Network (EPN) includes now more then 150 GNSS stations of different quality and different observation history. The greatest portion of the sites is settled on the tectonically stable parts of Eurasia, where only mm-level yearly displacements are expected. In order to extract the relevant geophysical information, sophisticated analysis tools and stable, long term observations are necessary. As the EPN is operational since 1996, it offers the potential to estimate high quality velocities associated with reliable uncertainties. In order to support this work, a set of efficient and demonstrative tools have been developed to monitor the data and station quality. The periodically upgraded results are displayed on the website of the EPN Central Bureau (CB) (www.epncb.oma.be) in terms of sky plots, graphs of observation percentage, cycle slips and multipath. The different quality plots are indirectly used for the interpretation of the time series. Sudden changes or unusual variation in the time series (beyond the obvious equipment change) often correlates with changes in the environment mirrored by the quality plots. These graphs are vital for the proper interpretation and the understanding of the real processes. Knowing the nuisance factors, we can generate cleaner time series. We are presenting relevant examples of this work. Two kinds of time series plots are displayed at the EPN CB website: raw and improved time series. They are cumulative solutions of the weekly EPN SINEX files using the minimum constraint approach. Within the improved time series the outliers and offsets are already taken into account. We will also present preliminary results of a detailed noise analysis of the EPN time series. The target of this work is twofold: on one side we aim at computing more realistic velocity estimates of the EPN stations and on the other side the information about the station noise characteristics will support the removal and proper interpretation of site-specific phenomena .

  8. Reconnaissance of Water Quality at Four Swine Farms in Jackson County, Florida, 1993

    DTIC Science & Technology

    1996-01-01

    applications on agricultural land. (Krider, 1987). Since the estimated annual wet manure product in pounds per animal is: 3,407 for breeding swine , and...Reconnaissance of Water Quality at Four Swine Farms in Jackson County, Florida, 1993 By Jerilyn J. Collins U.S. Geological Survey Open File Report...COVERED - 4. TITLE AND SUBTITLE Reconnaissance of Water Quality at Four Swine Farms in Jackson County, Florida, 1993 5a. CONTRACT NUMBER 5b. GRANT

  9. Evaluation of Contact Friction in Fracture of Rotationally Bent Nitinol Endodontic Files

    NASA Astrophysics Data System (ADS)

    Haimed, Tariq Abu

    2011-12-01

    The high flexibility of rotary Nitinol (Ni-Ti) files has helped clinicians perform root canal treatments with fewer technical errors than seen with stainless steel files. However, intracanal file fracture can occur, compromising the outcome of the treatment. Ni-Ti file fracture incidence is roughly around 4% amongst specialists and higher amongst general practitioners. Therefore, eliminating or reducing this problem should improve patient care. The aim of this project was to isolate and examine the role of friction between files and the canal walls of the glass tube model, and bending-related maximum strain amplitudes, on Ni-Ti file lifetimes-tofracture in the presence of different irrigant solutions and file coatings. A specifically designed device was used to test over 300 electropolished EndoSequenceRTM Ni-Ti files for number of cycles to failure (NCF) in smooth, bent glass tube models at 45 and 60 degrees during dry, coated and liquid-lubricated rotation at 600rpm. Fractured files were examined under Scanning Electron Microscopy (SEM) afterwards. Four different file sizes 25.04, 25.06, 35.04, 35.06 (diameter in mm/taper %) and six surface modification conditions were used independently. These conditions included, three solutions; (1) a surfactant-based solution, Surface-Active-Displacement-Solution (SADS), (2) a mouth wash proven to remove biofilms, Delmopinol 1%(DEL), and (3) Bleach 6% (vol.%), the most common antibacterial endodontic irrigant solution. The conditions also included two low-friction silane-based coating groups, 3-Hepta-fluoroisopropyl-propoxymethyl-dichlorosilane (3-HEPT) and Octadecyltrichlorosilane (ODS), in addition to an as-received file control group (Dry). The coefficient of friction (CF) between the file and the canal walls for each condition was measured as well as the surface tension of the irrigant solutions and the critical surface tension of the coated and uncoated files by contact angle measurements. The radius of curvature and maximum strain amplitude (MSA) for each file size were determined based on images of the files inside the glass tubes. The force of insertion for each file type under each condition was also measured inside 45 and 60 degree glass tube paths, static and while dynamic. The results showed that NCF of Ni-Ti files is strongly inversely related to the CF which ranged from 0.15 for ODS and 3-HEPT coated files to 0.43 for irrigant bleach. High CF (in the presence of bleach) significantly reduced the NCF. Conversely, lower CF (in the presence of other solutions and file coatings) resulted in significantly higher NCF. CF was found to be directly related to the surface tension of the media used. Similarly, high MSA typical of low radius of curvature and high bending angle significantly diminished the fatigue life of Ni-Ti files. The integral of the force of insertion versus time curve was the highest for bleach irrigation which also showed the highest CF. Scanning electron microscope inspection of file fracture surfaces illustrated a 2-step progressive failure mode characterized by creation of a smooth initial fatigue area (striation marks) followed by catastrophic ductile fracture (dimple area) when the intact file shaft area was sufficiently reduced. The bleach-lubricated files failed earlier and with a smaller fatigue area (23%) than all other groups (31-35%) indicating premature fracture in the presence of higher frictional forces. The acquired data demonstrate that the combination of low MSA and low CF (by using coatings or solutions with low surface tension), related to the magnitude of the superficial drag force, can lead to statistically longer rotational bending lifetimes for Ni-Ti files. Based on the data of this study, lubricant solutions with low surface tension could significantly improve the fracture life of Ni-Ti files in root canal glass model. Laboratory testing using natural teeth should be performed to evaluate the effect of using such solutions on the fatigue life of Ni-Ti files.

  10. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  11. Evaluation of surface characteristics of rotary nickel-titanium instruments produced by different manufacturing methods.

    PubMed

    Inan, U; Gurel, M

    2017-02-01

    Instrument fracture is a serious concern in endodontic practice. The aim of this study was to investigate the surface quality of new and used rotary nickel-titanium (NiTi) instruments manufactured by the traditional grinding process and twisting methods. Total 16 instruments of two rotary NiTi systems were used in this study. Eight Twisted Files (TF) (SybronEndo, Orange, CA, USA) and 8 Mtwo (VDW, Munich, Germany) instruments were evaluated. New and used of 4 experimental groups were evaluated using an atomic force microscopy (AFM). New and used instruments were analyzed on 3 points along a 3 mm. section at the tip of the instrument. Quantitative measurements according to the topographical deviations were recorded. The data were statistically analyzed with paired samples t-test and independent samples t-test. Mean root mean square (RMS) values for new and used TF 25.06 files were 10.70 ± 2.80 nm and 21.58 ± 6.42 nm, respectively, and the difference between them was statistically significant (P < 0.05). Mean RMS values for new and used Mtwo 25.06 files were 24.16 ± 9.30 nm and 39.15 ± 16.20 nm respectively, the difference between them also was statistically significant (P < 0.05). According to the AFM analysis, instruments produced by twisting method (TF 25.06) had better surface quality than the instruments produced by traditional grinding process (Mtwo 25.06 files).

  12. 75 FR 27548 - Quality GearBox, LLC; Notice of Competing Preliminary Permit Application Accepted for Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13718-000] Quality GearBox... project would consist of: (1) An existing 8.5-foot- high by 325-foot-long gravity dam; (2) an existing 19-acre impoundment with a storage capacity of 133 acre feet; (3) an existing filtration plant to be...

  13. An assessment technique for computer-socket manufacturing

    PubMed Central

    Sanders, Joan; Severance, Michael

    2015-01-01

    An assessment strategy is presented for testing the quality of carving and forming of individual computer aided manufacturing facilities. The strategy is potentially useful to facilities making sockets and companies marketing manufacturing equipment. To execute the strategy, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, and then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file shapes. Then model shapes are compared with electronic file shapes to characterize carving performance. Socket shapes are compared with model shapes to characterize forming performance. The mean radial error (MRE), which is the average difference in radii between the two shapes being compared, provides insight into sizing quality. Inter-quartile range (IQR), the range of radial error for the best matched half of the points on the surfaces being compared, provides insight into shape quality. By determining MRE and IQR for carving and forming separately, the source(s) of socket shape error may be pinpointed. The developed strategy may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer aided manufacturing and insight into appropriate modifications to overcome them. PMID:21938663

  14. jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.

    PubMed

    Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris

    2014-07-03

    The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .

  15. 20 CFR 404.1519p - Reviewing reports of consultative examinations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... file (e.g., your blindness in one eye, amputations, pain, alcoholism, depression); (4) Whether this is... management studies on the quality of consultative examinations purchased from major medical sources and the...

  16. 20 CFR 404.1519p - Reviewing reports of consultative examinations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... file (e.g., your blindness in one eye, amputations, pain, alcoholism, depression); (4) Whether this is... management studies on the quality of consultative examinations purchased from major medical sources and the...

  17. 20 CFR 404.1519p - Reviewing reports of consultative examinations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... file (e.g., your blindness in one eye, amputations, pain, alcoholism, depression); (4) Whether this is... management studies on the quality of consultative examinations purchased from major medical sources and the...

  18. 20 CFR 404.1519p - Reviewing reports of consultative examinations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... file (e.g., your blindness in one eye, amputations, pain, alcoholism, depression); (4) Whether this is... management studies on the quality of consultative examinations purchased from major medical sources and the...

  19. 20 CFR 404.1519p - Reviewing reports of consultative examinations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... file (e.g., your blindness in one eye, amputations, pain, alcoholism, depression); (4) Whether this is... management studies on the quality of consultative examinations purchased from major medical sources and the...

  20. Image Steganography In Securing Sound File Using Arithmetic Coding Algorithm, Triple Data Encryption Standard (3DES) and Modified Least Significant Bit (MLSB)

    NASA Astrophysics Data System (ADS)

    Nasution, A. B.; Efendi, S.; Suwilo, S.

    2018-04-01

    The amount of data inserted in the form of audio samples that use 8 bits with LSB algorithm, affect the value of PSNR which resulted in changes in image quality of the insertion (fidelity). So in this research will be inserted audio samples using 5 bits with MLSB algorithm to reduce the number of data insertion where previously the audio sample will be compressed with Arithmetic Coding algorithm to reduce file size. In this research will also be encryption using Triple DES algorithm to better secure audio samples. The result of this research is the value of PSNR more than 50dB so it can be concluded that the image quality is still good because the value of PSNR has exceeded 40dB.

  1. A Summary of Proposed Changes to the Current ICARTT Format Standards and their Implications to Future Airborne Studies

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Kusterer, J.; Quam, B.; Chen, G.; Early, A. B.; Beach, A. L., III

    2015-12-01

    The current ICARTT file format standards were developed for the purpose of fulfilling the data management needs for the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) campaign in 2004. The goal of the ICARTT file format was to establish a common and simple to use data file format to promote data exchange and collaboration among science teams with similar science objectives. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Despite its level of acceptance, there are a number of issues with the current ICARTT format, especially concerning the machine readability. To enhance usability, the ICARTT Refresh Earth Science Data Systems Working Group (ESDSWG) was established to enable a platform for atmospheric science data producers, users (e.g. modelers) and data managers to collaborate on developing criteria for this file format. Ultimately, this is a cross agency effort to improve and aggregate the metadata records being produced. After conducting a survey to identify deficiencies in the current format, we determined which are considered most important to the various communities. Numerous recommendations were made to improve upon the file format while maintaining backward compatibility. The recommendations made to date and their advantages and limitations will be discussed.

  2. Through the Looking Glass: Real-Time Video Using 'Smart' Technology Provides Enhanced Intraoperative Logistics.

    PubMed

    Baldwin, Andrew C W; Mallidi, Hari R; Baldwin, John C; Sandoval, Elena; Cohn, William E; Frazier, O H; Singh, Steve K

    2016-01-01

    In the setting of increasingly complex medical therapies and limited physician resources, the recent emergence of 'smart' technology offers tremendous potential for improved logistics, efficiency, and communication between medical team members. In an effort to harness these capabilities, we sought to evaluate the utility of this technology in surgical practice through the employment of a wearable camera device during cardiothoracic organ recovery. A single procurement surgeon was trained for use of an Explorer Edition Google Glass (Google Inc., Mountain View, CA) during the recovery process. Live video feed of each procedure was securely broadcast to allow for members of the home transplant team to remotely participate in organ assessment. Primary outcomes involved demonstration of technological feasibility and validation of quality assurance through group assessment. The device was employed for the recovery of four organs: a right single lung, a left single lung, and two bilateral lung harvests. Live video of the visualization process was remotely accessed by the home transplant team, and supplemented final verification of organ quality. In each case, the organs were accepted for transplant without disruption of standard procurement protocols. Media files generated during the procedures were stored in a secure drive for future documentation, evaluation, and education purposes without preservation of patient identifiers. Live video streaming can improve quality assurance measures by allowing off-site members of the transplant team to participate in the final assessment of donor organ quality. While further studies are needed, this project suggests that the application of mobile 'smart' technology offers not just immediate value, but the potential to transform our approach to the practice of medicine.

  3. Reliable file sharing in distributed operating system using web RTC

    NASA Astrophysics Data System (ADS)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  4. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  5. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    PubMed

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.

  6. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data

    PubMed Central

    Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597

  7. FD_BH: a program for simulating electromagnetic waves from a borehole antenna

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2002-01-01

    Program FD_BH is used to simulate the electromagnetic waves generated by an antenna in a borehole. The model representing the antenna may include metallic parts, a coaxial cable as a feed to the driving point, and resistive loading. The program is written in the C programming language, and the program has been tested on both the Windows and the UNIX operating systems. This Open-File Report describes • The contents and organization of the Zip file (section 2). • The program files, the installation of the program, the input files, and the execution of the program (section 3). • Address to which suggestions for improving the program may be sent (section 4).

  8. Implementation of study results in guidelines and adherence to guidelines in clinical practice.

    PubMed

    Waldfahrer, Frank

    2016-01-01

    Guidelines were introduced in hospital- and practice-based otorhinolaryngology in the 1990ies, and have been undergoing further development ever since. There are currently 20 guidelines on file at the German Society of Oto-Rhino-Laryngology, Head & Neck Surgery. The society has cooperated in further 34 guidelines. The quality of the guidelines has been continuously improved by concrete specifications put forward by the Association of the Scientific Medical Societies in Germany (Arbeitsgemeinschaft der Wissenschaftlichen Medizinischen Fachgesellschaften e.V., AWMF). Since increasing digitalization has made access to scientific publications quicker and simpler, relevant study results can be incorporated in guidelines more easily today than in the analog world. S2e and S3 guidelines must be based on a formal literature search with subsequent evaluation of the evidence. The consensus procedure for S2k guidelines is also regulated. However, the implementation of guidelines in routine medical practice must still be considered inadequate, and there is still a considerable need for improvement in adherence to these guidelines.

  9. Implementation of study results in guidelines and adherence to guidelines in clinical practice

    PubMed Central

    Waldfahrer, Frank

    2016-01-01

    Guidelines were introduced in hospital- and practice-based otorhinolaryngology in the 1990ies, and have been undergoing further development ever since. There are currently 20 guidelines on file at the German Society of Oto-Rhino-Laryngology, Head & Neck Surgery. The society has cooperated in further 34 guidelines. The quality of the guidelines has been continuously improved by concrete specifications put forward by the Association of the Scientific Medical Societies in Germany (Arbeitsgemeinschaft der Wissenschaftlichen Medizinischen Fachgesellschaften e.V., AWMF). Since increasing digitalization has made access to scientific publications quicker and simpler, relevant study results can be incorporated in guidelines more easily today than in the analog world. S2e and S3 guidelines must be based on a formal literature search with subsequent evaluation of the evidence. The consensus procedure for S2k guidelines is also regulated. However, the implementation of guidelines in routine medical practice must still be considered inadequate, and there is still a considerable need for improvement in adherence to these guidelines. PMID:28025601

  10. Leveraging iPads to introduce meditation and reduce distress among cancer patients undergoing chemotherapy: a promising approach.

    PubMed

    Millegan, Jeffrey; Manschot, Bernard; Dispenzieri, Monica; Marks, Benjamin; Edwards, Ayesha; Raulston, Vanessa; Khatiwoda, Yojana; Narro, Marlo

    2015-12-01

    Distress is common among cancer patients. Regular meditation practice has the potential to mitigate this distress and improve quality of life for this population. Introducing meditation to cancer patients can be particularly challenging given the demands on patients' time from treatment and normal life events. This internal process improvement study examined the potential benefit of utilizing iPads during chemotherapy sessions to introduce meditation and reduce distress. Patients undergoing chemotherapy infusion were offered iPads with various meditation videos and audio files during the session. Levels of distress were measured using the distress thermometer at the beginning of chemotherapy and at the conclusion of chemotherapy. Seventy-three patients accepted the meditation iPads during the chemotherapy session. Among those who accepted the iPads, average distress dropped 46% by the end of the session (p < 0.0001). The use of iPads during chemotherapy is a potentially effective way to introduce meditation as a stress management tool for people with cancer.

  11. HMO penetration and quality of care: the case of breast cancer.

    PubMed

    Decker, S L; Hempstead, K

    1999-01-01

    In theory, health maintenance organizations (HMOs) receiving a fixed payment rate per enrolled member have an incentive to coordinate services and emphasize prevention and early detection of disease in order to minimize costs of care. This article tests whether higher HMO penetration rates across counties in the United States and across time improve the use of mammography services, the chance of early rather than late detection of breast cancer, and ultimately improve breast cancer survival. We use two data sets to test the effect of HMO penetration on use of breast cancer services and on breast cancer health outcomes for women aged 55 to 64 years. These data sources are matched with county-level data on HMO penetration and other market variables from the Bureau of Health Profession's Area Resource File. Results of logit regression show evidence that HMO penetration positively affects the probability of recent mammography receipt. However, we do not find a statistically significant relationship between HMO penetration and either stage of diagnosis or breast cancer survival.

  12. Reprocessing of multi-channel seismic-reflection data collected in the Beaufort Sea

    USGS Publications Warehouse

    Agena, W.F.; Lee, Myung W.; Hart, P.E.

    2000-01-01

    Contained on this set of two CD-ROMs are stacked and migrated multi-channel seismic-reflection data for 65 lines recorded in the Beaufort Sea by the United States Geological Survey in 1977. All data were reprocessed by the USGS using updated processing methods resulting in improved interpretability. Each of the two CD-ROMs contains the following files: 1) 65 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 65 lines in standard SEG-P1 format; 3) an ASCII text file with cross-reference information for relating the sequential trace numbers on each line to cdp numbers and shotpoint numbers; 4) 2 small scale graphic images (stacked and migrated) of a segment of line 722 in Adobe Acrobat (R) PDF format; 5) a graphic image of the location map, generated from the navigation file; 6) PlotSeis, an MS-DOS Application that allows PC users to interactively view the SEG-Y files; 7) a PlotSeis documentation file; and 8) an explanation of the processing used to create the final seismic sections (this document).

  13. Plate-based diversity subset screening: an efficient paradigm for high throughput screening of a large screening file.

    PubMed

    Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Knight, Michelle; Loesel, Jens; Mathias, John; McLoughlin, David; Mills, James; Sharp, Robert E; Williams, Christine; Wood, Terence P

    2013-05-01

    The screening files of many large companies, including Pfizer, have grown considerably due to internal chemistry efforts, company mergers and acquisitions, external contracted synthesis, or compound purchase schemes. In order to screen the targets of interest in a cost-effective fashion, we devised an easy-to-assemble, plate-based diversity subset (PBDS) that represents almost the entire computed chemical space of the screening file whilst comprising only a fraction of the plates in the collection. In order to create this file, we developed new design principles for the quality assessment of screening plates: the Rule of 40 (Ro40) and a plate selection process that insured excellent coverage of both library chemistry and legacy chemistry space. This paper describes the rationale, design, construction, and performance of the PBDS, that has evolved into the standard paradigm for singleton (one compound per well) high-throughput screening in Pfizer since its introduction in 2006.

  14. TADPLOT program, version 2.0: User's guide

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1991-01-01

    The TADPLOT Program, Version 2.0 is described. The TADPLOT program is a software package coordinated by a single, easy-to-use interface, enabling the researcher to access several standard file formats, selectively collect specific subsets of data, and create full-featured publication and viewgraph quality plots. The user-interface was designed to be independent from any file format, yet provide capabilities to accommodate highly specialized data queries. Integrated with an applications software network, data can be assessed, collected, and viewed quickly and easily. Since the commands are data independent, subsequent modifications to the file format will be transparent, while additional file formats can be integrated with minimal impact on the user-interface. The graphical capabilities are independent of the method of data collection; thus, the data specification and subsequent plotting can be modified and upgraded as separate functional components. The graphics kernel selected adheres to the full functional specifications of the CORE standard. Both interface and postprocessing capabilities are fully integrated into TADPLOT.

  15. ARCUS Internet Media Archive (IMA): A Window into the Arctic - An Online Resource For Education and Outreach

    NASA Astrophysics Data System (ADS)

    Buxbaum, T. M.; Warnick, W. K.; Polly, B.; Breen, K. J.

    2007-12-01

    The ARCUS Internet Media Archive (IMA) is a collection of photos, graphics, videos, and presentations about the Arctic and Antarctic that are shared through the Internet. It provides the polar research community and the public at large with a centralized location where images and video pertaining to polar research can be browsed and retrieved for a variety of uses. The IMA currently contains almost 6,500 publicly accessible photos, including 4,000 photos from the National Science Foundation (NSF) funded Teachers and Researchers Exploring and Collaborating (TREC) program, an educational research experience in which K-12 teachers participate in arctic research as a pathway to improving science education. The IMA is also the future home of all electronic media from the NSF funded PolarTREC program, a continuation of TREC that now takes place in both the Arctic and Antarctic. The IMA includes 450 video files, 270 audio files, nearly 100 graphics and logos, 28 presentations, and approximately 10,000 additional resources that are being prepared for public access. The contents of this archive are organized by file type, photographer's name, event, or by organization, with each photo or file accompanied by information on content, contributor source, and usage requirements. All the files are keyworded and all information, including file name and description, is completely searchable. ARCUS plans to continue to improve and expand the IMA with a particular focus on providing graphics depicting key arctic research results and findings as well as edited video archives of relevant scientific community meetings. To submit files or for more information and to view the ARCUS Internet Media Archive, please go to: http://media.arcus.org or email photo@arcus.org.

  16. ES-doc-errata: an issue tracker platform for CMIP6

    NASA Astrophysics Data System (ADS)

    Ben Nasser, Atef; Levavasseur, Guillaume; Greenslade, Mark; Denvil, Sébastien

    2017-04-01

    In the context of overseeing the quality of data, and as a result of the inherent complexity of projects such as CMIP5/6, it is a mandatory task to keep track of the status of datasets and the version evolution they sustain in their life-cycle. The ESdoc-errata project aims to keep track of the issues affecting specific versions of datasets/files. It enables users to resolve the history tree of each dataset/file enabling a better choice of the data used in their work based on the data status. The ES-doc-errata project has been designed and built on top of the Parent-IDentifiers handle service that will be deployed in the next iteration of the CMIP project, by ensuring maximum usability of ESGF ecosystem and encapsulated in the ES-doc structure. Consuming PIDs from handle service is guided by a specifically built algorithm that extracts meta-data regarding the issues that may or may not affect the quality of datasets/files and cause newer version to be published replacing older deprecated versions. This algorithm is able to deduce the nature of the flaws to the file granularity, that is of high value to the end-user. This new platform has been designed keeping in mind usability by end-users specialized in the data publishing process or other scientists requiring feedback on reliability of data required for their work. To this end, a specific set of rules and a code of conduct has been defined. A validation process ensures the quality of this newly introduced errata meta-data , an authentication safe-guard was implemented to prevent tampering with the archived data, and a wide variety of tools were put at users disposal to interact safely with the platform including a command-line client and a dedicated front-end.

  17. SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Shimizu, E; Matsunaga, K

    2014-06-01

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less

  18. Effects of Health Level 7 Messaging on Data Quality in New York City's Immunization Information System, 2014.

    PubMed

    Metroka, Amy E; Papadouka, Vikki; Ternier, Alexandra; Zucker, Jane R

    2016-01-01

    We compared the quality of data reported to New York City's immunization information system, the Citywide Immunization Registry (CIR), through its real-time Health Level 7 (HL7) Web service from electronic health records (EHRs), with data submitted through other methods. We stratified immunizations administered and reported to the CIR in 2014 for patients aged 0-18 years by reporting method: (1) sending HL7 messages from EHRs through the Web service, (2) manual data entry, and (3) upload of a non-standard flat file from EHRs. We assessed completeness of reporting by measuring the percentage of immunizations reported with lot number, manufacturer, and Vaccines for Children (VFC) program eligibility. We assessed timeliness of reporting by determining the number of days from date of administration to date entered into the CIR. HL7 reporting accounted for the largest percentage (46.3%) of the 3.8 million immunizations reported in 2014. Of immunizations reported using HL7, 97.9% included the lot number and 92.6% included the manufacturer, compared with 50.4% and 48.0% for manual entry, and 65.9% and 48.8% for non-standard flat file, respectively. VFC eligibility was 96.9% complete when reported by manual data entry, 95.3% complete for HL7 reporting, and 87.2% complete for non-standard flat file reporting. Of the three reporting methods, HL7 was the most timely: 77.6% of immunizations were reported by HL7 in <1 day, compared with 53.6% of immunizations reported through manual data entry and 18.1% of immunizations reported through non-standard flat file. HL7 reporting from EHRs resulted in more complete and timely data in the CIR compared with other reporting methods. Providing resources to facilitate HL7 reporting from EHRs to immunization information systems to increase data quality should be a priority for public health.

  19. Effects of Health Level 7 Messaging on Data Quality in New York City's Immunization Information System, 2014

    PubMed Central

    Papadouka, Vikki; Ternier, Alexandra; Zucker, Jane R.

    2016-01-01

    Objective We compared the quality of data reported to New York City's immunization information system, the Citywide Immunization Registry (CIR), through its real-time Health Level 7 (HL7) Web service from electronic health records (EHRs), with data submitted through other methods. Methods We stratified immunizations administered and reported to the CIR in 2014 for patients aged 0–18 years by reporting method: (1) sending HL7 messages from EHRs through the Web service, (2) manual data entry, and (3) upload of a non-standard flat file from EHRs. We assessed completeness of reporting by measuring the percentage of immunizations reported with lot number, manufacturer, and Vaccines for Children (VFC) program eligibility. We assessed timeliness of reporting by determining the number of days from date of administration to date entered into the CIR. Results HL7 reporting accounted for the largest percentage (46.3%) of the 3.8 million immunizations reported in 2014. Of immunizations reported using HL7, 97.9% included the lot number and 92.6% included the manufacturer, compared with 50.4% and 48.0% for manual entry, and 65.9% and 48.8% for non-standard flat file, respectively. VFC eligibility was 96.9% complete when reported by manual data entry, 95.3% complete for HL7 reporting, and 87.2% complete for non-standard flat file reporting. Of the three reporting methods, HL7 was the most timely: 77.6% of immunizations were reported by HL7 in <1 day, compared with 53.6% of immunizations reported through manual data entry and 18.1% of immunizations reported through non-standard flat file. Conclusion HL7 reporting from EHRs resulted in more complete and timely data in the CIR compared with other reporting methods. Providing resources to facilitate HL7 reporting from EHRs to immunization information systems to increase data quality should be a priority for public health. PMID:27453603

  20. 75 FR 57313 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-62902; File No. SR-CBOE-2010-081] Self... Options Executed in Open Outcry or in the Automated Improvement Mechanism September 14, 2010. Pursuant to... notice to solicit comments on the proposed rule change from interested persons. I. Self-Regulatory...

  1. 76 FR 41842 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-64851; File No. SR-CBOE-2011-062] Self... Effectiveness of a Proposed Rule Change to Amend Its Fees Schedule Regarding Automated Improvement Mechanism.... Self-Regulatory Organization's Statement of the Terms of the Substance of the Proposed Rule Change The...

  2. 76 FR 40948 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-64817; File No. SR-CBOE-2011-059] Self... Customer Orders in SPY Options Executed in Open Outcry or in the Automated Improvement Mechanism July 6...)(2). I. Self-Regulatory Organization's Statement of the Terms of Substance of the Proposed Rule...

  3. 78 FR 6152 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-68710; File No. SR-NYSEMKT-2013-02] Self... Amending Rule 107C-- Equities To Allow Retail Liquidity Providers To Enter Retail Price Improvement Orders... I and II below, which Items have been prepared by the self-regulatory organization. The Commission...

  4. 75 FR 76770 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated: Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-09

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63422; File No. SR-CBOE-2010-105] Self... Orders in SPDR Options Executed in Open Outcry or in the Automated Improvement Mechanism December 3, 2010... notice to solicit comments on the proposed rule change from interested persons. I. Self-Regulatory...

  5. 75 FR 74757 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-01

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63372; File No. SR-Phlx-2010-162] Self... PHLX LLC Relating to Price Improvement (PIXL) Fees November 24, 2010. Pursuant to Section 19(b)(1) of... below, which Items have been prepared by the self-regulatory organization. The Commission is publishing...

  6. RELEASE NOTES FOR MODELS-3 VERSION 4.1 PATCH: SMOKE TOOL AND FILE CONVERTER

    EPA Science Inventory

    This software patch to the Models-3 system corrects minor errors in the Models-3 framework, provides substantial improvements in the ASCII to I/O API format conversion of the File Converter utility, and new functionalities for the SMOKE Tool. Version 4.1 of the Models-3 system...

  7. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  8. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  9. Cryptonite: A Secure and Performant Data Repository on Public Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumbhare, Alok; Simmhan, Yogesh; Prasanna, Viktor

    2012-06-29

    Cloud storage has become immensely popular for maintaining synchronized copies of files and for sharing documents with collaborators. However, there is heightened concern about the security and privacy of Cloud-hosted data due to the shared infrastructure model and an implicit trust in the service providers. Emerging needs of secure data storage and sharing for domains like Smart Power Grids, which deal with sensitive consumer data, require the persistence and availability of Cloud storage but with client-controlled security and encryption, low key management overhead, and minimal performance costs. Cryptonite is a secure Cloud storage repository that addresses these requirements using amore » StrongBox model for shared key management.We describe the Cryptonite service and desktop client, discuss performance optimizations, and provide an empirical analysis of the improvements. Our experiments shows that Cryptonite clients achieve a 40% improvement in file upload bandwidth over plaintext storage using the Azure Storage Client API despite the added security benefits, while our file download performance is 5 times faster than the baseline for files greater than 100MB.« less

  10. X-MATE: a flexible system for mapping short read data

    PubMed Central

    Pearson, John V.; Cloonan, Nicole; Grimmond, Sean M.

    2011-01-01

    Summary: Accurate and complete mapping of short-read sequencing to a reference genome greatly enhances the discovery of biological results and improves statistical predictions. We recently presented RNA-MATE, a pipeline for the recursive mapping of RNA-Seq datasets. With the rapid increase in genome re-sequencing projects, progression of available mapping software and the evolution of file formats, we now present X-MATE, an updated version of RNA-MATE, capable of mapping both RNA-Seq and DNA datasets and with improved performance, output file formats, configuration files, and flexibility in core mapping software. Availability: Executables, source code, junction libraries, test data and results and the user manual are available from http://grimmond.imb.uq.edu.au/X-MATE/. Contact: n.cloonan@uq.edu.au; s.grimmond@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics Online. PMID:21216778

  11. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  12. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2000-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  13. Diurnal Ensemble Surface Meteorology Statistics

    EPA Pesticide Factsheets

    Excel file containing diurnal ensemble statistics of 2-m temperature, 2-m mixing ratio and 10-m wind speed. This Excel file contains figures for Figure 2 in the paper and worksheets containing all statistics for the 14 members of the ensemble and a base simulation.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  14. DECOMP: a PDB decomposition tool on the web.

    PubMed

    Ordog, Rafael; Szabadka, Zoltán; Grolmusz, Vince

    2009-07-27

    The protein databank (PDB) contains high quality structural data for computational structural biology investigations. We have earlier described a fast tool (the decomp_pdb tool) for identifying and marking missing atoms and residues in PDB files. The tool also automatically decomposes PDB entries into separate files describing ligands and polypeptide chains. Here, we describe a web interface named DECOMP for the tool. Our program correctly identifies multi-monomer ligands, and the server also offers the preprocessed ligand-protein decomposition of the complete PDB for downloading (up to size: 5GB) AVAILABILITY: http://decomp.pitgroup.org.

  15. Shaping a screening file for maximal lead discovery efficiency and effectiveness: elimination of molecular redundancy.

    PubMed

    Bakken, Gregory A; Bell, Andrew S; Boehm, Markus; Everett, Jeremy R; Gonzales, Rosalia; Hepworth, David; Klug-McLeod, Jacquelyn L; Lanfear, Jeremy; Loesel, Jens; Mathias, John; Wood, Terence P

    2012-11-26

    High Throughput Screening (HTS) is a successful strategy for finding hits and leads that have the opportunity to be converted into drugs. In this paper we highlight novel computational methods used to select compounds to build a new screening file at Pfizer and the analytical methods we used to assess their quality. We also introduce the novel concept of molecular redundancy to help decide on the density of compounds required in any region of chemical space in order to be confident of running successful HTS campaigns.

  16. 37 CFR 1.81 - Drawings required in patent application.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to be patented; this drawing, or a high quality copy thereof, must be filed with the application... the original disclosure thereof for the purpose of interpretation of the scope of any claim. [43 FR...

  17. VizieR Online Data Catalog: Trumpler 5 photometric BV catalog (Donati+, 2015)

    NASA Astrophysics Data System (ADS)

    Donati, P.; Cocozza, G.; Bragaglia, A.; Pancino, E.; Cantat-Gaudin, T.; Carrera, R.; Tosi, M.

    2014-11-01

    We combined high-quality photometric observations obtained with WFI and high-resolution spectra obtained with FLAMES to determine accurate cluster parameters, namely age, distance, reddening, and metallicity. (2 data files).

  18. 75 FR 8303 - Marine Mammals; File No. 13430

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ..., Silver Spring, MD 20910; phone (301) 713-2289; fax (301) 713-0376; and Northwest Region, NMFS, 7600 Sand... impact the quality of the human environment and that preparation of an environmental impact statement was...

  19. 77 FR 26517 - Marine Mammals; File No. 14118

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ... significantly impact the quality of the human environment and that preparation of an environmental impact...) 713-0376; Northwest Region, NMFS, 7600 Sand Point Way NE., BIN C15700, Bldg. 1, Seattle, WA 98115-0700...

  20. 78 FR 25425 - Marine Mammals; File No. 16388

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... the quality of the human environment and that preparation of an environmental impact statement was not...) 713-0376; Northwest Region, NMFS, 7600 Sand Point Way NE., BIN C15700, Bldg. 1, Seattle, WA 98115-0700...

  1. 77 FR 58358 - Marine Mammals; File No. 14097

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... significantly impact the quality of the human environment and that preparation of an environmental impact...) 713-0376; Northwest Region, NMFS, 7600 Sand Point Way NE., BIN C15700, Bldg. 1, Seattle, WA 98115-0700...

  2. 77 FR 40859 - Marine Mammals; File No. 14097

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-11

    ... impact the quality of the human environment and that preparation of an environmental impact statement was..., Silver Spring, MD 20910; phone (301)427-8401; fax (301)713-0376; Northwest Region, NMFS, 7600 Sand Point...

  3. 75 FR 40776 - Marine Mammals; File No. 14097

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... quality of the human environment and that preparation of an environmental impact statement was not...; Northwest Region, NMFS, 7600 Sand Point Way NE, BIN C15700, Bldg. 1, Seattle, WA 98115-0700; phone (206) 526...

  4. The availability of community health center services and access to medical care.

    PubMed

    Kirby, James B; Sharma, Ravi

    2017-12-01

    Community Health Centers (CHCs) funded by Section 330 of the Public Health Service Act are an essential part of the health care safety net in the US. The Patient Protection and Affordable Care Act expanded the program significantly, but the extent to which the availability of CHCs improve access to care in general is not clear. In this paper, we examine the associations between the availability of CHC services in communities and two key measures of ambulatory care access - having a usual source of care and having any office-based medical visits over a one year period. We pooled six years of data from the Medical Expenditure Panel Survey (2008-2013) and linked it to geographic data on CHCs from Health Resources and Services Administration's Health Center Program Uniform Data System. We also link other community characteristics from the Area Health Resource File and the Dartmouth Institute's data files. The associations between CHC availability and our access measures are estimated with logistic regression models stratified by insurance status. The availability of CHC services was positively associated with both measures of access among those with no insurance coverage. Additionally, it was positively associated with having a usual source of care among those with Medicaid and private insurance. These findings persist after controlling for key individual- and community-level characteristics. Our findings suggest that an enhanced CHC program could be an important resource for supporting the efficacy of expanded Medicaid coverage under the Affordable Care Act and, ultimately, improving access to quality primary care for underserved Americans. Published by Elsevier Inc.

  5. The development of the July 1989 1 deg x 1 deg and 30' x 30' terrestrial mean free-air anomaly data bases

    NASA Technical Reports Server (NTRS)

    Kim, Jeong-Hee; Rapp, Richard H.

    1990-01-01

    In June 1986 a 1 x 1 deg/mean free-air anomaly data file containing 48955 anomalies was completed. In August 1986 a 30 x 30 min mean free-air anomaly file was defined containing 31787 values. For the past three years data has been collected to upgrade these mean anomaly files. The primary emphasis was the collection of data to be used for the estimation of 30 min means anomalies in land areas. The emphasis on land areas was due to the anticipated use of 30 min anomalies derived from satellite altimeter data in the ocean areas. There were 10 data sources in the August 1986 file. Twenty-eight sources were added based on the collection of both point and mean anomalies from a number of individuals and organizations. A preliminary 30 min file was constructed from the 38 data sources. This file was used to calculate 1 x 1 deg mean anomalies. This 1 x 1 deg file was merged with a 1 x 1 deg file which was a merger of the June 1986 file plus a 1 x 1 deg file made available by DMA Aerospace Center. Certain bad 30 min anomalies were identified and deleted from the preliminary 30 min file leading to the final 30 min file (the July 1989 30 min file) with 66990 anomalies and their accuracy. These anomalies were used to again compute 1 x 1 deg anomalies which were merged with the previous June 86 DMAAC data file. The final 1 x 1 deg mean anomaly file (the July 89 1 x 1 deg data base) contained 50793 anomalies and their accuracy. The anomaly data files were significantly improved over the prior data sets in the following geographic regions: Africa, Scandinavia, Canada, United States, Mexico, Central and South America. Substantial land areas remain where there is little or no available data.

  6. Improving quality of care and long-term health outcomes through continuity of care with the use of an electronic or paper patient-held portable health file (COMMUNICATE): study protocol for a randomized controlled trial.

    PubMed

    Lassere, Marissa Nichole; Baker, Sue; Parle, Andrew; Sara, Anthony; Johnson, Kent Robert

    2015-06-04

    The advantages of patient-held portable health files (PHF) and personal health records (PHR), paper or electronic, are said to include improved health-care provider continuity-of-care and patient empowerment in maintaining health. Top-down approaches are favored by public sector government and health managers. Bottom-up approaches include systems developed directly by health-care providers, consumers and industry, implemented locally on devices carried by patient-consumers or shared via web-based portals. These allow individuals to access, manage and share their health information, and that of others for whom they are authorized, in a private, secure and confidential environment. Few medical record technologies have been evaluated in randomized trials to determine whether there are important clinical benefits of these interventions. The COMMUNICATE trial will assess the acceptability and long-term clinical outcomes of an electronic and paper patient-held PHF. This is a 48-month, open-label pragmatic, superiority, parallel-group design randomized controlled trial. Subjects (n = 792) will be randomized in a 1:1:1 ratio to each of the trial arms: the electronic PHF added to usual care, the paper PHF added to usual care and usual care alone (no PHF). Inclusion criteria include those 60 years or older living independently in the community, but who have two or more chronic medical conditions that require prescription medication and regular care by at least three medical practitioners (general and specialist care). The primary objective is whether use of a PHF compared to usual care reduces a combined endpoint of deaths, overnight hospitalizations and blindly adjudicated serious out-of-hospital events. All primary analyses will be undertaken masked to randomized arm allocation using intention-to-treat principles. Secondary outcomes include quality of life and health literacy improvements. Lack of blinding creates potential for bias in trial conduct and ascertainment of clinical outcomes. Mechanisms are provided to reduce bias, including balanced study contact with all participants, a blinded adjudication committee determining which out-of-hospital events are serious and endpoints that are objective (overnight hospitalizations and mortality). The PRECIS tool provides a summary of the trial's design on the Pragmatic-Explanatory Continuum. Registered with Clinicaltrials.gov (identifier: NCT01082978) on 8 March 2010.

  7. A Quality Screening Service for Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Olsen, Edward; Fox, Peter; Vollmer, Bruce; Wolfe, Robert; Samadi, Shahin

    2010-01-01

    NASA provides a wide variety of Earth-observing satellite data products to a diverse community. These data are annotated with quality information in a variety of ways, with the result that many users struggle to understand how to properly account for quality when dealing with satellite data. To address this issue, a Data Quality Screening Service (DQSS) is being implemented for a number of datasets. The DQSS will enable users to obtain data files in which low-quality pixels have been filtered out, based either on quality criteria recommended by the science team or on the user s particular quality criteria. The objective is to increase proper utilization of this critical quality data in science data analysis of satellite data products.

  8. An update to the Surface Ocean CO2 Atlas (SOCAT version 2)

    NASA Astrophysics Data System (ADS)

    Bakker, D. C. E.; Pfeil, B.; Smith, K.; Hankin, S.; Olsen, A.; Alin, S. R.; Cosca, C.; Harasawa, S.; Kozyr, A.; Nojiri, Y.; O'Brien, K. M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N.; Boutin, J.; Cai, W.-J.; Castle, R. D.; Chavez, F. P.; Chen, L.; Chierici, M.; Currie, K.; de Baar, H. J. W.; Evans, W.; Feely, R. A.; Fransson, A.; Gao, Z.; Hales, B.; Hardman-Mountford, N.; Hoppema, M.; Huang, W.-J.; Hunt, C. W.; Huss, B.; Ichikawa, T.; Johannessen, T.; Jones, E. M.; Jones, S. D.; Jutterström, S.; Kitidis, V.; Körtzinger, A.; Landschtzer, P.; Lauvset, S. K.; Lefèvre, N.; Manke, A. B.; Mathis, J. T.; Merlivat, L.; Metzl, N.; Murata, A.; Newberger, T.; Ono, T.; Park, G.-H.; Paterson, K.; Pierrot, D.; Ríos, A. F.; Sabine, C. L.; Saito, S.; Salisbury, J.; Sarma, V. V. S. S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K.; Sun, H.; Sutton, A. J.; Suzuki, T.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; Tsurushima, N.; van Heuven, S. M. A. C.; Vandemark, D.; Vlahos, P.; Wallace, D. W. R.; Wanninkhof, R.; Watson, A. J.

    2013-08-01

    The Surface Ocean CO2 Atlas (SOCAT) is an effort by the international marine carbon research community. It aims to improve access to carbon dioxide measurements in the surface oceans by regular releases of quality controlled and fully documented synthesis and gridded fCO2 (fugacity of carbon dioxide) products. SOCAT version 2 presented here extends the data set for the global oceans and coastal seas by four years and has 10.1 million surface water fCO2 values from 2660 cruises between 1968 and 2011. The procedures for creating version 2 have been comparable to those for version 1. The SOCAT website (http://www.socat.info/) provides access to the individual cruise data files, as well as to the synthesis and gridded data products. Interactive online tools allow visitors to explore the richness of the data. Scientific users can also retrieve the data as downloadable files or via Ocean Data View. Version 2 enables carbon specialists to expand their studies until 2011. Applications of SOCAT include process studies, quantification of the ocean carbon sink and its spatial, seasonal, year-to-year and longer-term variation, as well as initialisation or validation of ocean carbon models and coupled-climate carbon models.

  9. Patent prosecution at the European Patent Office: what is new for life sciences applicants?

    PubMed

    Lidén, Camilla; Setréus, Ellen

    2011-06-01

    During the last couple of years, the European Patent Office (EPO) has gradually implemented the so-called 'Raising the Bar' initiative, which is aimed at improving the quality of European patents and, in combination with speeding up the procedure for grant, reducing uncertainty for third parties. As a part of this initiative, a series of changes to the implementing regulations of the European Patent Convention (EPC) were introduced last year. Applicants now need to be sure of their position with regard to supportive experimental data and a thorough argumentation at an earlier stage in the prosecution of a patent application. When evaluating the criteria of inventive step (Art 56 EPC) and sufficiency of disclosure (Art 83 EPC), the EPO frequently refers to two decisions issued by the Boards of Appeal. These decisions are also used to determine whether it is appropriate to further support a disclosed technical effect or a scope of protection with experimental data produced after the date of filing of the patent application. In view of recent case law, it is evident that applicants cannot rely on post-filed experimental data as the only evidence for a technical effect or a certain scope of protection claimed in the application.

  10. NoGOA: predicting noisy GO annotations using evidences and sparse representation.

    PubMed

    Yu, Guoxian; Lu, Chang; Wang, Jun

    2017-07-21

    Gene Ontology (GO) is a community effort to represent functional features of gene products. GO annotations (GOA) provide functional associations between GO terms and gene products. Due to resources limitation, only a small portion of annotations are manually checked by curators, and the others are electronically inferred. Although quality control techniques have been applied to ensure the quality of annotations, the community consistently report that there are still considerable noisy (or incorrect) annotations. Given the wide application of annotations, however, how to identify noisy annotations is an important but yet seldom studied open problem. We introduce a novel approach called NoGOA to predict noisy annotations. NoGOA applies sparse representation on the gene-term association matrix to reduce the impact of noisy annotations, and takes advantage of sparse representation coefficients to measure the semantic similarity between genes. Secondly, it preliminarily predicts noisy annotations of a gene based on aggregated votes from semantic neighborhood genes of that gene. Next, NoGOA estimates the ratio of noisy annotations for each evidence code based on direct annotations in GOA files archived on different periods, and then weights entries of the association matrix via estimated ratios and propagates weights to ancestors of direct annotations using GO hierarchy. Finally, it integrates evidence-weighted association matrix and aggregated votes to predict noisy annotations. Experiments on archived GOA files of six model species (H. sapiens, A. thaliana, S. cerevisiae, G. gallus, B. Taurus and M. musculus) demonstrate that NoGOA achieves significantly better results than other related methods and removing noisy annotations improves the performance of gene function prediction. The comparative study justifies the effectiveness of integrating evidence codes with sparse representation for predicting noisy GO annotations. Codes and datasets are available at http://mlda.swu.edu.cn/codes.php?name=NoGOA .

  11. 76 FR 53853 - Approval and Promulgation of Implementation Plans and Designation of Areas for Air Quality...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... nitrogen oxides (NO X ) and VOCs in the presence of sunlight and high ambient temperatures. NO X and VOCs... Quality, the City of Baton Rouge, and the Chamber of Greater Baton Rouge all formally requested a waiver... area. EPA denied these requests. The City and the Chamber filed a Petition for Review in the U.S. Court...

  12. Technical quality of root canal treatment of posterior teeth after rotary or hand preparation by fifth year undergraduate students, The University of Jordan.

    PubMed

    Abu-Tahun, Ibrahim; Al-Rabab'ah, Mohammad A; Hammad, Mohammad; Khraisat, Ameen

    2014-12-01

    The aim of this study was to investigate the technical quality of root canal treatment provided by the undergraduate students as their first experience in molar endodontics using nickel-titanium (NiTi) files in a crown-down approach compared with stainless steel standard technique. This study was carried out by the fifth year undergraduate students attending peer review sessions as a part of their training programme, using two different questionnaires to assess the overall technical quality and potential problems regarding endodontic complications after root canal preparation with these two techniques. The overall results indicated a statistically significant difference in the performance of the two instrument techniques in difficult cases showing better performance of the NiTi system and mean rotary preparation time (P < 0.001). Under the conditions of this study, novice dental students, using NiTi ProTaper rotary files, were able to prepare root canals faster with more preparation accuracy compared with canals of same teeth prepared with hand instruments. © 2014 Australian Society of Endodontology.

  13. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  14. A model for a PC-based, universal-format, multimedia digitization system: moving beyond the scanner.

    PubMed

    McEachen, James C; Cusack, Thomas J; McEachen, John C

    2003-08-01

    Digitizing images for use in case presentations based on hardcopy films, slides, photographs, negatives, books, and videos can present a challenging task. Scanners and digital cameras have become standard tools of the trade. Unfortunately, use of these devices to digitize multiple images in many different media formats can be a time-consuming and in some cases unachievable process. The authors' goal was to create a PC-based solution for digitizing multiple media formats in a timely fashion while maintaining adequate image presentation quality. The authors' PC-based solution makes use of off-the-shelf hardware applications to include a digital document camera (DDC), VHS video player, and video-editing kit. With the assistance of five staff radiologists, the authors examined the quality of multiple image types digitized with this equipment. The authors also quantified the speed of digitization of various types of media using the DDC and video-editing kit. With regard to image quality, the five staff radiologists rated the digitized angiography, CT, and MR images as adequate to excellent for use in teaching files and case presentations. With regard to digitized plain films, the average rating was adequate. As for performance, the authors recognized a 68% improvement in the time required to digitize hardcopy films using the DDC instead of a professional quality scanner. The PC-based solution provides a means for digitizing multiple images from many different types of media in a timely fashion while maintaining adequate image presentation quality.

  15. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  16. Saving-enhanced memory: the benefits of saving on the learning and remembering of new information.

    PubMed

    Storm, Benjamin C; Stone, Sean M

    2015-02-01

    With the continued integration of technology into people's lives, saving digital information has become an everyday facet of human behavior. In the present research, we examined the consequences of saving certain information on the ability to learn and remember other information. Results from three experiments showed that saving one file before studying a new file significantly improved memory for the contents of the new file. Notably, this effect was not observed when the saving process was deemed unreliable or when the contents of the to-be-saved file were not substantial enough to interfere with memory for the new file. These results suggest that saving provides a means to strategically off-load memory onto the environment in order to reduce the extent to which currently unneeded to-be-remembered information interferes with the learning and remembering of other information. © The Author(s) 2014.

  17. 78 FR 26777 - Technological Advisory Council Recommendation for Improving Receiver Performance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-08

    ... paper makes note that an interference limits policy approach may not be appropriate in all cases. Are.... SUMMARY: The FCC's Technological Advisory Council (TAC) has been tasked to study the role of receivers in.../ecfs2/ . [ssquf] Paper Filers: Parties who choose to file by paper must file an original and one copy of...

  18. 78 FR 15392 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-11

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69039; File No. SR-NASDAQ-2013-031] Self...'' Orders Submitted to the Retail Price Improvement Program Will Qualify as ``Retail Orders'' March 5, 2013... described in Items I, II, and III below, which Items have been prepared by the self-regulatory organization...

  19. 77 FR 42048 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-67399; File No. SR-Phlx-2012-94] Self... Change Relating to the Extension of a Pilot Program Regarding Price Improvement XL July 11, 2012...). \\2\\ 17 CFR 240.19b-4. I. Self-Regulatory Organization's Statement of the Terms of Substance of the...

  20. 76 FR 44642 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-26

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-64931; File No. SR-ISE-2011-41] Self-Regulatory... Rule Change Relating to the Extension of the Price Improvement Mechanism Pilot Program July 20, 2011...). \\4\\ 17 CFR 240.19b-4(f)(6). I. Self-Regulatory Organization's Statement of the Terms of Substance of...

  1. 75 FR 43221 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-23

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-62513; File No. SR-ISE-2010-75] Self-Regulatory... Rule Change Relating to the Extension of the Price Improvement Mechanism Pilot Program July 16, 2010...)(3)(A). \\4\\ 17 CFR 240.19b-4(f)(6). I. Self-Regulatory Organization's Statement of the Terms of...

  2. 77 FR 36589 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-19

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-67202; File No. SR-ISE-2012-54] Self-Regulatory... Rule Change Relating to the Extension of the Price Improvement Mechanism Pilot Program June 14, 2012... described in Items I and II below, which items have been prepared by the self-regulatory organization. The...

  3. 78 FR 39390 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69853; File No. SR-ISE-2013-41] Self-Regulatory... Rule Change To Extend the Price Improvement Mechanism Pilot Program June 25, 2013. Pursuant to Section... described in Items I and II below, which items have been prepared by the self-regulatory organization. The...

  4. 78 FR 43950 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69989; File No. SR-Phlx-2013-74] Self... Change Relating to the Extension of a Pilot Program Regarding Price Improvement XL July 16, 2013...(b)(1). \\2\\ 17 CFR 240.19b-4. I. Self-Regulatory Organization's Statement of the Terms of Substance...

  5. 78 FR 6160 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-68709; File No. SR-NYSE-2013-04] Self... Improvement Orders in a Non-RLP Capacity for Securities to Which the RLP Is Not Assigned January 23, 2013... in Items I and II below, which Items have been prepared by the self-regulatory organization. The...

  6. 76 FR 77782 - Marine Mammals; File No. 781-1824

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-14

    ...; fax (301) 713-0376; and Northwest Region, NMFS, 7600 Sand Point Way NE., BIN C15700, Bldg. 1, Seattle... issuance of the permit would not significantly impact the quality of the human environment and that...

  7. Pedestrian injury causation parameters. Phase 2

    DOT National Transportation Integrated Search

    1981-10-01

    This report describes data collection, quality control and data analysis procedures for a five-team program to study pedestrian injury causation factors. The data file contains 1,997 pedestrian accidents collected during a two and one-half year perio...

  8. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  9. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  10. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  11. 15 CFR 995.4 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE QUALITY ASSURANCE AND CERTIFICATION... of the 1974 SOLAS Convention. Electronic Navigational Chart (ENC) means a database, standardized as... National Oceanic and Atmospheric Administration. NOAA ENC files comply with the IHO S-57 standard, Edition...

  12. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  13. PF-WFS Shell Inspection Update December 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, Anthony Eugene; Ledoux, Reina Rebecca; Gonzales, Antonio R.

    Since the last project update in FY16:Q2, PF-WFS personnel have advanced in understanding of shell inspection on Coordinate Measuring Machines {CMM} and refined the PF-WFS process to the point it was decided to convert shell inspection from the Sheffield #1 gage to Lietz CM Ms. As a part of introspection on the quality of this process many sets of data have been reviewed and analyzed. This analysis included Sheffield to CMM comparisons, CMM inspection repeatability, fixturing differences, quality check development, probing approach changes. This update report will touch on these improvements that have built the confidence in this process tomore » mainstream it inspecting shells. In addition to the CMM programming advancements, the continuation in refinement of input and outputs for the CMM program has created an archiving scheme, input spline files, an output metafile, and inspection report package. This project will continue to mature. Part designs may require program modifications to accommodate "new to this process" part designs. Technology limitations tied to security and performance are requiring possible changes to computer configurations to support an automated process.« less

  14. VizieR Online Data Catalog: WISE W1/W2 Tully-Fisher relation calibrator data (Neill+, 2014)

    NASA Astrophysics Data System (ADS)

    Neill, J. D.; Seibert, M.; Tully, R. B.; Courtois, H.; Sorce, J. G.; Jarrett, T. H.; Scowcroft, V.; Masci, F. J.

    2017-04-01

    We have instigated a separate project to provide high-quality surface photometry of all WISE galaxies larger than 0.8' on the sky. The WISE Nearby Galaxy Atlas (WNGA; M. Seibert et al., in preparation) will provide photometry that is quality controlled for over 20000 galaxies. This photometry, optimized for extended sources, significantly reduces the resulting scatter in the Tully-Fisher relation (here after TFR) calibration and thus improves the resulting distances. Having an accurate calibration of the TFR for these two WISE passbands will allow the use of this large sample to explore the structure and dynamics of local galaxy bulk flows. With the current tally, there are 310 cluster calibrators with WISE W1 and W2 photometry, compared with 213 available to Sorce et al. (2013, J/ApJ/765/94) for the Spitzer calibration, and 291 of the 310 WISE calibrators have I-band photometry, compared with the 267 available to Tully & Courtois (2012ApJ...749...78T) for the previous I-band calibration. (1 data file).

  15. DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.

    PubMed

    Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques

    2008-09-08

    Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.

  16. Apically extruded dentin debris by reciprocating single-file and multi-file rotary system.

    PubMed

    De-Deus, Gustavo; Neves, Aline; Silva, Emmanuel João; Mendonça, Thais Accorsi; Lourenço, Caroline; Calixto, Camila; Lima, Edson Jorge Moreira

    2015-03-01

    This study aims to evaluate the apical extrusion of debris by the two reciprocating single-file systems: WaveOne and Reciproc. Conventional multi-file rotary system was used as a reference for comparison. The hypotheses tested were (i) the reciprocating single-file systems extrude more than conventional multi-file rotary system and (ii) the reciprocating single-file systems extrude similar amounts of dentin debris. After solid selection criteria, 80 mesial roots of lower molars were included in the present study. The use of four different instrumentation techniques resulted in four groups (n = 20): G1 (hand-file technique), G2 (ProTaper), G3 (WaveOne), and G4 (Reciproc). The apparatus used to evaluate the collection of apically extruded debris was typical double-chamber collector. Statistical analysis was performed for multiple comparisons. No significant difference was found in the amount of the debris extruded between the two reciprocating systems. In contrast, conventional multi-file rotary system group extruded significantly more debris than both reciprocating groups. Hand instrumentation group extruded significantly more debris than all other groups. The present results yielded favorable input for both reciprocation single-file systems, inasmuch as they showed an improved control of apically extruded debris. Apical extrusion of debris has been studied extensively because of its clinical relevance, particularly since it may cause flare-ups, originated by the introduction of bacteria, pulpal tissue, and irrigating solutions into the periapical tissues.

  17. Algal Attributes: An Autecological Classification of Algal Taxa Collected by the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Porter, Stephen D.

    2008-01-01

    Algae are excellent indicators of water-quality conditions, notably nutrient and organic enrichment, and also are indicators of major ion, dissolved oxygen, and pH concentrations and stream microhabitat conditions. The autecology, or physiological optima and tolerance, of algal species for various water-quality contaminants and conditions is relatively well understood for certain groups of freshwater algae, notably diatoms. However, applications of autecological information for water-quality assessments have been limited because of challenges associated with compiling autecological literature from disparate sources, tracking name changes for a large number of algal species, and creating an autecological data base from which algal-indicator metrics can be calculated. A comprehensive summary of algal autecological attributes for North American streams and rivers does not exist. This report describes a large, digital data file containing 28,182 records for 5,939 algal taxa, generally species or variety, collected by the U.S. Geological Survey?s National Water-Quality Assessment (NAWQA) Program. The data file includes 37 algal attributes classified by over 100 algal-indicator codes or metrics that can be calculated easily with readily available software. Algal attributes include qualitative classifications based on European and North American autecological literature, and semi-quantitative, weighted-average regression approaches for estimating optima using regional and national NAWQA data. Applications of algal metrics in water-quality assessments are discussed and national quartile distributions of metric scores are shown for selected indicator metrics.

  18. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Astrophysics Data System (ADS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-11-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  19. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-01-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  20. A mass spectrometry proteomics data management platform.

    PubMed

    Sharma, Vagisha; Eng, Jimmy K; Maccoss, Michael J; Riffle, Michael

    2012-09-01

    Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.

  1. Accreditation status and geographic location of outpatient vascular testing facilities among Medicare beneficiaries: the VALUE (Vascular Accreditation, Location & Utilization Evaluation) study.

    PubMed

    Rundek, Tatjana; Brown, Scott C; Wang, Kefeng; Dong, Chuanhui; Farrell, Mary Beth; Heller, Gary V; Gornik, Heather L; Hutchisson, Marge; Needleman, Laurence; Benenati, James F; Jaff, Michael R; Meier, George H; Perese, Susana; Bendick, Phillip; Hamburg, Naomi M; Lohr, Joann M; LaPerna, Lucy; Leers, Steven A; Lilly, Michael P; Tegeler, Charles; Alexandrov, Andrei V; Katanick, Sandra L

    2014-10-01

    There is limited information on the accreditation status and geographic distribution of vascular testing facilities in the US. The Centers for Medicare & Medicaid Services (CMS) provide reimbursement to facilities regardless of accreditation status. The aims were to: (1) identify the proportion of Intersocietal Accreditation Commission (IAC) accredited vascular testing facilities in a 5% random national sample of Medicare beneficiaries receiving outpatient vascular testing services; (2) describe the geographic distribution of these facilities. The VALUE (Vascular Accreditation, Location & Utilization Evaluation) Study examines the proportion of IAC accredited facilities providing vascular testing procedures nationally, and the geographic distribution and utilization of these facilities. The data set containing all facilities that billed Medicare for outpatient vascular testing services in 2011 (5% CMS Outpatient Limited Data Set (LDS) file) was examined, and locations of outpatient vascular testing facilities were obtained from the 2011 CMS/Medicare Provider of Services (POS) file. Of 13,462 total vascular testing facilities billing Medicare for vascular testing procedures in a 5% random Outpatient LDS for the US in 2011, 13% (n=1730) of facilities were IAC accredited. The percentage of IAC accredited vascular testing facilities in the LDS file varied significantly by US region, p<0.0001: 26%, 12%, 11%, and 7% for the Northeast, South, Midwest, and Western regions, respectively. Findings suggest that the proportion of outpatient vascular testing facilities that are IAC accredited is low and varies by region. Increasing the number of accredited vascular testing facilities to improve test quality is a hypothesis that should be tested in future research. © The Author(s) 2014.

  2. Patient Satisfaction, Treatment Experience, and Disability Outcomes in a Population-Based Cohort of Injured Workers in Washington State: Implications for Quality Improvement

    PubMed Central

    Wickizer, Thomas M; Franklin, Gary; Fulton-Kehoe, Deborah; Turner, Judith A; Mootz, Robert; Smith-Weller, Terri

    2004-01-01

    Objective To determine what aspects of patient satisfaction are most important in explaining the variance in patients' overall treatment experience and to evaluate the relationship between treatment experience and subsequent outcomes. Data Sources and Setting Data from a population-based survey of 804 randomly selected injured workers in Washington State filing a workers' compensation claim between November 1999 and February 2000 were combined with insurance claims data indicating whether survey respondents were receiving disability compensation payments for being out of work at 6 or 12 months after claim filing. Study Design We conducted a two-step analysis. In the first step, we tested a multiple linear regression model to assess the relationship of satisfaction measures to patients' overall treatment experience. In the second step, we used logistic regression to assess the relationship of treatment experience to subsequent outcomes. Principal Findings Among injured workers who had ongoing follow-up care after their initial treatment (n=681), satisfaction with interpersonal and technical aspects of care and with care coordination was strongly and positively associated with overall treatment experience (p<0.001). As a group, the satisfaction measures explained 38 percent of the variance in treatment experience after controlling for demographics, satisfaction with medical care prior to injury, job satisfaction, type of injury, and provider type. Injured workers who reported less-favorable treatment experience were 3.54 times as likely (95 percent confidence interval, 1.20–10.95, p=.021) to be receiving time-loss compensation for inability to work due to injury 6 or 12 months after filing a claim, compared to patients whose treatment experience was more positive. PMID:15230925

  3. Value-based medicine, comparative effectiveness, and cost-effectiveness analysis of topical cyclosporine for the treatment of dry eye syndrome.

    PubMed

    Brown, Melissa M; Brown, Gary C; Brown, Heidi C; Peet, Jonathan; Roth, Zachary

    2009-02-01

    To assess the comparative effectiveness and cost-effectiveness (cost-utility) of a 0.05% emulsion of topical cyclosporine (Restasis; Allergan Inc, Irvine, California) for the treatment of moderate to severe dry eye syndrome that is unresponsive to conventional therapy. Data from 2 multicenter, randomized, clinical trials and Food and Drug Administration files for topical cyclosporine, 0.05%, emulsion were used in Center for Value-Based Medicine analyses. Analyses included value-based medicine as a comparative effectiveness analysis and average cost-utility analysis using societal and third-party insurer cost perspectives. Outcome measures of comparative effectiveness were quality-adjusted life-year (QALY) gain and percentage of improvement in quality of life, and for cost-effectiveness were cost-utility ratio (CUR) using dollars per QALY. Topical cyclosporine, 0.05%, confers a value gain (comparative effectiveness) of 0.0319 QALY per year compared with topical lubricant therapy, a 4.3% improvement in quality of life for the average patient with moderate to severe dry eye syndrome that is unresponsive to conventional lubricant therapy. The societal perspective incremental CUR for cyclosporine over vehicle therapy is $34,953 per QALY and the societal perspective average CUR is $11,199 per QALY. The third-party-insurer incremental CUR is $37,179 per QALY, while the third-party-insurer perspective average CUR is $34,343 per QALY. Topical cyclosporine emulsion, 0.05%, confers considerable patient value and is a cost-effective therapy for moderate to severe dry eye syndrome that is unresponsive to conventional therapy.

  4. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  5. GNAQPMS v1.1: accelerating the Global Nested Air Quality Prediction Modeling System (GNAQPMS) on Intel Xeon Phi processors

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Chen, Huansheng; Wu, Qizhong; Lin, Junmin; Chen, Xueshun; Xie, Xinwei; Wang, Rongrong; Tang, Xiao; Wang, Zifa

    2017-08-01

    The Global Nested Air Quality Prediction Modeling System (GNAQPMS) is the global version of the Nested Air Quality Prediction Modeling System (NAQPMS), which is a multi-scale chemical transport model used for air quality forecast and atmospheric environmental research. In this study, we present the porting and optimisation of GNAQPMS on a second-generation Intel Xeon Phi processor, codenamed Knights Landing (KNL). Compared with the first-generation Xeon Phi coprocessor (codenamed Knights Corner, KNC), KNL has many new hardware features such as a bootable processor, high-performance in-package memory and ISA compatibility with Intel Xeon processors. In particular, we describe the five optimisations we applied to the key modules of GNAQPMS, including the CBM-Z gas-phase chemistry, advection, convection and wet deposition modules. These optimisations work well on both the KNL 7250 processor and the Intel Xeon E5-2697 V4 processor. They include (1) updating the pure Message Passing Interface (MPI) parallel mode to the hybrid parallel mode with MPI and OpenMP in the emission, advection, convection and gas-phase chemistry modules; (2) fully employing the 512 bit wide vector processing units (VPUs) on the KNL platform; (3) reducing unnecessary memory access to improve cache efficiency; (4) reducing the thread local storage (TLS) in the CBM-Z gas-phase chemistry module to improve its OpenMP performance; and (5) changing the global communication from writing/reading interface files to MPI functions to improve the performance and the parallel scalability. These optimisations greatly improved the GNAQPMS performance. The same optimisations also work well for the Intel Xeon Broadwell processor, specifically E5-2697 v4. Compared with the baseline version of GNAQPMS, the optimised version was 3.51 × faster on KNL and 2.77 × faster on the CPU. Moreover, the optimised version ran at 26 % lower average power on KNL than on the CPU. With the combined performance and energy improvement, the KNL platform was 37.5 % more efficient on power consumption compared with the CPU platform. The optimisations also enabled much further parallel scalability on both the CPU cluster and the KNL cluster scaled to 40 CPU nodes and 30 KNL nodes, with a parallel efficiency of 70.4 and 42.2 %, respectively.

  6. Understanding Customer Dissatisfaction with Underutilized Distributed File Servers

    NASA Technical Reports Server (NTRS)

    Riedel, Erik; Gibson, Garth

    1996-01-01

    An important trend in the design of storage subsystems is a move toward direct network attachment. Network-attached storage offers the opportunity to off-load distributed file system functionality from dedicated file server machines and execute many requests directly at the storage devices. For this strategy to lead to better performance, as perceived by users, the response time of distributed operations must improve. In this paper we analyze measurements of an Andrew file system (AFS) server that we recently upgraded in an effort to improve client performance in our laboratory. While the original server's overall utilization was only about 3%, we show how burst loads were sufficiently intense to lead to period of poor response time significant enough to trigger customer dissatisfaction. In particular, we show how, after adjusting for network load and traffic to non-project servers, 50% of the variation in client response time was explained by variation in server central processing unit (CPU) use. That is, clients saw long response times in large part because the server was often over-utilized when it was used at all. Using these measures, we see that off-loading file server work in a network-attached storage architecture has to potential to benefit user response time. Computational power in such a system scales directly with storage capacity, so the slowdown during burst period should be reduced.

  7. An effective XML based name mapping mechanism within StoRM

    NASA Astrophysics Data System (ADS)

    Corso, E.; Forti, A.; Ghiselli, A.; Magnoni, L.; Zappi, R.

    2008-07-01

    In a Grid environment the naming capability allows users to refer to specific data resources in a physical storage system using a high level logical identifier. This logical identifier is typically organized in a file system like structure, a hierarchical tree of names. Storage Resource Manager (SRM) services map the logical identifier to the physical location of data evaluating a set of parameters as the desired quality of services and the VOMS attributes specified in the requests. StoRM is a SRM service developed by INFN and ICTP-EGRID to manage file and space on standard POSIX and high performing parallel and cluster file systems. An upcoming requirement in the Grid data scenario is the orthogonality of the logical name and the physical location of data, in order to refer, with the same identifier, to different copies of data archived in various storage areas with different quality of service. The mapping mechanism proposed in StoRM is based on a XML document that represents the different storage components managed by the service, the storage areas defined by the site administrator, the quality of service they provide and the Virtual Organization that want to use the storage area. An appropriate directory tree is realized in each storage component reflecting the XML schema. In this scenario StoRM is able to identify the physical location of a requested data evaluating the logical identifier and the specified attributes following the XML schema, without querying any database service. This paper presents the namespace schema defined, the different entities represented and the technical details of the StoRM implementation.

  8. Data management system for USGS/USEPA urban hydrology studies program

    USGS Publications Warehouse

    Doyle, W.H.; Lorens, J.A.

    1982-01-01

    A data management system was developed to store, update, and retrieve data collected in urban stormwater studies jointly conducted by the U.S. Geological Survey and U.S. Environmental Protection Agency in 11 cities in the United States. The data management system is used to retrieve and combine data from USGS data files for use in rainfall, runoff, and water-quality models and for data computations such as storm loads. The system is based on the data management aspect of the Statistical Analysis System (SAS) and was used to create all the data files in the data base. SAS is used for storage and retrieval of basin physiography, land-use, and environmental practices inventory data. Also, storm-event water-quality characteristics are stored in the data base. The advantages of using SAS to create and manage a data base are many with a few being that it is simple, easy to use, contains a comprehensive statistical package, and can be used to modify files very easily. Data base system development has progressed rapidly during the last two decades and the data managment system concepts used in this study reflect the advancement made in computer technology during this era. Urban stormwater data is, however, just one application for which the system can be used. (USGS)

  9. Aquatic toxicity information retrieval data base: A technical support document. (Revised July 1992)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The AQUIRE (AQUatic toxicity Information REtrieval) database was established in 1981 by the United States Environmental Protection Agency (US EPA), Environmental Research Laboratory-Duluth (ERL-D). The purpose of AQUIRE is to provide quick access to a comprehensive, systematic, computerized compilation of aquatic toxic effects data. As of July 1992, AQUIRE consists of over 98,300 individual test results on computer file. These tests contain information for 5,500 chemicals and 2,300 organisms, extracted from over 6,300 publications. In addition, the ERL-D data file, prepared by the University of Wisconsin-Superior is now included in AQUIRE. The data file consists of acute toxicity test resultsmore » for the effects of 525 organic chemicals to fathead minnow. All AQUIRE data entries have been subjected to established quality assurance procedures.« less

  10. Enabling Incremental Iterative Development at Scale: Quality Attribute Refinement and Allocation in Practice

    DTIC Science & Technology

    2015-06-01

    abstract constraints along six dimen- sions for expansion: user, actions, data , business rules, interfaces, and quality attributes [Gottesdiener 2010...relevant open source systems. For example, the CONNECT and HADOOP Distributed File System (HDFS) projects have many user stories that deal with...Iteration Zero involves architecture planning before writing any code. An overly long Iteration Zero is equivalent to the dysfunctional “ Big Up-Front

  11. Effects of State Minimum Staffing Standards on Nursing Home Staffing and Quality of Care

    PubMed Central

    Park, Jeongyoung; Stearns, Sally C

    2009-01-01

    Objective To investigate the impact of state minimum staffing standards on the level of staffing and quality of nursing home care. Data Sources Online Survey and Certification Reporting System (OSCAR) merged with the Area Resource File from 1998 through 2001. Study Design Between 1998 and 2001, 16 states implemented or expanded staffing standards in excess of federal requirements, creating a natural experiment in comparison with facilities in states without new standards. Difference-in-differences models using facility fixed effects were estimated to determine the effect of state standards. Data Collection/Extraction Methods OSCAR data were linked to the data on market conditions and state policies. A total of 55,248 facility-year observations from 15,217 freestanding facilities were analyzed. Principal Findings Increased standards resulted in small staffing increases for facilities with staffing initially below or close to new standards. Yet the standards were associated with reductions in restraint use and the number of total deficiencies at all types of facilities. Conclusions Mandated staffing standards affect only low-staff facilities facing potential for penalties, and effects are small. Selected facility-level outcomes may show improvement at all facilities due to a general response to increased standards or to other quality initiatives implemented at the same time as staffing standards. PMID:18823448

  12. Standard operating procedures (SOPs): reason for, types of, adequacy, approval, and deviations from and revisions to.

    PubMed

    Isaman, V; Thelin, R

    1995-09-01

    Standard Operating Procedures (SOPs) are required in order to comply with the Good Laboratory Practice Standards (GLPS) as promulgated in the Federal Insecticide, Fungicide and Rodenticide Act (FIFRA) 40 CFR Part 160. Paragraph 160.81 (a) states: "A testing facility shall have standard operating procedures in writing setting forth study methods that management is satisfied are adequate to insure the quality and integrity of the data generated in the course of a study." Types of SOPs include administrative and personnel, analyses, substances, quality assurance and records, test system, equipment, and field related. All SOPs must be adequate in scope to describe the function in sufficient detail such that the study data are reproducible. All SOPs must be approved by a management level as described in a corporate organization chart. Signatures for SOP responsibility, authorship, and Quality Assurance review adds strength and accountability to the SOP. In the event a procedure or method is performed differently from what is stated in the SOP, an SOP deviation is necessary. As methods and procedures are improved, SOP revisions are necessary to maintain SOP adequacy and applicability. The replaced SOP is put into a historical SOP file and all copies of the replaced SOPs are destroyed.

  13. An audit of risk assessments for suicide and attempted suicide in ED: a retrospective review of quality.

    PubMed

    de Beer, Wayne; DeWitt, Bernard; Schofield, Jules; Clark, Helen; Gibbons, Veronique

    2018-02-23

    The primary aim of this audit was to determine the quality of psychiatric risk assessments conducted by Mental Health & Addiction Services clinicians for patients presenting to the emergency department, Waikato Hospital, Hamilton, New Zealand following an attempted suicide. A retrospective, randomised audit of 376 files of patients who had presented to the ED over a 12-month period from 1 July 2015 to 30 June 2016 was conducted, following the standards outlined in the present New Zealand Ministry of Health Clinical Practice Guideline for Deliberate Self Harm (DSH). It was found that clinicians routinely focused on the historical features of the suicide attempt presentation while failing to record judgements about future suicidal behaviours. Interactions with family members were recorded in less than half of the cases. The guideline most poorly adhered to was checking whether Māori patients wanted culturally appropriate services during the assessment and treatment planning, with this recorded in less than 10% of the clinical records. To improve the quality of the suicide risk assessments, and to better align with Clinical Practice Guidelines, the authors propose redevelopment of clinician training, including focus on cultural competence, and training in confidentiality and privacy relating to an attempted suicide episode.

  14. 75 FR 45480 - Approval and Promulgation of Air Quality Implementation Plans; Minnesota

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-03

    ... consider your comment. Electronic files should avoid the use of special characters, any form of encryption... units at the facility are three fossil fuel- fired boilers (Nos. 1, 2, and 3), and four emergency...

  15. 43 CFR 3809.401 - Where do I file my plan of operations and what information must I include with it?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... equipment, devices, or practices you propose to use during operations including, where applicable— (i) Maps..., paleontological resources, cave resources, hydrology, soils, vegetation, wildlife, air quality, cultural resources...

  16. 43 CFR 3809.401 - Where do I file my plan of operations and what information must I include with it?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... equipment, devices, or practices you propose to use during operations including, where applicable— (i) Maps..., paleontological resources, cave resources, hydrology, soils, vegetation, wildlife, air quality, cultural resources...

  17. 43 CFR 3809.401 - Where do I file my plan of operations and what information must I include with it?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... equipment, devices, or practices you propose to use during operations including, where applicable— (i) Maps..., paleontological resources, cave resources, hydrology, soils, vegetation, wildlife, air quality, cultural resources...

  18. 78 FR 45927 - Information Collection Being Reviewed by the Federal Communications Commission Under Delegated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ... quality, utility, and clarity of the information collected; ways to minimize the burden of the collection... noncommercial FM and TV broadcast station is required to file an Ownership Report for Noncommercial Educational...

  19. SEGY to ASCII Conversion and Plotting Program 2.0

    USGS Publications Warehouse

    Goldman, Mark R.

    2005-01-01

    INTRODUCTION SEGY has long been a standard format for storing seismic data and header information. Almost every seismic processing package can read and write seismic data in SEGY format. In the data processing world, however, ASCII format is the 'universal' standard format. Very few general-purpose plotting or computation programs will accept data in SEGY format. The software presented in this report, referred to as SEGY to ASCII (SAC), converts seismic data written in SEGY format (Barry et al., 1975) to an ASCII data file, and then creates a postscript file of the seismic data using a general plotting package (GMT, Wessel and Smith, 1995). The resulting postscript file may be plotted by any standard postscript plotting program. There are two versions of SAC: one version for plotting a SEGY file that contains a single gather, such as a stacked CDP or migrated section, and a second version for plotting multiple gathers from a SEGY file containing more than one gather, such as a collection of shot gathers. Note that if a SEGY file has multiple gathers, then each gather must have the same number of traces per gather, and each trace must have the same sample interval and number of samples per trace. SAC will read several common standards of SEGY data, including SEGY files with sample values written in either IBM or IEEE floating-point format. In addition, utility programs are present to convert non-standard Seismic Unix (.sux) SEGY files and PASSCAL (.rsy) SEGY files to standard SEGY files. SAC allows complete user control over all plotting parameters including label size and font, tick mark intervals, trace scaling, and the inclusion of a title and descriptive text. SAC shell scripts create a postscript image of the seismic data in vector rather than bitmap format, using GMT's pswiggle command. Although this can produce a very large postscript file, the image quality is generally superior to that of a bitmap image, and commercial programs such as Adobe Illustrator? can manipulate the image more efficiently.

  20. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

Top