Sample records for software usage agreement

  1. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    PubMed

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some physical exposures the full models performed better. Relative RMS errors ranged between 5% and 19% for the full models, and between 10% and 19% for the practical model. When the predicted physical exposures were classified into low, medium, and high, classification agreement ranged from 26% to 71%. The full prediction models, based on self-reported factors, software-recorded computer usage patterns, and additional measurements of anthropometrics and workstation set-up, show a better predictive quality as compared to the practical models based on self-reported factors and recorded computer usage patterns only. However, predictive quality varied largely across different arm-wrist-hand exposure parameters. Future exploration of the relation between predicted physical exposure and symptoms is therefore only recommended for physical exposures that can be reasonably well predicted. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  2. Mining Software Usage with the Automatic Library Tracking Database (ALTD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadri, Bilel; Fahey, Mark R

    2013-01-01

    Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less

  3. Pre Service Teachers' Usage of Dynamic Mathematics Software

    ERIC Educational Resources Information Center

    Bulut, Mehmet; Bulut, Neslihan

    2011-01-01

    Aim of this study is about mathematics education and dynamic mathematics software. Dynamic mathematics software provides new opportunities for using both computer algebra system and dynamic geometry software. GeoGebra selected as dynamic mathematics software in this research. In this study, it is investigated that what is the usage of pre service…

  4. Reference management software for systematic reviews and meta-analyses: an exploration of usage and usability.

    PubMed

    Lorenzetti, Diane L; Ghali, William A

    2013-11-15

    Reference management software programs enable researchers to more easily organize and manage large volumes of references typically identified during the production of systematic reviews. The purpose of this study was to determine the extent to which authors are using reference management software to produce systematic reviews; identify which programs are used most frequently and rate their ease of use; and assess the degree to which software usage is documented in published studies. We reviewed the full text of systematic reviews published in core clinical journals indexed in ACP Journal Club from 2008 to November 2011 to determine the extent to which reference management software usage is reported in published reviews. We surveyed corresponding authors to verify and supplement information in published reports, and gather frequency and ease-of-use data on individual reference management programs. Of the 78 researchers who responded to our survey, 79.5% reported that they had used a reference management software package to prepare their review. Of these, 4.8% reported this usage in their published studies. EndNote, Reference Manager, and RefWorks were the programs of choice for more than 98% of authors who used this software. Comments with respect to ease-of-use issues focused on the integration of this software with other programs and computer interfaces, and the sharing of reference databases among researchers. Despite underreporting of use, reference management software is frequently adopted by authors of systematic reviews. The transparency, reproducibility and quality of systematic reviews may be enhanced through increased reporting of reference management software usage.

  5. Usage of "Powergraph" software at laboratory lessons of "general physics" department of MEPhI

    NASA Astrophysics Data System (ADS)

    Klyachin, N. A.; Matronchik, A. Yu.; Khangulyan, E. V.

    2017-01-01

    One considers usage of "PowerGraph" software in laboratory exercise "Study of sodium spectrum" of physical experiment lessons. Togethe with the design of experiment setup, one discusses the sodium spectra digitized with computer audio chip. Usage of "PowerGraph" software in laboratory experiment "Study of sodium spectrum" allows an efficient visualization of the sodium spectrum and analysis of its fine structure. In particular, it allows quantitative measurements of the wavelengths and line relative intensities.

  6. ITERATIVE SCATTER CORRECTION FOR GRID-LESS BEDSIDE CHEST RADIOGRAPHY: PERFORMANCE FOR A CHEST PHANTOM.

    PubMed

    Mentrup, Detlef; Jockel, Sascha; Menser, Bernd; Neitzel, Ulrich

    2016-06-01

    The aim of this work was to experimentally compare the contrast improvement factors (CIFs) of a newly developed software-based scatter correction to the CIFs achieved by an antiscatter grid. To this end, three aluminium discs were placed in the lung, the retrocardial and the abdominal areas of a thorax phantom, and digital radiographs of the phantom were acquired both with and without a stationary grid. The contrast generated by the discs was measured in both images, and the CIFs achieved by grid usage were determined for each disc. Additionally, the non-grid images were processed with a scatter correction software. The contrasts generated by the discs were determined in the scatter-corrected images, and the corresponding CIFs were calculated. The CIFs obtained with the grid and with the software were in good agreement. In conclusion, the experiment demonstrates quantitatively that software-based scatter correction allows restoring the image contrast of a non-grid image in a manner comparable with an antiscatter grid. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. A Survey of Bioinformatics Database and Software Usage through Mining the Literature.

    PubMed

    Duck, Geraint; Nenadic, Goran; Filannino, Michele; Brass, Andy; Robertson, David L; Stevens, Robert

    2016-01-01

    Computer-based resources are central to much, if not most, biological and medical research. However, while there is an ever expanding choice of bioinformatics resources to use, described within the biomedical literature, little work to date has provided an evaluation of the full range of availability or levels of usage of database and software resources. Here we use text mining to process the PubMed Central full-text corpus, identifying mentions of databases or software within the scientific literature. We provide an audit of the resources contained within the biomedical literature, and a comparison of their relative usage, both over time and between the sub-disciplines of bioinformatics, biology and medicine. We find that trends in resource usage differs between these domains. The bioinformatics literature emphasises novel resource development, while database and software usage within biology and medicine is more stable and conservative. Many resources are only mentioned in the bioinformatics literature, with a relatively small number making it out into general biology, and fewer still into the medical literature. In addition, many resources are seeing a steady decline in their usage (e.g., BLAST, SWISS-PROT), though some are instead seeing rapid growth (e.g., the GO, R). We find a striking imbalance in resource usage with the top 5% of resource names (133 names) accounting for 47% of total usage, and over 70% of resources extracted being only mentioned once each. While these results highlight the dynamic and creative nature of bioinformatics research they raise questions about software reuse, choice and the sharing of bioinformatics practice. Is it acceptable that so many resources are apparently never reused? Finally, our work is a step towards automated extraction of scientific method from text. We make the dataset generated by our study available under the CC0 license here: http://dx.doi.org/10.6084/m9.figshare.1281371.

  8. Style and Usage Software: Mentor, not Judge.

    ERIC Educational Resources Information Center

    Smye, Randy

    Computer software style and usage checkers can encourage students' recursive revision strategies. For example, HOMER is based on the revision pedagogy presented in Richard Lanham's "Revising Prose," while Grammatik II focuses on readability, passive voice, and possibly misused words or phrases. Writer's Workbench "Style" (a UNIX program) provides…

  9. Extracting patterns of database and software usage from the bioinformatics literature

    PubMed Central

    Duck, Geraint; Nenadic, Goran; Brass, Andy; Robertson, David L.; Stevens, Robert

    2014-01-01

    Motivation: As a natural consequence of being a computer-based discipline, bioinformatics has a strong focus on database and software development, but the volume and variety of resources are growing at unprecedented rates. An audit of database and software usage patterns could help provide an overview of developments in bioinformatics and community common practice, and comparing the links between resources through time could demonstrate both the persistence of existing software and the emergence of new tools. Results: We study the connections between bioinformatics resources and construct networks of database and software usage patterns, based on resource co-occurrence, that correspond to snapshots of common practice in the bioinformatics community. We apply our approach to pairings of phylogenetics software reported in the literature and argue that these could provide a stepping stone into the identification of scientific best practice. Availability and implementation: The extracted resource data, the scripts used for network generation and the resulting networks are available at http://bionerds.sourceforge.net/networks/ Contact: robert.stevens@manchester.ac.uk PMID:25161253

  10. Generic, Type-Safe and Object Oriented Computer Algebra Software

    NASA Astrophysics Data System (ADS)

    Kredel, Heinz; Jolly, Raphael

    Advances in computer science, in particular object oriented programming, and software engineering have had little practical impact on computer algebra systems in the last 30 years. The software design of existing systems is still dominated by ad-hoc memory management, weakly typed algorithm libraries and proprietary domain specific interactive expression interpreters. We discuss a modular approach to computer algebra software: usage of state-of-the-art memory management and run-time systems (e.g. JVM) usage of strongly typed, generic, object oriented programming languages (e.g. Java) and usage of general purpose, dynamic interactive expression interpreters (e.g. Python) To illustrate the workability of this approach, we have implemented and studied computer algebra systems in Java and Scala. In this paper we report on the current state of this work by presenting new examples.

  11. 40 CFR 35.6320 - Usage rate.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 1 2013-07-01 2013-07-01 false Usage rate. 35.6320 Section 35.6320... Property Requirements Under A Cooperative Agreement § 35.6320 Usage rate. (a) Usage rate approval. To... by site, activity, and operable unit, as applicable, the recipient must apply a usage rate. The...

  12. 40 CFR 35.6320 - Usage rate.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 1 2014-07-01 2014-07-01 false Usage rate. 35.6320 Section 35.6320... Property Requirements Under A Cooperative Agreement § 35.6320 Usage rate. (a) Usage rate approval. To... by site, activity, and operable unit, as applicable, the recipient must apply a usage rate. The...

  13. 40 CFR 35.6320 - Usage rate.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 1 2011-07-01 2011-07-01 false Usage rate. 35.6320 Section 35.6320... Property Requirements Under A Cooperative Agreement § 35.6320 Usage rate. (a) Usage rate approval. To... by site, activity, and operable unit, as applicable, the recipient must apply a usage rate. The...

  14. 40 CFR 35.6320 - Usage rate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Usage rate. 35.6320 Section 35.6320... Property Requirements Under A Cooperative Agreement § 35.6320 Usage rate. (a) Usage rate approval. To... by site, activity, and operable unit, as applicable, the recipient must apply a usage rate. The...

  15. 40 CFR 35.6320 - Usage rate.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 1 2012-07-01 2012-07-01 false Usage rate. 35.6320 Section 35.6320... Property Requirements Under A Cooperative Agreement § 35.6320 Usage rate. (a) Usage rate approval. To... by site, activity, and operable unit, as applicable, the recipient must apply a usage rate. The...

  16. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  17. A Student View of Technology in the Classroom: Does It Enhance the Seven Principles of Good Practice in Undergraduate Education?

    ERIC Educational Resources Information Center

    McCabe, Deborah Brown; Meuter, Matthew L.

    2011-01-01

    There has been an explosion of classroom technologies, yet there is a lack of research investigating the connection between classroom technology and student learning. This research project explores faculty usage of classroom-based course management software, student usage and opinions of these software tools, and an exploration of whether or not…

  18. Analysis of 3D Modeling Software Usage Patterns for K-12 Students

    ERIC Educational Resources Information Center

    Wu, Yi-Chieh; Liao, Wen-Hung; Chi, Ming-Te; Li, Tsai-Yen

    2016-01-01

    In response to the recent trend in maker movement, teachers are learning 3D techniques actively and bringing 3D printing into the classroom to enhance variety and creativity in designing lectures. This study investigates the usage pattern of a 3D modeling software, Qmodel Creator, which is targeted at K-12 students. User logs containing…

  19. Space Shuttle Usage of z/OS

    NASA Technical Reports Server (NTRS)

    Green, Jan

    2009-01-01

    This viewgraph presentation gives a detailed description of the avionics associated with the Space Shuttle's data processing system and its usage of z/OS. The contents include: 1) Mission, Products, and Customers; 2) Facility Overview; 3) Shuttle Data Processing System; 4) Languages and Compilers; 5) Application Tools; 6) Shuttle Flight Software Simulator; 7) Software Development and Build Tools; and 8) Fun Facts and Acronyms.

  20. Sptrace

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    Sptrace is a general-purpose space utilization tracing system that is conceptually similar to the commercial Purify product used to detect leaks and other memory usage errors. It is designed to monitor space utilization in any sort of heap, i.e., a region of data storage on some device (nominally memory; possibly shared and possibly persistent) with a flat address space. This software can trace usage of shared and/or non-volatile storage in addition to private RAM (random access memory). Sptrace is implemented as a set of C function calls that are invoked from within the software that is being examined. The function calls fall into two broad classes: (1) functions that are embedded within the heap management software [e.g., JPL's SDR (Simple Data Recorder) and PSM (Personal Space Management) systems] to enable heap usage analysis by populating a virtual time-sequenced log of usage activity, and (2) reporting functions that are embedded within the application program whose behavior is suspect. For ease of use, these functions may be wrapped privately inside public functions offered by the heap management software. Sptrace can be used for VxWorks or RTEMS realtime systems as easily as for Linux or OS/X systems.

  1. 76 FR 78290 - Cooperative Research and Development Agreement: Usage of Biodiesel Fuel Blends Within Marine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-16

    ... Development Agreement: Usage of Biodiesel Fuel Blends Within Marine Inboard Engines AGENCY: Coast Guard, DHS... issues associated with using biodiesel fuel blends in marine inboard engines, with the overarching goal... participant in a CRADA similar to the one described in this notice (investigating the use of biodiesel fuel...

  2. Survey: Computer Usage in Design Courses.

    ERIC Educational Resources Information Center

    Henley, Ernest J.

    1983-01-01

    Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are…

  3. Avoidable Software Procurements

    DTIC Science & Technology

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  4. System IDentification Programs for AirCraft (SIDPAC)

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2002-01-01

    A collection of computer programs for aircraft system identification is described and demonstrated. The programs, collectively called System IDentification Programs for AirCraft, or SIDPAC, were developed in MATLAB as m-file functions. SIDPAC has been used successfully at NASA Langley Research Center with data from many different flight test programs and wind tunnel experiments. SIDPAC includes routines for experiment design, data conditioning, data compatibility analysis, model structure determination, equation-error and output-error parameter estimation in both the time and frequency domains, real-time and recursive parameter estimation, low order equivalent system identification, estimated parameter error calculation, linear and nonlinear simulation, plotting, and 3-D visualization. An overview of SIDPAC capabilities is provided, along with a demonstration of the use of SIDPAC with real flight test data from the NASA Glenn Twin Otter aircraft. The SIDPAC software is available without charge to U.S. citizens by request to the author, contingent on the requestor completing a NASA software usage agreement.

  5. Intraobserver and Interobserver Agreement of Structural and Functional Software Programs for Measuring Glaucoma Progression.

    PubMed

    Moreno-Montañés, Javier; Antón, Vanesa; Antón, Alfonso; Larrosa, José M; Martinez-de-la-Casa, José María; Rebolleda, Gema; Ussa, Fernando; García-Granero, Marta

    2017-04-01

    It is important to evaluate intraobserver and interobserver agreement using visual field (VF) testing and optical coherence tomography (OCT) software in order to understand whether the use of this software is sufficient to detect glaucoma progression and to make decisions regarding its treatment. To evaluate agreement in VF and OCT software among 5 glaucoma specialists. The printout pages from VF progression software and OCT progression software from 100 patients were randomized, and the 5 glaucoma specialists subjectively and independently evaluated them for glaucoma. Each image was classified as having no progression, questionable progression, or progression. The principal investigator classified the patients previously as without variability (normal) or with high variability among tests (difficult). Using both software, the specialists also evaluated whether the glaucoma damage had progressed and if treatment change was needed. One month later, the same observers reevaluated the patients in a different order to determine intraobserver reproducibility. Intraobserver and interobserver agreement was estimated using κ statistics and Gwet second-order agreement coefficient. The agreement was compared with other factors. Of the 100 observed patients, half were male and all were white; the mean (SD) age was 69.7 (14.1) years. Intraobserver agreement was substantial to almost perfect for VF software (overall κ [95% CI], 0.59 [0.46-0.72] to 0.87 [0.79-0.96]) and similar for OCT software (overall κ [95% CI], 0.59 [0.46-0.71] to 0.85 [0.76-0.94]). Interobserver agreement among the 5 glaucoma specialists with the VF progression software was moderate (κ, 0.48; 95% CI, 0.41-0.55) and similar to OCT progression software (κ, 0.52; 95% CI, 0.44-0.59). Interobserver agreement was substantial in images classified as having no progression but only fair in those classified as having questionable glaucoma progression or glaucoma progression. Interobserver agreement was fair regarding questions about glaucoma progression (κ, 0.39; 95% CI, 0.32-0.48) and consideration about treatment changes (κ, 0.39; 95% CI, 0.32-0.48). The factors associated with agreement were the glaucoma stage and case difficulty. There was substantial intraobserver agreement but moderate interobserver agreement among glaucoma specialists using 2 glaucoma progression software packages. These data suggest that these glaucoma progression software packages are insufficient to obtain high interobserver agreement in both devices except in patients with no progression. The low agreement regarding progression or treatment changes suggests that both software programs used in isolation are insufficient for decision making.

  6. High School Students' Social Media Usage Habits

    ERIC Educational Resources Information Center

    Tezci, Erdogan; Içen, Mustafa

    2017-01-01

    Social media which is an important product of Computer and Internet Technologies has a growing usage level day by day. Increasing social media usage level gives opportunity for new software developments and making investments in this area. From this aspect, therefore, social media has not only economic function but also make persons participate in…

  7. Use and Perceived Benefits of Handheld Computer-based Clinical References

    PubMed Central

    Rothschild, Jeffrey M.; Fang, Edward; Liu, Vincent; Litvak, Irina; Yoon, Cathy; Bates, David W.

    2006-01-01

    Objective Clinicians are increasingly using handheld computers (HC) during patient care. We sought to assess the role of HC-based clinical reference software in medical practice by conducting a survey and assessing actual usage behavior. Design During a 2-week period in February 2005, 3600 users of a HC-based clinical reference application were asked by e-mail to complete a survey and permit analysis of their usage patterns. The software includes a pharmacopeia, an infectious disease reference, a medical diagnostic and therapeutic reference and transmits medical alerts and other notifications during HC synchronizations. Software usage data were captured during HC synchronization for the 4 weeks prior to survey completion. Measurements Survey responses and software usage data. Results The survey response rate was 42% (n = 1501). Physicians reported using the clinical reference software for a mean of 4 years and 39% reported using the software during more than half of patient encounters. Physicians who synchronized their HC during the data collection period (n = 1249; 83%) used the pharmacopeia for unique drug lookups a mean of 6.3 times per day (SD 12.4). The majority of users (61%) believed that in the prior 4 weeks, use of the clinical reference prevented adverse drug events or medication errors 3 or more times. Physicians also believed that alerts and other notifications improved patient care if they were public health warnings (e.g. about influenza), new immunization guidelines or drug alert warnings (e.g. rofecoxib withdrawal). Conclusion Current adopters of HC-based medical references use these tools frequently, and found them to improve patient care and be valuable in learning of recent alerts and warnings. PMID:16929041

  8. 48 CFR 252.251-7000 - Ordering from Government supply sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Enterprise Software Agreements, the Contractor shall follow the terms of the applicable schedule or agreement... Enterprise Software Agreement contractor). (2) The following statement: Any price reductions negotiated as part of an Enterprise Software Agreement issued under a Federal Supply Schedule contract shall control...

  9. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  10. Would Boys and Girls Benefit from Gender-Specific Educational Software?

    ERIC Educational Resources Information Center

    Luik, Piret

    2011-01-01

    Most boys and girls interact differently with educational software and have different preferences for the design of educational software. The question is whether the usage of educational software has the same consequences for both genders. This paper investigates the characteristics of drill-and-practice programmes or drills that are efficient for…

  11. 48 CFR 208.7401 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7401 Definitions. As used in this subpart— Enterprise software agreement means an agreement or a contract that is used to acquire designated commercial software or related services such as...

  12. 48 CFR 208.7401 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7401 Definitions. As used in this subpart— Enterprise software agreement means an agreement or a contract that is used to acquire designated commercial software or related services such as...

  13. 48 CFR 208.7401 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7401 Definitions. As used in this subpart— Enterprise software agreement means an agreement or a contract that is used to acquire designated commercial software or related services such as...

  14. 48 CFR 208.7401 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7401 Definitions. As used in this subpart— Enterprise software agreement means an agreement or a contract that is used to acquire designated commercial software or related services such as...

  15. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  16. Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python.

    PubMed

    Gorgolewski, Krzysztof; Burns, Christopher D; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O; Waskom, Michael L; Ghosh, Satrajit S

    2011-01-01

    Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research.

  17. Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python

    PubMed Central

    Gorgolewski, Krzysztof; Burns, Christopher D.; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O.; Waskom, Michael L.; Ghosh, Satrajit S.

    2011-01-01

    Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research. PMID:21897815

  18. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems: Volume 1

    DTIC Science & Technology

    2016-01-06

    supporting “Bring Your Own Devices” (BYOD)? 22 New business models for OA software components ● Franchising ● Enterprise licensing ● Metered usage...paths IP and cybersecurity requirements will need continuous attention! 35 New business models for OA software components ● Franchising ● Enterprise

  19. A Survey on the Usage of Biomass Wastes from Palm Oil Mills on Sustainable Development of Oil Palm Plantations in Sarawak

    NASA Astrophysics Data System (ADS)

    Phang, K. Y.; Lau, S. W.

    2017-06-01

    As one of the world’s largest palm oil producers and exporters, Malaysia is committed to sustainable management of this industry to address the emerging environmental challenges. This descriptive study aims to evaluate the oil palm planters’ opinions regarding the usage of biomass wastes from palm oil mills and its impact on sustainable development of oil palm plantations in Sarawak. 253 planters across Sarawak were approached for their opinions about the usage of empty fruit bunch (EFB), palm oil mill effluent (POME), mesocarp fibre (MF), and palm kernel shell (PKS). This study revealed that the planters had generally higher agreement on the beneficial application of EFB and POME in oil palm plantations. This could be seen from the higher means of agreement rating of 3.64 - 4.22 for EFB and POME, compared with the rating of 3.19 - 3.41 for MF and PKS in the 5-point Likert scale (with 5 being the strongest agreement). Besides, 94.7 percent of the planters’ companies were found to comply with the Environmental Impact Assessment (EIA) requirements where nearly 38 percent carried out the EIA practice twice a year. Therefore high means of agreement were correlated to the compliance of environmental regulations, recording a Likert rating of 3.89 to 4.31. Lastly, the usage of EFB and POME also gained higher Likert scale point of 3.76 to 4.17 against MF and PKS of 3.34 to 3.49 in the evaluation of the impact of sustainability in oil palm plantations. The planters agreed that the usage of EFB and POME has reduced the environmental impact and improved the sustainable development, and its application has been improved and increased by research and development. However the planters were uncertain of the impact of usage of biomass wastes with respect to the contribution to social responsibility and company image in terms of transparency in waste management.

  20. Usage and effectiveness of seat and shoulder belts in rural Pennsylvania accidents

    DOT National Transportation Integrated Search

    1974-12-01

    This report presents an analysis of lap-belt and shoulder- belt usage and effectiveness in rural Pennsylvania accidents. The data were collected by the Pennsylvania State Police under an agreement with the National Highway Traffic Safety Administrati...

  1. MISR Data Product Specifications

    Atmospheric Science Data Center

    2016-11-25

    ... and usage of metadata. Improvements to MISR algorithmic software occasionally result in changes to file formats. While these changes ...  (DPS).   DPS Revision:   Rev. S Software Version:  5.0.9 Date:  September 20, 2010, updated April ...

  2. Usage of the www.2aida.org AIDA diabetes software Website: a pilot study.

    PubMed

    Lehmann, Eldon D

    2003-01-01

    AIDA is a diabetes-computing program freely available from www.2aida.org on the Web. The software is intended to serve as an educational support tool, and can be used by anyone who has an interest in diabetes, whether they be patients, relatives, health-care professionals, or students. In previous "Diabetes Information Technology & WebWatch" columns various indicators of usage of the AIDA program have been reviewed, and various comments from users of the software have been documented. One aspect of AIDA, though, that has been of considerable interest has been to investigate its Web-based distribution as a wider paradigm for more general medically related usage of the Internet. In this respect we have been keen to understand in general terms: (1) why people are turning to the Web for health-care/diabetes information; (2) more specifically, what sort of people are making use of the AIDA software; and (3) what benefits they feel might accrue from using the program. To answer these types of questions we have been conducting a series of audits/surveys via the AIDA Website, and via the software program itself, to learn as much as possible about who the AIDA end users really are. The rationale for this work is that, in this way, it should be possible to improve the program as well as tailor future versions of the software to the interests and needs of its users. However, a recurring observation is that data collection is easiest if it is as unobtrusive and innocuous as possible. One aspect of learning as much as possible about diabetes Website visitors and users may be to apply techniques that do not necessitate any visitor or user interaction. There are various programs that can monitor what pages visitors are viewing at a site. As these programs do not require visitors to do anything special, over time some interesting insights into Website usage may be obtained. For the current study we have reviewed anonymous logstats data, which are automatically collected at many Websites, to try and establish a baseline level of usage for the AIDA site. For the initial pilot study the analysis was performed from October 1, 2000 to November 1, 2001. The study has yielded an interesting insight into how the AIDA Website is being used. The results also confirm those of previous audits based on different self-reported methodologies, confirming, amongst other things, what countries people are visiting from and what operating systems/computers they are using. These analyses have been informative and useful. Given this, it is proposed to repeat the current pilot survey approach on a routine basis, in the future, as a way of monitoring on-going usage of the AIDA Website.

  3. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    PubMed

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  4. Improving ICT Governance by Reorganizing Operation of ICT and Software Applications: The First Step to Outsource

    NASA Astrophysics Data System (ADS)

    Johansson, Björn

    During recent years great attention has been paid to outsourcing as well as to the reverse, insourcing (Dibbern et al., 2004). There has been a strong focus on how the management of software applications and information and communication technology (ICT), expressed as ICT management versus ICT governance, should be carried out (Grembergen, 2004). The maintenance and operation of software applications and ICT use a lot of the resources spent on ICT in organizations today (Bearingpoint, 2004), and managers are asked to increase the business benefits of these investments (Weill & Ross, 2004). That is, they are asked to improve the usage of ICT and to develop new business critical solutions supported by ICT. It also means that investments in ICT and software applications need to be shown to be worthwhile. Basically there are two considerations to take into account with ICT usage: cost reduction and improving business value. How the governance and management of ICT and software applications are organized is important. This means that the improvement of the control of maintenance and operation may be of interest to executives of organizations. It can be stated that usage is dependent on how it is organized. So, if an increase of ICT governance is the same as having well-organized ICT resources, could this be seen as the first step in organizations striving for external provision of ICT? This question is dealt with to some degree in this paper.

  5. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  6. bioNerDS: exploring bioinformatics’ database and software use through literature mining

    PubMed Central

    2013-01-01

    Background Biology-focused databases and software define bioinformatics and their use is central to computational biology. In such a complex and dynamic field, it is of interest to understand what resources are available, which are used, how much they are used, and for what they are used. While scholarly literature surveys can provide some insights, large-scale computer-based approaches to identify mentions of bioinformatics databases and software from primary literature would automate systematic cataloguing, facilitate the monitoring of usage, and provide the foundations for the recovery of computational methods for analysing biological data, with the long-term aim of identifying best/common practice in different areas of biology. Results We have developed bioNerDS, a named entity recogniser for the recovery of bioinformatics databases and software from primary literature. We identify such entities with an F-measure ranging from 63% to 91% at the mention level and 63-78% at the document level, depending on corpus. Not attaining a higher F-measure is mostly due to high ambiguity in resource naming, which is compounded by the on-going introduction of new resources. To demonstrate the software, we applied bioNerDS to full-text articles from BMC Bioinformatics and Genome Biology. General mention patterns reflect the remit of these journals, highlighting BMC Bioinformatics’s emphasis on new tools and Genome Biology’s greater emphasis on data analysis. The data also illustrates some shifts in resource usage: for example, the past decade has seen R and the Gene Ontology join BLAST and GenBank as the main components in bioinformatics processing. Abstract Conclusions We demonstrate the feasibility of automatically identifying resource names on a large-scale from the scientific literature and show that the generated data can be used for exploration of bioinformatics database and software usage. For example, our results help to investigate the rate of change in resource usage and corroborate the suspicion that a vast majority of resources are created, but rarely (if ever) used thereafter. bioNerDS is available at http://bionerds.sourceforge.net/. PMID:23768135

  7. Personal Computers in Iowa Vocational Agriculture Programs: Competency Assessment and Usage.

    ERIC Educational Resources Information Center

    Miller, W. Wade; And Others

    The competencies needed by Iowa vocational agriculture instructors at the secondary school level to integrate computer technology into the classroom were assessed, as well as the status of computer usage, types of computer use and software utilities and hardware used, and the sources of computer training obtained by instructors. Surveys were…

  8. 37 CFR 380.23 - Terms for making payment of royalty fees and statements of account.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... waiver, including development of proxy usage data. The Proxy Fee shall be paid by the date specified in... Educational Webcasters based on proxy usage data in accordance with a methodology adopted by the Collective's... third-party Web hosting or service provider maintains equipment or software for a Noncommercial...

  9. Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.

    ERIC Educational Resources Information Center

    Reed, Mary Hutchings

    This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…

  10. Of Publishers and Pirates: License Agreements Promote Unethical Behavior, But That's Only the Beginning.

    ERIC Educational Resources Information Center

    Pournelle, Jerry

    1984-01-01

    Discussion of software license agreements implies that they actually contribute to software piracy because of their stringency and indicates that competition in the software publishing field will eventually eliminate the piracy problem. (MBR)

  11. Learning Outcomes in a Laboratory Environment vs. Classroom for Statistics Instruction: An Alternative Approach Using Statistical Software

    ERIC Educational Resources Information Center

    McCulloch, Ryan Sterling

    2017-01-01

    The role of any statistics course is to increase the understanding and comprehension of statistical concepts and those goals can be achieved via both theoretical instruction and statistical software training. However, many introductory courses either forego advanced software usage, or leave its use to the student as a peripheral activity. The…

  12. The environmental control and life support system advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  13. Collecting conditions usage metadata to optimize current and future ATLAS software and processing

    NASA Astrophysics Data System (ADS)

    Rinaldi, L.; Barberis, D.; Formica, A.; Gallas, E. J.; Oda, S.; Rybkin, G.; Verducci, M.; ATLAS Collaboration

    2017-10-01

    Conditions data (for example: alignment, calibration, data quality) are used extensively in the processing of real and simulated data in ATLAS. The volume and variety of the conditions data needed by different types of processing are quite diverse, so optimizing its access requires a careful understanding of conditions usage patterns. These patterns can be quantified by mining representative log files from each type of processing and gathering detailed information about conditions usage for that type of processing into a central repository.

  14. Get Out of MySpace!

    ERIC Educational Resources Information Center

    Jones, Norah; Blackey, Haydn; Fitzgibbon, Karen; Chew, Esyin

    2010-01-01

    To understand the student experience on social software, the research aims to explore the disruptive nature and opportunity of social networking for higher education. Taking four universities, the research: (1) identifies the distinction between the students' current usage of social software; (2) reports on the students' experience on…

  15. Kaiser Permanente-Sandia National Health Care Model: Phase 1 prototype final report. Part 2 -- Domain analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, D.; Yoshimura, A.; Butler, D.

    This report describes the results of a Cooperative Research and Development Agreement between Sandia National Laboratories and Kaiser Permanente Southern California to develop a prototype computer model of Kaiser Permanente`s health care delivery system. As a discrete event simulation, SimHCO models for each of 100,000 patients the progression of disease, individual resource usage, and patient choices in a competitive environment. SimHCO is implemented in the object-oriented programming language C{sup 2}, stressing reusable knowledge and reusable software components. The versioned implementation of SimHCO showed that the object-oriented framework allows the program to grow in complexity in an incremental way. Furthermore, timingmore » calculations showed that SimHCO runs in a reasonable time on typical workstations, and that a second phase model will scale proportionally and run within the system constraints of contemporary computer technology.« less

  16. Assessing Usage and Maximizing Finance Lab Impact: A Case Exploration

    ERIC Educational Resources Information Center

    Noguera, Magdy; Budden, Michael Craig; Silva, Alberto

    2011-01-01

    This paper reports the results of a survey conducted to assess students' usage and perceptions of a finance lab. Finance labs differ from simple computer labs as they typically contain data boards, streaming market quotes, terminals and software that allow for real-time financial analyses. Despite the fact that such labs represent significant and…

  17. 78 FR 20120 - Cooperative Research and Development Agreement: Joint Technical Demonstration of Tactical Data...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-03

    ... Development Agreement: Joint Technical Demonstration of Tactical Data Link Range Enhancement Software AGENCY... (TDL) range enhancement software technologies to improve operational effectiveness and communications... Range Enhancement Software Technologies, U.S. Coast Guard Research and Development Center, 1 Chelsea...

  18. Computer and Software Use in Teaching the Beginning Statistics Course.

    ERIC Educational Resources Information Center

    Bartz, Albert E.; Sabolik, Marisa A.

    2001-01-01

    Surveys the extent of computer usage in the beginning statistics course and the variety of statistics software used. Finds that 69% of the psychology departments used computers in beginning statistics courses and 90% used computer-assisted data analysis in statistics or other courses. (CMK)

  19. Daily computer usage correlated with undergraduate students' musculoskeletal symptoms.

    PubMed

    Chang, Che-Hsu Joe; Amick, Benjamin C; Menendez, Cammie Chaumont; Katz, Jeffrey N; Johnson, Peter W; Robertson, Michelle; Dennerlein, Jack Tigh

    2007-06-01

    A pilot prospective study was performed to examine the relationships between daily computer usage time and musculoskeletal symptoms on undergraduate students. For three separate 1-week study periods distributed over a semester, 27 students reported body part-specific musculoskeletal symptoms three to five times daily. Daily computer usage time for the 24-hr period preceding each symptom report was calculated from computer input device activities measured directly by software loaded on each participant's primary computer. General Estimating Equation models tested the relationships between daily computer usage and symptom reporting. Daily computer usage longer than 3 hr was significantly associated with an odds ratio 1.50 (1.01-2.25) of reporting symptoms. Odds of reporting symptoms also increased with quartiles of daily exposure. These data suggest a potential dose-response relationship between daily computer usage time and musculoskeletal symptoms.

  20. Glioblastoma Segmentation: Comparison of Three Different Software Packages.

    PubMed

    Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid

    2016-01-01

    To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.

  1. The Levels of Speech Usage rating scale: comparison of client self-ratings with speech pathologist ratings.

    PubMed

    Gray, Christina; Baylor, Carolyn; Eadie, Tanya; Kendall, Diane; Yorkston, Kathryn

    2012-01-01

    The term 'speech usage' refers to what people want or need to do with their speech to fulfil the communication demands in their life roles. Speech-language pathologists (SLPs) need to know about clients' speech usage to plan appropriate interventions to meet their life participation goals. The Levels of Speech Usage is a categorical scale intended for client self-report of speech usage, but SLPs may want the option to use it as a proxy-report tool. The relationship between self-report and clinician ratings should be examined before the instrument is used in a proxy format. The primary purpose of this study was to compare client self-ratings with SLP ratings on the Levels of Speech Usage scale. The secondary purpose was to determine if the SLP ratings differed depending on whether or not the SLPs knew about the clients' medical condition. Self-ratings of adults with communication disorders on the Levels of Speech Usage scale were available from prior research. Vignettes about these individuals were created from existing data. Two sets of vignettes were created. One set contained information about demographic information, living situation, occupational status and hobbies or social activities. The second set was identical to the first with the addition of information about the clients' medical conditions and communication disorders. Various communication disorders were represented including dysarthria, voice disorders, laryngectomy, and mild cognitive and language disorders. Sixty SLPs were randomly divided into two groups with each group rating one set of vignettes. The task was completed online. While this does not replicate typical in-person clinical interactions, it was a feasible method for this study. For data analysis, the client self-ratings were considered fixed points and the percentage of SLP ratings in agreement with the self-ratings was calculated. The percentage of SLP ratings in exact agreement with client self-ratings was 44.9%. Agreement was lowest for the less-demanding speech usage categories and highest for the most demanding usage category. There was no significant difference between the two groups of SLPs based on knowledge of medical condition. SLPs often need to document the speech usage levels of clients. This study suggests the potential for SLPs to misjudge how clients see their own speech demands. Further research is needed to determine if similar results would be found in actual clinical interactions. Until then, SLPs should seek the input of their clients when using this instrument. © 2012 Royal College of Speech and Language Therapists.

  2. Iraq: Country Status Report.

    ERIC Educational Resources Information Center

    McFerren, Margaret

    A survey of the status of language usage in Iraq begins with an overview of the usage patterns of Arabic and Kurdish, especially in the context of recent political events and the agreement to make Kurdish a second official language in the Kurdish autonomous region, and to allow limited use of Kurdish in instruction and public communication. A…

  3. Academic Software Downloads from Google Code: Useful Usage Indicators?

    ERIC Educational Resources Information Center

    Thelwall, Mike; Kousha, Kayvan

    2016-01-01

    Introduction: Computer scientists and other researchers often make their programs freely available online. If this software makes a valuable contribution inside or outside of academia then its creators may want to demonstrate this with a suitable indicator, such as download counts. Methods: Download counts, citation counts, labels and licenses…

  4. GridKit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, Slaven

    2016-11-06

    GridKit is a software development kit for interfacing power systems and power grid application software with high performance computing (HPC) libraries developed at National Labs and academia. It is also intended as interoperability layer between different numerical libraries. GridKit is not a standalone application, but comes with a suite of test examples illustrating possible usage.

  5. Genomic adaptation of the ISA virus to Salmo salar codon usage

    PubMed Central

    2013-01-01

    Background The ISA virus (ISAV) is an Orthomyxovirus whose genome encodes for at least 10 proteins. Low protein identity and lack of genetic tools have hampered the study of the molecular mechanism behind its virulence. It has been shown that viral codon usage controls several processes such as translational efficiency, folding, tuning of protein expression, antigenicity and virulence. Despite this, the possible role that adaptation to host codon usage plays in virulence and viral evolution has not been studied in ISAV. Methods Intergenomic adaptation between viral and host genomes was calculated using the codon adaptation index score with EMBOSS software and the Kazusa database. Classification of host genes according to GeneOnthology was performed using Blast2go. A non parametric test was applied to determine the presence of significant correlations among CAI, mortality and time. Results Using the codon adaptation index (CAI) score, we found that the encoding genes for nucleoprotein, matrix protein M1 and antagonist of Interferon I signaling (NS1) are the ISAV genes that are more adapted to host codon usage, in agreement with their requirement for production of viral particles and inactivation of antiviral responses. Comparison to host genes showed that ISAV shares CAI values with less than 0.45% of Salmo salar genes. GeneOntology classification of host genes showed that ISAV genes share CAI values with genes from less than 3% of the host biological process, far from the 14% shown by Influenza A viruses and closer to the 5% shown by Influenza B and C. As well, we identified a positive correlation (p<0.05) between CAI values of a virus and the duration of the outbreak disease in given salmon farms, as well as a weak relationship between codon adaptation values of PB1 and the mortality rates of a set of ISA viruses. Conclusions Our analysis shows that ISAV is the least adapted viral Salmo salar pathogen and Orthomyxovirus family member less adapted to host codon usage, avoiding the general behavior of host genes. This is probably due to its recent emergence among farmed Salmon populations. PMID:23829271

  6. Genomic adaptation of the ISA virus to Salmo salar codon usage.

    PubMed

    Tello, Mario; Vergara, Francisco; Spencer, Eugenio

    2013-07-05

    The ISA virus (ISAV) is an Orthomyxovirus whose genome encodes for at least 10 proteins. Low protein identity and lack of genetic tools have hampered the study of the molecular mechanism behind its virulence. It has been shown that viral codon usage controls several processes such as translational efficiency, folding, tuning of protein expression, antigenicity and virulence. Despite this, the possible role that adaptation to host codon usage plays in virulence and viral evolution has not been studied in ISAV. Intergenomic adaptation between viral and host genomes was calculated using the codon adaptation index score with EMBOSS software and the Kazusa database. Classification of host genes according to GeneOnthology was performed using Blast2go. A non parametric test was applied to determine the presence of significant correlations among CAI, mortality and time. Using the codon adaptation index (CAI) score, we found that the encoding genes for nucleoprotein, matrix protein M1 and antagonist of Interferon I signaling (NS1) are the ISAV genes that are more adapted to host codon usage, in agreement with their requirement for production of viral particles and inactivation of antiviral responses. Comparison to host genes showed that ISAV shares CAI values with less than 0.45% of Salmo salar genes. GeneOntology classification of host genes showed that ISAV genes share CAI values with genes from less than 3% of the host biological process, far from the 14% shown by Influenza A viruses and closer to the 5% shown by Influenza B and C. As well, we identified a positive correlation (p<0.05) between CAI values of a virus and the duration of the outbreak disease in given salmon farms, as well as a weak relationship between codon adaptation values of PB1 and the mortality rates of a set of ISA viruses. Our analysis shows that ISAV is the least adapted viral Salmo salar pathogen and Orthomyxovirus family member less adapted to host codon usage, avoiding the general behavior of host genes. This is probably due to its recent emergence among farmed Salmon populations.

  7. Telecytology: Is it possible with smartphone images?

    PubMed

    Sahin, Davut; Hacisalihoglu, Uguray Payam; Kirimlioglu, Saime Hale

    2018-01-01

    This study aimed to discuss smartphone usage in telecytology and determine intraobserver concordance between microscopic cytopathological diagnoses and diagnoses derived via static smartphone images. The study was conducted with 172 cytologic material. A pathologist captured static images of the cytology slides from the ocular lens of a microscope using a smartphone. The images were transferred via WhatsApp® to a cytopathologist working in another center who made all the microscopic cytopathological diagnoses 5-27 months ago. The cytopathologist diagnosed images on a smartphone without knowledge of their previous microscopic diagnoses. The Kappa agreement between microscopic cytopathological diagnoses and smartphone image diagnoses was determined. The average image capturing, transfer, and remote cytopathological diagnostic time for one case was 6.20 minutes. The percentage of cases whose microscopic and smartphone image diagnoses were concordant was 84.30%, and the percentage of those whose diagnoses were discordant was 15.69%. The highest Kappa agreement was observed in endoscopic ultrasound-guided fine needle aspiration (1.000), and the lowest agreement was observed in urine cytology (0.665). Patient management changed with smart phone image diagnoses at 11.04%. This study showed that easy, fast, and high-quality image capturing and transfer is possible from cytology slides using smartphones. The intraobserver Kappa agreement between the microscopic cytopathological diagnoses and remote smartphone image diagnoses was high. It was found that remote diagnosis due to difficulties in telecytology might change patient management. The developments in the smartphone camera technology and transfer software make them efficient telepathology and telecytology tools. © 2017 Wiley Periodicals, Inc.

  8. Freeware Versus Commercial Office Productivity Software

    DTIC Science & Technology

    2016-12-01

    adapting Google’s widely popular freeware for government agency usage. This study analyzes the proposed benefits of using freeware, specifically... computing , ESI 15. NUMBER OF PAGES 73 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF...announced the launch of Google Apps for Government, adapting Google’s widely popular freeware for government agency usage. This study analyzes the

  9. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    NASA Astrophysics Data System (ADS)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  10. Promoting Science Software Best Practices: A Scientist's Perspective (Invited)

    NASA Astrophysics Data System (ADS)

    Blanton, B. O.

    2013-12-01

    Software is at the core of most modern scientific activities, and as societal awareness of, and impacts from, extreme weather, disasters, and climate and global change continue to increase, the roles that scientific software play in analyses and decision-making are brought more to the forefront. Reproducibility of research results (particularly those that enter into the decision-making arena) and open access to the software is essential for scientific and scientists' credibility. This has been highlighted in a recent article by Joppa et al (Troubling Trends in Scientific Software Use, Science Magazine, May 2013) that describes reasons for particular software being chosen by scientists, including that the "developer is well-respected" and on "recommendation from a close colleague". This reliance on recommendation, Joppa et al conclude, is fraught with risks to both sciences and scientists. Scientists must frequently take software for granted, assuming that it performs as expected and advertised and that the software itself has been validated and results verified. This is largely due to the manner in which much software is written and developed; in an ad hoc manner, with an inconsistent funding stream, and with little application of core software engineering best practices. Insufficient documentation, limited test cases, and code unavailability are significant barriers to informed and intelligent science software usage. This situation is exacerbated when the scientist becomes the software developer out of necessity due to resource constraints. Adoption of, and adherence to, best practices in scientific software development will substantially increase intelligent software usage and promote a sustainable evolution of the science as encoded in the software. We describe a typical scientist's perspective on using and developing scientific software in the context of storm surge research and forecasting applications that have real-time objectives and regulatory constraints. This include perspectives on what scientists/users of software can contribute back to the software development process and examples of successful scientist/developer interactions, and the competition between "getting it done" and "getting it done right".

  11. Internet Usage In The Fresh Produce Supply Chainin China

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoxiao; Duan, Yanqing; Fu, Zetian; Liu, Xue

    Although effective implementation of the Internet technologies has a great potential for improving efficiency and reducing wastage within the fresh produce supply chain. the situation of the Internet usage by SMEs (small and medium sized enterprises) in the fresh produce supply chain is still unclear in China. As the main players, SMEs haven't been given enough attention from both academics and governments. Therefore, this research attempts to address this issue by, first, investigating the current usage of the Internet and related software by Chinese SMEs in the fresh produce supply chain, and then, by identifying enablers and barriers faced by SMEs to call government's attention. As a part of an EU-Asia IT&C funded project, a survey was carried out with SMEs in this industry from five major cities in China. The results reveal that in the relatively developed areas of China, SMEs in the fresh produce supply chain are rapidly adopting the Internet and software packages, but the level of adoption varies greatly and there is a significant lack of integration among the supply chain partners. Chinese SMEs are keen to embrace emerging technologies and have acted to adopt new software and tools. Given that cost of implementation is not a barrier, their concern over legal protection and online security must be addressed for further development.

  12. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    PubMed

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  13. Microsoft Excel Software Usage for Teaching Science and Engineering Curriculum

    ERIC Educational Resources Information Center

    Singh, Gurmukh; Siddiqui, Khalid

    2009-01-01

    In this article, our main objective is to present the use of Microsoft Software Excel 2007/2003 for teaching college and university level curriculum in science and engineering. In particular, we discuss two interesting and fascinating examples of interactive applications of Microsoft Excel targeted for undergraduate students in: 1) computational…

  14. A Comparison of the Usage of Tablet PC, Lecture Capture, and Online Homework in an Introductory Chemistry Course

    ERIC Educational Resources Information Center

    Revell, Kevin D.

    2014-01-01

    Three emerging technologies were used in a large introductory chemistry class: a tablet PC, a lecture capture and replay software program, and an online homework program. At the end of the semester, student usage of the lecture replay and online homework systems was compared to course performance as measured by course grade and by a standardized…

  15. Feasibility Study of a Rotorcraft Health and Usage Monitoring System ( HUMS): Usage and Structural Life Monitoring Evaluation

    NASA Technical Reports Server (NTRS)

    Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.

    1996-01-01

    The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FLS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques. The data that was used for the evaluation of the usage monitoring techniques was collected under an independent HUMS Flight trial program, using a commercially available HUMS and data recording system. The usage data collect from the HUMS trial aircraft was analyzed off-line using PC-based software that included the FCR and FLS techniques. In the future, if the technique prove feasible, usage monitoring would be incorporated into the onboard HUMS.

  16. Perceived Need and Actual Usage of the Family Support Agreement in Rural China: Results from a Nationally Representative Survey

    ERIC Educational Resources Information Center

    Chou, Rita Jing-Ann

    2011-01-01

    Purpose: The Family Support Agreement (FSA) is a voluntary but legal contract between older parents and adult children on parental support in China. As the first comprehensive empirical study on the FSA, this study aims to understand the prevalence and covariates of older parents' perceived need and actual use of this agreement. Design and…

  17. hdfscan

    Atmospheric Science Data Center

    2013-04-01

    ... free of charge from JPL, upon completion of a license agreement. hdfscan software consists of two components - a core hdf file ... at the Jet Propulsion Laboratory. To obtain the license agreement, go to the  MISR Science Software web page , read the introductory ...

  18. VARK Learning Preferences and Mobile Anatomy Software Application Use in Pre-Clinical Chiropractic Students

    ERIC Educational Resources Information Center

    Meyer, Amanda J.; Stomski, Norman J.; Innes, Stanley I.; Armson, Anthony J.

    2016-01-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists…

  19. How Does ERPsim Influence Students' Perceived Learning Outcomes in an Information Systems Course? An Empirical Study

    ERIC Educational Resources Information Center

    Chen, Liqiang; Keys, Anthony; Gaber, Donald

    2015-01-01

    It is a challenge for business students or even employees to understand business processes and enterprise software usage without involvement in real-world practices. Many business schools are using ERP software in their curriculum, aiming to expose students to real-world business practices. ERPsim is an Enterprise Resource Planning (ERP)…

  20. Why Don't All Maths Teachers Use Dynamic Geometry Software in Their Classrooms?

    ERIC Educational Resources Information Center

    Stols, Gerrit; Kriek, Jeanne

    2011-01-01

    In this exploratory study, we sought to examine the influence of mathematics teachers' beliefs on their intended and actual usage of dynamic mathematics software in their classrooms. The theory of planned behaviour (TPB), the technology acceptance model (TAM) and the innovation diffusion theory (IDT) were used to examine the influence of teachers'…

  1. Proceedings of the Workshop on Software Engineering Foundations for End-User Programming (SEEUP 2009)

    DTIC Science & Technology

    2009-11-01

    interest of scientific and technical information exchange. This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a...an interesting conti- nuum between how many different requirements a program must satisfy: the more complex and diverse the requirements, the more... Gender differences in approaches to end-user software development have also been reported in debugging feature usage [1] and in end-user web programming

  2. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    NASA Technical Reports Server (NTRS)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  3. Workstation-Based Avionics Simulator to Support Mars Science Laboratory Flight Software Development

    NASA Technical Reports Server (NTRS)

    Henriquez, David; Canham, Timothy; Chang, Johnny T.; McMahon, Elihu

    2008-01-01

    The Mars Science Laboratory developed the WorkStation TestSet (WSTS) to support flight software development. The WSTS is the non-real-time flight avionics simulator that is designed to be completely software-based and run on a workstation class Linux PC. This provides flight software developers with their own virtual avionics testbed and allows device-level and functional software testing when hardware testbeds are either not yet available or have limited availability. The WSTS has successfully off-loaded many flight software development activities from the project testbeds. At the writing of this paper, the WSTS has averaged an order of magnitude more usage than the project's hardware testbeds.

  4. Software for real-time localization of baleen whale calls using directional sonobuoys: A case study on Antarctic blue whales.

    PubMed

    Miller, Brian S; Calderan, Susannah; Gillespie, Douglas; Weatherup, Graham; Leaper, Russell; Collins, Kym; Double, Michael C

    2016-03-01

    Directional frequency analysis and recording (DIFAR) sonobuoys can allow real-time acoustic localization of baleen whales for underwater tracking and remote sensing, but limited availability of hardware and software has prevented wider usage. These software limitations were addressed by developing a module in the open-source software PAMGuard. A case study is presented demonstrating that this software provides greater efficiency and accessibility than previous methods for detecting, localizing, and tracking Antarctic blue whales in real time. Additionally, this software can easily be extended to track other low and mid frequency sounds including those from other cetaceans, pinnipeds, icebergs, shipping, and seismic airguns.

  5. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  6. [Computer usage among primary health care physicians in the Vukovar-Srijem County].

    PubMed

    Iveković, Hrvoje

    2002-01-01

    A survey was carried out, aiming at identification of the current usage of computers among primary health care physicians of the Vukovar-Srijem County. The results indicated poor knowledge and practice concerning the computer usage among examinees: 58% of the responders are not aware of the possibilities of computer usage in a GP office and 82% have not had an opportunity to see the software specialised for usage at GP offices. The results obtained from this survey indicate that none of the examinees use computer during daily routine work at the GP office. Only 26% of the examinees have got a computer, and use it at home, mostly for text processing. The Internet is used actively by 8% of examinees. Lack of education and equipment have been identified as main obstacles in the process of introducing computers to GP offices. Positive attitude towards computer usage has been identified, representing an important stimulus towards a more active role of the health centres management in solving this problem.

  7. Analyses of requirements for computer control and data processing experiment subsystems: Image data processing system (IDAPS) software description (7094 version), volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.

  8. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  9. 76 FR 25362 - Cooperative Research and Development Agreement: Butanol Fuel Blend Usage With Marine Outboard...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-04

    ... participants would identify and investigate the advantages, disadvantages, required technology enhancements... Development Agreements (CRADAs), are authorized by the Federal Technology Transfer Act of 1986 (Pub. L. 99- 502, codified at 15 U.S.C. 3710(a)). A CRADA promotes the transfer of technology to the private sector...

  10. Investigating Students' Attitude and Intention to Use Social Software in Higher Institution of Learning in Malaysia

    ERIC Educational Resources Information Center

    Shittu, Ahmed Tajudeen; Basha, Kamal Madarsha; AbdulRahman, Nik Suryani Nik; Ahmad, Tunku Badariah Tunku

    2011-01-01

    Purpose: Social software usage is growing at an exponential rate among the present generation of students. Yet, there is paucity of empirical study to understand the determinant of its use in the present setting of this study. This study, therefore, seeks to investigate factors that predict students' attitudes and intentions to use this…

  11. Social network analyzer on the example of Twitter

    NASA Astrophysics Data System (ADS)

    Gorodetskaia, Mariia; Khruslova, Diana

    2017-09-01

    Social networks are powerful sources of data due to their popularity. Twitter is one of the networks providing a lot of data. There is need to collect this data for future usage from linguistics to SMM and marketing. The report examines the existing software solutions and provides new ones. The study includes information about the software developed. Some future features are listed.

  12. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  13. Potential of the Cogex Software Platform to Replace Logbooks in Capstone Design Projects

    ERIC Educational Resources Information Center

    Foley, David; Charron, François; Plante, Jean-Sébastien

    2018-01-01

    Recent technologies are offering the power to share and grow knowledge and ideas in unprecedented ways. The CogEx software platform was developed to take advantage of the digital world with innovative ideas to support designers work in both industrial and academic contexts. This paper presents a qualitative study on the usage of CogEx during…

  14. The Ethics and Politics of Policing Plagiarism: A Qualitative Study of Faculty Views on Student Plagiarism and Turnitin®

    ERIC Educational Resources Information Center

    Bruton, Samuel; Childers, Dan

    2016-01-01

    Recently, the usage of plagiarism detection software such as Turnitin® has increased dramatically among university instructors. At the same time, academic criticism of this software's employment has also increased. We interviewed 23 faculty members from various departments at a medium-sized, public university in the southeastern US to determine…

  15. Using Mobile Platforms for Sensitive Government Business

    DTIC Science & Technology

    2013-01-01

    install call, text and data monitoring software within the corporate enclave(s), in order to meter professional usage separately. Personal usage could be...Available from: http://www.3gpp.org/ftp/Specs/archive/33_series/33.908/33908- 400 .zip. 27. Geiger, H. NFC Phones Raise Opportunities, Privacy And...Available from: http://www.pcworld.com/article/212192/protect_your_android_phone_with_sec urity_apps.html. 74. German, K. Sprint offers McAfee

  16. Agreement between clinical estimation and a new quantitative analysis by Photoshop software in fundus and angiographic image variables.

    PubMed

    Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R

    2009-12-01

    To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.

  17. Developing sustainable software solutions for bioinformatics by the “ Butterfly” paradigm

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2014-01-01

    Software design and sustainable software engineering are essential for the long-term development of bioinformatics software. Typical challenges in an academic environment are short-term contracts, island solutions, pragmatic approaches and loose documentation. Upcoming new challenges are big data, complex data sets, software compatibility and rapid changes in data representation. Our approach to cope with these challenges consists of iterative intertwined cycles of development (“ Butterfly” paradigm) for key steps in scientific software engineering. User feedback is valued as well as software planning in a sustainable and interoperable way. Tool usage should be easy and intuitive. A middleware supports a user-friendly Graphical User Interface (GUI) as well as a database/tool development independently. We validated the approach of our own software development and compared the different design paradigms in various software solutions. PMID:25383181

  18. Software for analysis of chemical mixtures--composition, occurrence, distribution, and possible toxicity

    USGS Publications Warehouse

    Scott, Jonathon C.; Skach, Kenneth A.; Toccalino, Patricia L.

    2013-01-01

    The composition, occurrence, distribution, and possible toxicity of chemical mixtures in the environment are research concerns of the U.S. Geological Survey and others. The presence of specific chemical mixtures may serve as indicators of natural phenomena or human-caused events. Chemical mixtures may also have ecological, industrial, geochemical, or toxicological effects. Chemical-mixture occurrences vary by analyte composition and concentration. Four related computer programs have been developed by the National Water-Quality Assessment Program of the U.S. Geological Survey for research of chemical-mixture compositions, occurrences, distributions, and possible toxicities. The compositions and occurrences are identified for the user-supplied data, and therefore the resultant counts are constrained by the user’s choices for the selection of chemicals, reporting limits for the analytical methods, spatial coverage, and time span for the data supplied. The distribution of chemical mixtures may be spatial, temporal, and (or) related to some other variable, such as chemical usage. Possible toxicities optionally are estimated from user-supplied benchmark data. The software for the analysis of chemical mixtures described in this report is designed to work with chemical-analysis data files retrieved from the U.S. Geological Survey National Water Information System but can also be used with appropriately formatted data from other sources. Installation and usage of the mixture software are documented. This mixture software was designed to function with minimal changes on a variety of computer-operating systems. To obtain the software described herein and other U.S. Geological Survey software, visit http://water.usgs.gov/software/.

  19. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  20. A survey of Canadian medical physicists: software quality assurance of in-house software.

    PubMed

    Salomons, Greg J; Kelly, Diane

    2015-01-05

    This paper reports on a survey of medical physicists who write and use in-house written software as part of their professional work. The goal of the survey was to assess the extent of in-house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple-choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software-related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines.

  1. Use of Internet audience measurement data to gauge market share for online health information services.

    PubMed

    Wood, Fred B; Benson, Dennis; LaCroix, Eve-Marie; Siegel, Elliot R; Fariss, Susan

    2005-07-01

    The transition to a largely Internet and Web-based environment for dissemination of health information has changed the health information landscape and the framework for evaluation of such activities. A multidimensional evaluative approach is needed. This paper discusses one important dimension of Web evaluation-usage data. In particular, we discuss the collection and analysis of external data on website usage in order to develop a better understanding of the health information (and related US government information) market space, and to estimate the market share or relative levels of usage for National Library of Medicine (NLM) and National Institutes of Health (NIH) websites compared to other health information providers. The primary method presented is Internet audience measurement based on Web usage by external panels of users and assembled by private vendors-in this case, comScore. A secondary method discussed is Web usage based on Web log software data. The principle metrics for both methods are unique visitors and total pages downloaded per month. NLM websites (primarily MedlinePlus and PubMed) account for 55% to 80% of total NIH website usage depending on the metric used. In turn, NIH.gov top-level domain usage (inclusive of NLM) ranks second only behind WebMD in the US domestic home health information market and ranks first on a global basis. NIH.gov consistently ranks among the top three or four US government top-level domains based on global Web usage. On a site-specific basis, the top health information websites in terms of global usage appear to be WebMD, MSN Health, PubMed, Yahoo! Health, AOL Health, and MedlinePlus. Based on MedlinePlus Web log data and external Internet audience measurement data, the three most heavily used cancer-centric websites appear to be www.cancer.gov (National Cancer Institute), www.cancer.org (American Cancer Society), and www.breastcancer.org (non-profit organization). Internet audience measurement has proven useful to NLM, with significant advantages compared to sole reliance on usage data from Web log software. Internet audience data has helped NLM better understand the relative usage of NLM and NIH websites in the intersection of the health information and US government information market sectors, which is the primary market intersector for NLM and NIH. However important, Web usage is only one dimension of a complete Web evaluation framework, and other primary research methods, such as online user surveys, usability tests, and focus groups, are also important for comprehensive evaluation that includes qualitative elements, such as user satisfaction and user friendliness, as well as quantitative indicators of website usage.

  2. Use of Internet Audience Measurement Data to Gauge Market Share for Online Health Information Services

    PubMed Central

    Benson, Dennis; LaCroix, Eve-Marie; Siegel, Elliot R; Fariss, Susan

    2005-01-01

    Background The transition to a largely Internet and Web-based environment for dissemination of health information has changed the health information landscape and the framework for evaluation of such activities. A multidimensional evaluative approach is needed. Objective This paper discusses one important dimension of Web evaluation—usage data. In particular, we discuss the collection and analysis of external data on website usage in order to develop a better understanding of the health information (and related US government information) market space, and to estimate the market share or relative levels of usage for National Library of Medicine (NLM) and National Institutes of Health (NIH) websites compared to other health information providers. Methods The primary method presented is Internet audience measurement based on Web usage by external panels of users and assembled by private vendors—in this case, comScore. A secondary method discussed is Web usage based on Web log software data. The principle metrics for both methods are unique visitors and total pages downloaded per month. Results NLM websites (primarily MedlinePlus and PubMed) account for 55% to 80% of total NIH website usage depending on the metric used. In turn, NIH.gov top-level domain usage (inclusive of NLM) ranks second only behind WebMD in the US domestic home health information market and ranks first on a global basis. NIH.gov consistently ranks among the top three or four US government top-level domains based on global Web usage. On a site-specific basis, the top health information websites in terms of global usage appear to be WebMD, MSN Health, PubMed, Yahoo! Health, AOL Health, and MedlinePlus. Based on MedlinePlus Web log data and external Internet audience measurement data, the three most heavily used cancer-centric websites appear to be www.cancer.gov (National Cancer Institute), www.cancer.org (American Cancer Society), and www.breastcancer.org (non-profit organization). Conclusions Internet audience measurement has proven useful to NLM, with significant advantages compared to sole reliance on usage data from Web log software. Internet audience data has helped NLM better understand the relative usage of NLM and NIH websites in the intersection of the health information and US government information market sectors, which is the primary market intersector for NLM and NIH. However important, Web usage is only one dimension of a complete Web evaluation framework, and other primary research methods, such as online user surveys, usability tests, and focus groups, are also important for comprehensive evaluation that includes qualitative elements, such as user satisfaction and user friendliness, as well as quantitative indicators of website usage. PMID:15998622

  3. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  4. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... electronically to RUS computer software applications. RUS will evaluate borrower load forecasts for readability...'s engineering planning documents, such as the construction work plan, incorporate consumer and usage...

  5. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... electronically to RUS computer software applications. RUS will evaluate borrower load forecasts for readability...'s engineering planning documents, such as the construction work plan, incorporate consumer and usage...

  6. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... electronically to RUS computer software applications. RUS will evaluate borrower load forecasts for readability...'s engineering planning documents, such as the construction work plan, incorporate consumer and usage...

  7. Energy optimization system

    DOEpatents

    Zhou, Zhi; de Bedout, Juan Manuel; Kern, John Michael; Biyik, Emrah; Chandra, Ramu Sharat

    2013-01-22

    A system for optimizing customer utility usage in a utility network of customer sites, each having one or more utility devices, where customer site is communicated between each of the customer sites and an optimization server having software for optimizing customer utility usage over one or more networks, including private and public networks. A customer site model for each of the customer sites is generated based upon the customer site information, and the customer utility usage is optimized based upon the customer site information and the customer site model. The optimization server can be hosted by an external source or within the customer site. In addition, the optimization processing can be partitioned between the customer site and an external source.

  8. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  9. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI...

  10. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. (1) Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative...

  11. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI...

  12. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI...

  13. Negotiating Software Agreements: Avoid Contractual Mishaps and Get the Biggest Bang for Your Buck

    ERIC Educational Resources Information Center

    Riley, Sheila

    2006-01-01

    Purchasing software license and service agreements can be daunting for any district. Greg Lindner, director of information and technology services for the Elk Grove Unified School District in California, and Steve Midgley, program manager at the Stupski Foundation, provided several tips on contract negotiation. This article presents the tips…

  14. A comparison of two methods to assess the usage of mobile hand-held communication devices.

    PubMed

    Berolo, Sophia; Steenstra, Ivan; Amick, Benjamin C; Wells, Richard P

    2015-01-01

    The purposes of this study were to: 1) examine agreement between self-reported measures of mobile device use and direct measures of use, and 2) understand how respondents thought about their device use when they provided self-reports. Self-reports of six categories of device use were obtained using a previously developed questionnaire, and direct measures of use were collected using a custom logging application (n = 47). Bland-Altman analyses were used to examine agreement between the two measurement approaches. Interviews targeted participants' experiences completing the device use section of the questionnaire. Self-reports of use on a typical day last week overestimated logged use. Overestimates tended to be low at low average usage times, and became more variable as usage time increased. Self-reports of use yesterday also exceeded logged use, however the degree of overestimation was less than for a typical day last week. Six themes were identified from interviews, including the thought process used by participants to arrive at usage and the ease of reporting usage. It is challenging for respondents of this questionnaire to provide accurate self-reports of use. The source of this challenge may be attributed to the intrinsic difficulty of estimating use, partly due to the multiple functions of the devices as well as the variability of use both within a day and a week. Research investigating the relationship between device use and health outcomes should include a logging application to examine exposure simultaneously with self-reports to better understand the sources of hazardous exposures.

  15. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.

    PubMed

    Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E

    2018-01-01

    The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.

  16. Using geographic information systems to identify prospective marketing areas for a special library.

    PubMed

    McConnaughy, Rozalynd P; Wilson, Steven P

    2006-05-04

    The Center for Disability Resources (CDR) Library is the largest collection of its kind in the Southeastern United States, consisting of over 5,200 books, videos/DVDs, brochures, and audiotapes covering a variety of disability-related topics, from autism to transition resources. The purpose of the library is to support the information needs of families, faculty, students, staff, and other professionals in South Carolina working with individuals with disabilities. The CDR Library is funded on a yearly basis; therefore, maintaining high usage is crucial. A variety of promotional efforts have been used to attract new patrons to the library. Anyone in South Carolina can check out materials from the library, and most of the patrons use the library remotely by requesting materials, which are then mailed to them. The goal of this project was to identify areas of low geographic usage as a means of identifying locations for future library marketing efforts. Nearly four years worth of library statistics were compiled in a spreadsheet that provided information per county on the number of checkouts, the number of renewals, and the population. Five maps were created using ArcView GIS software to create visual representations of patron checkout and renewal behavior per county. Out of the 46 counties in South Carolina, eight counties never checked out materials from the library. As expected urban areas and counties near the library's physical location have high usage totals. The visual representation of the data made identification of low usage regions easier than using a standalone database with no visual-spatial component. The low usage counties will be the focus of future Center for Disability Resources Library marketing efforts. Due to the impressive visual-spatial representations created with Geographic Information Systems, which more efficiently communicate information than stand-alone database information can, librarians may benefit from the software's use as a supplemental tool for tracking library usage and planning promotional efforts.

  17. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208... services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI... software and related services. ESI does not dictate the products or services to be acquired. ...

  18. A University Dilemma: The Use of Third Party-Owned Software.

    ERIC Educational Resources Information Center

    Hersey, Karen

    1985-01-01

    Reviews specific problems associated with software protection (copyright license, trade secret license), clauses in software license agreements that most frequently cause difficulties, and solutions university administrators and software industry should consider. Topics include confidentiality, restrictive use of modifications, use on single…

  19. Energy Tracking Software Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan Davis; Nathan Bird; Rebecca Birx

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and helpmore » their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.« less

  20. Event visualization in ATLAS

    NASA Astrophysics Data System (ADS)

    Bianchi, R. M.; Boudreau, J.; Konstantinidis, N.; Martyniuk, A. C.; Moyse, E.; Thomas, J.; Waugh, B. M.; Yallup, D. P.; ATLAS Collaboration

    2017-10-01

    At the beginning, HEP experiments made use of photographical images both to record and store experimental data and to illustrate their findings. Then the experiments evolved and needed to find ways to visualize their data. With the availability of computer graphics, software packages to display event data and the detector geometry started to be developed. Here, an overview of the usage of event display tools in HEP is presented. Then the case of the ATLAS experiment is considered in more detail and two widely used event display packages are presented, Atlantis and VP1, focusing on the software technologies they employ, as well as their strengths, differences and their usage in the experiment: from physics analysis to detector development, and from online monitoring to outreach and communication. Towards the end, the other ATLAS visualization tools will be briefly presented as well. Future development plans and improvements in the ATLAS event display packages will also be discussed.

  1. Impacts of object-oriented technologies: Seven years of SEL studies

    NASA Technical Reports Server (NTRS)

    Stark, Mike

    1993-01-01

    This paper examines the premise that object-oriented technology (OOT) is the most significant technology ever examined by the Software Engineering Laboratory. The evolution of the use of OOT in the Software Engineering Laboratory (SEL) 'Experience Factory' is described in terms of the SEL's original expectations, focusing on how successive generations of projects have used OOT. General conclusions are drawn on how the usage of the technology has evolved in this environment.

  2. A toolbox for developing bioinformatics software

    PubMed Central

    Potrzebowski, Wojciech; Puton, Tomasz; Rother, Magdalena; Wywial, Ewa; Bujnicki, Janusz M.

    2012-01-01

    Creating useful software is a major activity of many scientists, including bioinformaticians. Nevertheless, software development in an academic setting is often unsystematic, which can lead to problems associated with maintenance and long-term availibility. Unfortunately, well-documented software development methodology is difficult to adopt, and technical measures that directly improve bioinformatic programming have not been described comprehensively. We have examined 22 software projects and have identified a set of practices for software development in an academic environment. We found them useful to plan a project, support the involvement of experts (e.g. experimentalists), and to promote higher quality and maintainability of the resulting programs. This article describes 12 techniques that facilitate a quick start into software engineering. We describe 3 of the 22 projects in detail and give many examples to illustrate the usage of particular techniques. We expect this toolbox to be useful for many bioinformatics programming projects and to the training of scientific programmers. PMID:21803787

  3. A survey of Canadian medical physicists: software quality assurance of in‐house software

    PubMed Central

    Kelly, Diane

    2015-01-01

    This paper reports on a survey of medical physicists who write and use in‐house written software as part of their professional work. The goal of the survey was to assess the extent of in‐house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple‐choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software‐related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines. PACS number: 87.55.Qr PMID:25679168

  4. DNASynth: a software application to optimization of artificial gene synthesis

    NASA Astrophysics Data System (ADS)

    Muczyński, Jan; Nowak, Robert M.

    2017-08-01

    DNASynth is a client-server software application in which the client runs in a web browser. The aim of this program is to support and optimize process of artificial gene synthesizing using Ligase Chain Reaction. Thanks to LCR it is possible to obtain DNA strand coding defined by user peptide. The DNA sequence is calculated by optimization algorithm that consider optimal codon usage, minimal energy of secondary structures and minimal number of required LCR. Additionally absence of sequences characteristic for defined by user set of restriction enzymes is guaranteed. The presented software was tested on synthetic and real data.

  5. [Utility of Smartphone in Home Care Medicine - First Trial].

    PubMed

    Takeshige, Toshiyuki; Hirano, Chiho; Nakagawa, Midori; Yoshioka, Rentaro

    2015-12-01

    The use of video calls for home care can reduce anxiety and offer patients peace of mind. The most suitable terminals at facilities to support home care have been iPad Air and iPhone with FaceTime software. However, usage has been limited to specific terminals. In order to eliminate the need for special terminals and software, we have developed a program that has been customized to meet the needs of facilities using Web Real Time Communication(WebRTC)in cooperation with the University of Aizu. With this software, video calls can accommodate the large number of home care patients.

  6. Proprietary software

    NASA Technical Reports Server (NTRS)

    Marnock, M. J.

    1971-01-01

    The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.

  7. Multibiodose radiation emergency triage categorization software.

    PubMed

    Ainsbury, Elizabeth A; Barnard, Stephen; Barrios, Lleonard; Fattibene, Paola; de Gelder, Virginie; Gregoire, Eric; Lindholm, Carita; Lloyd, David; Nergaard, Inger; Rothkamm, Kai; Romm, Horst; Scherthan, Harry; Thierens, Hubert; Vandevoorde, Charlot; Woda, Clemens; Wojcik, Andrzej

    2014-07-01

    In this note, the authors describe the MULTIBIODOSE software, which has been created as part of the MULTIBIODOSE project. The software enables doses estimated by networks of laboratories, using up to five retrospective (biological and physical) assays, to be combined to give a single estimate of triage category for each individual potentially exposed to ionizing radiation in a large scale radiation accident or incident. The MULTIBIODOSE software has been created in Java. The usage of the software is based on the MULTIBIODOSE Guidance: the program creates a link to a single SQLite database for each incident, and the database is administered by the lead laboratory. The software has been tested with Java runtime environment 6 and 7 on a number of different Windows, Mac, and Linux systems, using data from a recent intercomparison exercise. The Java program MULTIBIODOSE_1.0.jar is freely available to download from http://www.multibiodose.eu/software or by contacting the software administrator: MULTIBIODOSE-software@gmx.com.

  8. [Product of the month: a bibliographic database with optional formatting capability].

    PubMed

    Vahlensieck, M

    1992-05-01

    The function and usage of the software package "Endnote Plus" for the Apple Macintosh are described. Its advantages in fulfilling different requirements for the citation style and the sort order of reference lists are emphasized.

  9. Nine Easy Steps to Avoiding Software Copyright Infringement.

    ERIC Educational Resources Information Center

    Gamble, Lanny R.; Anderson, Larry S.

    1989-01-01

    To avoid microcomputer software copyright infringement, administrators must be aware of the law, read the software agreements, maintain good records, submit all software registration cards, provide secure storage, post warnings, be consistent when establishing and enforcing policies, consider a site license, and ensure the legality of currently…

  10. ICCE Policy Statement on Network and Multiple Machine Software.

    ERIC Educational Resources Information Center

    International Council for Computers in Education, Eugene, OR.

    Designed to provide educators with guidance for the lawful reproduction of computer software, this document contains suggested guidelines, sample forms, and several short articles concerning software copyright and license agreements. The initial policy statement calls for educators to provide software developers (or their agents) with a…

  11. Kaiser Permanente/Sandia National health care model. Phase I prototype final report. Part 1 - model overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, D.; Yoshimura, A.; Butler, D.

    1996-11-01

    This report describes the results of a Cooperative Research and Development Agreement between Sandia National Laboratories and Kaiser Permanente Southern California to develop a prototype computer model of Kaiser Permanente`s health care delivery system. As a discrete event simulation, SimHCO models for each of 100,000 patients the progression of disease, individual resource usage, and patient choices in a competitive environment. SimHCO is implemented in the object-oriented programming language C++, stressing reusable knowledge and reusable software components. The versioned implementation of SimHCO showed that the object-oriented framework allows the program to grow in complexity in an incremental way. Furthermore, timing calculationsmore » showed that SimHCO runs in a reasonable time on typical workstations, and that a second phase model will scale proportionally and run within the system constraints of contemporary computer technology. This report is published as two documents: Model Overview and Domain Analysis. A separate Kaiser-proprietary report contains the Disease and Health Care Organization Selection Models.« less

  12. Acquisition of Gender Agreement in Lithuanian: Exploring the Effect of Diminutive Usage in an Elicited Production Task

    ERIC Educational Resources Information Center

    Savickiene, Ineta; Kempe, Vera; Brooks, Patricia J.

    2009-01-01

    This study examines Lithuanian children's acquisition of gender agreement using an elicited production task. Lithuanian is a richly inflected Baltic language, with two genders and seven cases. Younger (N = 24, mean 3 ; 1, 2 ; 5-3 ; 8) and older (N = 24, mean 6 ; 3, 5 ; 6-6 ; 9) children were shown pictures of animals and asked to describe them…

  13. Trunk muscle activation during golf swing: Baseline and threshold.

    PubMed

    Silva, Luís; Marta, Sérgio; Vaz, João; Fernandes, Orlando; Castro, Maria António; Pezarat-Correia, Pedro

    2013-10-01

    There is a lack of studies regarding EMG temporal analysis during dynamic and complex motor tasks, such as golf swing. The aim of this study is to analyze the EMG onset during the golf swing, by comparing two different threshold methods. Method A threshold was determined using the baseline activity recorded between two maximum voluntary contraction (MVC). Method B threshold was calculated using the mean EMG activity for 1000ms before the 500ms prior to the start of the Backswing. Two different clubs were also studied. Three-way repeated measures ANOVA was used to compare methods, muscles and clubs. Two-way mixed Intraclass Correlation Coefficient (ICC) with absolute agreement was used to determine the methods reliability. Club type usage showed no influence in onset detection. Rectus abdominis (RA) showed the higher agreement between methods. Erector spinae (ES), on the other hand, showed a very low agreement, that might be related to postural activity before the swing. External oblique (EO) is the first being activated, at 1295ms prior impact. There is a similar activation time between right and left muscles sides, although the right EO showed better agreement between methods than left side. Therefore, the algorithms usage is task- and muscle-dependent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. LBflow: An extensible lattice Boltzmann framework for the simulation of geophysical flows. Part II: usage and validation

    NASA Astrophysics Data System (ADS)

    Llewellin, E. W.

    2010-02-01

    LBflow is a flexible, extensible implementation of the lattice Boltzmann method, developed with geophysical applications in mind. The theoretical basis for LBflow, and its implementation, are presented in the companion paper, 'Part I'. This article covers the practical usage of LBflow and presents guidelines for obtaining optimal results from available computing power. The relationships among simulation resolution, accuracy, runtime and memory requirements are investigated in detail. Particular attention is paid to the origin, quantification and minimization of errors. LBflow is validated against analytical, numerical and experimental results for a range of three-dimensional flow geometries. The fluid conductance of prismatic pipes with various cross sections is calculated with LBflow and found to be in excellent agreement with published results. Simulated flow along sinusoidally constricted pipes gives good agreement with experimental data for a wide range of Reynolds number. The permeability of packs of spheres is determined and shown to be in excellent agreement with analytical results. The accuracy of internal flow patterns within the investigated geometries is also in excellent quantitative agreement with published data. The development of vortices within a sinusoidally constricted pipe with increasing Reynolds number is shown, demonstrating the insight that LBflow can offer as a 'virtual laboratory' for fluid flow.

  15. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method

    PubMed Central

    2011-01-01

    Background The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. Results A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. Conclusions The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim. PMID:21586134

  16. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method.

    PubMed

    Luna, Augustin; Karac, Evrim I; Sunshine, Margot; Chang, Lucas; Nussinov, Ruth; Aladjem, Mirit I; Kohn, Kurt W

    2011-05-17

    The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim.

  17. Laserdisk Directory--Part 1.

    ERIC Educational Resources Information Center

    Connolly, Bruce, Comp.

    1986-01-01

    This first installment of four-part "Online/Database Laserdisk Directory" reports on aspects of laserdisks including: product name; product description; company name; conpatibility information; type of laserdisk (compact disc read-only-memory, videodisk); software used; interface with magnetic media capability; conditions of usage;…

  18. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software

    PubMed Central

    Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.

    2018-01-01

    Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably. PMID:29750166

  19. Development of software for geodynamic processes monitoring system

    NASA Astrophysics Data System (ADS)

    Kabanov, M. M.; Kapustin, S. N.; Gordeev, V. F.; Botygin, I. A.; Tartakovsky, V. A.

    2017-11-01

    This article justifies the usage of natural pulsed electromagnetic Earth's noises logging method for mapping anomalies of strain-stress state of Earth's crust. The methods and technologies for gathering, processing and systematization of data gathered by ground multi-channel geophysical loggers for monitoring geomagnetic situation have been experimentally tested, and software had been developed. The data was consolidated in a network storage and can be accessed without using any specialized client software. The article proposes ways to distinguish global and regional small-scale time-space variations of Earth's natural electromagnetic field. For research purposes, the software provides a way to export data for any given period of time for any loggers and displays measurement data charts for selected set of stations.

  20. SmartWay Mark Signature Page: Tractors & Trailers

    EPA Pesticide Factsheets

    This SmartWay agreement is for companies and organizations who wish to comply with the SmartWay Graphic Standards and Usage Guide guidelines and requirements for using the SmartWay logos on SmartWay designated Tractors and Trailers.

  1. Towards a whole-cell modeling approach for synthetic biology

    NASA Astrophysics Data System (ADS)

    Purcell, Oliver; Jain, Bonny; Karr, Jonathan R.; Covert, Markus W.; Lu, Timothy K.

    2013-06-01

    Despite rapid advances over the last decade, synthetic biology lacks the predictive tools needed to enable rational design. Unlike established engineering disciplines, the engineering of synthetic gene circuits still relies heavily on experimental trial-and-error, a time-consuming and inefficient process that slows down the biological design cycle. This reliance on experimental tuning is because current modeling approaches are unable to make reliable predictions about the in vivo behavior of synthetic circuits. A major reason for this lack of predictability is that current models view circuits in isolation, ignoring the vast number of complex cellular processes that impinge on the dynamics of the synthetic circuit and vice versa. To address this problem, we present a modeling approach for the design of synthetic circuits in the context of cellular networks. Using the recently published whole-cell model of Mycoplasma genitalium, we examined the effect of adding genes into the host genome. We also investigated how codon usage correlates with gene expression and find agreement with existing experimental results. Finally, we successfully implemented a synthetic Goodwin oscillator in the whole-cell model. We provide an updated software framework for the whole-cell model that lays the foundation for the integration of whole-cell models with synthetic gene circuit models. This software framework is made freely available to the community to enable future extensions. We envision that this approach will be critical to transforming the field of synthetic biology into a rational and predictive engineering discipline.

  2. Application of majority voting and consensus voting algorithms in N-version software

    NASA Astrophysics Data System (ADS)

    Tsarev, R. Yu; Durmuş, M. S.; Üstoglu, I.; Morozov, V. A.

    2018-05-01

    N-version programming is one of the most common techniques which is used to improve the reliability of software by building in fault tolerance, redundancy and decreasing common cause failures. N different equivalent software versions are developed by N different and isolated workgroups by considering the same software specifications. The versions solve the same task and return results that have to be compared to determine the correct result. Decisions of N different versions are evaluated by a voting algorithm or the so-called voter. In this paper, two of the most commonly used software voting algorithms such as the majority voting algorithm and the consensus voting algorithm are studied. The distinctive features of Nversion programming with majority voting and N-version programming with consensus voting are described. These two algorithms make a decision about the correct result on the base of the agreement matrix. However, if the equivalence relation on the agreement matrix is not satisfied it is impossible to make a decision. It is shown that the agreement matrix can be transformed into an appropriate form by using the Boolean compositions when the equivalence relation is satisfied.

  3. The Satirical Value of Virtual Worlds

    ERIC Educational Resources Information Center

    Baggaley, Jon

    2010-01-01

    Imaginary worlds have been devised by artists and commentators for centuries to focus satirical attention on society's problems. The increasing sophistication of three-dimensional graphics software is generating comparable "virtual worlds" for educational usage. Can such worlds play a satirical role suggesting developments in distance…

  4. Using 3D Geometric Models to Teach Spatial Geometry Concepts.

    ERIC Educational Resources Information Center

    Bertoline, Gary R.

    1991-01-01

    An explanation of 3-D Computer Aided Design (CAD) usage to teach spatial geometry concepts using nontraditional techniques is presented. The software packages CADKEY and AutoCAD are described as well as their usefulness in solving space geometry problems. (KR)

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; Clemencic, M.; Dykstra, D.

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  6. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.

    Bioinformatics researchers are increasingly confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBasemore » project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date.« less

  7. An Investigation of Expert Systems Usage for Software Requirements Development in the Strategic Defense Initiative Environment.

    DTIC Science & Technology

    1986-06-10

    the solution of the base could be the solution of the target. If expert systems are to mimic humans , then they should inherently utilize analogy. In the...expert systems environment, the theory of frames for representing knowledge developed partly because humans usually solve problems by first seeing if...Goals," Computer, May 1975, p. 17. 8. A.I. Wasserman, "Some Principles of User Software Engineering for Information Systems ," Digest of Papers, COMPCON

  8. Enabling active and healthy ageing decision support systems with the smart collection of TV usage patterns

    PubMed Central

    Billis, Antonis S.; Batziakas, Asterios; Bratsas, Charalampos; Tsatali, Marianna S.; Karagianni, Maria

    2016-01-01

    Smart monitoring of seniors behavioural patterns and more specifically activities of daily living have attracted immense research interest in recent years. Development of smart decision support systems to support the promotion of health smart homes has also emerged taking advantage of the plethora of smart, inexpensive and unobtrusive monitoring sensors, devices and software tools. To this end, a smart monitoring system has been used in order to extract meaningful information about television (TV) usage patterns and subsequently associate them with clinical findings of experts. The smart TV operating state remote monitoring system was installed in four elderly women homes and gathered data for more than 11 months. Results suggest that TV daily usage (time the TV is turned on) can predict mental health change. Conclusively, the authors suggest that collection of smart device usage patterns could strengthen the inference capabilities of existing health DSSs applied in uncontrolled settings such as real senior homes. PMID:27284457

  9. Enabling active and healthy ageing decision support systems with the smart collection of TV usage patterns.

    PubMed

    Billis, Antonis S; Batziakas, Asterios; Bratsas, Charalampos; Tsatali, Marianna S; Karagianni, Maria; Bamidis, Panagiotis D

    2016-03-01

    Smart monitoring of seniors behavioural patterns and more specifically activities of daily living have attracted immense research interest in recent years. Development of smart decision support systems to support the promotion of health smart homes has also emerged taking advantage of the plethora of smart, inexpensive and unobtrusive monitoring sensors, devices and software tools. To this end, a smart monitoring system has been used in order to extract meaningful information about television (TV) usage patterns and subsequently associate them with clinical findings of experts. The smart TV operating state remote monitoring system was installed in four elderly women homes and gathered data for more than 11 months. Results suggest that TV daily usage (time the TV is turned on) can predict mental health change. Conclusively, the authors suggest that collection of smart device usage patterns could strengthen the inference capabilities of existing health DSSs applied in uncontrolled settings such as real senior homes.

  10. Generate Optimized Genetic Rhythm for Enzyme Expression in Non-native systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-11-03

    Most amino acids are represented by more than one codon, resulting in redundancy in the genetic code. Silent codon substitutions that do not alter the amino acid sequence still have an effect on protein expression. We have developed an algorithm, GoGREEN, to enhance the expression of foreign proteins in a host organism. GoGREEN selects codons according to frequency patterns seen in the gene of interest using the codon usage table from the host organism. GoGREEN is also designed to accommodate gaps in the sequence.This software takes for input (1) the aligned protein sequences for genes the user wishes to express,more » (2) the codon usage table for the host organism, (3) and the DNA sequence for the target protein found in the host organism. The program will select codons based on codon usage patterns for the target DNA sequence. The program will also select codons for “gaps” found in the aligned protein sequences using the codon usage table from the host organism.« less

  11. Prospective comparison of speckle tracking longitudinal bidimensional strain between two vendors.

    PubMed

    Castel, Anne-Laure; Szymanski, Catherine; Delelis, François; Levy, Franck; Menet, Aymeric; Mailliet, Amandine; Marotte, Nathalie; Graux, Pierre; Tribouilloy, Christophe; Maréchaux, Sylvestre

    2014-02-01

    Speckle tracking is a relatively new, largely angle-independent technique used for the evaluation of myocardial longitudinal strain (LS). However, significant differences have been reported between LS values obtained by speckle tracking with the first generation of software products. To compare LS values obtained with the most recently released equipment from two manufacturers. Systematic scanning with head-to-head acquisition with no modification of the patient's position was performed in 64 patients with equipment from two different manufacturers, with subsequent off-line post-processing for speckle tracking LS assessment (Philips QLAB 9.0 and General Electric [GE] EchoPAC BT12). The interobserver variability of each software product was tested on a randomly selected set of 20 echocardiograms from the study population. GE and Philips interobserver coefficients of variation (CVs) for global LS (GLS) were 6.63% and 5.87%, respectively, indicating good reproducibility. Reproducibility was very variable for regional and segmental LS values, with CVs ranging from 7.58% to 49.21% with both software products. The concordance correlation coefficient (CCC) between GLS values was high at 0.95, indicating substantial agreement between the two methods. While good agreement was observed between midwall and apical regional strains with the two software products, basal regional strains were poorly correlated. The agreement between the two software products at a segmental level was very variable; the highest correlation was obtained for the apical cap (CCC 0.90) and the poorest for basal segments (CCC range 0.31-0.56). A high level of agreement and reproducibility for global but not for basal regional or segmental LS was found with two vendor-dependent software products. This finding may help to reinforce clinical acceptance of GLS in everyday clinical practice. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  12. 7 CFR 1485.17 - Reimbursement rules.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... shall document by a salary survey or other means, except for approved supergrades; (4) A retroactive..., machinery, removable fixtures, draperies, blinds, floor coverings, computer hardware and software, and..., including communication costs, except as noted in § 1485.17(c)(22) and except that usage costs for...

  13. The Mad Dash To Compute.

    ERIC Educational Resources Information Center

    Healy, Jane M.

    1999-01-01

    Discusses trade-offs and ramifications of technology use in schools. Cutbacks in proven staples of mental development (arts, music, drama, and physical education) are used to finance technology programs. Youngsters often use educational software for mindless fun. Few advocates consider how extended computer usage affects children's developing…

  14. The Ins and Outs of Access Control.

    ERIC Educational Resources Information Center

    Longworth, David

    1999-01-01

    Presents basic considerations when school districts plan to acquire an access-control system for their education facilities. Topics cover cards and readers, controllers, software, automation, card technology, expandability, price, specification of needs beyond the canned specifications already supplied, and proper usage training to cardholders.…

  15. Scheduling algorithms for automatic control systems for technological processes

    NASA Astrophysics Data System (ADS)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  16. SLS-PLAN-IT: A knowledge-based blackboard scheduling system for Spacelab life sciences missions

    NASA Technical Reports Server (NTRS)

    Kao, Cheng-Yan; Lee, Seok-Hua

    1992-01-01

    The primary scheduling tool in use during the Spacelab Life Science (SLS-1) planning phase was the operations research (OR) based, tabular form Experiment Scheduling System (ESS) developed by NASA Marshall. PLAN-IT is an artificial intelligence based interactive graphic timeline editor for ESS developed by JPL. The PLAN-IT software was enhanced for use in the scheduling of Spacelab experiments to support the SLS missions. The enhanced software SLS-PLAN-IT System was used to support the real-time reactive scheduling task during the SLS-1 mission. SLS-PLAN-IT is a frame-based blackboard scheduling shell which, from scheduling input, creates resource-requiring event duration objects and resource-usage duration objects. The blackboard structure is to keep track of the effects of event duration objects on the resource usage objects. Various scheduling heuristics are coded in procedural form and can be invoked any time at the user's request. The system architecture is described along with what has been learned with the SLS-PLAN-IT project.

  17. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  18. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  19. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  20. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  1. Copyright: Know Your Electronic Rights!

    ERIC Educational Resources Information Center

    Valauskas, Edward J.

    1992-01-01

    Defines copyright and examines the interests of computer software publishers. Issues related to the rights of libraries in the circulation of software are discussed, including the fair use principle, software vendors' licensing agreements, and cooperation between libraries and vendors. An inset describes procedures for internal auditing of…

  2. A survey of personal digital assistant use in a sample of New Zealand doctors.

    PubMed

    Menzies, Oliver H; Thwaites, John

    2012-03-30

    To gather information about handheld computing hardware and software usage by hospital based doctors in New Zealand (NZ). An online tool (SurveyMonkey) was used to conduct the survey from 27 June to 10 September 2010. Distribution of the survey was via an email to all NZ District Health Boards (DHBs). There were 850 responses. About half of respondents (52%) used a personal digital assistant (PDA), 90% using it at least once daily. Usage varied greatly between DHBs (27-100%), perhaps related to institutional support. Among PDA users, the most common applications were the non-clinical; Scheduler (95%), Contacts (97%), and Tasks (83%). Users felt PDAs helped considerably with organisation and time saving. For non-users there were a range of barriers to usage, cost being a large factor. Another major barrier identified by both users and non-users was lack of organisational integration and support. Half of survey respondents used a PDA. PDA usage of responders from different DHBs varied considerably. Perceived barriers to PDA use included cost and lack of institutional support. A collaborative approach between clinical leadership and Information Technology teams to address barriers may result in increased utility and usage of PDAs in the NZ health system.

  3. The effects of exercise reminder software program on office workers' perceived pain level, work performance and quality of life.

    PubMed

    Irmak, A; Bumin, G; Irmak, R

    2012-01-01

    In direct proportion to current technological developments, both the computer usage in the workplaces is increased and requirement of leaving the desk for an office worker in order to photocopy a document, send or receive an e-mail is decreased. Therefore, office workers stay in the same postures accompanied by long periods of keyboard usage. In recent years, with intent to reduce the incidence of work related musculoskeletal disorders several exercise reminder software programs have been developed. The purpose of this study is to evaluate the effectiveness of the exercise reminder software program on office workers' perceived pain level, work performance and quality of life. 39 healthy office workers accepted to attend the study. Participants were randomly split in to two groups, control group (n = 19) and intervention group (n = 20). Visual Analogue Scale to evaluate the perceived pain was administered all of the participants in the beginning and at the end of the study. The intervention group used the program for 10 weeks. Findings showed that the control group VAS scores remained the same, but the intervention group VAS scores decreased in a statistically significant way (p < 0.01). Results support that such exercise reminder software programs may help to reduce perceived pain among office workers. Further long term studies with more subjects are needed to describe the effects of these programs and the mechanism under these effects.

  4. Validation of a Custom-made Software for DQE Assessment in Mammography Digital Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayala-Dominguez, L.; Perez-Ponce, H.; Brandan, M. E.

    2010-12-07

    This works presents the validation of a custom-made software, designed and developed in Matlab, intended for routine evaluation of detective quantum efficiency DQE, according to algorithms described in the IEC 62220-1-2 standard. DQE, normalized noise power spectrum NNPS and pre-sampling modulation transfer function MTF were calculated from RAW images from a GE Senographe DS (FineView disabled) and a Siemens Novation system. Calculated MTF is in close agreement with results obtained with alternative codes: MTF lowbar tool (Maidment), ImageJ plug-in (Perez-Ponce) and MIQuaELa (Ayala). Overall agreement better than {approx_equal}90% was found in MTF; the largest differences were observed at frequencies closemore » to the Nyquist limit. For the measurement of NNPS and DQE, agreement is similar to that obtained in the MTF. These results suggest that the developed software can be used with confidence for image quality assessment.« less

  5. The impact of library services in primary care trusts in NHS North West England: a large-scale retrospective quantitative study of online resource usage in relation to types of service.

    PubMed

    Bell, Katherine; Glover, Steven William; Brodie, Colin; Roberts, Anne; Gleghorn, Colette

    2009-06-01

    Within NHS North West England there are 24 primary care trusts (PCTs), all with access to different types of library services. This study aims to evaluate the impact the type of library service has on online resource usage. We conducted a large-scale retrospective quantitative study across all PCT staff in NHS NW England using Athens sessions log data. We studied the Athens log usage of 30,381 staff, with 8,273 active Athens accounts and 100,599 sessions from 1 January 2007 to 31 December 2007. In 2007, PCTs with outreach librarians achieved 43% penetration of staff with active Athens accounts compared with PCTs with their own library service (28.23%); PCTs with service level agreements (SLAs) with acute hospital library services (22.5%) and with no library service (19.68%). This pattern was also observed when we looked at the average number of Athens user sessions per person, and usage of Dialog Datastar databases and Proquest full text journal collections. Our findings have shown a correlation of e-resource usage and type of library service. Outreach librarians have proved to be an efficient model for promoting and driving up resources usage. PCTs with no library service have shown the lowest level of resource usage.

  6. Software Authority Transition through Multiple Distributors

    PubMed Central

    Han, Kyusunk; Shon, Taeshik

    2014-01-01

    The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times. PMID:25143971

  7. Software authority transition through multiple distributors.

    PubMed

    Han, Kyusunk; Shon, Taeshik

    2014-01-01

    The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times.

  8. Semiautomatic estimation of breast density with DM-Scan software.

    PubMed

    Martínez Gómez, I; Casals El Busto, M; Antón Guirao, J; Ruiz Perales, F; Llobet Azpitarte, R

    2014-01-01

    To evaluate the reproducibility of the calculation of breast density with DM-Scan software, which is based on the semiautomatic segmentation of fibroglandular tissue, and to compare it with the reproducibility of estimation by visual inspection. The study included 655 direct digital mammograms acquired using craniocaudal projections. Three experienced radiologists analyzed the density of the mammograms using DM-Scan, and the inter- and intra-observer agreement between pairs of radiologists for the Boyd and BI-RADS® scales were calculated using the intraclass correlation coefficient. The Kappa index was used to compare the inter- and intra-observer agreements with those obtained previously for visual inspection in the same set of images. For visual inspection, the mean interobserver agreement was 0,876 (95% CI: 0,873-0,879) on the Boyd scale and 0,823 (95% CI: 0,818-0,829) on the BI-RADS® scale. The mean intraobserver agreement was 0,813 (95% CI: 0,796-0,829) on the Boyd scale and 0,770 (95% CI: 0,742-0,797) on the BI-RADS® scale. For DM-Scan, the mean inter- and intra-observer agreement was 0,92, considerably higher than the agreement for visual inspection. The semiautomatic calculation of breast density using DM-Scan software is more reliable and reproducible than visual estimation and reduces the subjectivity and variability in determining breast density. Copyright © 2012 SERAM. Published by Elsevier Espana. All rights reserved.

  9. The development of an artificial organic networks toolkit for LabVIEW.

    PubMed

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  10. 14 CFR 1274.942 - Export licenses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... appropriate licenses or other approvals, if required, for exports of hardware, technical data, and software... installation], where the foreign person will have access to export-controlled technical data or software. (c... software) pursuant to the exemption at 22 CFR 125.4(b)(3). The Agreement Officer or designated...

  11. Software Auditing: A New Task for U.K. Universities.

    ERIC Educational Resources Information Center

    Fletcher, Mark

    1997-01-01

    Based on a pilot project at Exeter University (Devon, England) a software audit, comparing number of copies of software installed with number of license agreements, is described. Discussion includes auditing budgets, workstation questionnaires, the scanner program which detects the hardware configuration and staff training, analysis and…

  12. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... release, disclosure, or authorized use of technical data or computer software subject to special license... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  13. Performance testing of 3D point cloud software

    NASA Astrophysics Data System (ADS)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  14. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  15. Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring

    ERIC Educational Resources Information Center

    Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri

    2017-01-01

    Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…

  16. Facilities Management via Computer: Information at Your Fingertips.

    ERIC Educational Resources Information Center

    Hensey, Susan

    1996-01-01

    Computer-aided facilities management is a software program consisting of a relational database of facility information--such as occupancy, usage, student counts, etc.--attached to or merged with computerized floor plans. This program can integrate data with drawings, thereby allowing the development of "what if" scenarios. (MLF)

  17. Development of Computer-Based Resources for Textile Education.

    ERIC Educational Resources Information Center

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  18. Who Goes There? Measuring Library Web Site Usage.

    ERIC Educational Resources Information Center

    Bauer, Kathleen

    2000-01-01

    Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)

  19. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping.

    PubMed

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon's conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team.

  20. Beta-Testing Agreement | FNLCR Staging

    Cancer.gov

    Beta-Testing Agreements are appropriate forlimited term evaluation and applications development of new software, technology, or equipment platforms by the Frederick National Labin collaboration with an external commercial partner. It may

  1. Extracting data from figures with software was faster, with higher interrater reliability than manual extraction.

    PubMed

    Jelicic Kadic, Antonia; Vucic, Katarina; Dosenovic, Svjetlana; Sapunar, Damir; Puljak, Livia

    2016-06-01

    To compare speed and accuracy of graphical data extraction using manual estimation and open source software. Data points from eligible graphs/figures published in randomized controlled trials (RCTs) from 2009 to 2014 were extracted by two authors independently, both by manual estimation and with the Plot Digitizer, open source software. Corresponding authors of each RCT were contacted up to four times via e-mail to obtain exact numbers that were used to create graphs. Accuracy of each method was compared against the source data from which the original graphs were produced. Software data extraction was significantly faster, reducing time for extraction for 47%. Percent agreement between the two raters was 51% for manual and 53.5% for software data extraction. Percent agreement between the raters and original data was 66% vs. 75% for the first rater and 69% vs. 73% for the second rater, for manual and software extraction, respectively. Data extraction from figures should be conducted using software, whereas manual estimation should be avoided. Using software for data extraction of data presented only in figures is faster and enables higher interrater reliability. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Use of symbolic computation in robotics education

    NASA Technical Reports Server (NTRS)

    Vira, Naren; Tunstel, Edward

    1992-01-01

    An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.

  3. Collecting household water usage data: telephone questionnaire or diary?

    PubMed Central

    2009-01-01

    Background Quantitative Microbial Risk Assessment (QMRA), a modelling approach, is used to assess health risks. Inputs into the QMRA process include data that characterise the intensity, frequency and duration of exposure to risk(s). Data gaps for water exposure assessment include the duration and frequency of urban non-potable (non-drinking) water use. The primary objective of this study was to compare household water usage results obtained using two data collection tools, a computer assisted telephone interview (CATI) and a 7-day water activity diary, in order to assess the effect of different methodological survey approaches on derived exposure estimates. Costs and logistical aspects of each data collection tool were also examined. Methods A total of 232 households in an Australian dual reticulation scheme (where households are supplied with two grades of water through separate pipe networks) were surveyed about their water usage using both a CATI and a 7-day diary. Householders were questioned about their use of recycled water for toilet flushing, garden watering and other outdoor activities. Householders were also questioned about their water use in the laundry. Agreement between reported CATI and diary water usage responses was assessed. Results Results of this study showed that the level of agreement between CATI and diary responses was greater for more frequent water-related activities except toilet flushing and for those activities where standard durations or settings were employed. In addition, this study showed that the unit cost of diary administration was greater than for the CATI, excluding consideration of the initial selection and recruitment steps. Conclusion This study showed that it is possible to successfully 'remotely' coordinate diary completion providing that adequate instructions are given and that diary recording forms are well designed. In addition, good diary return rates can be achieved using a monetary incentive and the diary format allows for collective recording, rather than an individual's estimation, of household water usage. Accordingly, there is merit in further exploring the use of diaries for collection of water usage information either in combination with a mail out for recruitment, or potentially in the future with Internet-based recruitment (as household Internet uptake increases). PMID:19900290

  4. Collecting household water usage data: telephone questionnaire or diary?

    PubMed

    O'Toole, Joanne E; Sinclair, Martha I; Leder, Karin

    2009-11-09

    Quantitative Microbial Risk Assessment (QMRA), a modelling approach, is used to assess health risks. Inputs into the QMRA process include data that characterise the intensity, frequency and duration of exposure to risk(s). Data gaps for water exposure assessment include the duration and frequency of urban non-potable (non-drinking) water use. The primary objective of this study was to compare household water usage results obtained using two data collection tools, a computer assisted telephone interview (CATI) and a 7-day water activity diary, in order to assess the effect of different methodological survey approaches on derived exposure estimates. Costs and logistical aspects of each data collection tool were also examined. A total of 232 households in an Australian dual reticulation scheme (where households are supplied with two grades of water through separate pipe networks) were surveyed about their water usage using both a CATI and a 7-day diary. Householders were questioned about their use of recycled water for toilet flushing, garden watering and other outdoor activities. Householders were also questioned about their water use in the laundry. Agreement between reported CATI and diary water usage responses was assessed. Results of this study showed that the level of agreement between CATI and diary responses was greater for more frequent water-related activities except toilet flushing and for those activities where standard durations or settings were employed. In addition, this study showed that the unit cost of diary administration was greater than for the CATI, excluding consideration of the initial selection and recruitment steps. This study showed that it is possible to successfully 'remotely' coordinate diary completion providing that adequate instructions are given and that diary recording forms are well designed. In addition, good diary return rates can be achieved using a monetary incentive and the diary format allows for collective recording, rather than an individual's estimation, of household water usage. Accordingly, there is merit in further exploring the use of diaries for collection of water usage information either in combination with a mail out for recruitment, or potentially in the future with Internet-based recruitment (as household Internet uptake increases).

  5. Software and package applicating for network meta-analysis: A usage-based comparative study.

    PubMed

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  6. Consolidation and development roadmap of the EMI middleware

    NASA Astrophysics Data System (ADS)

    Kónya, B.; Aiftimiei, C.; Cecchi, M.; Field, L.; Fuhrmann, P.; Nilsen, J. K.; White, J.

    2012-12-01

    Scientific research communities have benefited recently from the increasing availability of computing and data infrastructures with unprecedented capabilities for large scale distributed initiatives. These infrastructures are largely defined and enabled by the middleware they deploy. One of the major issues in the current usage of research infrastructures is the need to use similar but often incompatible middleware solutions. The European Middleware Initiative (EMI) is a collaboration of the major European middleware providers ARC, dCache, gLite and UNICORE. EMI aims to: deliver a consolidated set of middleware components for deployment in EGI, PRACE and other Distributed Computing Infrastructures; extend the interoperability between grids and other computing infrastructures; strengthen the reliability of the services; establish a sustainable model to maintain and evolve the middleware; fulfil the requirements of the user communities. This paper presents the consolidation and development objectives of the EMI software stack covering the last two years. The EMI development roadmap is introduced along the four technical areas of compute, data, security and infrastructure. The compute area plan focuses on consolidation of standards and agreements through a unified interface for job submission and management, a common format for accounting, the wide adoption of GLUE schema version 2.0 and the provision of a common framework for the execution of parallel jobs. The security area is working towards a unified security model and lowering the barriers to Grid usage by allowing users to gain access with their own credentials. The data area is focusing on implementing standards to ensure interoperability with other grids and industry components and to reuse already existing clients in operating systems and open source distributions. One of the highlights of the infrastructure area is the consolidation of the information system services via the creation of a common information backbone.

  7. DAQ: Software Architecture for Data Acquisition in Sounding Rockets

    NASA Technical Reports Server (NTRS)

    Ahmad, Mohammad; Tran, Thanh; Nichols, Heidi; Bowles-Martinez, Jessica N.

    2011-01-01

    A multithreaded software application was developed by Jet Propulsion Lab (JPL) to collect a set of correlated imagery, Inertial Measurement Unit (IMU) and GPS data for a Wallops Flight Facility (WFF) sounding rocket flight. The data set will be used to advance Terrain Relative Navigation (TRN) technology algorithms being researched at JPL. This paper describes the software architecture and the tests used to meet the timing and data rate requirements for the software used to collect the dataset. Also discussed are the challenges of using commercial off the shelf (COTS) flight hardware and open source software. This includes multiple Camera Link (C-link) based cameras, a Pentium-M based computer, and Linux Fedora 11 operating system. Additionally, the paper talks about the history of the software architecture's usage in other JPL projects and its applicability for future missions, such as cubesats, UAVs, and research planes/balloons. Also talked about will be the human aspect of project especially JPL's Phaeton program and the results of the launch.

  8. Influencing Factors in OER Usage of Adult Learners in Korea

    ERIC Educational Resources Information Center

    Kim, Byoung Wook; Lee, Won Gyu; Lee, Byeong Rae; Shon, Jin Gon

    2015-01-01

    Open Educational Resources (OER) is terminology that refers to educational resources (content and software) distributed through the Internet, free of charge and freely accessible, expanding learning opportunities for adult learners. This terminology first appeared around 2002, although its roots can be traced to the open architecture of the…

  9. NREL Transportation Project to Reduce Fuel Usage

    Science.gov Websites

    and communication software was developed by NREL researchers to display a vehicle's location automatically and transmit a map of the its location over the Internet. After developing the communication vehicle location and communication technology to track and direct vehicle fleet movements," said the

  10. A Flexible and Configurable Architecture for Automatic Control Remote Laboratories

    ERIC Educational Resources Information Center

    Kalúz, Martin; García-Zubía, Javier; Fikar, Miroslav; Cirka, Luboš

    2015-01-01

    In this paper, we propose a novel approach in hardware and software architecture design for implementation of remote laboratories for automatic control. In our contribution, we show the solution with flexible connectivity at back-end, providing features of multipurpose usage with different types of experimental devices, and fully configurable…

  11. WINDS: A Web-Based Intelligent Interactive Course on Data-Structures

    ERIC Educational Resources Information Center

    Sirohi, Vijayalaxmi

    2007-01-01

    The Internet has opened new ways of learning and has brought several advantages to computer-aided education. Global access, self-paced learning, asynchronous teaching, interactivity, and multimedia usage are some of these. Along with the advantages comes the challenge of designing the software using the available facilities. Integrating online…

  12. Using SimCPU in Cooperative Learning Laboratories.

    ERIC Educational Resources Information Center

    Lin, Janet Mei-Chuen; Wu, Cheng-Chih; Liu, Hsi-Jen

    1999-01-01

    Reports research findings of an experimental design in which cooperative-learning strategies were applied to closed-lab instruction of computing concepts. SimCPU, a software package specially designed for closed-lab usage was used by 171 high school students of four classes. Results showed that collaboration enhanced learning and that blending…

  13. What Chemists (or Chemistry Students) Need to Know about Computing.

    ERIC Educational Resources Information Center

    Swift, Mary L.; Zielinski, Theresa Julia

    1995-01-01

    Presents key points of an on-line conference discussion and integrates them with information from the literature. Key points included: computer as a tool for learning, study, research, and communication; hardware, software, computing concepts, and other teaching concerns; and the appropriate place for chemistry computer-usage instruction. (45…

  14. Types for Correct Concurrent API Usage

    DTIC Science & Technology

    2010-12-01

    unique, full Here g is the state guarantee and A is the current abstract state of the object referenced by r. The ⊗ symbol is called the “ tensor ...to discover resources on a heterogeneous network. Votebox is an open-source implementation of software for voting machines. The Blocking queuemethod

  15. ePortfolios Meet Social Software

    ERIC Educational Resources Information Center

    Waters, John K.

    2007-01-01

    Although a seemingly good idea, electronic portfolios have to date failed to gain significant traction in higher education. Institutions with ePortfolio implementations routinely report high numbers of accounts on their campuses, but few believe that those numbers are a meaningful reflection of actual usage. Change is in the air for the…

  16. Online Socialization through Social Software and Networks from an Educational Perspective

    ERIC Educational Resources Information Center

    Gülbahar, Yasemin

    2015-01-01

    The potential represented by the usage of Internet-based communication technologies in parallel with e-instruction is enabling learners to cooperate and collaborate throughout the world. However, an important dimension, namely the socialization of learners through online dialogues via e-mail, discussion forums, chats, blogs, wikis and virtual…

  17. Efficacy of a Virtual Teaching Assistant in an Open Laboratory Environment for Electric Circuits

    ERIC Educational Resources Information Center

    Saleheen, Firdous; Wang, Zicong; Picone, Joseph; Butz, Brian P.; Won, Chang-Hee

    2018-01-01

    In order to provide an on-demand, open electrical engineering laboratory, we developed an innovative software-based Virtual Open Laboratory Teaching Assistant (VOLTA). This web-based virtual assistant provides laboratory instructions, equipment usage videos, circuit simulation assistance, and hardware implementation diagnostics. VOLTA allows…

  18. Practical uses of SPFIT

    NASA Astrophysics Data System (ADS)

    Drouin, Brian J.

    2017-10-01

    Over twenty-five years ago, Herb Pickett introduced his quantum-mechanical fitting programs to the spectroscopic community. The utility and flexibility of the software has enabled a whole generation of spectroscopists to analyze both simple and complex spectra without having to write and compile their own code. Last year Stewart Novick provided a primer for the coming generation of users. This follow-on work will serve as a guide to intermediate and advanced usage of the software. It is meant to be used in concert with the online documentation as well as the spectral line catalog archive.

  19. Frequency Estimator Performance for a Software-Based Beacon Receiver

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  20. TACAN operational description for the space shuttle orbital flight test program

    NASA Technical Reports Server (NTRS)

    Hughes, C. L.; Hudock, P. J.

    1979-01-01

    The TACAN subsystems (three TACAN transponders, six antennas, a subsystem operating program, and redundancy management software in a tutorial form) are discussed and the interaction between these subsystems and the shuttle navigation system are identified. The use of TACAN during the first space transportation system (STS-1), is followed by a brief functional description of the TACAN hardware, then proceeds to cover the software units with a view to the STS-1, and ends with a discussion on the shuttle usage of the TACAN data and anticipated performance.

  1. A comparative study of software programmes for cross-sectional skeletal muscle and adipose tissue measurements on abdominal computed tomography scans of rectal cancer patients.

    PubMed

    van Vugt, Jeroen L A; Levolger, Stef; Gharbharan, Arvind; Koek, Marcel; Niessen, Wiro J; Burger, Jacobus W A; Willemsen, Sten P; de Bruin, Ron W F; IJzermans, Jan N M

    2017-04-01

    The association between body composition (e.g. sarcopenia or visceral obesity) and treatment outcomes, such as survival, using single-slice computed tomography (CT)-based measurements has recently been studied in various patient groups. These studies have been conducted with different software programmes, each with their specific characteristics, of which the inter-observer, intra-observer, and inter-software correlation are unknown. Therefore, a comparative study was performed. Fifty abdominal CT scans were randomly selected from 50 different patients and independently assessed by two observers. Cross-sectional muscle area (CSMA, i.e. rectus abdominis, oblique and transverse abdominal muscles, paraspinal muscles, and the psoas muscle), visceral adipose tissue area (VAT), and subcutaneous adipose tissue area (SAT) were segmented by using standard Hounsfield unit ranges and computed for regions of interest. The inter-software, intra-observer, and inter-observer agreement for CSMA, VAT, and SAT measurements using FatSeg, OsiriX, ImageJ, and sliceOmatic were calculated using intra-class correlation coefficients (ICCs) and Bland-Altman analyses. Cohen's κ was calculated for the agreement of sarcopenia and visceral obesity assessment. The Jaccard similarity coefficient was used to compare the similarity and diversity of measurements. Bland-Altman analyses and ICC indicated that the CSMA, VAT, and SAT measurements between the different software programmes were highly comparable (ICC 0.979-1.000, P < 0.001). All programmes adequately distinguished between the presence or absence of sarcopenia (κ = 0.88-0.96 for one observer and all κ = 1.00 for all comparisons of the other observer) and visceral obesity (all κ = 1.00). Furthermore, excellent intra-observer (ICC 0.999-1.000, P < 0.001) and inter-observer (ICC 0.998-0.999, P < 0.001) agreement for all software programmes were found. Accordingly, excellent Jaccard similarity coefficients were found for all comparisons (mean ≥ 0.964). FatSeg, OsiriX, ImageJ, and sliceOmatic showed an excellent agreement for CSMA, VAT, and SAT measurements on abdominal CT scans. Furthermore, excellent inter-observer and intra-observer agreement were achieved. Therefore, results of studies using these different software programmes can reliably be compared. © 2016 The Authors. Journal of Cachexia, Sarcopenia and Muscle published by John Wiley & Sons Ltd on behalf of the Society on Sarcopenia, Cachexia and Wasting Disorders.

  2. A comparative study of software programmes for cross‐sectional skeletal muscle and adipose tissue measurements on abdominal computed tomography scans of rectal cancer patients

    PubMed Central

    Levolger, Stef; Gharbharan, Arvind; Koek, Marcel; Niessen, Wiro J.; Burger, Jacobus W.A.; Willemsen, Sten P.; de Bruin, Ron W.F.

    2016-01-01

    Abstract Background The association between body composition (e.g. sarcopenia or visceral obesity) and treatment outcomes, such as survival, using single‐slice computed tomography (CT)‐based measurements has recently been studied in various patient groups. These studies have been conducted with different software programmes, each with their specific characteristics, of which the inter‐observer, intra‐observer, and inter‐software correlation are unknown. Therefore, a comparative study was performed. Methods Fifty abdominal CT scans were randomly selected from 50 different patients and independently assessed by two observers. Cross‐sectional muscle area (CSMA, i.e. rectus abdominis, oblique and transverse abdominal muscles, paraspinal muscles, and the psoas muscle), visceral adipose tissue area (VAT), and subcutaneous adipose tissue area (SAT) were segmented by using standard Hounsfield unit ranges and computed for regions of interest. The inter‐software, intra‐observer, and inter‐observer agreement for CSMA, VAT, and SAT measurements using FatSeg, OsiriX, ImageJ, and sliceOmatic were calculated using intra‐class correlation coefficients (ICCs) and Bland–Altman analyses. Cohen's κ was calculated for the agreement of sarcopenia and visceral obesity assessment. The Jaccard similarity coefficient was used to compare the similarity and diversity of measurements. Results Bland–Altman analyses and ICC indicated that the CSMA, VAT, and SAT measurements between the different software programmes were highly comparable (ICC 0.979–1.000, P < 0.001). All programmes adequately distinguished between the presence or absence of sarcopenia (κ = 0.88–0.96 for one observer and all κ = 1.00 for all comparisons of the other observer) and visceral obesity (all κ = 1.00). Furthermore, excellent intra‐observer (ICC 0.999–1.000, P < 0.001) and inter‐observer (ICC 0.998–0.999, P < 0.001) agreement for all software programmes were found. Accordingly, excellent Jaccard similarity coefficients were found for all comparisons (mean ≥ 0.964). Conclusions FatSeg, OsiriX, ImageJ, and sliceOmatic showed an excellent agreement for CSMA, VAT, and SAT measurements on abdominal CT scans. Furthermore, excellent inter‐observer and intra‐observer agreement were achieved. Therefore, results of studies using these different software programmes can reliably be compared. PMID:27897414

  3. CWA 15793 2011 Planning and Implementation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gross, Alan; Nail, George

    This software, built on an open source platform called Electron (runs on Chromium and Node.js), is designed to assist organizations in the implementation of a biorisk management system consistent with the requirements of the international, publicly available guidance document CEN Workshop Agreement 15793:2011 (CWA 15793). The software includes tools for conducting organizational gap analysis against CWA 15793 requirements, planning tools to support the implementation of CWA 15793 requirements, and performance monitoring support. The gap analysis questions are based on the text of CWA 15793, and its associated guidance document, CEN Workshop Agreement 16393:2012. The authors have secured permission from themore » publisher of CWA 15793, the European Committee for Standardization (CEN), to use language from the document in the software, with the understanding that the software will be made available freely, without charge.« less

  4. Science Gateways, Scientific Workflows and Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.

  5. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    NASA Astrophysics Data System (ADS)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  6. Opportunistic Resource Usage in CMS

    NASA Astrophysics Data System (ADS)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.; Gutsche, O.; Tadel, M.; Sfiligoi, I.; Letts, J.; Wuerthwein, F.; McCrea, A.; Bockelman, B.; Fajardo, E.; Linares, L.; Wagner, R.; Konstantinov, P.; Blumenfeld, B.; Bradley, D.; Cms Collaboration

    2014-06-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliant cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.

  7. Research on an expert system for database operation of simulation-emulation math models. Volume 1, Phase 1: Results

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G. O.; Schaffer, J. D.; Hsieh, B. J.; Padalkar, S.; Rodriguez-Moscoso, J. J.

    1985-01-01

    The results of the first phase of Research on an Expert System for Database Operation of Simulation/Emulation Math Models, is described. Techniques from artificial intelligence (AI) were to bear on task domains of interest to NASA Marshall Space Flight Center. One such domain is simulation of spacecraft attitude control systems. Two related software systems were developed to and delivered to NASA. One was a generic simulation model for spacecraft attitude control, written in FORTRAN. The second was an expert system which understands the usage of a class of spacecraft attitude control simulation software and can assist the user in running the software. This NASA Expert Simulation System (NESS), written in LISP, contains general knowledge about digital simulation, specific knowledge about the simulation software, and self knowledge.

  8. On the use and the performance of software reliability growth models

    NASA Technical Reports Server (NTRS)

    Keiller, Peter A.; Miller, Douglas R.

    1991-01-01

    We address the problem of predicting future failures for a piece of software. The number of failures occurring during a finite future time interval is predicted from the number failures observed during an initial period of usage by using software reliability growth models. Two different methods for using the models are considered: straightforward use of individual models, and dynamic selection among models based on goodness-of-fit and quality-of-prediction criteria. Performance is judged by the relative error of the predicted number of failures over future finite time intervals relative to the number of failures eventually observed during the intervals. Six of the former models and eight of the latter are evaluated, based on their performance on twenty data sets. Many open questions remain regarding the use and the performance of software reliability growth models.

  9. Beta-Testing Agreement | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Beta-Testing Agreements are appropriate forlimited term evaluation and applications development of new software, technology, or equipment platforms by the Frederick National Laboratory in collaboration with an external commercial partner. It ma

  10. Do We Really Know What Makes Educational Software Effective? A Call for Empirical Research on Effectiveness.

    ERIC Educational Resources Information Center

    Jolicoeur, Karen; Berger, Dale E.

    1986-01-01

    Examination of methods used by two software review services in evaluating microcomputer courseware--EPIE (Educational Products Information Exchange) and MicroSIFT (Microcomputer Software and Information for Teachers)--found low correlations between their recommendations for 82 programs. This lack of agreement casts doubts on the usefulness of…

  11. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  12. A Bayesian phylogenetic approach to estimating the stability of linguistic features and the genetic biasing of tone.

    PubMed

    Dediu, Dan

    2011-02-07

    Language is a hallmark of our species and understanding linguistic diversity is an area of major interest. Genetic factors influencing the cultural transmission of language provide a powerful and elegant explanation for aspects of the present day linguistic diversity and a window into the emergence and evolution of language. In particular, it has recently been proposed that linguistic tone-the usage of voice pitch to convey lexical and grammatical meaning-is biased by two genes involved in brain growth and development, ASPM and Microcephalin. This hypothesis predicts that tone is a stable characteristic of language because of its 'genetic anchoring'. The present paper tests this prediction using a Bayesian phylogenetic framework applied to a large set of linguistic features and language families, using multiple software implementations, data codings, stability estimations, linguistic classifications and outgroup choices. The results of these different methods and datasets show a large agreement, suggesting that this approach produces reliable estimates of the stability of linguistic data. Moreover, linguistic tone is found to be stable across methods and datasets, providing suggestive support for the hypothesis of genetic influences on its distribution.

  13. LC-IMS-MS Feature Finder: detecting multidimensional liquid chromatography, ion mobility and mass spectrometry features in complex datasets.

    PubMed

    Crowell, Kevin L; Slysz, Gordon W; Baker, Erin S; LaMarche, Brian L; Monroe, Matthew E; Ibrahim, Yehia M; Payne, Samuel H; Anderson, Gordon A; Smith, Richard D

    2013-11-01

    The addition of ion mobility spectrometry to liquid chromatography-mass spectrometry experiments requires new, or updated, software tools to facilitate data processing. We introduce a command line software application LC-IMS-MS Feature Finder that searches for molecular ion signatures in multidimensional liquid chromatography-ion mobility spectrometry-mass spectrometry (LC-IMS-MS) data by clustering deisotoped peaks with similar monoisotopic mass, charge state, LC elution time and ion mobility drift time values. The software application includes an algorithm for detecting and quantifying co-eluting chemical species, including species that exist in multiple conformations that may have been separated in the IMS dimension. LC-IMS-MS Feature Finder is available as a command-line tool for download at http://omics.pnl.gov/software/LC-IMS-MS_Feature_Finder.php. The Microsoft.NET Framework 4.0 is required to run the software. All other dependencies are included with the software package. Usage of this software is limited to non-profit research to use (see README). rds@pnnl.gov. Supplementary data are available at Bioinformatics online.

  14. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    PubMed

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  15. Automated interpretation of home blood pressure assessment (Hy-Result software) versus physician's assessment: a validation study.

    PubMed

    Postel-Vinay, Nicolas; Bobrie, Guillaume; Ruelland, Alan; Oufkir, Majida; Savard, Sebastien; Persu, Alexandre; Katsahian, Sandrine; Plouin, Pierre F

    2016-04-01

    Hy-Result is the first software for self-interpretation of home blood pressure measurement results, taking into account both the recommended thresholds for normal values and patient characteristics. We compare the software-generated classification with the physician's evaluation. The primary assessment criterion was whether algorithm classification of the blood pressure (BP) status concurred with the physician's advice (blinded to the software's results) following a consultation (n=195 patients). Secondary assessment was the reliability of text messages. In the 58 untreated patients, the agreement between classification of the BP status generated by the software and the physician's classification was 87.9%. In the 137 treated patients, the agreement was 91.9%. The κ-test applied for all the patients was 0.81 (95% confidence interval: 0.73-0.89). After correction of errors identified in the algorithm during the study, agreement increased to 95.4% [κ=0.9 (95% confidence interval: 0.84-0.97)]. For 100% of the patients with comorbidities (n=46), specific text messages were generated, indicating that a physician might recommend a target BP lower than 135/85 mmHg. Specific text messages were also generated for 100% of the patients for whom global cardiovascular risks markedly exceeded norms. Classification by Hy-Result is at least as accurate as that of a specialist in current practice (http://www.hy-result.com).

  16. LCG Persistency Framework (CORAL, COOL, POOL): Status and Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Clemencic, M.

    2012-04-19

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  17. Evaluation of a breast software model for 2D and 3D X-ray imaging studies of the breast.

    PubMed

    Baneva, Yanka; Bliznakova, Kristina; Cockmartin, Lesley; Marinov, Stoyko; Buliev, Ivan; Mettivier, Giovanni; Bosmans, Hilde; Russo, Paolo; Marshall, Nicholas; Bliznakov, Zhivko

    2017-09-01

    In X-ray imaging, test objects reproducing breast anatomy characteristics are realized to optimize issues such as image processing or reconstruction, lesion detection performance, image quality and radiation induced detriment. Recently, a physical phantom with a structured background has been introduced for both 2D mammography and breast tomosynthesis. A software version of this phantom and a few related versions are now available and a comparison between these 3D software phantoms and the physical phantom will be presented. The software breast phantom simulates a semi-cylindrical container filled with spherical beads of different diameters. Four computational breast phantoms were generated with a dedicated software application and for two of these, physical phantoms are also available and they are used for the side by side comparison. Planar projections in mammography and tomosynthesis were simulated under identical incident air kerma conditions. Tomosynthesis slices were reconstructed with an in-house developed reconstruction software. In addition to a visual comparison, parameters like fractal dimension, power law exponent β and second order statistics (skewness, kurtosis) of planar projections and tomosynthesis reconstructed images were compared. Visually, an excellent agreement between simulated and real planar and tomosynthesis images is observed. The comparison shows also an overall very good agreement between parameters evaluated from simulated and experimental images. The computational breast phantoms showed a close match with their physical versions. The detailed mathematical analysis of the images confirms the agreement between real and simulated 2D mammography and tomosynthesis images. The software phantom is ready for optimization purpose and extrapolation of the phantom to other breast imaging techniques. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. CANFAR+Skytree: A Cloud Computing and Data Mining System for Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, N. M.

    2013-10-01

    This is a companion Focus Demonstration article to the CANFAR+Skytree poster (Ball 2013, this volume), demonstrating the usage of the Skytree machine learning software on the Canadian Advanced Network for Astronomical Research (CANFAR) cloud computing system. CANFAR+Skytree is the world's first cloud computing system for data mining in astronomy.

  19. A Survey of Computer Use in Associate Degree Programs in Engineering Technology.

    ERIC Educational Resources Information Center

    Cunningham, Pearley

    As part of its annual program review process, the Department of Engineering Technology at the Community College of Allegheny County, in Pennsylvania, conducted a study of computer usage in community college engineering technology programs across the nation. Specifically, the study sought to determine the types of software, Internet access, average…

  20. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    ERIC Educational Resources Information Center

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  1. Tracking the PhD Students' Daily Computer Use

    ERIC Educational Resources Information Center

    Sim, Kwong Nui; van der Meer, Jacques

    2015-01-01

    This study investigated PhD students' computer activities in their daily research practice. Software that tracks computer usage (Manic Time) was installed on the computers of nine PhD students, who were at their early, mid and final stage in doing their doctoral research in four different discipline areas (Commerce, Humanities, Health Sciences and…

  2. Health Information System Simulation. Curriculum Improvement Project. Region II.

    ERIC Educational Resources Information Center

    Anderson, Beth H.; Lacobie, Kevin

    This volume is one of three in a self-paced computer literacy course that gives allied health students a firm base of knowledge concerning computer usage in the hospital environment. It also develops skill in several applications software packages. This volume contains five self-paced modules that allow students to interact with a health…

  3. Microcomputer Usage in Secondary Marketing Education. A National Study.

    ERIC Educational Resources Information Center

    Searle, A. Gary

    A study was conducted to determine microcomputer hardware, software, and inservice components of secondary marketing education programs. A questionnaire was developed and sent to 420 teacher-coordinators in 42 states. A total of 225 (54 percent) usable returns were tabulated at the University of Wisconsin-Stout Computer Center. Results of the…

  4. A Corpus-Based Comparative Study of "Learn" and "Acquire"

    ERIC Educational Resources Information Center

    Yang, Bei

    2016-01-01

    As an important yet intricate linguistic feature in English language, synonymy poses a great challenge for second language learners. Using the 100 million-word British National Corpus (BNC) as data and the software Sketch Engine (SkE) as an analyzing tool, this article compares the usage of "learn" and "acquire" used in natural…

  5. Student Reactions to Classroom Management Technology: Learning Styles and Attitudes toward Moodle

    ERIC Educational Resources Information Center

    Chung, Christina; Ackerman, David

    2015-01-01

    The authors look at student perceptions regarding the adoption and usage of Moodle. Self-efficacy theory and the Technology Acceptance Model were applied to understand student reactions to instructor implementation of classroom management software Moodle. They also looked at how the learning styles of students impacted their reactions to Moodle.…

  6. Microcomputer Usage in Schools, 1983-84.

    ERIC Educational Resources Information Center

    Quality Education Data, Inc., Denver, CO.

    Results are presented for the third annual survey of all United States school districts by Quality Education Data, Inc. Findings are displayed in tabular form, and include information on the following: size of the microcomputer marketplace and the kinds of people involved; availability of software by brand for home, business, and education use;…

  7. Internet Audio Products (3/3)

    ERIC Educational Resources Information Center

    Schwartz, Linda; de Schutter, Adrienne; Fahrni, Patricia; Rudolph, Jim

    2004-01-01

    Two contrasting additions to the online audio market are reviewed: "iVocalize", a browser-based audio-conferencing software, and "Skype", a PC-to-PC Internet telephone tool. These products are selected for review on the basis of their success in gaining rapid popular attention and usage during 2003-04. The "iVocalize" review emphasizes the…

  8. How to Teach Programming Indirectly--Using Spreadsheet Application

    ERIC Educational Resources Information Center

    Tahy, Zsuzsanna Szalayné

    2016-01-01

    It is a question in many countries whether ICT and application usage should be taught. There are some problems with IT literacy: users do not understand the concepts of a software, they cannot solve problems, and moreover, using applications gives them more problems. Consequently, using ICT seems to slow work down. Experts suggest learning…

  9. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  10. Integration of Computer Related Instruction in Texas Vocational Agriculture Programs. Final Report.

    ERIC Educational Resources Information Center

    Cepica, M. J.; And Others

    A study examined current usage of microcomputers, projected software needs, and teacher inservice training needs in Texas vocational agriculture programs. Questionnaires were mailed to each of the 922 vocational agriculture departments in Texas. Data from the 446 usable instruments returned were tabulated by geographical area and school size.…

  11. 48 CFR 208.7403 - Acquisition procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SYSTEM, DEPARTMENT OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7403 Acquisition procedures. Follow the procedures at PGI 208.7403 when acquiring commercial software and related services. [71 FR 39005, July 11, 2006] ...

  12. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping

    PubMed Central

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon’s conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team. PMID:27134733

  13. Evaluation of the benefits of assistive reading software: perceptions of high school students with learning disabilities.

    PubMed

    Chiang, Hsin-Yu; Liu, Chien-Hsiou

    2011-01-01

    Using assistive reading software may be a cost-effective way to increase the opportunity for independent learning in students with learning disabilities. However, the effectiveness and perception of assistive reading software has seldom been explored in English-as-a-second language students with learning disabilities. This research was designed to explore the perception and effect of using assistive reading software in high school students with dyslexia (one subtype of learning disability) to improve their English reading and other school performance. The Kurzweil 3000 software was used as the intervention tool in this study. Fifteen students with learning disabilities were recruited, and instruction in the usage of the Kurzweil 3000 was given. Then after 2 weeks, when they were familiarized with the use of Kurzweil 3000, interviews were used to determine the perception and potential benefit of using the software. The results suggested that the Kurzweil 3000 had an immediate impact on students' English word recognition. The students reported that the software made reading, writing, spelling, and pronouncing easier. They also comprehended more during their English class. Further study is needed to determine under which conditions certain hardware/software might be helpful for individuals with special learning needs.

  14. Musings about Beauty

    ERIC Educational Resources Information Center

    Kintsch, Walter

    2012-01-01

    In this essay, I explore how cognitive science could illuminate the concept of beauty. Two results from the extensive literature on aesthetics guide my discussion. As the term "beauty" is overextended in general usage, I choose as my starting point the notion of "perfect form." Aesthetic theorists are in reasonable agreement about the criteria for…

  15. Dagik Earth: A Digital Globe Project for Classrooms, Science Museums, and Research Institutes

    NASA Astrophysics Data System (ADS)

    Saito, A.; Tsugawa, T.

    2017-12-01

    Digital globe system is a powerful tool to make the audiences understand phenomena on the Earth and planets in intuitive way. Geo-cosmos of Miraikan, Japan uses 6-m spherical LED, and is one of the largest systems of digital globe. Science on a Sphere (SOS) by NOAA is a digital globe system that is most widely used in science museums around the world. These systems are so expensive that the usage of the digital globes is mainly limited to large-scale science museums. Dagik Earth is a digital globe project that promotes educational programs using digital globe with low cost. It aims to be used especially in classrooms. The cost for the digital globe of Dagik Earth is from several US dollars if PC and PC projector are available. It uses white spheres, such as balloons and balance balls, as the screen. The software is provided by the project with free of charge for the educational usage. The software runs on devices of Windows, Mac and iOS. There are English and Chinese language versions of the PC software besides Japanese version. The number of the registered users of Dagik Earth is about 1,400 in Japan. About 60% of them belongs to schools, 30% to universities and research institutes, and 8% to science museums. In schools, it is used in classes by teachers, and science activities by students. Several teachers have used the system for five years and more. In a students' activity, Dagik Earth contents on the typhoon, solar eclipse, and satellite launch were created and presented in a school festival. This is a good example of the usage of Dagik Earth for STEM education. In the presentation, the system and activity of Dagik Earth will be presented, and the future expansion of the project will be discussed.

  16. Clinical Data Systems to Support Public Health Practice: A National Survey of Software and Storage Systems Among Local Health Departments.

    PubMed

    McCullough, J Mac; Goodin, Kate

    2016-01-01

    Numerous software and data storage systems are employed by local health departments (LHDs) to manage clinical and nonclinical data needs. Leveraging electronic systems may yield improvements in public health practice. However, information is lacking regarding current usage patterns among LHDs. To analyze clinical and nonclinical data storage and software types by LHDs. Data came from the 2015 Informatics Capacity and Needs Assessment Survey, conducted by Georgia Southern University in collaboration with the National Association of County and City Health Officials. A total of 324 LHDs from all 50 states completed the survey (response rate: 50%). Outcome measures included LHD's primary clinical service data system, nonclinical data system(s) used, and plans to adopt electronic clinical data system (if not already in use). Predictors of interest included jurisdiction size and governance type, and other informatics capacities within the LHD. Bivariate analyses were performed using χ and t tests. Up to 38.4% of LHDs reported using an electronic health record (EHR). Usage was common especially among LHDs that provide primary care and/or dental services. LHDs serving smaller populations and those with state-level governance were both less likely to use an EHR. Paper records were a common data storage approach for both clinical data (28.9%) and nonclinical data (59.4%). Among LHDs without an EHR, 84.7% reported implementation plans. Our findings suggest that LHDs are increasingly using EHRs as a clinical data storage solution and that more LHDs are likely to adopt EHRs in the foreseeable future. Yet use of paper records remains common. Correlates of electronic system usage emerged across a range of factors. Program- or system-specific needs may be barriers or facilitators to EHR adoption. Policy makers can tailor resources to address barriers specific to LHD size, governance, service portfolio, existing informatics capabilities, and other pertinent characteristics.

  17. Chapter 24: Programmatic Interfaces - IDL VOlib

    NASA Astrophysics Data System (ADS)

    Miller, C. J.

    In this chapter, we describe a library for working with the VO using IDL (the Interactive Data Language). IDL is a software environment for data analysis, visualization, and cross-platform application development. It has wide-usage in astronomy, including NASA (e.g. http://seadas.gsfc.nasa.gov/), the Sloan Digital Sky Survey (http://www.sdss.org), and the Spitzer Infrared Spectrograph Instrument (http://ssc.spitzer.caltech.edu/archanaly/contributed/smart/). David Stern, the founder of Research Systems, Inc. (RSI), began the development of IDL while working with NASA's Mars Mariner 7 and 9 data at the Laboratory for Atmospheric and Space Physics at the University of Colorado. In 1981, IDL was rewritten in assembly language and FORTRAN for VAX/VMS. IDL's usage has expanded over the last decade into the fields of medical imaging and engineering, among many others. IDL's programming style carries over much of this FORTRAN-legacy, and has a familiar feel to many astronomers who learned their trade using FORTRAN. The spread of IDL-usage amongst astronomers can in part be attributed to the wealth of publicly astronomical libraries. The Goddard Space Flight Center (GSFC) maintains a list of astronomy-related IDL libraries, including the well known Astronomy User's Library (hereafter ASTROLIB2). We will use some of these GSFC IDL libraries. We note that while IDL is a licensed-software product, the source code of user-written procedures are typically freely available to the community. To make the most out of this section as a reader, it is important that many of the data discovery, access, and analysis protocols are understood before reading this chapter. In the next section, we provide an overview of some of the NVO terminology with which the reader should be familiar. The IDL library discussed here is specifically for use with the Virtual Observatory and is named VOlib. IDL's VOlib is available at http://nvo.noao.edu and is included with the software distrubution for this book.

  18. Federal Emergency Management Information System (FEMIS) system administration guide. Version 1.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burford, M.J.; Burnett, R.A.; Downing, T.R.

    The Federal Emergency Management Information System (FEMIS) is an emergency management planning and analysis tool that was developed by the (Pacific Northwest National Laboratory) (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide defines FEMIS hardware and software requirements and gives instructions for installing the FEMIS software package. 91 This document also contains information on the following: software installation for the FEMIS data servers, communication server, mail server, and the emergency management workstations; distribution media loading and FEMIS installation validation and troubleshooting; and system management of FEMIS users, login, privileges, and usage.more » The system administration utilities (tools), available in the FEMIS client software, are described for user accounts and site profile. This document also describes the installation and use of system and database administration utilities that will assist in keeping the FEMIS system running in an operational environment.« less

  19. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  20. Software Reviews.

    ERIC Educational Resources Information Center

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  1. IDSE Version 1 User's Manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.

  2. Universal computer test stand (recommended computer test requirements). [for space shuttle computer evaluation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.

  3. CASE tools and UML: state of the ART.

    PubMed

    Agarwal, S

    2001-05-01

    With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.

  4. Frequency Estimator Performance for a Software-Based Beacon Receiver

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne R.; Nessel, James A.

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a Q/V-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  5. CFEL-ASG Software Suite (CASS): usage for free-electron laser experiments with biological focus.

    PubMed

    Foucar, Lutz

    2016-08-01

    CASS [Foucar et al. (2012). Comput. Phys. Commun. 183 , 2207-2213] is a well established software suite for experiments performed at any sort of light source. It is based on a modular design and can easily be adapted for use at free-electron laser (FEL) experiments that have a biological focus. This article will list all the additional functionality and enhancements of CASS for use with FEL experiments that have been introduced since the first publication. The article will also highlight some advanced experiments with biological aspects that have been performed.

  6. Your EHR license agreement: critical issues.

    PubMed

    Shay, Daniel F

    2014-01-01

    This article discusses several key provisions and concepts in software license agreements for electronic health records. It offers insight into what physician practices can expect to find in their license agreements, as well as practical advice on beneficial provisions. The article examines contractual language relating to term and termination, technical specifications and support, and compliance with governmental programs.

  7. The Barriers and Causes of Building Information Modelling Usage for Interior Design Industry

    NASA Astrophysics Data System (ADS)

    Hamid, A. B. Abd; Taib, M. Z. Mohd; Razak, A. H. N. Abdul; Embi, M. R.

    2017-12-01

    Building Information Modeling (BIM) has since developed alongside the improvement in the construction industry, purposely to simulate the design, management, construction and documentation. It facilitates and monitors the construction through visualization and emphasizes on various inputs to virtually design and construct a building using specific software. This study aims to identify and elaborate barriers of BIM usage in interior design industry in Malaysia. This study is initiated with a pilot survey utilising sixteen respondents that has been randomly chosen. Respondents are attached with interior design firms that are registered by Lembaga Arkitek Malaysia (LAM). The research findings are expected to provide significant information to encourage BIM adoption among interior design firms.

  8. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  9. Cooperative GN&C development in a rapid prototyping environment. [flight software design for space vehicles

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim

    1993-01-01

    The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.

  10. Investigating the Structural Relationship for the Determinants of Cloud Computing Adoption in Education

    ERIC Educational Resources Information Center

    Bhatiasevi, Veera; Naglis, Michael

    2016-01-01

    This research is one of the first few to investigate the adoption and usage of cloud computing in higher education in the context of developing countries, in this case Thailand. It proposes extending the technology acceptance model to integrate subjective norm, perceived convenience, trust, computer self-efficacy, and software functionality in…

  11. An Instructional Feedback Technique for Teaching Project Management Tools Aligned with PMBOK

    ERIC Educational Resources Information Center

    Gonçalves, Rafael Queiroz; von Wangenheim, Christiane Gresse; Hauck, Jean Carlo Rossa; Petri, Giani

    2017-01-01

    The management of contemporary software projects is unfeasible without the support of a Project Management (PM) tool. In order to enable the adoption of PM tools in practice, teaching its usage is important as part of computer education. Aiming at teaching PM tools, several approaches have been proposed, such as the development of educational PM…

  12. The Impact of the Computer on the English Language.

    ERIC Educational Resources Information Center

    Perry, Devern

    1990-01-01

    Study analyzed 224 product announcements from 69 hardware and software companies to detail computer-related words that are in common usage and compare the words and definitions with those in the Merriam-Webster dictionary. It was found that 67.3 percent of the words were not included in the dictionary, pointing out the need for teachers to help…

  13. Distracted driving: a neglected epidemic.

    PubMed

    Dildy, Dale W

    2012-10-01

    In 2009, the National Highway Traffic Safety Administration (NHTSA) estimated nearly 6,000 distracted driver fatalities and 515,000 injuries in the United States alone. Distracted driving is a worldwide problem that needs to be addressed. Software is available to disable cell phone usage while driving, but using the advanced technology may require legislation along with a renewed sense of driver responsibility.

  14. Assessing Teachers' Perception on Integrating ICT in Teaching-Learning Process: The Case of Adwa College

    ERIC Educational Resources Information Center

    Gebremedhin, Mewcha Amha; Fenta, Ayele Almaw

    2015-01-01

    Rapid growth and improvement in ICT have led to the diffusion of technology in education. The purpose of this study is to investigate teachers' perception on integrating ICT in teaching-learning process. The research questions sought to measure teachers' software usage as well as other instructional tools and materials, preferences for…

  15. Microcomputer Applications for Health Care Professionals. Volume I. Curriculum Improvement Project. Region II.

    ERIC Educational Resources Information Center

    Bruce, Lucy

    This volume is one of three in a self-paced computer literacy course that gives allied health students a firm base of knowledge concerning computer usage in the hospital environment. It also develops skill in several applications software packages. Volume I contains materials for a three-hour course. A student course syllabus provides this…

  16. Simulation Methods for Optics and Electromagnetics in Complex Geometries and Extreme Nonlinear Regimes with Disparate Scales

    DTIC Science & Technology

    2014-09-30

    software devel- oped with this project support. S1 Cork School 2013: I. UPPEcore Simulator design and usage, Simulation examples II. Nonlinear pulse...pulse propagation 08/28/13 — 08/02/13, University College Cork , Ireland S2 ACMS MURI School 2012: Computational Methods for Nonlinear PDEs describing

  17. Microcomputer Applications for Health Care Professionals. Volume II. Curriculum Improvement Project. Region II.

    ERIC Educational Resources Information Center

    Bruce, Lucy

    This volume is one of three in a self-paced computer literacy course that gives allied health students a firm base of knowledge concerning computer usage in the hospital environment. It also develops skill in several applications software packages. Volume II contains materials for three one-hour courses on word processing applications, spreadsheet…

  18. Assessing Homegrown Library Collections: Using Google Analytics to Track Use of Screencasts and Flash-Based Learning Objects

    ERIC Educational Resources Information Center

    Betty, Paul

    2009-01-01

    Increasing use of screencast and Flash authoring software within libraries is resulting in "homegrown" library collections of digital learning objects and multimedia presentations. The author explores the use of Google Analytics to track usage statistics for interactive Shockwave Flash (.swf) files, the common file output for screencast and Flash…

  19. PyMidas: Interface from Python to Midas

    NASA Astrophysics Data System (ADS)

    Maisala, Sami; Oittinen, Tero

    2014-01-01

    PyMidas is an interface between Python and MIDAS, the major ESO legacy general purpose data processing system. PyMidas allows a user to exploit both the rich legacy of MIDAS software and the power of Python scripting in a unified interactive environment. PyMidas also allows the usage of other Python-based astronomical analysis systems such as PyRAF.

  20. A Survey of Computer Usage in Adult Education Programs in Florida Report.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.

    A study was conducted to identify the types and uses of computer hardware and software in adult and community education programs in Florida. Information was gathered through a survey instrument developed for the study and mailed to 100 adult and community education directors and adult literacy center coordinators (92 surveys were returned). The…

  1. Opportunistic Resource Usage in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreuzer, Peter; Hufnagel, Dirk; Dykstra, D.

    2014-01-01

    CMS is using a tiered setup of dedicated computing resources provided by sites distributed over the world and organized in WLCG. These sites pledge resources to CMS and are preparing them especially for CMS to run the experiment's applications. But there are more resources available opportunistically both on the GRID and in local university and research clusters which can be used for CMS applications. We will present CMS' strategy to use opportunistic resources and prepare them dynamically to run CMS applications. CMS is able to run its applications on resources that can be reached through the GRID, through EC2 compliantmore » cloud interfaces. Even resources that can be used through ssh login nodes can be harnessed. All of these usage modes are integrated transparently into the GlideIn WMS submission infrastructure, which is the basis of CMS' opportunistic resource usage strategy. Technologies like Parrot to mount the software distribution via CVMFS and xrootd for access to data and simulation samples via the WAN are used and will be described. We will summarize the experience with opportunistic resource usage and give an outlook for the restart of LHC data taking in 2015.« less

  2. Systems aspects of COBE science data compression

    NASA Technical Reports Server (NTRS)

    Freedman, I.; Boggess, E.; Seiler, E.

    1993-01-01

    A general approach to compression of diverse data from large scientific projects has been developed and this paper addresses the appropriate system and scientific constraints together with the algorithm development and test strategy. This framework has been implemented for the COsmic Background Explorer spacecraft (COBE) by retrofitting the existing VAS-based data management system with high-performance compression software permitting random access to the data. Algorithms which incorporate scientific knowledge and consume relatively few system resources are preferred over ad hoc methods. COBE exceeded its planned storage by a large and growing factor and the retrieval of data significantly affects the processing, delaying the availability of data for scientific usage and software test. Embedded compression software is planned to make the project tractable by reducing the data storage volume to an acceptable level during normal processing.

  3. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    USGS Publications Warehouse

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  4. ACTS: from ATLAS software towards a common track reconstruction software

    NASA Astrophysics Data System (ADS)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  5. Agent-based computational model of the prevalence of gonococcal infections after the implementation of HIV pre-exposure prophylaxis guidelines.

    PubMed

    Escobar, Erik; Durgham, Ryan; Dammann, Olaf; Stopka, Thomas J

    2015-01-01

    Recently, the first comprehensive guidelines were published for pre-exposure prophylaxis (PrEP) for the prevention of HIV infection in populations with substantial risk of infection. Guidelines include a daily regimen of emtricitabine/tenofovir disoproxil fumarate (TDF/FTC) as well as condom usage during sexual activity. The relationship between the TDF/FTC intake regimen and condom usage is not yet fully understood. If men who have sex with men (MSM,) engage in high-risk sexual activities without using condoms when prescribed TDF/FTC they might be at an increased risk for other sexually transmitted diseases (STD). Our study focuses on the possible occurrence of behavioral changes among MSM in the United States over time with regard to condom usage. In particular, we were interested in creating a model of how increased uptake of TDF/FTC might cause a decline in condom usage, causing significant increases in non-HIV STD incidence, using gonococcal infection incidence as a biological endpoint. We used the agent-based modeling software NetLogo, building upon an existing model of HIV infection. We found no significant evidence for increased gonorrhea prevalence due to increased PrEP usage at any level of sample-wide usage, with a range of 0-90% PrEP usage. However, we did find significant evidence for decreased prevalence of HIV, with a maximal effect being reached when 5% to 10% of the MSM population used PrEP. Our findings appear to indicate that attitudes of aversion, within the medical community, toward the promotion of PrEP due to the potential risk of increased STD transmission are unfounded.

  6. The use of online information resources by nurses.

    PubMed

    Wozar, Jody A; Worona, Paul C

    2003-04-01

    Based on the results of an informal needs assessment, the Usage of Online Information Resources by Nurses Project was designed to provide clinical nurses with accurate medical information at the point of care by introducing them to existing online library resources through instructional classes. Actual usage of the resources was then monitored for a set period of time. A two-hour hands-on class was developed for interested nurses. Participants were instructed in the content and use of several different online resources. A special Web page was designed for this project serving as an access point to the resources. Using a password system and WebTrends trade mark software, individual participant's usage of the resources was monitored for a thirty-day period following the class. At the end of the thirty days, usage results were tabulated, and participants were sent general evaluation forms. Eight participants accessed the project page thirty-nine times in a thirty-day period. The most accessed resource was Primary Care Online (PCO), accessed thirty-three times. PCO was followed by MD Consult (17), Ovid (8), NLM resources (5), and electronic journals (1). The individual with the highest usage accessed the project page thirteen times. Practicing clinical nurses will use online medical information resources if they are first introduced to them and taught how to access and use them. Health sciences librarians can play an important role in providing instruction to this often overlooked population.

  7. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  8. 78 FR 32431 - Notice of Submission of Proposed Information Collection to OMB; Enterprise Income Verification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... Proposed Information Collection to OMB; Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY: Office of the Chief Information Officer, HUD... user with information related to the Rules of Behavior for system usage and the user's responsibilities...

  9. The Usage of CAUSE in Three Branches of Science

    ERIC Educational Resources Information Center

    Yang, Bei; Chen, Bin

    2016-01-01

    Semantic prosody is a concept that has been subject to considerable criticism and debate. One big concern is to what extent semantic prosody is domain or register-related. Previous studies reach the agreement that CAUSE has an overwhelmingly negative meaning in general English. Its semantic prosody remains controversial in academic writing,…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report was prepared to provide information concerning past solid and hazardous waste management practices for all leased land at the US DOE Hanford Reservation. This report contains sections including land description; land usage; ground water, air and soil monitoring data; and land uses after 1963. Numerous appendices are included which provide documentation of lease agreements and amendments, environmental assessments, and site surveys.

  11. Usage of Electric Vehicle Supply Equipment Along the Corridors between the EV Project Major Cities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mindy Kirkpatrick

    The report explains how the EVSE are being used along the corridors between the EV Project cities. The EV Project consists of a nationwide collaboration between Idaho National Laboratory (INL), ECOtality North America, Nissan, General Motors, and more than 40 other city, regional and state governments, and electric utilities. The purpose of the EV Project is to demonstrate the deployment and use of approximately 14,000 Level II (208-240V) electric vehicle supply equipment (EVSE) and 300 fast chargers in 16 major cities. This research investigates the usage of all currently installed EV Project commercial EVSE along major interstate corridors. ESRI ArcMapmore » software products are utilized to create geographic EVSE data layers for analysis and visualization of commercial EVSE usage. This research locates the crucial interstate corridors lacking sufficient commercial EVSE and targets locations for future commercial EVSE placement. The results and methods introduced in this research will be used by INL for the duration of the EV Project.« less

  12. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence.

    PubMed

    Alphy, Anna; Prabakaran, S

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations.

  13. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence

    PubMed Central

    Alphy, Anna; Prabakaran, S.

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations. PMID:26229978

  14. Latent Structure Agreement Analysis

    DTIC Science & Technology

    1989-11-01

    correct for bias in estimation of disease prevalence due to misclassification error [39]. Software Varying panel latent class agreement models can be...D., and L. M. Irwig, "Estimation of Test Error Rates, Disease Prevalence and Relative Risk from Misclassified Data: A Review," Journal of Clinical

  15. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  16. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  17. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  18. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  19. Software Product Liability

    DTIC Science & Technology

    1993-08-01

    disclaimers should be a top priority. Contract law involves the Uniform Commercial Code (UCC). This is an agreement between all the states (except...to contract law than this, the basic issue with software is that the sup- plier is generally an expert on an arcane and sophisticated technology and

  20. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  1. Influence of local meshing size on stress intensity factor of orthopedic lag screw

    NASA Astrophysics Data System (ADS)

    Husain, M. N.; Daud, R.; Basaruddin, K. S.; Mat, F.; Bajuri, M. Y.; Arifin, A. K.

    2017-09-01

    Linear elastic fracture mechanics (LEFM) concept is generally used to study the influence of crack on the performance of structures. In order to study the LEFM concept on damaged structure, the usage of finite element analysis software is implemented to do the simulation of the structure. Mesh generation is one of the most crucial procedures in finite element method. For the structure that crack or damaged, it is very important to determine the accurate local meshing size at the crack tip of the crack itself in order to get the accurate value of stress intensity factor, KI. Pre crack will be introduced to the lag screw based on the von mises' stress result that had been performed in previous research. This paper shows the influence of local mesh arrangement on numerical value of the stress intensity factor, KI obtained by the displacement method. This study aims to simulate the effect of local meshing which is the singularity region on stress intensity factor, KI to the critical point of failure in screw. Five different set of wedges meshing size are introduced during the simulation of finite element analysis. The number of wedges used to simulate this research is 8, 10, 14, 16 and 20. There are three set of numerical equations used to validate the results which are brown and srawley, gross and brown and Tada equation. The result obtained from the finite element software (ANSYS APDL) has a positive agreement with the numerical analysis which is Brown and Srawley compared to other numerical formula. Radius of first row size of 0.014 and singularity element with 14 numbers of wedges is proved to be the best local meshing for this study.

  2. Evaluation of Automated Fracture Risk Assessment Based on the Canadian Association of Radiologists and Osteoporosis Canada Assessment Tool.

    PubMed

    Allin, Sonya; Bleakney, Robert; Zhang, Julie; Munce, Sarah; Cheung, Angela M; Jaglal, Susan

    2016-01-01

    Fracture risk assessments are not always clearly communicated on bone mineral density (BMD) reports; evidence suggests that structured reporting (SR) tools may improve report clarity. The aim of this study is to compare fracture risk assessments automatically assigned by SR software in accordance with Canadian Association of Radiologists and Osteoporosis Canada (CAROC) recommendations to assessments from experts on narrative BMD reports. Charts for 500 adult patients who recently received a BMD exam were sampled from across University of Toronto's Joint Department of Medical Imaging. BMD measures and clinical details were manually abstracted from charts and were used to create structured reports with assessments generated by a software implementation of CAROC recommendations. CAROC calculations were statistically compared to experts' original assessments using percentage agreement (PA) and Krippendorff's alpha. Canadian FRAX calculations were also compared to experts', where possible. A total of 25 (5.0%) reported assessments did not conform to categorizations recommended by Canadian guidelines. Across the remainder, the Krippendorff's alpha relating software assigned assessments to physicians was high at 0.918; PA was 94.3%. Lower agreement was associated with reports for patients with documented modifying factors (alpha = 0.860, PA = 90.2%). Similar patterns of agreement related expert assessments to FRAX calculations, although statistics of agreement were lower. Categories of disagreement were defined by (1) gray areas in current guidelines, (2) margins of assessment categorizations, (3) dictation/transcription errors, (4) patients on low doses of steroids, and (5) ambiguous documentation of modifying factors. Results suggest that SR software can produce fracture risk assessments that agree with experts on most routine, adult BMD exams. Results also highlight situations where experts tend to diverge from guidelines and illustrate the potential for SR software to (1) reduce variability in, (2) ameliorate errors in, and (3) improve clarity of routine adult BMD exam reports. Copyright © 2016 International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  3. Reproducibility of Lobar Perfusion and Ventilation Quantification Using SPECT/CT Segmentation Software in Lung Cancer Patients.

    PubMed

    Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin

    2017-09-01

    Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  4. Evaluation of the Turkish translation of the Minimal Standard Terminology for Digestive Endoscopy by development of an endoscopic information system.

    PubMed

    Atalağ, Koray; Bilgen, Semih; Gür, Gürden; Boyacioğlu, Sedat

    2007-09-01

    There are very few evaluation studies for the Minimal Standard Terminology for Digestive Endoscopy. This study aims to evaluate the usage of the Turkish translation of Minimal Standard Terminology by developing an endoscopic information system. After elicitation of requirements, database modeling and software development were performed. Minimal Standard Terminology driven forms were designed for rapid data entry. The endoscopic report was rapidly created by applying basic Turkish syntax and grammar rules. Entering free text and also editing of final report were possible. After three years of live usage, data analysis was performed and results were evaluated. The system has been used for reporting of all endoscopic examinations. 15,638 valid records were analyzed, including 11,381 esophagogastroduodenoscopies, 2,616 colonoscopies, 1,079 rectoscopies and 562 endoscopic retrograde cholangiopancreatographies. In accordance with other previous validation studies, the overall usage of Minimal Standard Terminology terms was very high: 85% for examination characteristics, 94% for endoscopic findings and 94% for endoscopic diagnoses. Some new terms, attributes and allowed values were also added for better clinical coverage. Minimal Standard Terminology has been shown to cover a high proportion of routine endoscopy reports. Good user acceptance proves that both the terms and structure of Minimal Standard Terminology were consistent with usual clinical thinking. However, future work on Minimal Standard Terminology is mandatory for better coverage of endoscopic retrograde cholangiopancreatographies examinations. Technically new software development methodologies have to be sought for lowering cost of development and the maintenance phase. They should also address integration and interoperability of disparate information systems.

  5. CT colonography: automated measurement of colonic polyps compared with manual techniques--human in vitro study.

    PubMed

    Taylor, Stuart A; Slater, Andrew; Halligan, Steve; Honeyfield, Lesley; Roddie, Mary E; Demeshski, Jamshid; Amin, Hamdam; Burling, David

    2007-01-01

    To prospectively investigate the relative accuracy and reproducibility of manual and automated computer software measurements by using polyps of known size in a human colectomy specimen. Institutional review board approval was obtained for the study; written consent for use of the surgical specimen was obtained. A colectomy specimen containing 27 polyps from a 16-year-old male patient with familial adenomatous polyposis was insufflated, submerged in a container with solution, and scanned at four-section multi-detector row computed tomography (CT). A histopathologist measured the maximum dimension of all polyps in the opened specimen. Digital photographs and line drawings were produced to aid CT-histologic measurement correlation. A novice (radiographic technician) and an experienced (radiologist) observer independently estimated polyp diameter with three methods: manual two-dimensional (2D) and manual three-dimensional (3D) measurement with software calipers and automated measurement with software (automatic). Data were analyzed with paired t tests and Bland-Altman limits of agreement. Seven polyps (

  6. Recent trends related to the use of formal methods in software engineering

    NASA Technical Reports Server (NTRS)

    Prehn, Soren

    1986-01-01

    An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.

  7. From data to knowledge: The future of multi-omics data analysis for the rhizosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen White, Richard; Borkum, Mark I.; Rivas-Ubach, Albert

    The rhizosphere is the interface between a plant's roots and its surrounding soil. The rhizosphere microbiome, a complex microbial ecosystem, nourishes the terrestrial biosphere. Integrated multi-omics is a modern approach to systems biology that analyzes and interprets the datasets of multiple -omes of both individual organisms and multi-organism communities and consortia. The successful usage and application of integrated multi-omics to rhizospheric science is predicated upon the availability of rhizosphere-specific data, metadata and software. This review analyzes the availability of multi-omics data, metadata and software for rhizospheric science, identifying potential issues, challenges and opportunities.

  8. DBCreate: A SUPCRT92-based program for producing EQ3/6, TOUGHREACT, and GWB thermodynamic databases at user-defined T and P

    NASA Astrophysics Data System (ADS)

    Kong, Xiang-Zhao; Tutolo, Benjamin M.; Saar, Martin O.

    2013-02-01

    SUPCRT92 is a widely used software package for calculating the standard thermodynamic properties of minerals, gases, aqueous species, and reactions. However, it is labor-intensive and error-prone to use it directly to produce databases for geochemical modeling programs such as EQ3/6, the Geochemist's Workbench, and TOUGHREACT. DBCreate is a SUPCRT92-based software program written in FORTRAN90/95 and was developed in order to produce the required databases for these programs in a rapid and convenient way. This paper describes the overall structure of the program and provides detailed usage instructions.

  9. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  10. EduCloud: PaaS versus IaaS Cloud Usage for an Advanced Computer Science Course

    ERIC Educational Resources Information Center

    Vaquero, L. M.

    2011-01-01

    The cloud has become a widely used term in academia and the industry. Education has not remained unaware of this trend, and several educational solutions based on cloud technologies are already in place, especially for software as a service cloud. However, an evaluation of the educational potential of infrastructure and platform clouds has not…

  11. My Virtual Reading Coach: "An Analysis of Usage and Impact," 2013-14. Technical Note. Volume 3, Number 2

    ERIC Educational Resources Information Center

    Urdegar, Steven M.

    2014-01-01

    My Virtual Reading Coach (MVRC) is an online program for students who have been identified as struggling readers. It is used as an intervention within the Response to Intervention (RtI) framework, as well as for students with disabilities. The software addresses reading sub-skills (i.e., comprehension, fluency, phonemic awareness, phonics, and…

  12. Exploring Educational Use of Blogs in U.S. Education

    ERIC Educational Resources Information Center

    Hong, Wang

    2008-01-01

    Abstract: As one of the Web 2.0 tools, blogs are widely used in US education. This paper gives a brief overview of blogs such as advantages, disadvantages, and major software for creating blogs, and then it reviews some EduBlogs, its usage, and examples in US education. The purpose is to motivate more educators to use blogs in teaching and…

  13. Investigating Information Technologies in Disasters: Three Essays on Micro-Blogging and Free and Open Source Software (FOSS) Environment

    ERIC Educational Resources Information Center

    Li, Pu

    2012-01-01

    This dissertation aims to investigate how advanced information technologies cope with the various demands of disaster response. It consists of three essays on the exploration of micro-blogging and FOSS environments. The first essay looks at the usage of micro-blogging in the aftermath of the massive 2008 China earthquake and explores the…

  14. A Study of Learners' Usage of a Mobile Learning Application for Learning Idioms and Collocations

    ERIC Educational Resources Information Center

    Amer, Mahmoud

    2014-01-01

    This study explored how four groups of language learners used a mobile software application developed by the researcher for learning idiomatic expressions and collocations. A total of 45 participants in the study used the application for a period of one week. Data for this study was collected from the application, a questionnaire, and follow-up…

  15. Adaptation of Diaphyseal Structure with Aging and Increased Mechanical Usage in the Adult Rat: A Histomorphometrical and Biomechanical Study

    NASA Technical Reports Server (NTRS)

    Jee, Webster S. S.; Li, Xiao Jian; Schaffler, Mitchell B.

    1991-01-01

    The experimental increase in mechanical usage or overloading of the left hindlimb was produced by immobilization of the contralateral hindlimb. The right hindlimb was placed in a flexed position against the body and was immobilized using an elastic bandage. Some control animals were sacrificed initially at time zero and increased mechanical usage and age-matched control animals were sacrificed after 2, 10, 18, and 26 weeks of treatment. All animals received double bone fluorochrome labeling prior to sacrifice. Cortical bone histomorphometry and cross-sectional moments of inertia were determined. Marrow cavity enlargement and total cross-sectional area expansion represented the age-related cortical bone changes. Increased mechanical usage enhanced periosteal bone modeling in the formation mode and dampened endocortical bone remodeling and bone modeling in the resorption mode (resorption drift) to create a slight positive bone balance. These observations are in general agreement with Frost's postulate for mechanical effects on bone modeling and remodeling. The maximum moment of inertia did not change significantly in either control or overloaded tibial shafts. The minimum and polar moment of inertias in overloaded bones increases over those of controls at 18 and 26 weeks of the experiment.

  16. Adaptation of Diaphyseal Structure With Aging and Increased Mechanical Usage in the Adult Rat: A Histomorphometrical and Biomechanical Study

    NASA Technical Reports Server (NTRS)

    Jee, Webster S. S.; Li, Xiao Jian; Schaffler, Mitchell B.

    1991-01-01

    The experimental increase in mechanical usage or overloading of the left hindlimb was produced by immobilization of the contralateral hindlimb. The right hindlimb was placed in a flexed position against the body and was immobilized using an elastic bandage. Some control animals were sacrificed initially at time zero and increased mechanical usage and age-matched control animals were sacrificed after 2, 10, 18, and 26 weeks of treatment. All animals received double bone fluorochrome labeling prior to sacrifice. Cortical bone histomorphometry and cross-sectional moments of inertia were determined. Marrow cavity enlargement and total cross-sectional area expansion represented the age-related cortical bone changes. Increased mechanical usage enhanced periosteal bone modeling in the formation mode and dampened endocortical bone remodeling and bone modeling in the resorption mode (resorption drift) to create a slight positive bone balance. These observations are in general agreement with Frost's postulate for mechanical effects on bone modeling and remodeling. The maximum moment of inertia did not change significantly in either control or overloaded tibial shafts. The minimum and polar moment of inertias in overloaded bones increases over those of controls at 18 and 26 weeks of the experiment.

  17. Improving Airline Safety

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under a NASA-Ames Space Act Agreement, Coryphaeus Software and Simauthor, Inc., developed an Aviation Performance Measuring System (APMS). This software, developed for the aerospace and airline industry, enables the replay of Digital Flight Data Recorder (DFDR) data in a flexible, user-configurable, real-time, high fidelity 3D (three dimensional) environment.

  18. The Navy’s Management of Software Licenses Needs Improvement

    DTIC Science & Technology

    2013-08-07

    Enterprise Software Licensing ( ESL ) as a primary DON etliciency target. Through policy and Integrated Product Team actions, this efficiency...review, as well as with DoD Enterprise Software Initiative ( ESl ) Blanket Pw·chase Agreements and any r•elated fedeml Acquisition Regulation and General...organizational and multi-functional DON ESL team. The DON is also participating in DoD level enterprise softwru·e licensing project~ through the Dol

  19. Improving Software Quality and Management Through Use of Service Level Agreements

    DTIC Science & Technology

    2005-03-01

    many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check

  20. Sequence and System in the Acquisition of Tense and Agreement

    ERIC Educational Resources Information Center

    Rispoli, Matthew; Hadley, Pamela A.; Holt, Janet K.

    2012-01-01

    Purpose: The relatedness of tense morphemes in the language of children younger than 3 years of age is a matter of controversy. Generativist accounts predict that the morphemes will be related, whereas usage-based accounts predict the absence of relationships. This study focused on the increasing productivity of the 5 morphemes in the tense…

  1. Effect of Subject Types on the Production of Auxiliary "Is" in Young English-Speaking Children

    ERIC Educational Resources Information Center

    Guo, Ling-Yu; Owen, Amanda J.; Tomblin, J. Bruce

    2010-01-01

    Purpose: In this study, the authors tested the unique checking constraint (UCC) hypothesis and the usage-based approach concerning why young children variably use tense and agreement morphemes in obligatory contexts by examining the effect of subject types on the production of auxiliary "is". Method: Twenty typically developing 3-year-olds were…

  2. Automated lobar quantification of emphysema in patients with severe COPD.

    PubMed

    Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques

    2008-12-01

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.

  3. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    NASA Technical Reports Server (NTRS)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  4. Optimization of an interactive distributive computer network

    NASA Technical Reports Server (NTRS)

    Frederick, V.

    1985-01-01

    The activities under a cooperative agreement for the development of a computer network are briefly summarized. Research activities covered are: computer operating systems optimization and integration; software development and implementation of the IRIS (Infrared Imaging of Shuttle) Experiment; and software design, development, and implementation of the APS (Aerosol Particle System) Experiment.

  5. Software, Copyright, and Site-License Agreements: Publishers' Perspective of Library Practice.

    ERIC Educational Resources Information Center

    Happer, Stephanie K.

    Thirty-one academic publishers of stand-alone software and book/disk packages were surveyed to determine whether publishers have addressed the copyright issues inherent in circulating these packages within the library environment. Twenty-two questionnaires were returned, providing a 71% return rate. There were 18 usable questionnaires. Publishers…

  6. Computer work duration and its dependence on the used pause definition.

    PubMed

    Richter, Janneke M; Slijper, Harm P; Over, Eelco A B; Frens, Maarten A

    2008-11-01

    Several ergonomic studies have estimated computer work duration using registration software. In these studies, an arbitrary pause definition (Pd; the minimal time between two computer events to constitute a pause) is chosen and the resulting duration of computer work is estimated. In order to uncover the relationship between the used pause definition and the computer work duration (PWT), we used registration software to record usage patterns of 571 computer users across almost 60,000 working days. For a large range of Pds (1-120 s), we found a shallow, log-linear relationship between PWT and Pds. For keyboard and mouse use, a second-order function fitted the data best. We found that these relationships were dependent on the amount of computer work and subject characteristics. Comparison of exposure duration from studies using different pause definitions should take this into account, since it could lead to misclassification. Software manufacturers and ergonomists assessing computer work duration could use the found relationships for software design and study comparison.

  7. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  8. 28 CFR 20.21 - Preparation and submission of a Criminal History Record Information Plan.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., evaluative, or statistical activities pursuant to an agreement with a criminal justice agency. The agreement shall specifically authorize access to data, limit the use of data to research, evaluative, or... technologically advanced software and hardware designs are instituted to prevent unauthorized access to such...

  9. 28 CFR 20.21 - Preparation and submission of a Criminal History Record Information Plan.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., evaluative, or statistical activities pursuant to an agreement with a criminal justice agency. The agreement shall specifically authorize access to data, limit the use of data to research, evaluative, or... technologically advanced software and hardware designs are instituted to prevent unauthorized access to such...

  10. 28 CFR 20.21 - Preparation and submission of a Criminal History Record Information Plan.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., evaluative, or statistical activities pursuant to an agreement with a criminal justice agency. The agreement shall specifically authorize access to data, limit the use of data to research, evaluative, or... technologically advanced software and hardware designs are instituted to prevent unauthorized access to such...

  11. 28 CFR 20.21 - Preparation and submission of a Criminal History Record Information Plan.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., evaluative, or statistical activities pursuant to an agreement with a criminal justice agency. The agreement shall specifically authorize access to data, limit the use of data to research, evaluative, or... technologically advanced software and hardware designs are instituted to prevent unauthorized access to such...

  12. 28 CFR 20.21 - Preparation and submission of a Criminal History Record Information Plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., evaluative, or statistical activities pursuant to an agreement with a criminal justice agency. The agreement shall specifically authorize access to data, limit the use of data to research, evaluative, or... technologically advanced software and hardware designs are instituted to prevent unauthorized access to such...

  13. 26 CFR 1.197-2 - Amortization of goodwill and certain other intangibles.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., etc. Section 197 intangibles include any patent, copyright, formula, process, design, pattern, know-how, format, package design, computer software (as defined in paragraph (c)(4)(iv) of this section... agreement that provides one of the parties to the agreement with the right to distribute, sell, or provide...

  14. Validity of questionnaire self‐reports on computer, mouse and keyboard usage during a four‐week period

    PubMed Central

    Mikkelsen, Sigurd; Vilstrup, Imogen; Lassen, Christina Funch; Kryger, Ann Isabel; Thomsen, Jane Frølund; Andersen, Johan Hviid

    2007-01-01

    Objective To examine the validity and potential biases in self‐reports of computer, mouse and keyboard usage times, compared with objective recordings. Methods A study population of 1211 people was asked in a questionnaire to estimate the average time they had worked with computer, mouse and keyboard during the past four working weeks. During the same period, a software program recorded these activities objectively. The study was part of a one‐year follow‐up study from 2000–1 of musculoskeletal outcomes among Danish computer workers. Results Self‐reports on computer, mouse and keyboard usage times were positively associated with objectively measured activity, but the validity was low. Self‐reports explained only between a quarter and a third of the variance of objectively measured activity, and were even lower for one measure (keyboard time). Self‐reports overestimated usage times. Overestimation was large at low levels and declined with increasing levels of objectively measured activity. Mouse usage time proportion was an exception with a near 1:1 relation. Variability in objectively measured activity, arm pain, gender and age influenced self‐reports in a systematic way, but the effects were modest and sometimes in different directions. Conclusion Self‐reported durations of computer activities are positively associated with objective measures but they are quite inaccurate. Studies using self‐reports to establish relations between computer work times and musculoskeletal pain could be biased and lead to falsely increased or decreased risk estimates. PMID:17387136

  15. Validity of questionnaire self-reports on computer, mouse and keyboard usage during a four-week period.

    PubMed

    Mikkelsen, Sigurd; Vilstrup, Imogen; Lassen, Christina Funch; Kryger, Ann Isabel; Thomsen, Jane Frølund; Andersen, Johan Hviid

    2007-08-01

    To examine the validity and potential biases in self-reports of computer, mouse and keyboard usage times, compared with objective recordings. A study population of 1211 people was asked in a questionnaire to estimate the average time they had worked with computer, mouse and keyboard during the past four working weeks. During the same period, a software program recorded these activities objectively. The study was part of a one-year follow-up study from 2000-1 of musculoskeletal outcomes among Danish computer workers. Self-reports on computer, mouse and keyboard usage times were positively associated with objectively measured activity, but the validity was low. Self-reports explained only between a quarter and a third of the variance of objectively measured activity, and were even lower for one measure (keyboard time). Self-reports overestimated usage times. Overestimation was large at low levels and declined with increasing levels of objectively measured activity. Mouse usage time proportion was an exception with a near 1:1 relation. Variability in objectively measured activity, arm pain, gender and age influenced self-reports in a systematic way, but the effects were modest and sometimes in different directions. Self-reported durations of computer activities are positively associated with objective measures but they are quite inaccurate. Studies using self-reports to establish relations between computer work times and musculoskeletal pain could be biased and lead to falsely increased or decreased risk estimates.

  16. The Intelligent Flight Control Program (IFCS)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This is the closeout report for the Research Cooperative Agreement NCC4-00130 of accomplishments for the Intelligent Flight Control System (IFCS) Project. It has been a pleasure working with NASA and NASA partners as we strive to meet the goals of this research initiative. ISR was engaged in this Research Cooperative Agreement beginning 01 January 2003 and ending 31 January 2004. During this time ISR conducted efforts towards development of the ARTS II Computer Software Configuration Item (CSCI) version 4.0 by performing or developing the following: 1) Requirements Definition; 2) Software Design and Development; 3) Hardware In the Loop Simulation; 4) Unit Level testing; 5) Documentation.

  17. Comparative XAFS studies of some Cobalt complexes of (3-N- phenyl -thiourea-pentanone-2)

    NASA Astrophysics Data System (ADS)

    soni, Namrata; Parsai, Neetu; Mishra, Ashutosh

    2016-10-01

    XAFS spectroscopy is a useful method for determining the local structure around a specific atom in disordered systems. XAFS study of some cobalt complexes of (3-N-phenyle- thiourea-pentanon-2) is carried out using the latest XAFS analysis software Demeter with Strawberry Perl. The same study is also carried out theoretically using Mathcad software. It is found that the thiourea has significant influence in the spectra and the results obtained experimentally and theoretically are in agreement. Fourier transform of the experimental and theoretically generated XAFS have been taken to obtain first shell radial distance. The values so obtained are in agreement with each other.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudkevich, Aleksandr; Goldis, Evgeniy

    This research conducted by the Newton Energy Group, LLC (NEG) is dedicated to the development of pCloud: a Cloud-based Power Market Simulation Environment. pCloud is offering power industry stakeholders the capability to model electricity markets and is organized around the Software as a Service (SaaS) concept -- a software application delivery model in which software is centrally hosted and provided to many users via the internet. During the Phase I of this project NEG developed a prototype design for pCloud as a SaaS-based commercial service offering, system architecture supporting that design, ensured feasibility of key architecture's elements, formed technological partnershipsmore » and negotiated commercial agreements with partners, conducted market research and other related activities and secured funding for continue development of pCloud between the end of Phase I and beginning of Phase II, if awarded. Based on the results of Phase I activities, NEG has established that the development of a cloud-based power market simulation environment within the Windows Azure platform is technologically feasible, can be accomplished within the budget and timeframe available through the Phase II SBIR award with additional external funding. NEG believes that pCloud has the potential to become a game-changing technology for the modeling and analysis of electricity markets. This potential is due to the following critical advantages of pCloud over its competition: - Standardized access to advanced and proven power market simulators offered by third parties. - Automated parallelization of simulations and dynamic provisioning of computing resources on the cloud. This combination of automation and scalability dramatically reduces turn-around time while offering the capability to increase the number of analyzed scenarios by a factor of 10, 100 or even 1000. - Access to ready-to-use data and to cloud-based resources leading to a reduction in software, hardware, and IT costs. - Competitive pricing structure, which will make high-volume usage of simulation services affordable. - Availability and affordability of high quality power simulators, which presently only large corporate clients can afford, will level the playing field in developing regional energy policies, determining prudent cost recovery mechanisms and assuring just and reasonable rates to consumers. - Users that presently do not have the resources to internally maintain modeling capabilities will now be able to run simulations. This will invite more players into the industry, ultimately leading to more transparent and liquid power markets.« less

  19. What Type of Learning Style Leads to Online Participation in the Mixed-Mode E-Learning Environment? A Study of Software Usage Instruction

    ERIC Educational Resources Information Center

    Huang, Eugenia Y.; Lin, Sheng Wei; Huang, Travis K.

    2012-01-01

    Learning style is traditionally assumed to be a predictor of learning performance, yet few studies have identified the mediating and moderating effects between the two. This study extends previous research by proposing and testing a model that examines the mediating processes in the relationship between learning style and e-learning performance…

  20. MATLAB Software Versions and Licenses for the Peregrine System |

    Science.gov Websites

    : Feature usage info: Users of MATLAB: (Total of 6 licenses issued; Total of ... licenses in use) Users of Compiler: (Total of 1 license issued; Total of ... licenses in use) Users of Distrib_Computing_Toolbox : (Total of 4 licenses issued; Total of ... licenses in use) Users of MATLAB_Distrib_Comp_Engine: (Total of

  1. Computer Attitude, Use, Experience, Software Familiarity and Perceived Pedagogical Usefulness: The Case of Mathematics Professors

    ERIC Educational Resources Information Center

    Yushau, Balarabe

    2006-01-01

    As the pedagogical-effectiveness of information technology (IT) in mathematics education is carefully established the topic of discourse among mathematicians and mathematics educators is no longer a dispute about whether or not to use IT in the teaching and learning of mathematics but a shift to some debate about the when and how of its usage.…

  2. Lessons Learned From Developing A Streaming Data Framework for Scientific Analysis

    NASA Technical Reports Server (NTRS)

    Wheeler. Kevin R.; Allan, Mark; Curry, Charles

    2003-01-01

    We describe the development and usage of a streaming data analysis software framework. The framework is used for three different applications: Earth science hyper-spectral imaging analysis, Electromyograph pattern detection, and Electroencephalogram state determination. In each application the framework was used to answer a series of science questions which evolved with each subsequent answer. This evolution is summarized in the form of lessons learned.

  3. Integrating ICT Skills and Tax Software in Tax Education: A Survey of Malaysian Tax Practitioners' Perspectives

    ERIC Educational Resources Information Center

    Ling, Lai Ming; Nawawi, Nurul Hidayah Ahamad

    2010-01-01

    Purpose: This study aims to examine the ICT skills needed by a fresh accounting graduate when first joining a tax firm; to find out usage of electronic tax (e-tax) applications in tax practice; to assess the rating of senior tax practitioners on fresh graduates' ICT and e-tax applications skills; and to solicit tax practitioners' opinion regarding…

  4. The Effectiveness of a Conceptually Focused Out-of-Class Intervention on Promoting Learning of Electricity by Township Learners

    ERIC Educational Resources Information Center

    Stott, Angela Elisabeth

    2017-01-01

    In this article I report a study into the effectiveness of a 6 week intervention aimed at promoting learning of electricity concepts by 91 Grade 8 and 9 township learners. Each week I taught these learners for an hour and they engaged with educational software for another hour. Analogy usage and predict-explain-observe-explain pedagogy were…

  5. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    PubMed

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  6. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  7. It's Time to Consider Open Source Software

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2007-01-01

    In 1985 Richard Stallman, a computer programmer, released "The GNU Manifesto" in which he proclaimed a golden rule: One must share computer programs. Software vendors required him to agree to license agreements that forbade sharing programs with others, but he refused to "break solidarity" with other computer users whom he assumed also wanted to…

  8. A socialization intervention in remote health coaching for older adults in the home.

    PubMed

    Jimison, Holly B; Klein, Krystal A; Marcoe, Jennifer L

    2013-01-01

    Previous studies have shown that social ties enhance both physical and mental health, and that social isolation has been linked to increased cognitive decline. As part of our cognitive training platform, we created a socialization intervention to address these issues. The intervention is designed to improve social contact time of older adults with remote family members and friends using a variety of technologies, including Web cameras, Skype software, email and phone. We used usability testing, surveys, interviews and system usage monitoring to develop design guidance for socialization protocols that were appropriate for older adults living independently in their homes. Our early results with this intervention show increased number of social contacts, total communication time (we measure email, phone, and Skype usage) and significant participant satisfaction with the intervention.

  9. A Socialization Intervention in Remote Health Coaching for Older Adults in the Home*

    PubMed Central

    Jimison, Holly B.; Klein, Krystal A.; Marcoe, Jennifer L.

    2014-01-01

    Previous studies have shown that social ties enhance both physical and mental health, and that social isolation has been linked to increased cognitive decline. As part of our cognitive training platform, we created a socialization intervention to address these issues. The intervention is designed to improve social contact time of older adults with remote family members and friends using a variety of technologies, including Web cameras, Skype software, email and phone. We used usability testing, surveys, interviews and system usage monitoring to develop design guidance for socialization protocols that were appropriate for older adults living independently in their homes. Our early results with this intervention show increased number of social contacts, total communication time (we measure email, phone, and Skype usage) and significant participant satisfaction with the intervention. PMID:24111362

  10. Accounting utility for determining individual usage of production level software systems

    NASA Technical Reports Server (NTRS)

    Garber, S. C.

    1984-01-01

    An accounting package was developed which determines the computer resources utilized by a user during the execution of a particular program and updates a file containing accumulated resource totals. The accounting package is divided into two separate programs. The first program determines the total amount of computer resources utilized by a user during the execution of a particular program. The second program uses these totals to update a file containing accumulated totals of computer resources utilized by a user for a particular program. This package is useful to those persons who have several other users continually accessing and running programs from their accounts. The package provides the ability to determine which users are accessing and running specified programs along with their total level of usage.

  11. Attending to the Grammatical Errors of Students Using Constructive Teaching and Learning Activities

    ERIC Educational Resources Information Center

    Wornyo, Albert Agbesi

    2016-01-01

    This study was a classroom-based action research. In this study, constructive teaching and learning activities were used to help learners improve on their grammar and usage with a focus on how to help them internalize subject verb agreement rules. The purpose of the research was to assist learners to improve upon their performance in grammar and…

  12. Interface Generation and Compositional Verification in JavaPathfinder

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina

    2009-01-01

    We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.

  13. A Study on Signal Group Processing of AUTOSAR COM Module

    NASA Astrophysics Data System (ADS)

    Lee, Jeong-Hwan; Hwang, Hyun Yong; Han, Tae Man; Ahn, Yong Hak

    2013-06-01

    In vehicle, there are many ECU(Electronic Control Unit)s, and ECUs are connected to networks such as CAN, LIN, FlexRay, and so on. AUTOSAR COM(Communication) which is a software platform of AUTOSAR(AUTomotive Open System ARchitecture) in the international industry standards of automotive electronic software processes signals and signal groups for data communications between ECUs. Real-time and reliability are very important for data communications in the vehicle. Therefore, in this paper, we analyze functions of signals and signal groups used in COM, and represent that functions of signal group are more efficient than signals in real-time data synchronization and network resource usage between the sender and receiver.

  14. Comparison of in-hospital versus 30-day mortality assessments for selected medical conditions.

    PubMed

    Borzecki, Ann M; Christiansen, Cindy L; Chew, Priscilla; Loveland, Susan; Rosen, Amy K

    2010-12-01

    In-hospital mortality measures such as the Agency for Healthcare Research and Quality (AHRQ) Inpatient Quality Indicators (IQIs) are easily derived using hospital discharge abstracts and publicly available software. However, hospital assessments based on a 30-day postadmission interval might be more accurate given potential differences in facility discharge practices. To compare in-hospital and 30-day mortality rates for 6 medical conditions using the AHRQ IQI software. We used IQI software (v3.1) and 2004-2007 Veterans Health Administration (VA) discharge and Vital Status files to derive 4-year facility-level in-hospital and 30-day observed mortality rates and observed/expected ratios (O/Es) for admissions with a principal diagnosis of acute myocardial infarction, congestive heart failure, stroke, gastrointestinal hemorrhage, hip fracture, and pneumonia. We standardized software-calculated O/Es to the VA population and compared O/Es and outlier status across sites using correlation, observed agreement, and kappas. Of 119 facilities, in-hospital versus 30-day mortality O/E correlations were generally high (median: r = 0.78; range: 0.31-0.86). Examining outlier status, observed agreement was high (median: 84.7%, 80.7%-89.1%). Kappas showed at least moderate agreement (k > 0.40) for all indicators except stroke and hip fracture (k ≤ 0.22). Across indicators, few sites changed from a high to nonoutlier or low outlier, or vice versa (median: 10, range: 7-13). The AHRQ IQI software can be easily adapted to generate 30-day mortality rates. Although 30-day mortality has better face validity as a hospital performance measure than in-hospital mortality, site assessments were similar despite the definition used. Thus, the measure selected for internal benchmarking should primarily depend on the healthcare system's data linkage capabilities.

  15. Selvester scoring in patients with strict LBBB using the QUARESS software.

    PubMed

    Xia, Xiaojuan; Chaudhry, Uzma; Wieslander, Björn; Borgquist, Rasmus; Wagner, Galen S; Strauss, David G; Platonov, Pyotr; Ugander, Martin; Couderc, Jean-Philippe

    2015-01-01

    Estimation of the infarct size from body-surface ECGs in post-myocardial infarction patients has become possible using the Selvester scoring method. Automation of this scoring has been proposed in order to speed-up the measurement of the score and improving the inter-observer variability in computing a score that requires strong expertise in electrocardiography. In this work, we evaluated the quality of the QuAReSS software for delivering correct Selvester scoring in a set of standard 12-lead ECGs. Standard 12-lead ECGs were recorded in 105 post-MI patients prescribed implantation of an implantable cardiodefibrillator (ICD). Amongst the 105 patients with standard clinical left bundle branch block (LBBB) patterns, 67 had a LBBB pattern meeting the strict criteria. The QuAReSS software was applied to these 67 tracings by two independent groups of cardiologists (from a clinical group and an ECG core laboratory) to measure the Selvester score semi-automatically. Using various level of agreement metrics, we compared the scores between groups and when automatically measured by the software. The average of the absolute difference in Selvester scores measured by the two independent groups was 1.4±1.5 score points, whereas the difference between automatic method and the two manual adjudications were 1.2±1.2 and 1.3±1.2 points. Eighty-two percent score agreement was observed between the two independent measurements when the difference of score was within two point ranges, while 90% and 84% score agreements were reached using the automatic method compared to the two manual adjudications. The study confirms that the QuAReSS software provides valid measurements of the Selvester score in patients with strict LBBB with minimal correction from cardiologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  17. Characteristics and evolution of the ecosystem of software tools supporting research in molecular biology.

    PubMed

    Pazos, Florencio; Chagoyen, Monica

    2018-01-16

    Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Monitoring of computing resource use of active software releases at ATLAS

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  19. Development of display design and command usage guidelines for Spacelab experiment computer applications

    NASA Technical Reports Server (NTRS)

    Dodson, D. W.; Shields, N. L., Jr.

    1979-01-01

    Individual Spacelab experiments are responsible for developing their CRT display formats and interactive command scenarios for payload crew monitoring and control of experiment operations via the Spacelab Data Display System (DDS). In order to enhance crew training and flight operations, it was important to establish some standardization of the crew/experiment interface among different experiments by providing standard methods and techniques for data presentation and experiment commanding via the DDS. In order to establish optimum usage guidelines for the Spacelab DDS, the capabilities and limitations of the hardware and Experiment Computer Operating System design had to be considered. Since the operating system software and hardware design had already been established, the Display and Command Usage Guidelines were constrained to the capabilities of the existing system design. Empirical evaluations were conducted on a DDS simulator to determine optimum operator/system interface utilization of the system capabilities. Display parameters such as information location, display density, data organization, status presentation and dynamic update effects were evaluated in terms of response times and error rates.

  20. A streamlined Python framework for AT-TPC data analysis

    NASA Astrophysics Data System (ADS)

    Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.

    2017-09-01

    User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.

  1. Project Management Software for Distributed Industrial Companies

    NASA Astrophysics Data System (ADS)

    Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.

    This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.

  2. The AAO fiber instrument data simulator

    NASA Astrophysics Data System (ADS)

    Goodwin, Michael; Farrell, Tony; Smedley, Scott; Heald, Ron; Heijmans, Jeroen; De Silva, Gayandhi; Carollo, Daniela

    2012-09-01

    The fiber instrument data simulator is an in-house software tool that simulates detector images of fiber-fed spectrographs developed by the Australian Astronomical Observatory (AAO). In addition to helping validate the instrument designs, the resulting simulated images are used to develop the required data reduction software. Example applications that have benefited from the tool usage are the HERMES and SAMI instrumental projects for the Anglo-Australian Telescope (AAT). Given the sophistication of these projects an end-to-end data simulator that accurately models the predicted detector images is required. The data simulator encompasses all aspects of the transmission and optical aberrations of the light path: from the science object, through the atmosphere, telescope, fibers, spectrograph and finally the camera detectors. The simulator runs under a Linux environment that uses pre-calculated information derived from ZEMAX models and processed data from MATLAB. In this paper, we discuss the aspects of the model, software, example simulations and verification.

  3. IUTA: a tool for effectively detecting differential isoform usage from RNA-Seq data.

    PubMed

    Niu, Liang; Huang, Weichun; Umbach, David M; Li, Leping

    2014-10-06

    Most genes in mammals generate several transcript isoforms that differ in stability and translational efficiency through alternative splicing. Such alternative splicing can be tissue- and developmental stage-specific, and such specificity is sometimes associated with disease. Thus, detecting differential isoform usage for a gene between tissues or cell lines/types (differences in the fraction of total expression of a gene represented by the expression of each of its isoforms) is potentially important for cell and developmental biology. We present a new method IUTA that is designed to test each gene in the genome for differential isoform usage between two groups of samples. IUTA also estimates isoform usage for each gene in each sample as well as averaged across samples within each group. IUTA is the first method to formulate the testing problem as testing for equal means of two probability distributions under the Aitchison geometry, which is widely recognized as the most appropriate geometry for compositional data (vectors that contain the relative amount of each component comprising the whole). Evaluation using simulated data showed that IUTA was able to provide test results for many more genes than was Cuffdiff2 (version 2.2.0, released in Mar. 2014), and IUTA performed better than Cuffdiff2 for the limited number of genes that Cuffdiff2 did analyze. When applied to actual mouse RNA-Seq datasets from six tissues, IUTA identified 2,073 significant genes with clear patterns of differential isoform usage between a pair of tissues. IUTA is implemented as an R package and is available at http://www.niehs.nih.gov/research/resources/software/biostatistics/iuta/index.cfm. Both simulation and real-data results suggest that IUTA accurately detects differential isoform usage. We believe that our analysis of RNA-seq data from six mouse tissues represents the first comprehensive characterization of isoform usage in these tissues. IUTA will be a valuable resource for those who study the roles of alternative transcripts in cell development and disease.

  4. STAMPS: development and verification of swallowing kinematic analysis software.

    PubMed

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P < 0.001) for displacement and velocity. The Bland-Altman plots showed good agreement between the measurements and the reference values. STAMPS provides precise and reliable kinematic measurements and multiple practical functionalities for spatiotemporal analysis. The software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  5. Parallelization of Rocket Engine Simulator Software (PRESS)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1997-01-01

    Parallelization of Rocket Engine System Software (PRESS) project is part of a collaborative effort with Southern University at Baton Rouge (SUBR), University of West Florida (UWF), and Jackson State University (JSU). The second-year funding, which supports two graduate students enrolled in our new Master's program in Computer Science at Hampton University and the principal investigator, have been obtained for the period from October 19, 1996 through October 18, 1997. The key part of the interim report was new directions for the second year funding. This came about from discussions during Rocket Engine Numeric Simulator (RENS) project meeting in Pensacola on January 17-18, 1997. At that time, a software agreement between Hampton University and NASA Lewis Research Center had already been concluded. That agreement concerns off-NASA-site experimentation with PUMPDES/TURBDES software. Before this agreement, during the first year of the project, another large-scale FORTRAN-based software, Two-Dimensional Kinetics (TDK), was being used for translation to an object-oriented language and parallelization experiments. However, that package proved to be too complex and lacking sufficient documentation for effective translation effort to the object-oriented C + + source code. The focus, this time with better documented and more manageable PUMPDES/TURBDES package, was still on translation to C + + with design improvements. At the RENS Meeting, however, the new impetus for the RENS projects in general, and PRESS in particular, has shifted in two important ways. One was closer alignment with the work on Numerical Propulsion System Simulator (NPSS) through cooperation and collaboration with LERC ACLU organization. The other was to see whether and how NASA's various rocket design software can be run over local and intra nets without any radical efforts for redesign and translation into object-oriented source code. There were also suggestions that the Fortran based code be encapsulated in C + + code thereby facilitating reuse without undue development effort. The details are covered in the aforementioned section of the interim report filed on April 28, 1997.

  6. Precision analysis of a quantitative CT liver surface nodularity score.

    PubMed

    Smith, Andrew; Varney, Elliot; Zand, Kevin; Lewis, Tara; Sirous, Reza; York, James; Florez, Edward; Abou Elkassem, Asser; Howard-Claudio, Candace M; Roda, Manohar; Parker, Ellen; Scortegagna, Eduardo; Joyner, David; Sandlin, David; Newsome, Ashley; Brewster, Parker; Lirette, Seth T; Griswold, Michael

    2018-04-26

    To evaluate precision of a software-based liver surface nodularity (LSN) score derived from CT images. An anthropomorphic CT phantom was constructed with simulated liver containing smooth and nodular segments at the surface and simulated visceral and subcutaneous fat components. The phantom was scanned multiple times on a single CT scanner with adjustment of image acquisition and reconstruction parameters (N = 34) and on 22 different CT scanners from 4 manufacturers at 12 imaging centers. LSN scores were obtained using a software-based method. Repeatability and reproducibility were evaluated by intraclass correlation (ICC) and coefficient of variation. Using abdominal CT images from 68 patients with various stages of chronic liver disease, inter-observer agreement and test-retest repeatability among 12 readers assessing LSN by software- vs. visual-based scoring methods were evaluated by ICC. There was excellent repeatability of LSN scores (ICC:0.79-0.99) using the CT phantom and routine image acquisition and reconstruction parameters (kVp 100-140, mA 200-400, and auto-mA, section thickness 1.25-5.0 mm, field of view 35-50 cm, and smooth or standard kernels). There was excellent reproducibility (smooth ICC: 0.97; 95% CI 0.95, 0.99; CV: 7%; nodular ICC: 0.94; 95% CI 0.89, 0.97; CV: 8%) for LSN scores derived from CT images from 22 different scanners. Inter-observer agreement for the software-based LSN scoring method was excellent (ICC: 0.84; 95% CI 0.79, 0.88; CV: 28%) vs. good for the visual-based method (ICC: 0.61; 95% CI 0.51, 0.69; CV: 43%). Test-retest repeatability for the software-based LSN scoring method was excellent (ICC: 0.82; 95% CI 0.79, 0.84; CV: 12%). The software-based LSN score is a quantitative CT imaging biomarker with excellent repeatability, reproducibility, inter-observer agreement, and test-retest repeatability.

  7. Statistical software applications used in health services research: analysis of published studies in the U.S

    PubMed Central

    2011-01-01

    Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990

  8. 26 CFR 1.197-2 - Amortization of goodwill and certain other intangibles.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., process, design, pattern, know-how, format, package design, computer software (as defined in paragraph (c... agreement that provides one of the parties to the agreement with the right to distribute, sell, or provide... any program or routine (that is, any sequence of machine-readable code) that is designed to cause a...

  9. Continuous recording and interobserver agreement algorithms reported in the Journal of Applied Behavior Analysis (1995-2005).

    PubMed

    Mudford, Oliver C; Taylor, Sarah Ann; Martin, Neil T

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the Journal of Applied Behavior Analysis (JABA): Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement, block-by-block agreement, and time-window analysis) were employed in more than 10 of the articles that reported continuous recording. Having identified these currently popular agreement computation algorithms, we explain them to assist researchers, software writers, and other consumers of JABA articles.

  10. Agreement between Computerized and Human Assessment of Performance on the Ruff Figural Fluency Test

    PubMed Central

    Elderson, Martin F.; Pham, Sander; van Eersel, Marlise E. A.; Wolffenbuttel, Bruce H. R.; Kok, Johan; Gansevoort, Ron T.; Tucha, Oliver; van der Klauw, Melanie M.; Slaets, Joris P. J.

    2016-01-01

    The Ruff Figural Fluency Test (RFFT) is a sensitive test for nonverbal fluency suitable for all age groups. However, assessment of performance on the RFFT is time-consuming and may be affected by interrater differences. Therefore, we developed computer software specifically designed to analyze performance on the RFFT by automated pattern recognition. The aim of this study was to compare assessment by the new software with conventional assessment by human raters. The software was developed using data from the Lifelines Cohort Study and validated in an independent cohort of the Prevention of Renal and Vascular End Stage Disease (PREVEND) study. The total study population included 1,761 persons: 54% men; mean age (SD), 58 (10) years. All RFFT protocols were assessed by the new software and two independent human raters (criterion standard). The mean number of unique designs (SD) was 81 (29) and the median number of perseverative errors (interquartile range) was 9 (4 to 16). The intraclass correlation coefficient (ICC) between the computerized and human assessment was 0.994 (95%CI, 0.988 to 0.996; p<0.001) and 0.991 (95%CI, 0.990 to 0.991; p<0.001) for the number of unique designs and perseverative errors, respectively. The mean difference (SD) between the computerized and human assessment was -1.42 (2.78) and +0.02 (1.94) points for the number of unique designs and perseverative errors, respectively. This was comparable to the agreement between two independent human assessments: ICC, 0.995 (0.994 to 0.995; p<0.001) and 0.985 (0.982 to 0.988; p<0.001), and mean difference (SD), -0.44 (2.98) and +0.56 (2.36) points for the number of unique designs and perseverative errors, respectively. We conclude that the agreement between the computerized and human assessment was very high and comparable to the agreement between two independent human assessments. Therefore, the software is an accurate tool for the assessment of performance on the RFFT. PMID:27661083

  11. Technical note on the validation of a semi-automated image analysis software application for estrogen and progesterone receptor detection in breast cancer.

    PubMed

    Krecsák, László; Micsik, Tamás; Kiszler, Gábor; Krenács, Tibor; Szabó, Dániel; Jónás, Viktor; Császár, Gergely; Czuni, László; Gurzó, Péter; Ficsor, Levente; Molnár, Béla

    2011-01-18

    The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two scoring schemes. NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14 software application proved to be a reliable image analysis tool for pathologists testing ER and PR status in breast cancer.

  12. [Classification of results of studying blood plasma with laser correlation spectroscopy based on semiotics of preclinical and clinical states].

    PubMed

    Ternovoĭ, K S; Kryzhanovskiĭ, G N; Musiĭchuk, Iu I; Noskin, L A; Klopov, N V; Noskin, V A; Starodub, N F

    1998-01-01

    The usage of laser correlation spectroscopy for verification of preclinical and clinical states is substantiated. Developed "semiotic" classifier for solving the problems of preclinical and clinical states is presented. The substantiation of biological algorithms as well as the mathematical support and software for the proposed classifier for the data of laser correlation spectroscopy of blood plasma are presented.

  13. Examining the Impact of an Integrative Method of Using Technology on Students' Achievement and Efficiency of Computer Usage and on Pedagogical Procedure in Geometry

    ERIC Educational Resources Information Center

    Gurevich, Irina; Gurev, Dvora

    2012-01-01

    In the current study we follow the development of the pedagogical procedure for the course "Constructions in Geometry" that resulted from using dynamic geometry software (DGS), where the computer became an integral part of the educational process. Furthermore, we examine the influence of integrating DGS into the course on students' achievement and…

  14. The Impact of Computer and Mathematics Software Usage on Performance of School Leavers in the Western Cape Province of South Africa: A Comparative Analysis

    ERIC Educational Resources Information Center

    Smith, Garth Spencer; Hardman, Joanne

    2014-01-01

    In this study the impact of computer immersion on performance of school leavers Senior Certificate mathematics scores was investigated across 31 schools in the EMDC East education district of Cape Town, South Africa by comparing performance between two groups: a control and an experimental group. The experimental group (14 high schools) had access…

  15. Comparative assessment of methods for the fusion transcripts detection from RNA-Seq data

    PubMed Central

    Kumar, Shailesh; Vo, Angie Duy; Qin, Fujun; Li, Hui

    2016-01-01

    RNA-Seq made possible the global identification of fusion transcripts, i.e. “chimeric RNAs”. Even though various software packages have been developed to serve this purpose, they behave differently in different datasets provided by different developers. It is important for both users, and developers to have an unbiased assessment of the performance of existing fusion detection tools. Toward this goal, we compared the performance of 12 well-known fusion detection software packages. We evaluated the sensitivity, false discovery rate, computing time, and memory usage of these tools in four different datasets (positive, negative, mixed, and test). We conclude that some tools are better than others in terms of sensitivity, positive prediction value, time consumption and memory usage. We also observed small overlaps of the fusions detected by different tools in the real dataset (test dataset). This could be due to false discoveries by various tools, but could also be due to the reason that none of the tools are inclusive. We have found that the performance of the tools depends on the quality, read length, and number of reads of the RNA-Seq data. We recommend that users choose the proper tools for their purpose based on the properties of their RNA-Seq data. PMID:26862001

  16. BRepertoire: a user-friendly web server for analysing antibody repertoire data.

    PubMed

    Margreitter, Christian; Lu, Hui-Chun; Townsend, Catherine; Stewart, Alexander; Dunn-Walters, Deborah K; Fraternali, Franca

    2018-04-14

    Antibody repertoire analysis by high throughput sequencing is now widely used, but a persisting challenge is enabling immunologists to explore their data to discover discriminating repertoire features for their own particular investigations. Computational methods are necessary for large-scale evaluation of antibody properties. We have developed BRepertoire, a suite of user-friendly web-based software tools for large-scale statistical analyses of repertoire data. The software is able to use data preprocessed by IMGT, and performs statistical and comparative analyses with versatile plotting options. BRepertoire has been designed to operate in various modes, for example analysing sequence-specific V(D)J gene usage, discerning physico-chemical properties of the CDR regions and clustering of clonotypes. Those analyses are performed on the fly by a number of R packages and are deployed by a shiny web platform. The user can download the analysed data in different table formats and save the generated plots as image files ready for publication. We believe BRepertoire to be a versatile analytical tool that complements experimental studies of immune repertoires. To illustrate the server's functionality, we show use cases including differential gene usage in a vaccination dataset and analysis of CDR3H properties in old and young individuals. The server is accessible under http://mabra.biomed.kcl.ac.uk/BRepertoire.

  17. Sequence similarity is more relevant than species specificity in probabilistic backtranslation.

    PubMed

    Ferro, Alfredo; Giugno, Rosalba; Pigola, Giuseppe; Pulvirenti, Alfredo; Di Pietro, Cinzia; Purrello, Michele; Ragusa, Marco

    2007-02-21

    Backtranslation is the process of decoding a sequence of amino acids into the corresponding codons. All synthetic gene design systems include a backtranslation module. The degeneracy of the genetic code makes backtranslation potentially ambiguous since most amino acids are encoded by multiple codons. The common approach to overcome this difficulty is based on imitation of codon usage within the target species. This paper describes EasyBack, a new parameter-free, fully-automated software for backtranslation using Hidden Markov Models. EasyBack is not based on imitation of codon usage within the target species, but instead uses a sequence-similarity criterion. The model is trained with a set of proteins with known cDNA coding sequences, constructed from the input protein by querying the NCBI databases with BLAST. Unlike existing software, the proposed method allows the quality of prediction to be estimated. When tested on a group of proteins that show different degrees of sequence conservation, EasyBack outperforms other published methods in terms of precision. The prediction quality of a protein backtranslation methis markedly increased by replacing the criterion of most used codon in the same species with a Hidden Markov Model trained with a set of most similar sequences from all species. Moreover, the proposed method allows the quality of prediction to be estimated probabilistically.

  18. 76 FR 29609 - Wassenaar Arrangement 2010 Plenary Agreements Implementation: Commerce Control List, Definitions...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-20

    ... apply to computers specially designed for ``civil aircraft'' applications to prevent this control from... section. Adding a new paragraph 5D002.d to control ``software'' designed or modified to enable an item to... 6D003.f.3 and f.4 to control ``software'' and ``source code,'' specially designed for ``real time...

  19. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software

    PubMed Central

    Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-01

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801

  20. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    PubMed

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  1. Efficient radiologic reading environment by using an open-source macro program as connection software.

    PubMed

    Lee, Young Han

    2012-01-01

    The objectives are (1) to introduce an easy open-source macro program as connection software and (2) to illustrate the practical usages in radiologic reading environment by simulating the radiologic reading process. The simulation is a set of radiologic reading process to do a practical task in the radiologic reading room. The principal processes are: (1) to view radiologic images on the Picture Archiving and Communicating System (PACS), (2) to connect the HIS/EMR (Hospital Information System/Electronic Medical Record) system, (3) to make an automatic radiologic reporting system, and (4) to record and recall information of interesting cases. This simulation environment was designed by using open-source macro program as connection software. The simulation performed well on the Window-based PACS workstation. Radiologists practiced the steps of the simulation comfortably by utilizing the macro-powered radiologic environment. This macro program could automate several manual cumbersome steps in the radiologic reading process. This program successfully acts as connection software for the PACS software, EMR/HIS, spreadsheet, and other various input devices in the radiologic reading environment. A user-friendly efficient radiologic reading environment could be established by utilizing open-source macro program as connection software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    PubMed

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  3. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    PubMed Central

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data. PMID:23573172

  4. An Extensible, User- Modifiable Framework for Planning Activities

    NASA Technical Reports Server (NTRS)

    Joshing, Joseph C.; Abramyan, Lucy; Mickelson, Megan C.; Wallick, Michael N.; Kurien, James A.; Crockett, Thomasa M.; Powell, Mark W.; Pyrzak, Guy; Aghevli, Arash

    2013-01-01

    This software provides a development framework that allows planning activities for the Mars Science Laboratory rover to be altered at any time, based on changes of the Activity Dictionary. The Activity Dictionary contains the definition of all activities that can be carried out by a particular asset (robotic or human). These definitions (and combinations of these definitions) are used by mission planners to give a daily plan of what a mission should do. During the development and course of the mission, the Activity Dictionary and actions that are going to be carried out will often be changed. Previously, such changes would require a change to the software and redeployment. Now, the Activity Dictionary authors are able to customize activity definitions, parameters, and resource usage without requiring redeployment. This software provides developers and end users the ability to modify the behavior of automatically generated activities using a script. This allows changes to the software behavior without incurring the burden of redeployment. This software is currently being used for the Mars Science Laboratory, and is in the process of being integrated into the LADEE (Lunar Atmosphere and Dust Environment Explorer) mission, as well as the International Space Station.

  5. Gender agreement and multiple referents.

    PubMed

    Finocchiaro, Chiara; Mahon, Bradford Z; Caramazza, Alfonso

    2008-01-01

    We report a new pattern of usage in current, spoken Italian that has implications for both psycholinguistic models of language production and linguistic theories of language change. In Italian, gender agreement is mandatory for both singular and plural nouns. However, when two or more nouns of different grammatical gender appear in a conjoined noun phrase (NP), masculine plural agreement is required. In this study, we combined on-line and off-line methodologies in order to assess the mechanisms involved in gender marking in the context of multiple referents. The results of two pronoun production tasks showed that plural feminine agreement was significantly more difficult than plural masculine agreement. In a separate study using offline judgements of acceptability, we found that agreement violations in Italian are tolerated more readily in the case of feminine conjoined noun phrases (e.g., la mela e la banana 'the:fem apple:fem and the: fem banana: fem') than masculine conjoined noun phrases (e.g., il fiore e il libro 'the:mas flower: mas and the:mas book:mas'). Implications of these results are discussed both at the level of functional architecture within the language production system and at the level of changes in language use.

  6. 26 CFR 1.197-2 - Amortization of goodwill and certain other intangibles.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., process, design, pattern, know-how, format, package design, computer software (as defined in paragraph (c... section 1253(b)(1) and includes any agreement that provides one of the parties to the agreement with the... any program or routine (that is, any sequence of machine-readable code) that is designed to cause a...

  7. What Are We Looking for in Computer-Based Learning Interventions in Medical Education? A Systematic Review.

    PubMed

    Taveira-Gomes, Tiago; Ferreira, Patrícia; Taveira-Gomes, Isabel; Severo, Milton; Ferreira, Maria Amélia

    2016-08-01

    Computer-based learning (CBL) has been widely used in medical education, and reports regarding its usage and effectiveness have ranged broadly. Most work has been done on the effectiveness of CBL approaches versus traditional methods, and little has been done on the comparative effects of CBL versus CBL methodologies. These findings urged other authors to recommend such studies in hopes of improving knowledge about which CBL methods work best in which settings. In this systematic review, we aimed to characterize recent studies of the development of software platforms and interventions in medical education, search for common points among studies, and assess whether recommendations for CBL research are being taken into consideration. We conducted a systematic review of the literature published from 2003 through 2013. We included studies written in English, specifically in medical education, regarding either the development of instructional software or interventions using instructional software, during training or practice, that reported learner attitudes, satisfaction, knowledge, skills, or software usage. We conducted 2 latent class analyses to group articles according to platform features and intervention characteristics. In addition, we analyzed references and citations for abstracted articles. We analyzed 251 articles. The number of publications rose over time, and they encompassed most medical disciplines, learning settings, and training levels, totaling 25 different platforms specifically for medical education. We uncovered 4 latent classes for educational software, characteristically making use of multimedia (115/251, 45.8%), text (64/251, 25.5%), Web conferencing (54/251, 21.5%), and instructional design principles (18/251, 7.2%). We found 3 classes for intervention outcomes: knowledge and attitudes (175/212, 82.6%), knowledge, attitudes, and skills (11.8%), and online activity (12/212, 5.7%). About a quarter of the articles (58/227, 25.6%) did not hold references or citations in common with other articles. The number of common references and citations increased in articles reporting instructional design principles (P=.03), articles measuring online activities (P=.01), and articles citing a review by Cook and colleagues on CBL (P=.04). There was an association between number of citations and studies comparing CBL versus CBL, independent of publication date (P=.02). Studies in this field vary highly, and a high number of software systems are being developed. It seems that past recommendations regarding CBL interventions are being taken into consideration. A move into a more student-centered model, a focus on implementing reusable software platforms for specific learning contexts, and the analysis of online activity to track and predict outcomes are relevant areas for future research in this field.

  8. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    PubMed

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  9. G-Guidance Interface Design for Small Body Mission Simulation

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Carson, John; Phan, Linh

    2008-01-01

    The G-Guidance software implements a guidance and control (G and C) algorithm for small-body, autonomous proximity operations, developed under the Small Body GN and C task at JPL. The software is written in Matlab and interfaces with G-OPT, a JPL-developed optimization package written in C that provides G-Guidance with guaranteed convergence to a solution in a finite computation time with a prescribed accuracy. The resulting program is computationally efficient and is a prototype of an onboard, real-time algorithm for autonomous guidance and control. Two thruster firing schemes are available in G-Guidance, allowing tailoring of the software for specific mission maneuvers. For example, descent, landing, or rendezvous benefit from a thruster firing at the maneuver termination to mitigate velocity errors. Conversely, ascent or separation maneuvers benefit from an immediate firing to avoid potential drift toward a second body. The guidance portion of this software explicitly enforces user-defined control constraints and thruster silence times while minimizing total fuel usage. This program is currently specialized to small-body proximity operations, but the underlying method can be generalized to other applications.

  10. Section 4. The GIS Weasel User's Manual

    USGS Publications Warehouse

    Viger, Roland J.; Leavesley, George H.

    2007-01-01

    INTRODUCTION The GIS Weasel was designed to aid in the preparation of spatial information for input to lumped and distributed parameter hydrologic or other environmental models. The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to a user's model and to generate parameters from those maps. The operation of the GIS Weasel does not require the user to be a GIS expert, only that the user have an understanding of the spatial information requirements of the environmental simulation model being used. The GIS Weasel software system uses a GIS-based graphical user interface (GUI), the C programming language, and external scripting languages. The software will run on any computing platform where ArcInfo Workstation (version 8.0.2 or later) and the GRID extension are accessible. The user controls the processing of the GIS Weasel by interacting with menus, maps, and tables. The purpose of this document is to describe the operation of the software. This document is not intended to describe the usage of this software in support of any particular environmental simulation model. Such guides are published separately.

  11. Usability Considerations in Developing a Graphic Interface for Intra Office Communications

    NASA Astrophysics Data System (ADS)

    Yammiyavar, Pradeep; Jain, Piyush

    This paper outlines the basis of incorporating functional features in a new GUI based software under development for addressing comprehensive communication and interaction needs within an office environment. Bench marking of features in existing communication software products such as Microsoft Outlook, IBM Lotusnotes, Office Communicator, Mozilla Thunderbird etc. was done by asking a set of questions related to the usage of these existing softwares. Usability issues were identified through a user survey involving 30 subjects of varied profiles (domain, designation, age etc.) in a corporate office. It is posited that existing software products that have been developed for a universal market may be highly underutilized or have redundant features especially for use as an intra office (within the same office) communication medium. Simultaneously they may not cater to some very contextual requirements of intra office communications. Based on the findings of the survey of feature preferences & usability of existing products, a simple 'person to person' communicating medium for intra office situations was visualized with a new interactive GUI. Usability issues that need to be considered for a new intra-office product have been brought out.

  12. Transportable educational programs for scientific and technical professionals: More effective utilization of automated scientific and technical data base systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D.

    1987-01-01

    This grant final report executive summary documents a major, long-term program addressing innovative educational issues associated with the development, administration, evaluation, and widespread distribution of transportable educational programs for scientists and engineers to increase their knowledge of, and facilitate their utilization of automated scientific and technical information storage and retrieval systems. This educational program is of very broad scope, being targeted at Colleges of Engineering and Colleges of Physical sciences at a large number of colleges and universities throughout the United States. The educational program is designed to incorporate extensive hands-on, interactive usage of the NASA RECON system and is supported by a number of microcomputer-based software systems to facilitate the delivery and usage of the educational course materials developed as part of the program.

  13. Occurrence and fate of pharmaceutically active compounds in the largest municipal wastewater treatment plant in Southwest China: mass balance analysis and consumption back-calculated model.

    PubMed

    Yan, Qing; Gao, Xu; Huang, Lei; Gan, Xiu-Mei; Zhang, Yi-Xin; Chen, You-Peng; Peng, Xu-Ya; Guo, Jin-Song

    2014-03-01

    The occurrence and fate of twenty-one pharmaceutically active compounds (PhACs) were investigated in different steps of the largest wastewater treatment plant (WWTP) in Southwest China. Concentrations of these PhACs were determined in both wastewater and sludge phases by a high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry. Results showed that 21 target PhACs were present in wastewater and 18 in sludge. The calculated total mass load of PhACs per capita to the influent, the receiving water and sludge were 4.95mgd(-1)person(-1), 889.94μgd(-1)person(-1) and 78.57μgd(-1)person(-1), respectively. The overall removal efficiency of the individual PhACs ranged from "negative removal" to almost complete removal. Mass balance analysis revealed that biodegradation is believed to be the predominant removal mechanism, and sorption onto sludge was a relevant removal pathway for quinolone antibiotics, azithromycin and simvastatin, accounting for 9.35-26.96% of the initial loadings. However, the sorption of the other selected PhACs was negligible. The overall pharmaceutical consumption in Chongqing, China, was back-calculated based on influent concentration by considering the pharmacokinetics of PhACs in humans. The back-estimated usage was in good agreement with usage of ofloxacin (agreement ratio: 72.5%). However, the back-estimated usage of PhACs requires further verification. Generally, the average influent mass loads and back-calculated annual per capita consumption of the selected antibiotics were comparable to or higher than those reported in developed countries, while the case of other target PhACs was opposite. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Comprehensive evaluation of untargeted metabolomics data processing software in feature detection, quantification and discriminating marker selection.

    PubMed

    Li, Zhucui; Lu, Yan; Guo, Yufeng; Cao, Haijie; Wang, Qinhong; Shui, Wenqing

    2018-10-31

    Data analysis represents a key challenge for untargeted metabolomics studies and it commonly requires extensive processing of more than thousands of metabolite peaks included in raw high-resolution MS data. Although a number of software packages have been developed to facilitate untargeted data processing, they have not been comprehensively scrutinized in the capability of feature detection, quantification and marker selection using a well-defined benchmark sample set. In this study, we acquired a benchmark dataset from standard mixtures consisting of 1100 compounds with specified concentration ratios including 130 compounds with significant variation of concentrations. Five software evaluated here (MS-Dial, MZmine 2, XCMS, MarkerView, and Compound Discoverer) showed similar performance in detection of true features derived from compounds in the mixtures. However, significant differences between untargeted metabolomics software were observed in relative quantification of true features in the benchmark dataset. MZmine 2 outperformed the other software in terms of quantification accuracy and it reported the most true discriminating markers together with the fewest false markers. Furthermore, we assessed selection of discriminating markers by different software using both the benchmark dataset and a real-case metabolomics dataset to propose combined usage of two software for increasing confidence of biomarker identification. Our findings from comprehensive evaluation of untargeted metabolomics software would help guide future improvements of these widely used bioinformatics tools and enable users to properly interpret their metabolomics results. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. BioSmalltalk: a pure object system and library for bioinformatics.

    PubMed

    Morales, Hernán F; Giovambattista, Guillermo

    2013-09-15

    We have developed BioSmalltalk, a new environment system for pure object-oriented bioinformatics programming. Adaptive end-user programming systems tend to become more important for discovering biological knowledge, as is demonstrated by the emergence of open-source programming toolkits for bioinformatics in the past years. Our software is intended to bridge the gap between bioscientists and rapid software prototyping while preserving the possibility of scaling to whole-system biology applications. BioSmalltalk performs better in terms of execution time and memory usage than Biopython and BioPerl for some classical situations. BioSmalltalk is cross-platform and freely available (MIT license) through the Google Project Hosting at http://code.google.com/p/biosmalltalk hernan.morales@gmail.com Supplementary data are available at Bioinformatics online.

  16. Comparison of Cortical and Subcortical Measurements in Normal Older Adults across Databases and Software Packages

    PubMed Central

    Rane, Swati; Plassard, Andrew; Landman, Bennett A.; Claassen, Daniel O.; Donahue, Manus J.

    2017-01-01

    This work explores the feasibility of combining anatomical MRI data across two public repositories namely, the Alzheimer’s Disease Neuroimaging Initiative (ADNI) and the Progressive Parkinson’s Markers Initiative (PPMI). We compared cortical thickness and subcortical volumes in cognitively normal older adults between datasets with distinct imaging parameters to assess if they would provide equivalent information. Three distinct datasets were identified. Major differences in data were scanner manufacturer and the use of magnetization inversion to enhance tissue contrast. Equivalent datasets, i.e., those providing similar volumetric measurements in cognitively normal controls, were identified in ADNI and PPMI. These were datasets obtained on the Siemens scanner with TI = 900 ms. Our secondary goal was to assess the agreement between subcortical volumes that are obtained with different software packages. Three subcortical measurement applications (FSL, FreeSurfer, and a recent multi-atlas approach) were compared. Our results show significant agreement in the measurements of caudate, putamen, pallidum, and hippocampus across the packages and poor agreement between measurements of accumbens and amygdala. This is likely due to their smaller size and lack of gray matter-white matter tissue contrast for accurate segmentation. This work provides a segue to combine imaging data from ADNI and PPMI to increase statistical power as well as to interrogate common mechanisms in disparate pathologies such as Alzheimer’s and Parkinson’s diseases. It lays the foundation for comparison of anatomical data acquired with disparate imaging parameters and analyzed with disparate software tools. Furthermore, our work partly explains the variability in the results of studies using different software packages. PMID:29756095

  17. Comparison of Cortical and Subcortical Measurements in Normal Older Adults across Databases and Software Packages.

    PubMed

    Rane, Swati; Plassard, Andrew; Landman, Bennett A; Claassen, Daniel O; Donahue, Manus J

    2017-01-01

    This work explores the feasibility of combining anatomical MRI data across two public repositories namely, the Alzheimer's Disease Neuroimaging Initiative (ADNI) and the Progressive Parkinson's Markers Initiative (PPMI). We compared cortical thickness and subcortical volumes in cognitively normal older adults between datasets with distinct imaging parameters to assess if they would provide equivalent information. Three distinct datasets were identified. Major differences in data were scanner manufacturer and the use of magnetization inversion to enhance tissue contrast. Equivalent datasets, i.e., those providing similar volumetric measurements in cognitively normal controls, were identified in ADNI and PPMI. These were datasets obtained on the Siemens scanner with TI = 900 ms. Our secondary goal was to assess the agreement between subcortical volumes that are obtained with different software packages. Three subcortical measurement applications (FSL, FreeSurfer, and a recent multi-atlas approach) were compared. Our results show significant agreement in the measurements of caudate, putamen, pallidum, and hippocampus across the packages and poor agreement between measurements of accumbens and amygdala. This is likely due to their smaller size and lack of gray matter-white matter tissue contrast for accurate segmentation. This work provides a segue to combine imaging data from ADNI and PPMI to increase statistical power as well as to interrogate common mechanisms in disparate pathologies such as Alzheimer's and Parkinson's diseases. It lays the foundation for comparison of anatomical data acquired with disparate imaging parameters and analyzed with disparate software tools. Furthermore, our work partly explains the variability in the results of studies using different software packages.

  18. The Software Element of the NASA Portable Electronic Device Radiated Emissions Investigation

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Williams, Reuben A. (Technical Monitor)

    2002-01-01

    NASA Langley Research Center's (LaRC) High Intensity Radiated Fields Laboratory (HIRF Lab) recently conducted a series of electromagnetic radiated emissions tests under a cooperative agreement with Delta Airlines and an interagency agreement with the FAA. The frequency spectrum environment at a commercial airport was measured on location. The environment survey provides a comprehensive picture of the complex nature of the electromagnetic environment present in those areas outside the aircraft. In addition, radiated emissions tests were conducted on portable electronic devices (PEDs) that may be brought onboard aircraft. These tests were performed in both semi-anechoic and reverberation chambers located in the HIRF Lab. The PEDs included cell phones, laptop computers, electronic toys, and family radio systems. The data generated during the tests are intended to support the research on the effect of radiated emissions from wireless devices on aircraft systems. Both tests systems relied on customized control and data reduction software to provide test and instrument control, data acquisition, a user interface, real time data reduction, and data analysis. The software executed on PC's running MS Windows 98 and 2000, and used Agilent Pro Visual Engineering Environment (VEE) development software, Common Object Model (COM) technology, and MS Excel.

  19. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    PubMed

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  20. Time Delay Measurements of Key Generation Process on Smart Cards

    DTIC Science & Technology

    2015-03-01

    random number generator is available (Chatterjee & Gupta, 2009). The ECC algorithm will grow in usage as information becomes more and more secure. Figure...Worldwide Mobile Enterprise Security Software 2012–2016 Forecast and Analysis), mobile identity and access management is expected to grow by 27.6 percent...iPad, tablets) as well as 80000 BlackBerry phones. The mobility plan itself will be deployed in three phases over 2014, with the first phase

  1. Systems Biology of the Immune Response to Live and Inactivated Dengue Virus Vaccines

    DTIC Science & Technology

    2017-09-01

    Financial support;  In-kind support (e.g., partner makes software, computers , equipment, etc., available to project staff);  Facilities (e.g...reprints of manuscripts and abstracts, a curriculum vitae, patent applications, study questionnaires, and surveys , etc. Organization name: Walter...memory B-cells and the isotype usage of the antibody response. 9. A project-specific SQL database has been set up on a server based at URI. Major

  2. Software Capability Evaluation (SCE) Version 2.0 Implementation Guide

    DTIC Science & Technology

    1994-02-01

    Affected By SCE B-40 Figure 3-1 SCE Usage Decision Making Criteria 3-44 Figure 3-2 Estimated SCE Labor For One Source Selection 3-53 Figure 3-3 SCE...incorporated into the source selection sponsoring organization’s technical/management team for incorporation into acquisition decisions . The SCE team...expertise, past performance, and organizational capacity in acquisition decisions . The Capability Maturity Model Basic Concepts The CMM is based on the

  3. Evaluation of Service Level Agreement Approaches for Portfolio Management in the Financial Industry

    NASA Astrophysics Data System (ADS)

    Pontz, Tobias; Grauer, Manfred; Kuebert, Roland; Tenschert, Axel; Koller, Bastian

    The idea of service-oriented Grid computing seems to have the potential for fundamental paradigm change and a new architectural alignment concerning the design of IT infrastructures. There is a wide range of technical approaches from scientific communities which describe basic infrastructures and middlewares for integrating Grid resources in order that by now Grid applications are technically realizable. Hence, Grid computing needs viable business models and enhanced infrastructures to move from academic application right up to commercial application. For a commercial usage of these evolutions service level agreements are needed. The developed approaches are primary of academic interest and mostly have not been put into practice. Based on a business use case of the financial industry, five service level agreement approaches have been evaluated in this paper. Based on the evaluation, a management architecture has been designed and implemented as a prototype.

  4. Case Studies in Environment Integration

    DTIC Science & Technology

    1991-12-01

    such as CADRE Teamwork and Frame Technology FrameMaker , are integrated. Future plans include integrating additional software development tools into...Pictures, Sabre C, and Interleaf or FrameMaker . Cad- re Technologies has announced integration agreements with Saber C and Pansophic, as well as offering...access to the Interleaf and FrameMaker documentation tools. While some of the current agreements between vendors to create tool coalitions are

  5. The Self-Perception and Usage of Medical Apps amongst Medical Students in the United States: A Cross-Sectional Survey.

    PubMed

    Quant, Cara; Altieri, Lisa; Torres, Juan; Craft, Noah

    2016-01-01

    Background. Mobile medical software applications (apps) are used for clinical decision-making at the point of care. Objectives. To determine (1) the usage, reliability, and popularity of mobile medical apps and (2) medical students' perceptions of app usage effect on the quality of patient-provider interaction in healthcare settings. Methods. An anonymous web-based survey was distributed to medical students. Frequency of use, type of app used, and perceptions of reliability were assessed via univariate analysis. Results. Seven hundred thirty-one medical students responded, equating to a response rate of 29%. The majority (90%) of participants thought that medical apps enhance clinical knowledge, and 61% said that medical apps are as reliable as textbooks. While students thought that medical apps save time, improve the care of their patients, and improve diagnostic accuracy, 53% of participants believed that mobile device use in front of colleagues and patients makes one appear less competent. Conclusion. While medical students believe in the utility and reliability of medical apps, they were hesitant to use them out of fear of appearing less engaged. Higher levels of training correlated with a greater degree of comfort when using medical apps in front of patients.

  6. MEASURE: An integrated data-analysis and model identification facility

    NASA Technical Reports Server (NTRS)

    Singh, Jaidip; Iyer, Ravi K.

    1990-01-01

    The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.

  7. Sequence Polishing Library (SPL) v10.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberortner, Ernst

    The Sequence Polishing Library (SPL) is a suite of software tools in order to automate "Design for Synthesis and Assembly" workflows. Specifically: The SPL "Converter" tool converts files among the following sequence data exchange formats: CSV, FASTA, GenBank, and Synthetic Biology Open Language (SBOL); The SPL "Juggler" tool optimizes the codon usages of DNA coding sequences according to an optimization strategy, a user-specific codon usage table and genetic code. In addition, the SPL "Juggler" can translate amino acid sequences into DNA sequences.:The SPL "Polisher" verifies NA sequences against DNA synthesis constraints, such as GC content, repeating k-mers, and restriction sites.more » In case of violations, the "Polisher" reports the violations in a comprehensive manner. The "Polisher" tool can also modify the violating regions according to an optimization strategy, a user-specific codon usage table and genetic code;The SPL "Partitioner" decomposes large DNA sequences into smaller building blocks with partial overlaps that enable an efficient assembly. The "Partitioner" enables the user to configure the characteristics of the overlaps, which are mostly determined by the utilized assembly protocol, such as length, GC content, or melting temperature.« less

  8. Final Report: Wireless Instrument for Automated Measurement of Clean Cookstove Usage and Black Carbon Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukac, Martin; Ramanathan, Nithya; Graham, Eric

    2013-09-10

    Black carbon (BC) emissions from traditional cooking fires and other sources are significant anthropogenic drivers of radiative forcing. Clean cookstoves present a more energy-efficient and cleaner-burning vehicle for cooking than traditional wood-burning stoves, yet many existing cookstoves reduce emissions by only modest amounts. Further research into cookstove use, fuel types, and verification of emissions is needed as adoption rates for such stoves remain low. Accelerated innovation requires techniques for measuring and verifying such cookstove performance. The overarching goal of the proposed program was to develop a low-cost, wireless instrument to provide a high-resolution profile of the cookstove BC emissions andmore » usage in the field. We proposed transferring the complexity of analysis away from the sampling hardware at the measurement site and to software at a centrally located server to easily analyze data from thousands of sampling instruments. We were able to build a low-cost field-based instrument that produces repeatable, low-cost estimates of cookstove usage, fuel estimates, and emission values with low variability. Emission values from our instrument were consistent with published ranges of emissions for similar stove and fuel types.« less

  9. Demographic and health related data of users of a mobile application to support drug adherence is associated with usage duration and intensity.

    PubMed

    Becker, Stefan; Brandl, Christopher; Meister, Sven; Nagel, Eckhard; Miron-Shatz, Talya; Mitchell, Anna; Kribben, Andreas; Albrecht, Urs-Vito; Mertens, Alexander

    2015-01-01

    A wealth of mobile applications are designed to support users in their drug intake. When developing software for patients, it is important to understand the differences between individuals who have, who will or who might never adopt mobile interventions. This study analyzes demographic and health-related factors associated with real-life "longer usage" and the "usage-intensity per day" of the mobile application "Medication Plan". Between 2010-2012, the mobile application "Medication Plan" could be downloaded free of charge from the Apple-App-Store. It was aimed at supporting the regular and correct intake of medication. Demographic and health-related data were collected via an online questionnaire. This study analyzed captured data. App-related activities of 1799 users (1708 complete data sets) were recorded. 69% (1183/1708) applied "Medication Plan" for more than a day. 74% were male (872/1183), the median age 45 years. Variance analysis showed a significant effect of the users' age with respect to duration of usage (p = 0.025). While the mean duration of use was only 23.3 days for users younger than 21 years, for older users, there was a substantial increase over all age cohorts up to users of 60 years and above (103.9 days). Sex and educational status had no effect. "Daily usage intensity" was directly associated with an increasing number of prescribed medications and increased from an average of 1.87 uses per day and 1 drug per day to on average 3.71 uses per day for users stating to be taking more than 7 different drugs a day (p<0.001). Demographic predictors (sex, age and educational attainment) did not affect usage intensity. Users aged 60+ as well as those with complicated therapeutic drug regimens relied on the service we provided for more than three months on average. Mobile applications may be a promising approach to support the treatment of patients with chronic conditions.

  10. Monitoring the Earth System Grid Federation through the ESGF Dashboard

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Bell, G. M.; Drach, B.; Williams, D.; Aloisio, G.

    2012-12-01

    The Climate Model Intercomparison Project, phase 5 (CMIP5) is a global effort coordinated by the World Climate Research Programme (WCRP) involving tens of modeling groups spanning 19 countries. It is expected the CMIP5 distributed data archive will total upwards of 3.5 petabytes, stored across several ESGF Nodes on four continents (North America, Europe, Asia, and Australia). The Earth System Grid Federation (ESGF) provides the IT infrastructure to support the CMIP5. In this regard, the monitoring of the distributed ESGF infrastructure represents a crucial part carried out by the ESGF Dashboard. The ESGF Dashboard is a software component of the ESGF stack, responsible for collecting key information about the status of the federation in terms of: 1) Network topology (peer-groups composition), 2) Node type (host/services mapping), 3) Registered users (including their Identity Providers), 4) System metrics (e.g., round-trip time, service availability, CPU, memory, disk, processes, etc.), 5) Download metrics (both at the Node and federation level). The last class of information is very important since it provides a strong insight of the CMIP5 experiment: the data usage statistics. In this regard, CMCC and LLNL have developed a data analytics management system for the analysis of both node-level and federation-level data usage statistics. It provides data usage statistics aggregated by project, model, experiment, variable, realm, peer node, time, ensemble, datasetname (including version), etc. The back-end of the system is able to infer the data usage information of the entire federation, by carrying out: - at node level: a 18-step reconciliation process on the peer node databases (i.e. node manager and publisher DB) which provides a 15-dimension datawarehouse with local statistics and - at global level: an aggregation process which federates the data usage statistics into a 16-dimension datawarehouse with federation-level data usage statistics. The front-end of the Dashboard system exploits a web desktop approach, which joins the pervasivity of a web application with the flexibility of a desktop one.

  11. Clock Agreement Among Parallel Supercomputer Nodes

    DOE Data Explorer

    Jones, Terry R.; Koenig, Gregory A.

    2014-04-30

    This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.

  12. Digital pathology access and usage in the UK: results from a national survey on behalf of the National Cancer Research Institute's CM-Path initiative.

    PubMed

    Williams, Bethany Jill; Lee, Jessica; Oien, Karin A; Treanor, Darren

    2018-05-01

    To canvass the UK pathology community to ascertain current levels of digital pathology usage in clinical and academic histopathology departments, and prevalent attitudes to digital pathology. A 15-item survey was circulated to National Health Service and academic pathology departments across the UK using the SurveyMonkey online survey tool. Responses were sought at a departmental or institutional level. Where possible, departmental heads were approached and asked to complete the survey, or forward it to the most relevant individual in their department. Data were collected over a 6-month period from February to July 2017. 41 institutes from across the UK responded to the survey. 60% (23/39) of institutions had access to a digital pathology scanner, and 60% (24/40) had access to a digital pathology workstation. The most popular applications of digital pathology in current use were undergraduate and postgraduate teaching, research and quality assurance. Investigating the deployment of digital pathology in their department was identified as a high or highest priority by 58.5% of institutions, with improvements in efficiency, turnaround times, reporting times and collaboration in their institution anticipated by the respondents. Access to funding for initial hardware, software and staff outlay, pathologist training and guidance from the Royal College of Pathologists were identified as factors that could enable respondent institutions to increase their digital pathology usage. Interest in digital pathology adoption in the UK is high, with usage likely to increase in the coming years. In light of this, pathologists are seeking more guidance on safe usage. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Assessment of Oral Hygiene Knowledge, Practices, and Concepts of Tobacco Usage among Engineering Students in Bhubaneswar, Odisha, India.

    PubMed

    Bandyopadhyay, Alokenath; Bhuyan, Lipsa; Panda, Abikshyeet; Dash, Kailash C; Raghuvanshi, Malvika; Behura, Shyam S

    2017-06-01

    This study aimed to assess oral hygiene-related knowledge and practices among engineering students of Bhubaneswar city and also to evaluate the concepts about the side effects of tobacco usage among those students. The study was conducted using a self-administered, close-ended questionnaire to assess the oral hygiene knowledge and practices and study the concepts on tobacco usage among 362 engineering students of Bhubaneswar city, Odisha, India. The obtained data were statistically analyzed using Statistical Package for the Social Sciences software version 20.0. This survey found that 26.51% of the students had never visited a dentist. Nearly 43.64% of the participants were cognizant of the fact that improper brushing is the reason of tooth decay. About 47% of the participants consumed alcohol and 32.6% had the habit of chewing tobacco, though 80% were aware that use of smokeless tobacco can impair oral health and cause cancer and use of alcohol has detrimental effect on oral health. Knowledge with respect to oral health among engineering students of Bhubaneswar city is adequate regarding using fluoridated toothpaste and flosses. However, an unhealthy snacking habit, overusage of toothbrushes, consumption of alcohol, and practicing tobacco habit show the lack of oral health knowledge in these students. Our study provides an idea about the present scenario in terms of oral hygiene and tobacco usage in young individuals. This can form the basis for oral health education and tobacco cessation program. Moreover, as the habit of tobacco usage starts early during college life, adequate knowledge about its ill-effects would prevent deadly diseases, such as potentially malignant disorders and oral cancer.

  14. Digital pathology access and usage in the UK: results from a national survey on behalf of the National Cancer Research Institute’s CM-Path initiative

    PubMed Central

    Williams, Bethany Jill; Lee, Jessica; Oien, Karin A; Treanor, Darren

    2018-01-01

    Aim To canvass the UK pathology community to ascertain current levels of digital pathology usage in clinical and academic histopathology departments, and prevalent attitudes to digital pathology. Methods A 15-item survey was circulated to National Health Service and academic pathology departments across the UK using the SurveyMonkey online survey tool. Responses were sought at a departmental or institutional level. Where possible, departmental heads were approached and asked to complete the survey, or forward it to the most relevant individual in their department. Data were collected over a 6-month period from February to July 2017. Results 41 institutes from across the UK responded to the survey. 60% (23/39) of institutions had access to a digital pathology scanner, and 60% (24/40) had access to a digital pathology workstation. The most popular applications of digital pathology in current use were undergraduate and postgraduate teaching, research and quality assurance. Investigating the deployment of digital pathology in their department was identified as a high or highest priority by 58.5% of institutions, with improvements in efficiency, turnaround times, reporting times and collaboration in their institution anticipated by the respondents. Access to funding for initial hardware, software and staff outlay, pathologist training and guidance from the Royal College of Pathologists were identified as factors that could enable respondent institutions to increase their digital pathology usage. Conclusion Interest in digital pathology adoption in the UK is high, with usage likely to increase in the coming years. In light of this, pathologists are seeking more guidance on safe usage. PMID:29317516

  15. Reproducibility of dynamic contrast-enhanced MRI and dynamic susceptibility contrast MRI in the study of brain gliomas: a comparison of data obtained using different commercial software.

    PubMed

    Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta

    2017-04-01

    Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.

  16. Effect of self-deflection on a totally asymmetric simple exclusion process with functions of site assignments

    NASA Astrophysics Data System (ADS)

    Tsuzuki, Satori; Yanagisawa, Daichi; Nishinari, Katsuhiro

    2018-04-01

    This study proposes a model of a totally asymmetric simple exclusion process on a single-channel lane with functions of site assignments along the pit lane. The system model attempts to insert a new particle to the leftmost site at a certain probability by randomly selecting one of the empty sites in the pit lane, and reserving it for the particle. Thereafter, the particle is directed to stop at the site only once during its travel. Recently, the system was determined to show a self-deflection effect, in which the site usage distribution biases spontaneously toward the leftmost site, and the throughput becomes maximum when the site usage distribution is slightly biased to the rightmost site. Our exact analysis describes this deflection effect and show a good agreement with simulations.

  17. Analysis of machining accuracy during free form surface milling simulation for different milling strategies

    NASA Astrophysics Data System (ADS)

    Matras, A.; Kowalczyk, R.

    2014-11-01

    The analysis results of machining accuracy after the free form surface milling simulations (based on machining EN AW- 7075 alloys) for different machining strategies (Level Z, Radial, Square, Circular) are presented in the work. Particular milling simulations were performed using CAD/CAM Esprit software. The accuracy of obtained allowance is defined as a difference between the theoretical surface of work piece element (the surface designed in CAD software) and the machined surface after a milling simulation. The difference between two surfaces describes a value of roughness, which is as the result of tool shape mapping on the machined surface. Accuracy of the left allowance notifies in direct way a surface quality after the finish machining. Described methodology of usage CAD/CAM software can to let improve a time design of machining process for a free form surface milling by a 5-axis CNC milling machine with omitting to perform the item on a milling machine in order to measure the machining accuracy for the selected strategies and cutting data.

  18. A Software Development Platform for Wearable Medical Applications.

    PubMed

    Zhang, Ruikai; Lin, Wei

    2015-10-01

    Wearable medical devices have become a leading trend in healthcare industry. Microcontrollers are computers on a chip with sufficient processing power and preferred embedded computing units in those devices. We have developed a software platform specifically for the design of the wearable medical applications with a small code footprint on the microcontrollers. It is supported by the open source real time operating system FreeRTOS and supplemented with a set of standard APIs for the architectural specific hardware interfaces on the microcontrollers for data acquisition and wireless communication. We modified the tick counter routine in FreeRTOS to include a real time soft clock. When combined with the multitasking features in the FreeRTOS, the platform offers the quick development of wearable applications and easy porting of the application code to different microprocessors. Test results have demonstrated that the application software developed using this platform are highly efficient in CPU usage while maintaining a small code foot print to accommodate the limited memory space in microcontrollers.

  19. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  20. Image manipulation software portable on different hardware platforms: what is the cost?

    NASA Astrophysics Data System (ADS)

    Ligier, Yves; Ratib, Osman M.; Funk, Matthieu; Perrier, Rene; Girard, Christian; Logean, Marianne

    1992-07-01

    A hospital wide PACS project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging components of a PACS. Because there are different requirements depending on the clinical usage, it was necessary for such a visualization software to be provided on different types of workstations in different sectors of the PACS. The user interface has to be the same independently of the underlying workstation. Beside, in addition to a standard set of image manipulation and processing tools there is a need for more specific clinical tools that should be easily adapted to specific medical requirements. To achieve operating and windowing systems: the standard Unix/X-11/OSF-Motif based workstations and the Macintosh family and should be easily ported on other systems. This paper describes the design of such a system and discusses the extra cost and efforts involved in the development of a portable and easily expandable software.

  1. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    PubMed

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All software packages underestimated the upper airway dimensions of the anthropomorphic phantom.

  2. Improving urban district heating systems and assessing the efficiency of the energy usage therein

    NASA Astrophysics Data System (ADS)

    Orlov, M. E.; Sharapov, V. I.

    2017-11-01

    The report describes issues in connection with improving urban district heating systems from combined heat power plants (CHPs), to propose the ways for improving the reliability and the efficiency of the energy usage (often referred to as “energy efficiency”) in such systems. The main direction of such urban district heating systems improvement suggests transition to combined heating systems that include structural elements of both centralized and decentralized systems. Such systems provide the basic part of thermal power via highly efficient methods for extracting thermal power plants turbines steam, while peak loads are covered by decentralized peak thermal power sources to be mounted at consumers’ locations, with the peak sources being also reserve thermal power sources. The methodology was developed for assessing energy efficiency of the combined district heating systems, implemented as a computer software product capable of comparatively calculating saving on reference fuel for the system.

  3. DSN Resource Scheduling

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Baldwin, John

    2007-01-01

    TIGRAS is client-side software, which provides tracking-station equipment planning, allocation, and scheduling services to the DSMS (Deep Space Mission System). TIGRAS provides functions for schedulers to coordinate the DSN (Deep Space Network) antenna usage time and to resolve the resource usage conflicts among tracking passes, antenna calibrations, maintenance, and system testing activities. TIGRAS provides a fully integrated multi-pane graphical user interface for all scheduling operations. This is a great improvement over the legacy VAX VMS command line user interface. TIGRAS has the capability to handle all DSN resource scheduling aspects from long-range to real time. TIGRAS assists NASA mission operations for DSN tracking of station equipment resource request processes from long-range load forecasts (ten years or longer), to midrange, short-range, and real-time (less than one week) emergency tracking plan changes. TIGRAS can be operated by NASA mission operations worldwide to make schedule requests for the DSN station equipment.

  4. University students' notebook computer use.

    PubMed

    Jacobs, Karen; Johnson, Peter; Dennerlein, Jack; Peterson, Denise; Kaufman, Justin; Gold, Joshua; Williams, Sarah; Richmond, Nancy; Karban, Stephanie; Firn, Emily; Ansong, Elizabeth; Hudak, Sarah; Tung, Katherine; Hall, Victoria; Pencina, Karol; Pencina, Michael

    2009-05-01

    Recent evidence suggests that university students are self-reporting experiencing musculoskeletal discomfort with computer use similar to levels reported by adult workers. The objective of this study was to determine how university students use notebook computers and to determine what ergonomic strategies might be effective in reducing self-reported musculoskeletal discomfort in this population. Two hundred and eighty-nine university students randomly assigned to one of three towers by the university's Office of Housing participated in this study. The results of this investigation showed a significant reduction in self-reported notebook computer-related discomfort from pre- and post-survey in participants who received notebook computer accessories and in those who received accessories and participatory ergonomics training. A significant increase in post-survey rest breaks was seen. There was a significant correlation between self-reported computer usage and the amount measured using computer usage software (odometer). More research is needed however to determine the most effective ergonomics intervention for university students.

  5. Comprehensive 3D-elastohydrodynamic simulation of hermetic compressor crank drive

    NASA Astrophysics Data System (ADS)

    Posch, S.; Hopfgartner, J.; Berger, E.; Zuber, B.; Almbauer, R.; Schöllauf, P.

    2017-08-01

    Mechanical, electrical and thermodynamic losses form the major loss mechanisms of hermetic compressors for refrigeration application. The present work deals with the investigation of the mechanical losses of a hermetic compressor crank drive. Focus is on 3d-elastohydrodynamic (EHD) modelling of the journal bearings, piston-liner contact and piston secondary motion in combination with multi-body and structural dynamics of the crank drive elements. A detailed description of the model development within the commercial software AVL EXCITE Power Unit is given in the work. The model is used to create a comprehensive analysis of the mechanical losses of a hermetic compressor. Further on, a parametric study concerning oil viscosity and compressor speed is carried out which shows the possibilities of the usage of the model in the development process of hermetic compressors for refrigeration application. Additionally, the usage of the results in an overall thermal network for the determination of the thermal compressor behaviour is discussed.

  6. Information system needs in health promotion: a case study of the Safe Community programme using requirements engineering methods.

    PubMed

    Timpka, Toomas; Olvander, Christina; Hallberg, Niklas

    2008-09-01

    The international Safe Community programme was used as the setting for a case study to explore the need for information system support in health promotion programmes. The 14 Safe Communities active in Sweden during 2002 were invited to participate and 13 accepted. A questionnaire on computer usage and a critical incident technique instrument were distributed. Sharing of management information, creating social capital for safety promotion, and injury data recording were found to be key areas that need to be further supported by computer-based information systems. Most respondents reported having access to a personal computer workstation with standard office software. Interest in using more advanced computer applications was low, and there was considerable need for technical user support. Areas where information systems can be used to make health promotion practice more efficient were identified, and patterns of computers usage were described.

  7. [Hearing the impact of MP3 on a survey of middle school students].

    PubMed

    Xu, Zhan; Li, Zonghua; Chen, Yang; He, Ya; Chunyu, Xiujie; Wang, Fangyuan; Zhang, Pengzhi; Gao, Lei; Qiu, Shuping; Liu, Shunli; Qiao, Li; Qiu, Jianhua

    2011-02-01

    To understand the usage of MP3 and effects on hearing of middle school students in Xi'an, and discuss controlling strategies. Stratified random cluster sampling method was used in the 1567 middle school students in Xi'an through questionnaire survey, ear examination and hearing examination, data were analysed by the SPSS13.0 statistical software. 1) The rate of holding MP3 in the middle school students was 85.2%. Average daily use time was (1.41 +/- 1.11) h. 2) The noise group of pure tone hearing threshold was significantly higher compared with the control group (P<0.01), and increased the detection rate of hearing loss with the increasing use of MP3. 3) The detection rate of symptoms increased with the increasing use of MP3. The usage of MP3 can harm hearing in middle school students, which can result in neurasthenic syndrome.

  8. Searching for Information Online: Using Big Data to Identify the Concerns of Potential Army Recruits

    DTIC Science & Technology

    2016-01-01

    software. For instance, such Internet search engines as Google or Yahoo! often gather anonymized data regarding the topics that people search for, as...suggesting that these and other information needs may be fur- ther reflected in usage of online search engines . Google makes aggregated and anonymized...Internet search engines such as Google or Yahoo! often gather anonymized data regarding the topics that people search for, as well as the date and

  9. An Overview of the AAVSO's Information Technology Infrastructure From 1967 to 1997

    NASA Astrophysics Data System (ADS)

    Kinne, R. C. S.

    2012-06-01

    Computer technology and data processing swept both society and the sciences like a wave in the latter half of the 20th century. We trace the AAVSO’s usage of computational and data processing technology from its beginnings in 1967, through 1997. We focus on equipment, people, and the purpose such computational power was put to, and compare and contrast the organization’s use of hardware and software with that of the wider industry.

  10. System of Systems Engineering and Integration Process for Network Transport Assessment

    DTIC Science & Technology

    2016-09-01

    SOSE&I CONCEPTS The DOD-sourced “Systems Engineering Guide for Systems of Systems” provides an overview of the SoS environment and SE considerations...usage as a guide in application of systems engineering processes. They are listed verbatim below as defined in the DOD SE guide (ODUSD[A&T]SSE 2008...Technology (A&T), Systems and Software Engineering (SSE). 2008. Systems Engineering Guide for Systems of Systems. Washington, DC: ODUSD(A&T)SSE

  11. Satellite freeze forecast system

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D. (Principal Investigator)

    1983-01-01

    Provisions for back-up operations for the satellite freeze forecast system are discussed including software and hardware maintenance and DS/1000-1V linkage; troubleshooting; and digitized radar usage. The documentation developed; dissemination of data products via television and the IFAS computer network; data base management; predictive models; the installation of and progress towards the operational status of key stations; and digital data acquisition are also considered. The d addition of dew point temperature into the P-model is outlined.

  12. A New Method for Global Optimization Based on Stochastic Differential Equations.

    DTIC Science & Technology

    1984-12-01

    Optimizacion Global de Funciones, Universidad Nacional Autonoma de M~xico, Instituto de Investigaciones en Matematicas Aplicadas y en Sistemas, Report...SIGMA package and its usage are described in full de - tail in Annex A5; the complete listing of the FORTRAN code is in Annex A6. 5. Test problems Since...software implemen- tation on a number of test problems: and therefore a collection of test problems naturally began to build up during project de - velopment

  13. Data Mining and Information Technology: Its Impact on Intelligence Collection and Privacy Rights

    DTIC Science & Technology

    2007-11-26

    sources include: Cameras - Digital cameras (still and video ) have been improving in capability while simultaneously dropping in cost at a rate...citizen is caught on camera 300 times each day.5 The power of extensive video coverage is magnified greatly by the nascent capability for voice and...software on security videos and tracking cell phone usage in the local area. However, it would only return the names and data of those who

  14. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics

    PubMed Central

    2010-01-01

    Background Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. Description An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Conclusions Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms. PMID:21210976

  15. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics.

    PubMed

    Taylor, Ronald C

    2010-12-21

    Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms.

  16. Remote Sensing Image Analysis Without Expert Knowledge - A Web-Based Classification Tool On Top of Taverna Workflow Management System

    NASA Astrophysics Data System (ADS)

    Selsam, Peter; Schwartze, Christian

    2016-10-01

    Providing software solutions via internet has been known for quite some time and is now an increasing trend marketed as "software as a service". A lot of business units accept the new methods and streamlined IT strategies by offering web-based infrastructures for external software usage - but geospatial applications featuring very specialized services or functionalities on demand are still rare. Originally applied in desktop environments, the ILMSimage tool for remote sensing image analysis and classification was modified in its communicating structures and enabled for running on a high-power server and benefiting from Tavema software. On top, a GIS-like and web-based user interface guides the user through the different steps in ILMSimage. ILMSimage combines object oriented image segmentation with pattern recognition features. Basic image elements form a construction set to model for large image objects with diverse and complex appearance. There is no need for the user to set up detailed object definitions. Training is done by delineating one or more typical examples (templates) of the desired object using a simple vector polygon. The template can be large and does not need to be homogeneous. The template is completely independent from the segmentation. The object definition is done completely by the software.

  17. Bandwidth Optimization On Design Of Visual Display Information System Based Networking At Politeknik Negeri Bali

    NASA Astrophysics Data System (ADS)

    Sudiartha, IKG; Catur Bawa, IGNB

    2018-01-01

    Information can not be separated from the social life of the community, especially in the world of education. One of the information fields is academic calendar information, activity agenda, announcement and campus activity news. In line with technological developments, text-based information is becoming obsolete. For that need creativity to present information more quickly, accurately and interesting by exploiting the development of digital technology and internet. In this paper will be developed applications for the provision of information in the form of visual display, applied to computer network system with multimedia applications. Network-based applications provide ease in updating data through internet services, attractive presentations with multimedia support. The application “Networking Visual Display Information Unit” can be used as a medium that provides information services for students and academic employee more interesting and ease in updating information than the bulletin board. The information presented in the form of Running Text, Latest Information, Agenda, Academic Calendar and Video provide an interesting presentation and in line with technological developments at the Politeknik Negeri Bali. Through this research is expected to create software “Networking Visual Display Information Unit” with optimal bandwidth usage by combining local data sources and data through the network. This research produces visual display design with optimal bandwidth usage and application in the form of supporting software.

  18. Smart energy management system

    NASA Astrophysics Data System (ADS)

    Desai, Aniruddha; Singh, Jugdutt

    2010-04-01

    Peak and average energy usage in domestic and industrial environments is growing rapidly and absence of detailed energy consumption metrics is making systematic reduction of energy usage very difficult. Smart energy management system aims at providing a cost-effective solution for managing soaring energy consumption and its impact on green house gas emissions and climate change. The solution is based on seamless integration of existing wired and wireless communication technologies combined with smart context-aware software which offers a complete solution for automation of energy measurement and device control. The persuasive software presents users with easy-to-assimilate visual cues identifying problem areas and time periods and encourages a behavioural change to conserve energy. The system allows analysis of real-time/statistical consumption data with the ability to drill down into detailed analysis of power consumption, CO2 emissions and cost. The system generates intelligent projections and suggests potential methods (e.g. reducing standby, tuning heating/cooling temperature, etc.) of reducing energy consumption. The user interface is accessible using web enabled devices such as PDAs, PCs, etc. or using SMS, email, and instant messaging. Successful real-world trial of the system has demonstrated the potential to save 20 to 30% energy consumption on an average. Low cost of deployment and the ability to easily manage consumption from various web enabled devices offers gives this system a high penetration and impact capability offering a sustainable solution to act on climate change today.

  19. An evaluation of the state of time synchronization on leadership class supercomputers

    DOE PAGES

    Jones, Terry; Ostrouchov, George; Koenig, Gregory A.; ...

    2017-10-09

    We present a detailed examination of time agreement characteristics for nodes within extreme-scale parallel computers. Using a software tool we introduce in this paper, we quantify attributes of clock skew among nodes in three representative high-performance computers sited at three national laboratories. Our measurements detail the statistical properties of time agreement among nodes and how time agreement drifts over typical application execution durations. We discuss the implications of our measurements, why the current state of the field is inadequate, and propose strategies to address observed shortcomings.

  20. An evaluation of the state of time synchronization on leadership class supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Terry; Ostrouchov, George; Koenig, Gregory A.

    We present a detailed examination of time agreement characteristics for nodes within extreme-scale parallel computers. Using a software tool we introduce in this paper, we quantify attributes of clock skew among nodes in three representative high-performance computers sited at three national laboratories. Our measurements detail the statistical properties of time agreement among nodes and how time agreement drifts over typical application execution durations. We discuss the implications of our measurements, why the current state of the field is inadequate, and propose strategies to address observed shortcomings.

  1. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  2. An exploratory study of treated-bed nets in Timor-Leste: patterns of intended and alternative usage

    PubMed Central

    2011-01-01

    Background The Timor-Leste Ministry of Health has recently finalized the National Malaria Control Strategy for 2010-2020. A key component of this roadmap is to provide universal national coverage with long-lasting insecticide-treated nets (LLINs) in support of achieving the primary goal of reducing both morbidity and mortality from malaria by 30% in the first three years, followed by a further reduction of 20% by end of the programme cycle in 2020 [1]. The strategic plan calls for this target to be supported by a comprehensive information, education and communication (IEC) programme; however, there is limited prior research into household and personal usage patterns to assist in the creation of targeted, effective, and socio-culturally specific behaviour change materials. Methods Nine separate focus group discussions (FGDs) were carried out in Dili, Manatuto, and Covalima districts, Democratic Republic of Timor-Leste, in July 2010. These focus groups primarily explored themes of perceived malaria risk, causes of malaria, net usage patterns within families, barriers to correct and consistent usage, and the daily experience of users (both male and female) in households with at least one net. Comprehensive qualitative analysis utilized open source analysis software. Results The primary determinants of net usage were a widespread perception that nets could or should only be used by pregnant women and young children, and the availability of sufficient sleeping space under a limited number of nets within households. Both nuisance biting and disease prevention were commonly cited as primary motivations for usage, while seasonality was not a significant factor. Long-term net durability and ease of hanging were seen as key attributes in net design preference. Very frequent washing cycles were common, potentially degrading net effectiveness. Finally, extensive re-purposing of nets (fishing, protecting crops) was both reported and observed, and may significantly decrease availability of nighttime sleeping space for all family members if distributed nets do not remain within the household. Conclusions Emphasizing that net usage is acceptable and important for all family members regardless of age or gender, and addressing the complex behavioural economics of alternative net usages could have significant impacts on malaria control efforts in Timor-Leste, as the country's programmes make progress towards universal net coverage. PMID:21777415

  3. Challenges and Requirements for the Application of Industry 4.0: A Special Insight with the Usage of Cyber-Physical System

    NASA Astrophysics Data System (ADS)

    Mueller, Egon; Chen, Xiao-Li; Riedel, Ralph

    2017-09-01

    Considered as a top priority of industrial development, Industry 4.0 (or Industrie 4.0 as the German version) has being highlighted as the pursuit of both academy and practice in companies. In this paper, based on the review of state of art and also the state of practice in different countries, shortcomings have been revealed as the lacking of applicable framework for the implementation of Industrie 4.0. Therefore, in order to shed some light on the knowledge of the details, a reference architecture is developed, where four perspectives namely manufacturing process, devices, software and engineering have been highlighted. Moreover, with a view on the importance of Cyber-Physical systems, the structure of Cyber-Physical System are established for the in-depth analysis. Further cases with the usage of Cyber-Physical System are also arranged, which attempts to provide some implications to match the theoretical findings together with the experience of companies. In general, results of this paper could be useful for the extending on the theoretical understanding of Industrie 4.0. Additionally, applied framework and prototypes based on the usage of Cyber-Physical Systems are also potential to help companies to design the layout of sensor nets, to achieve coordination and controlling of smart machines, to realize synchronous production with systematic structure, and to extend the usage of information and communication technologies to the maintenance scheduling.

  4. Implementation of software-based sensor linearization algorithms on low-cost microcontrollers.

    PubMed

    Erdem, Hamit

    2010-10-01

    Nonlinear sensors and microcontrollers are used in many embedded system designs. As the input-output characteristic of most sensors is nonlinear in nature, obtaining data from a nonlinear sensor by using an integer microcontroller has always been a design challenge. This paper discusses the implementation of six software-based sensor linearization algorithms for low-cost microcontrollers. The comparative study of the linearization algorithms is performed by using a nonlinear optical distance-measuring sensor. The performance of the algorithms is examined with respect to memory space usage, linearization accuracy and algorithm execution time. The implementation and comparison results can be used for selection of a linearization algorithm based on the sensor transfer function, expected linearization accuracy and microcontroller capacity. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Applying Jlint to Space Exploration Software

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus

    2004-01-01

    Java is a very successful programming language which is also becoming widespread in embedded systems, where software correctness is critical. Jlint is a simple but highly efficient static analyzer that checks a Java program for several common errors, such as null pointer exceptions, and overflow errors. It also includes checks for multi-threading problems, such as deadlocks and data races. The case study described here shows the effectiveness of Jlint in find-false positives in the multi-threading warnings gives an insight into design patterns commonly used in multi-threaded code. The results show that a few analysis techniques are sufficient to avoid almost all false positives. These techniques include investigating all possible callers and a few code idioms. Verifying the correct application of these patterns is still crucial, because their correct usage is not trivial.

  6. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Bartoldus, R.

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farmmore » of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.« less

  7. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  8. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  9. Using Ready-To Drone Images in Forestry Activities: Case Study of ÇINARPINAR in Kahramanmaras, Turkey

    NASA Astrophysics Data System (ADS)

    Gülci, S.; Akgül, M.; Akay, A. E.; Taş, İ.

    2017-11-01

    This short paper aims to present pros and cons of current usage of ready-to-use drone images in the field of forestry also considering flight planning and photogrammetric processes. The capabilities of DJI Phantom 4, which is the low cost drone producing by Dji company, was evaluated through sample flights in Cinarpinar Forest Enterprise Chief in Kahramanmaras in Turkey. In addition, the photogrammetric workflow of obtained images and automated flight were presented with respect to capabilities of available software. The flight plans were created by using Pix4DCapture software with android based cell phone. The results indicated that high-resolution imagery obtained by drone can provide significant data for assessment of forest resources, forest roads, and stream channels.

  10. Study on Network Error Analysis and Locating based on Integrated Information Decision System

    NASA Astrophysics Data System (ADS)

    Yang, F.; Dong, Z. H.

    2017-10-01

    Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.

  11. Problem solving in magnetic field: Animation in mobile application

    NASA Astrophysics Data System (ADS)

    Najib, A. S. M.; Othman, A. P.; Ibarahim, Z.

    2014-09-01

    This paper is focused on the development of mobile application for smart phone, Android, tablet, iPhone, and iPad as a problem solving tool in magnetic field. Mobile application designs consist of animations that were created by using Flash8 software which could be imported and compiled to prezi.com software slide. The Prezi slide then had been duplicated in Power Point format and instead question bank with complete answer scheme was also additionally generated as a menu in the application. Results of the published mobile application can be viewed and downloaded at Infinite Monkey website or at Google Play Store from your gadgets. Statistics of the application from Google Play Developer Console shows the high impact of the application usage in all over the world.

  12. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  13. Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.

    DTIC Science & Technology

    1992-05-01

    de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools

  14. World Wind Tools Reveal Environmental Change

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Originally developed under NASA's Learning Technologies program as a tool to engage and inspire students, World Wind software was released under the NASA Open Source Agreement license. Honolulu, Hawaii based Intelesense Technologies is one of the companies currently making use of the technology for environmental, public health, and other monitoring applications for nonprofit organizations and Government agencies. The company saved about $1 million in development costs by using the NASA software.

  15. Applying Computerized-Scoring Models of Written Biological Explanations across Courses and Colleges: Prospects and Limitations

    PubMed Central

    Ha, Minsu; Nehm, Ross H.; Urban-Lurain, Mark; Merrill, John E.

    2011-01-01

    Our study explored the prospects and limitations of using machine-learning software to score introductory biology students’ written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors’ and nonmajors’ written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of “near-perfect” agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations. PMID:22135372

  16. Applying computerized-scoring models of written biological explanations across courses and colleges: prospects and limitations.

    PubMed

    Ha, Minsu; Nehm, Ross H; Urban-Lurain, Mark; Merrill, John E

    2011-01-01

    Our study explored the prospects and limitations of using machine-learning software to score introductory biology students' written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors' and nonmajors' written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of "near-perfect" agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations.

  17. Developing open-source codes for electromagnetic geophysics using industry support

    NASA Astrophysics Data System (ADS)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  18. Evaluation of CHO Benchmarks on the Arria 10 FPGA using Intel FPGA SDK for OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    The OpenCL standard is an open programming model for accelerating algorithms on heterogeneous computing system. OpenCL extends the C-based programming language for developing portable codes on different platforms such as CPU, Graphics processing units (GPUs), Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs). The Intel FPGA SDK for OpenCL is a suite of tools that allows developers to abstract away the complex FPGA-based development flow for a high-level software development flow. Users can focus on the design of hardware-accelerated kernel functions in OpenCL and then direct the tools to generate the low-level FPGA implementations. The approach makes themore » FPGA-based development more accessible to software users as the needs for hybrid computing using CPUs and FPGAs are increasing. It can also significantly reduce the hardware development time as users can evaluate different ideas with high-level language without deep FPGA domain knowledge. Benchmarking of OpenCL-based framework is an effective way for analyzing the performance of system by studying the execution of the benchmark applications. CHO is a suite of benchmark applications that provides support for OpenCL [1]. The authors presented CHO as an OpenCL port of the CHStone benchmark. Using Altera OpenCL (AOCL) compiler to synthesize the benchmark applications, they listed the resource usage and performance of each kernel that can be successfully synthesized by the compiler. In this report, we evaluate the resource usage and performance of the CHO benchmark applications using the Intel FPGA SDK for OpenCL and Nallatech 385A FPGA board that features an Arria 10 FPGA device. The focus of the report is to have a better understanding of the resource usage and performance of the kernel implementations using Arria-10 FPGA devices compared to Stratix-5 FPGA devices. In addition, we also gain knowledge about the limitations of the current compiler when it fails to synthesize a benchmark application.« less

  19. Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox

    NASA Astrophysics Data System (ADS)

    Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.

    2017-10-01

    Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.

  20. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  1. MoKey: A versatile exergame creator for everyday usage.

    PubMed

    Eckert, Martina; López, Marcos; Lázaro, Carlos; Meneses, Juan

    2017-11-27

    Currently, virtual applications for physical exercises are highly appreciated as rehabilitation instruments. This article presents a middleware called "MoKey" (Motion Keyboard), which converts standard off-the-shelf software into exergames (exercise games). A configurable set of gestures, captured by a motion capture camera, is translated into the key strokes required by the chosen software. The present study assesses the tool regarding usability and viability on a heterogeneous group of 11 participants, aged 5 to 51, with moderate to severe disabilities, and mostly bound to a wheelchair. In comparison with FAAST (The Flexible Action and Articulated Skeleton Toolkit), MoKey achieved better results in terms of ease of use and computational load. The viability as an exergame creator tool was proven with help of four applications (PowerPoint®, e-book reader, Skype®, and Tetris). Success rates of up to 91% have been achieved, subjective perception was rated with 4.5 points (from 0-5). The middleware provides increased motivation due to the use of favorite software and the advantage of exploiting it for exercise. Used together with communication software or online games, social inclusion can be stimulated. The therapists can employ the tool to monitor the correctness and progress of the exercises.

  2. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    PubMed

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  3. Design, Development and Pre-Flight Testing of the Communications, Navigation, and Networking Reconfigurable Testbed (Connect) to Investigate Software Defined Radio Architecture on the International Space Station

    NASA Technical Reports Server (NTRS)

    Over, Ann P.; Barrett, Michael J.; Reinhart, Richard C.; Free, James M.; Cikanek, Harry A., III

    2011-01-01

    The Communication Navigation and Networking Reconfigurable Testbed (CoNNeCT) is a NASA-sponsored mission, which will investigate the usage of Software Defined Radios (SDRs) as a multi-function communication system for space missions. A softwaredefined radio system is a communication system in which typical components of the system (e.g., modulators) are incorporated into software. The software-defined capability allows flexibility and experimentation in different modulation, coding and other parameters to understand their effects on performance. This flexibility builds inherent redundancy and flexibility into the system for improved operational efficiency, real-time changes to space missions and enhanced reliability/redundancy. The CoNNeCT Project is a collaboration between industrial radio providers and NASA. The industrial radio providers are providing the SDRs and NASA is designing, building and testing the entire flight system. The flight system will be integrated on the Express Logistics Carrier (ELC) on the International Space Station (ISS) after launch on the H-IIB Transfer Vehicle in 2012. This paper provides an overview of the technology research objectives, payload description, design challenges and pre-flight testing results.

  4. Batching System for Superior Service

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Veridian's Portable Batch System (PBS) was the recipient of the 1997 NASA Space Act Award for outstanding software. A batch system is a set of processes for managing queues and jobs. Without a batch system, it is difficult to manage the workload of a computer system. By bundling the enterprise's computing resources, the PBS technology offers users a single coherent interface, resulting in efficient management of the batch services. Users choose which information to package into "containers" for system-wide use. PBS also provides detailed system usage data, a procedure not easily executed without this software. PBS operates on networked, multi-platform UNIX environments. Veridian's new version, PBS Pro,TM has additional features and enhancements, including support for additional operating systems. Veridian distributes the original version of PBS as Open Source software via the PBS website. Customers can register and download the software at no cost. PBS Pro is also available via the web and offers additional features such as increased stability, reliability, and fault tolerance.A company using PBS can expect a significant increase in the effective management of its computing resources. Tangible benefits include increased utilization of costly resources and enhanced understanding of computational requirements and user needs.

  5. Students Soaring High with Software Spinoff

    NASA Technical Reports Server (NTRS)

    2004-01-01

    An educational software product designed by the Educational Technology Team at Ames Research Center is bringing actual aeronautical work performed by NASA engineers to the public in an interactive format for the very first time, in order to introduce future generations of engineers to the fundamentals of flight. The "Exploring Aeronautics" multimedia CD-ROM was created for use by teachers of students in grades 5 through 8. The software offers an introduction to aeronautics and covers the fundamentals of flight, including how airplanes take off, fly, and land. It contains a historical timeline and a glossary of aeronautical terms, examines different types of aircraft, and familiarizes its audience with the tools used by researchers to test aircraft designs, like wind tunnels and computational fluid dynamics. "Exploring Aeronautics" was done in cartoon animation to make it appealing to kids," notes Andrew Doser, an Ames graphic artist who helped to produce the CD-ROM, along with a team of multimedia programmers, artists, and educators, in conjunction with numerous Ames scientists. In addition to lively animation, the software features QuickTime movies and highly intuitive tools to promote usage of NASA s scientific methods in the world of aeronautics.

  6. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  7. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    PubMed Central

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  8. An analysis of integrated science and language arts themes in software at the elementary school level

    NASA Astrophysics Data System (ADS)

    Libidinsky, Lisa Jill

    2002-09-01

    There are many demands on the elementary classroom teacher today, such that teachers often do not have the time and resources to instruct in a meaningful manner that would produce effective, real instruction. Subjects are often disjointed and not significant. When teachers instruct using an integrated approach, students learn more efficiently as they see connections in the subjects. Science and language arts, when combined to produce an integrated approach, show positive associations that can enable students to learn real-life connections. In addition, with the onset of technology and the increased usage of technological programs in the schools, teachers can use technology to support an integrated curriculum. When teachers use a combined instructional focus of science, language arts, and technology to produce lessons, students are able to gain knowledge of concepts and skills necessary for appropriate academic growth and development. Given that there are many software programs available to teachers for classroom use, it is imperative that quality software is used for instruction. Using criteria based upon an intensive literature review of integrated instruction in the areas of science and language arts, this study examines science and language arts software programs to determine whether there are science and language arts integrated themes in the software analyzed. Also, this study examines whether more science and language arts integrated themes are present in science or language arts software programs. Overall, this study finds a significant difference between language arts software and science software when looking at integrated themes. This study shows that science software shows integrated themes with language arts more often than does language arts software with science. The findings in this study can serve as a reference point for educators when selecting software that is meaningful and effective in the elementary classroom. Based on this study, it is apparent that there is a need to evaluate software for appropriate use in the classroom in order to promote effective education.

  9. What Are We Looking for in Computer-Based Learning Interventions in Medical Education? A Systematic Review

    PubMed Central

    Ferreira, Patrícia; Taveira-Gomes, Isabel; Severo, Milton; Ferreira, Maria Amélia

    2016-01-01

    Background Computer-based learning (CBL) has been widely used in medical education, and reports regarding its usage and effectiveness have ranged broadly. Most work has been done on the effectiveness of CBL approaches versus traditional methods, and little has been done on the comparative effects of CBL versus CBL methodologies. These findings urged other authors to recommend such studies in hopes of improving knowledge about which CBL methods work best in which settings. Objective In this systematic review, we aimed to characterize recent studies of the development of software platforms and interventions in medical education, search for common points among studies, and assess whether recommendations for CBL research are being taken into consideration. Methods We conducted a systematic review of the literature published from 2003 through 2013. We included studies written in English, specifically in medical education, regarding either the development of instructional software or interventions using instructional software, during training or practice, that reported learner attitudes, satisfaction, knowledge, skills, or software usage. We conducted 2 latent class analyses to group articles according to platform features and intervention characteristics. In addition, we analyzed references and citations for abstracted articles. Results We analyzed 251 articles. The number of publications rose over time, and they encompassed most medical disciplines, learning settings, and training levels, totaling 25 different platforms specifically for medical education. We uncovered 4 latent classes for educational software, characteristically making use of multimedia (115/251, 45.8%), text (64/251, 25.5%), Web conferencing (54/251, 21.5%), and instructional design principles (18/251, 7.2%). We found 3 classes for intervention outcomes: knowledge and attitudes (175/212, 82.6%), knowledge, attitudes, and skills (11.8%), and online activity (12/212, 5.7%). About a quarter of the articles (58/227, 25.6%) did not hold references or citations in common with other articles. The number of common references and citations increased in articles reporting instructional design principles (P=.03), articles measuring online activities (P=.01), and articles citing a review by Cook and colleagues on CBL (P=.04). There was an association between number of citations and studies comparing CBL versus CBL, independent of publication date (P=.02). Conclusions Studies in this field vary highly, and a high number of software systems are being developed. It seems that past recommendations regarding CBL interventions are being taken into consideration. A move into a more student-centered model, a focus on implementing reusable software platforms for specific learning contexts, and the analysis of online activity to track and predict outcomes are relevant areas for future research in this field. PMID:27480053

  10. Quantitative comparison and evaluation of software packages for assessment of abdominal adipose tissue distribution by magnetic resonance imaging.

    PubMed

    Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M

    2008-01-01

    To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.

  11. Quantitative comparison and evaluation of software packages for assessment of abdominal adipose tissue distribution by magnetic resonance imaging

    PubMed Central

    Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM

    2009-01-01

    Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582

  12. Model format for a vaccine stability report and software solutions.

    PubMed

    Shin, Jinho; Southern, James; Schofield, Timothy

    2009-11-01

    A session of the International Association for Biologicals Workshop on Stability Evaluation of Vaccine, a Life Cycle Approach was devoted to a model format for a vaccine stability report, and software solutions. Presentations highlighted the utility of a model format that will conform to regulatory requirements and the ICH common technical document. However, there need be flexibility to accommodate individual company practices. Adoption of a model format is premised upon agreement regarding content between industry and regulators, and ease of use. Software requirements will include ease of use and protections against inadvertent misspecification of stability design or misinterpretation of program output.

  13. Averting Denver Airports on a Chip

    NASA Technical Reports Server (NTRS)

    Sullivan, Kevin J.

    1995-01-01

    As a result of recent advances in software engineering capabilities, we are now in a more stable environment. De-facto hardware and software standards are emerging. Work on software architecture and design patterns signals a consensus on the importance of early system-level design decisions, and agreements on the uses of certain paradigmatic software structures. We now routinely build systems that would have been risky or infeasible a few years ago. Unfortunately, technological developments threaten to destabilize software design again. Systems designed around novel computing and peripheral devices will spark ambitious new projects that will stress current software design and engineering capabilities. Micro-electro-mechanical systems (MEMS) and related technologies provide the physical basis for new systems with the potential to produce this kind of destabilizing effect. One important response to anticipated software engineering and design difficulties is carefully directed engineering-scientific research. Two specific problems meriting substantial research attention are: A lack of sufficient means to build software systems by generating, extending, specializing, and integrating large-scale reusable components; and a lack of adequate computational and analytic tools to extend and aid engineers in maintaining intellectual control over complex software designs.

  14. Relationship between Two Types of Coil Packing Densities Relative to Aneurysm Size.

    PubMed

    Park, Keun Young; Kim, Byung Moon; Ihm, Eun Hyun; Baek, Jang Hyun; Kim, Dong Joon; Kim, Dong Ik; Huh, Seung Kon; Lee, Jae Whan

    2015-01-01

    Coil packing density (PD) can be calculated via a formula (PDF ) or software (PDS ). Two types of PD can be different from each other for same aneurysm. This study aimed to evaluate the interobserver agreement and relationships between the 2 types of PD relative to aneurysm size. Consecutive 420 saccular aneurysms were treated with coiling. PD (PDF , [coil volume]/[volume calculated by formula] and PDS, [coil volume]/[volume measured by software]) was calculated and prospectively recorded. Interobserver agreement was evaluated between PDF and PDS . Additionally, the relationships between PDF and PDS relative to aneurysm size were subsequently analyzed. Interobserver agreement for PDF and PDS was excellent (Intraclass correlation coefficient, PDF ; 0.967 and PDS ; 0.998). The ratio of PDF and PDS was greater for smaller aneurysms and converged toward 1.0 as the maximum dimension (DM ) of aneurysm increased. Compared with PDS , PDF was overestimated by a mean of 28% for DM < 5 mm, by 17% for 5 mm ≤ DM < 10 mm, and by 9% for DM ≥ 10 mm (P < 0.01). Interobserver agreement for PDF and PDS was excellent. However, PDF was overestimated in smaller aneurysms and converged to PDS as aneurysm size increased. Copyright © 2014 by the American Society of Neuroimaging.

  15. Variations in algorithm implementation among quantitative texture analysis software packages

    NASA Astrophysics Data System (ADS)

    Foy, Joseph J.; Mitta, Prerana; Nowosatka, Lauren R.; Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.; Al-Hallaq, Hania; Armato, Samuel G.

    2018-02-01

    Open-source texture analysis software allows for the advancement of radiomics research. Variations in texture features, however, result from discrepancies in algorithm implementation. Anatomically matched regions of interest (ROIs) that captured normal breast parenchyma were placed in the magnetic resonance images (MRI) of 20 patients at two time points. Six first-order features and six gray-level co-occurrence matrix (GLCM) features were calculated for each ROI using four texture analysis packages. Features were extracted using package-specific default GLCM parameters and using GLCM parameters modified to yield the greatest consistency among packages. Relative change in the value of each feature between time points was calculated for each ROI. Distributions of relative feature value differences were compared across packages. Absolute agreement among feature values was quantified by the intra-class correlation coefficient. Among first-order features, significant differences were found for max, range, and mean, and only kurtosis showed poor agreement. All six second-order features showed significant differences using package-specific default GLCM parameters, and five second-order features showed poor agreement; with modified GLCM parameters, no significant differences among second-order features were found, and all second-order features showed poor agreement. While relative texture change discrepancies existed across packages, these differences were not significant when consistent parameters were used.

  16. Schodack Smart Roadside Inspection System.

    DOT National Transportation Integrated Search

    2013-02-01

    Under an earlier NYSERDA Agreement (17420) Intelligent Imaging Systems (IIS) supplied and installed Smart Roadside network software and integrated new connected vehicle roadside devices into the Schodack Smart Roadside system. The Smart Roadsid...

  17. The variables V477 Peg and MW Com observation results

    NASA Astrophysics Data System (ADS)

    Bahý, V.; Gajtanska, M.; Hanisko, P.; Krišták, L.

    2018-04-01

    The paper deals with our results of the photometric observations of two variable stars and with basic interprettions of our results. We have observed the V477 Pegassi and MW Comae systems. We have obtained their light curves in the integral light and in the B, V, R and I filters. The color indices have been computed and there have been realized the models of the both systems by the usage of the BM3 software. These models are presented in our study too.

  18. Porting of EPICS to Real Time UNIX, and Usage Ported EPICS for FEL Automation

    NASA Astrophysics Data System (ADS)

    Salikova, Tatiana

    This article describes concepts and mechanisms used in porting of EPICS (Experimental Physical and Industrial Control System) codes to platform of operating system UNIX. Without destruction of EPICS architecture, new features of EPICS provides the support for real time operating system LynxOS/x86 and equipment produced by INP (Budker Institute of Nuclear Physics). Application of ported EPICS reduces the cost of software and hardware is used for automation of FEL (Free Electron Laser) complex.

  19. MAPGEN CARTOGRAPHIC SYSTEM.

    USGS Publications Warehouse

    Evenden, Gerald I.; ,

    1986-01-01

    MAPGEN is a software system that facilitates production of cartographic displays in the research and production environment. The system generates a set of metagraphic overlays of application-defined geographical information that can be aggregated in any combination for display without reprocessing the original data. An overview of the control files, available cartographic projections, graphic attributes, overlay generator and ancillary support programs, and the device-independent graphic subsystem are presented, along with examples of usage. System transportability and associated host hardware and operating system requirements are also addressed.

  20. The high-energy physicistʼs guide to MathLink

    NASA Astrophysics Data System (ADS)

    Hahn, T.

    2012-03-01

    MathLink is Wolfram Research's protocol for communicating with the Mathematica Kernel and is used extensively in their own Notebook Frontends. The Mathematica Book insinuates that linking C programs with MathLink is straightforward but in practice there are quite a number of stumbling blocks, in particular in cross-language and cross-platform usage. This write-up tries to clarify the main issues and hopefully makes it easier for software authors to set up Mathematica interfacing in a portable way.

  1. Governance in Open Source Software Development Projects: Towards a Model for Network-Centric Edge Organizations

    DTIC Science & Technology

    2008-06-01

    project is not an isolated OSSD project. Instead, the NetBeans IDE which is the focus of development activities in the NetBeans.org project community...facilitate or constrain the intended usage of the NetBeans IDE. Figure 1 provides a rendering of some of the more visible OSSD projects that...as BioBeans and RefactorIT communities build tools on top of or extending the NetBeans platform or IDE. How do these organizations interact with

  2. Digital beacon receiver for ionospheric TEC measurement developed with GNU Radio

    NASA Astrophysics Data System (ADS)

    Yamamoto, M.

    2008-11-01

    A simple digital receiver named GNU Radio Beacon Receiver (GRBR) was developed for the satellite-ground beacon experiment to measure the ionospheric total electron content (TEC). The open-source software toolkit for the software defined radio, GNU Radio, is utilized to realize the basic function of the receiver and perform fast signal processing. The software is written in Python for a LINUX PC. The open-source hardware called Universal Software Radio Peripheral (USRP), which best matches the GNU Radio, is used as a front-end to acquire the satellite beacon signals of 150 and 400 MHz. The first experiment was successful as results from GRBR showed very good agreement to those from the co-located analog beacon receiver. Detailed design information and software codes are open at the URL http://www.rish.kyoto-u.ac.jp/digitalbeacon/.

  3. 48 CFR 208.002 - Priorities for use of Government supply sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES 208... Acquisition, and Subpart 208.74, Enterprise Software Agreements. [71 FR 39004, July 11, 2006] ...

  4. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  5. Aircraft Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  6. Federal COBOL Compiler Testing Service Compiler Validation Request Information.

    DTIC Science & Technology

    1977-05-09

    background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the

  7. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  8. 3D echocardiographic analysis of aortic annulus for transcatheter aortic valve replacement using novel aortic valve quantification software: Comparison with computed tomography.

    PubMed

    Mediratta, Anuj; Addetia, Karima; Medvedofsky, Diego; Schneider, Robert J; Kruse, Eric; Shah, Atman P; Nathan, Sandeep; Paul, Jonathan D; Blair, John E; Ota, Takeyoshi; Balkhy, Husam H; Patel, Amit R; Mor-Avi, Victor; Lang, Roberto M

    2017-05-01

    With the increasing use of transcatheter aortic valve replacement (TAVR) in patients with aortic stenosis (AS), computed tomography (CT) remains the standard for annulus sizing. However, 3D transesophageal echocardiography (TEE) has been an alternative in patients with contraindications to CT. We sought to (1) test the feasibility, accuracy, and reproducibility of prototype 3DTEE analysis software (Philips) for aortic annular measurements and (2) compare the new approach to the existing echocardiographic techniques. We prospectively studied 52 patients who underwent gated contrast CT, procedural 3DTEE, and TAVR. 3DTEE images were analyzed using novel semi-automated software designed for 3D measurements of the aortic root, which uses multiplanar reconstruction, similar to CT analysis. Aortic annulus measurements included area, perimeter, and diameter calculations from these measurements. The results were compared to CT-derived values. Additionally, 3D echocardiographic measurements (3D planimetry and mitral valve analysis software adapted for the aortic valve) were also compared to the CT reference values. 3DTEE image quality was sufficient in 90% of patients for aortic annulus measurements using the new software, which were in good agreement with CT (r-values: .89-.91) and small (<4%) inter-modality nonsignificant biases. Repeated measurements showed <10% measurements variability. The new 3D analysis was the more accurate and reproducible of the existing echocardiographic techniques. Novel semi-automated 3DTEE analysis software can accurately measure aortic annulus in patients with severe AS undergoing TAVR, in better agreement with CT than the existing methodology. Accordingly, intra-procedural TEE could potentially replace CT in patients where CT carries significant risk. © 2017, Wiley Periodicals, Inc.

  9. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  10. Inter- and Intrarater Reliability Using Different Software Versions of E4D Compare in Dental Education.

    PubMed

    Callan, Richard S; Cooper, Jeril R; Young, Nancy B; Mollica, Anthony G; Furness, Alan R; Looney, Stephen W

    2015-06-01

    The problems associated with intra- and interexaminer reliability when assessing preclinical performance continue to hinder dental educators' ability to provide accurate and meaningful feedback to students. Many studies have been conducted to evaluate the validity of utilizing various technologies to assist educators in achieving that goal. The purpose of this study was to compare two different versions of E4D Compare software to determine if either could be expected to deliver consistent and reliable comparative results, independent of the individual utilizing the technology. Five faculty members obtained E4D digital images of students' attempts (sample model) at ideal gold crown preparations for tooth #30 performed on typodont teeth. These images were compared to an ideal (master model) preparation utilizing two versions of E4D Compare software. The percent correlations between and within these faculty members were recorded and averaged. The intraclass correlation coefficient was used to measure both inter- and intrarater agreement among the examiners. The study found that using the older version of E4D Compare did not result in acceptable intra- or interrater agreement among the examiners. However, the newer version of E4D Compare, when combined with the Nevo scanner, resulted in a remarkable degree of agreement both between and within the examiners. These results suggest that consistent and reliable results can be expected when utilizing this technology under the protocol described in this study.

  11. The Self-Perception and Usage of Medical Apps amongst Medical Students in the United States: A Cross-Sectional Survey

    PubMed Central

    Craft, Noah

    2016-01-01

    Background. Mobile medical software applications (apps) are used for clinical decision-making at the point of care. Objectives. To determine (1) the usage, reliability, and popularity of mobile medical apps and (2) medical students' perceptions of app usage effect on the quality of patient-provider interaction in healthcare settings. Methods. An anonymous web-based survey was distributed to medical students. Frequency of use, type of app used, and perceptions of reliability were assessed via univariate analysis. Results. Seven hundred thirty-one medical students responded, equating to a response rate of 29%. The majority (90%) of participants thought that medical apps enhance clinical knowledge, and 61% said that medical apps are as reliable as textbooks. While students thought that medical apps save time, improve the care of their patients, and improve diagnostic accuracy, 53% of participants believed that mobile device use in front of colleagues and patients makes one appear less competent. Conclusion. While medical students believe in the utility and reliability of medical apps, they were hesitant to use them out of fear of appearing less engaged. Higher levels of training correlated with a greater degree of comfort when using medical apps in front of patients. PMID:27688752

  12. Algorithms and software for solving finite element equations on serial and parallel architectures

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, Alan

    1988-01-01

    The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the Computational Structural Mechanics (MSC) testbed. One of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A brief overview of the CSM Testbed software and its usage is presented. An overview of the sparse matrix research for the Testbed currently employed in the CSM Testbed is given. An interface which was designed and implemented as a research tool for installing and appraising new matrix processors in the CSM Testbed is described. The results of numerical experiments performed in solving a set of testbed demonstration problems using the processor SPK and other experimental processors are contained.

  13. Tool development in threat assessment: syntax regularization and correlative analysis. Final report Task I and Task II, November 21, 1977-May 21, 1978. [Linguistic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miron, M.S.; Christopher, C.; Hirshfield, S.

    1978-05-01

    Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less

  14. Leveraging object-oriented development at Ames

    NASA Technical Reports Server (NTRS)

    Wenneson, Greg; Connell, John

    1994-01-01

    This paper presents lessons learned by the Software Engineering Process Group (SEPG) from results of supporting two projects at NASA Ames using an Object Oriented Rapid Prototyping (OORP) approach supported by a full featured visual development environment. Supplemental lessons learned from a large project in progress and a requirements definition are also incorporated. The paper demonstrates how productivity gains can be made by leveraging the developer with a rich development environment, correct and early requirements definition using rapid prototyping, and earlier and better effort estimation and software sizing through object-oriented methods and metrics. Although the individual elements of OO methods, RP approach and OO metrics had been used on other separate projects, the reported projects were the first integrated usage supported by a rich development environment. Overall the approach used was twice as productive (measured by hours per OO Unit) as a C++ development.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouzaki, Mohammed Moustafa, E-mail: bouzaki-physique1@yahoo.fr; Chadel, Meriem; Université de Lorraine, LMOPS, EA 4423, 57070 Metz

    This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europe (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In themore » proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.« less

  16. Molecular docking.

    PubMed

    Morris, Garrett M; Lim-Wilby, Marguerita

    2008-01-01

    Molecular docking is a key tool in structural molecular biology and computer-assisted drug design. The goal of ligand-protein docking is to predict the predominant binding mode(s) of a ligand with a protein of known three-dimensional structure. Successful docking methods search high-dimensional spaces effectively and use a scoring function that correctly ranks candidate dockings. Docking can be used to perform virtual screening on large libraries of compounds, rank the results, and propose structural hypotheses of how the ligands inhibit the target, which is invaluable in lead optimization. The setting up of the input structures for the docking is just as important as the docking itself, and analyzing the results of stochastic search methods can sometimes be unclear. This chapter discusses the background and theory of molecular docking software, and covers the usage of some of the most-cited docking software.

  17. ANSYS UIDL-Based CAE Development of Axial Support System for Optical Mirror

    NASA Astrophysics Data System (ADS)

    Yang, De-Hua; Shao, Liang

    2008-09-01

    The Whiffle-tree type axial support mechanism is widely adopted by most relatively large optical mirrors. Based on the secondary developing tools offered by the commonly used Finite Element Anylysis (FEA) software ANSYS, ANSYS Parametric Design Language (APDL) is used for creating the mirror FEA model driven by parameters, and ANSYS User Interface Design Language (UIDL) for generating custom menu of interactive manner, whereby, the relatively independent dedicated Computer Aided Engineering (CAE) module is embedded in ANSYS for calculation and optimization of axial Whiffle-tree support of optical mirrors. An example is also described to illustrate the intuitive and effective usage of the dedicated module by boosting work efficiency and releasing related engineering knowledge of user. The philosophy of secondary-developed special module with commonly used software also suggests itself for product development in other industries.

  18. Tools and Approaches for the Construction of Knowledge Models from the Neuroscientific Literature

    PubMed Central

    Burns, Gully A. P. C.; Khan, Arshad M.; Ghandeharizadeh, Shahram; O’Neill, Mark A.; Chen, Yi-Shin

    2015-01-01

    Within this paper, we describe a neuroinformatics project (called “NeuroScholar,” http://www.neuroscholar.org/) that enables researchers to examine, manage, manipulate, and use the information contained within the published neuroscientific literature. The project is built within a multi-level, multi-component framework constructed with the use of software engineering methods that themselves provide code-building functionality for neuroinformaticians. We describe the different software layers of the system. First, we present a hypothetical usage scenario illustrating how NeuroScholar permits users to address large-scale questions in a way that would otherwise be impossible. We do this by applying NeuroScholar to a “real-world” neuroscience question: How is stress-related information processed in the brain? We then explain how the overall design of NeuroScholar enables the system to work and illustrate different components of the user interface. We then describe the knowledge management strategy we use to store interpretations. Finally, we describe the software engineering framework we have devised (called the “View-Primitive-Data Model framework,” [VPDMf]) to provide an open-source, accelerated software development environment for the project. We believe that NeuroScholar will be useful to experimental neuroscientists by helping them interact with the primary neuroscientific literature in a meaningful way, and to neuroinformaticians by providing them with useful, affordable software engineering tools. PMID:15055395

  19. System Testing of Ground Cooling System Components

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler Steven

    2014-01-01

    This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.

  20. Capacity and reliability analyses with applications to power quality

    NASA Astrophysics Data System (ADS)

    Azam, Mohammad; Tu, Fang; Shlapak, Yuri; Kirubarajan, Thiagalingam; Pattipati, Krishna R.; Karanam, Rajaiah

    2001-07-01

    The deregulation of energy markets, the ongoing advances in communication networks, the proliferation of intelligent metering and protective power devices, and the standardization of software/hardware interfaces are creating a dramatic shift in the way facilities acquire and utilize information about their power usage. The currently available power management systems gather a vast amount of information in the form of power usage, voltages, currents, and their time-dependent waveforms from a variety of devices (for example, circuit breakers, transformers, energy and power quality meters, protective relays, programmable logic controllers, motor control centers). What is lacking is an information processing and decision support infrastructure to harness this voluminous information into usable operational and management knowledge to handle the health of their equipment and power quality, minimize downtime and outages, and to optimize operations to improve productivity. This paper considers the problem of evaluating the capacity and reliability analyses of power systems with very high availability requirements (e.g., systems providing energy to data centers and communication networks with desired availability of up to 0.9999999). The real-time capacity and margin analysis helps operators to plan for additional loads and to schedule repair/replacement activities. The reliability analysis, based on computationally efficient sum of disjoint products, enables analysts to decide the optimum levels of redundancy, aids operators in prioritizing the maintenance options for a given budget and monitoring the system for capacity margin. The resulting analytical and software tool is demonstrated on a sample data center.

  1. Hazardous Environment Robotics

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Jet Propulsion Laboratory (JPL) developed video overlay calibration and demonstration techniques for ground-based telerobotics. Through a technology sharing agreement with JPL, Deneb Robotics added this as an option to its robotics software, TELEGRIP. The software is used for remotely operating robots in nuclear and hazardous environments in industries including automotive and medical. The option allows the operator to utilize video to calibrate 3-D computer models with the actual environment, and thus plan and optimize robot trajectories before the program is automatically generated.

  2. Agreement Between Face-to-Face and Free Software Video Analysis for Assessing Hamstring Flexibility in Adolescents.

    PubMed

    Moral-Muñoz, José A; Esteban-Moreno, Bernabé; Arroyo-Morales, Manuel; Cobo, Manuel J; Herrera-Viedma, Enrique

    2015-09-01

    The objective of this study was to determine the level of agreement between face-to-face hamstring flexibility measurements and free software video analysis in adolescents. Reduced hamstring flexibility is common in adolescents (75% of boys and 35% of girls aged 10). The length of the hamstring muscle has an important role in both the effectiveness and the efficiency of basic human movements, and reduced hamstring flexibility is related to various musculoskeletal conditions. There are various approaches to measuring hamstring flexibility with high reliability; the most commonly used approaches in the scientific literature are the sit-and-reach test, hip joint angle (HJA), and active knee extension. The assessment of hamstring flexibility using video analysis could help with adolescent flexibility follow-up. Fifty-four adolescents from a local school participated in a descriptive study of repeated measures using a crossover design. Active knee extension and HJA were measured with an inclinometer and were simultaneously recorded with a video camera. Each video was downloaded to a computer and subsequently analyzed using Kinovea 0.8.15, a free software application for movement analysis. All outcome measures showed reliability estimates with α > 0.90. The lowest reliability was obtained for HJA (α = 0.91). The preliminary findings support the use of a free software tool for assessing hamstring flexibility, offering health professionals a useful tool for adolescent flexibility follow-up.

  3. AnClim and ProClimDB software for data quality control and homogenization of time series

    NASA Astrophysics Data System (ADS)

    Stepanek, Petr

    2015-04-01

    During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.

  4. S-Cube: Enabling the Next Generation of Software Services

    NASA Astrophysics Data System (ADS)

    Metzger, Andreas; Pohl, Klaus

    The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.

  5. Qualitative study of factors associated with antimicrobial usage in seven small animal veterinary practices in the UK.

    PubMed

    Mateus, Ana L P; Brodbelt, David C; Barber, Nick; Stärk, Katharina D C

    2014-11-01

    Responsible use of antimicrobials by veterinarians is essential to contain antimicrobial resistance in pathogens relevant to public health. Inappropriate antimicrobial use has been previously described in practice. However, there is scarce information on factors influencing antimicrobial usage in dogs and cats. We investigated intrinsic and extrinsic factors influencing decision-making of antimicrobial usage in first opinion small animal practices in the UK through the application of qualitative research methods. Semi-structured interviews were conducted with 21 veterinarians from seven veterinary first opinion practices in the UK in 2010. Topics investigated included: a) criteria used for selection of antimicrobials, b) influences by colleagues, c) influences by clients, d) pet characteristics, e) sources of knowledge, f) awareness of guidelines and g) protocols implemented in practice that may affect antimicrobial usage by veterinarians. Hypothetical scenarios selected to assess appropriateness of antimicrobial usage were: a) vomiting in a Yorkshire Terrier due to dietary indiscretion, b) deep pyoderma in a Shar-Pei, c) Feline Lower Urinary Tract disease in an 7 year-old male neutered cat and d) neutering of a 6-months dog. Interviews were recorded and transcribed by the interviewer. Thematic analysis was used to analyse content of transcribed interviews. Data management and analysis was conducted with qualitative analysis software NVivo8 (QSR International Pty Ltd). Antimicrobial usage by participants was influenced by factors other than clinical evidence and scientific knowledge. Intrinsic factors included veterinarian's preference of substances and previous experience. Extrinsic factors influencing antimicrobial selection were; perceived efficacy, ease of administration of formulations, perceived compliance, willingness and ability to treat by pet owners, and animal characteristics. Cost of therapy was only perceived as an influential factor in low, mixed socioeconomic areas. Veterinarians had limited awareness of current recommendations for responsible use in small animal practice. Social norms, particularly verbally agreed protocols influenced veterinarians. Inappropriate antimicrobial usage was identified in the therapy of non-infectious diseases and prophylaxis of routine clean surgical procedures. Discussion of clinical cases with peers and effectiveness meetings in the workplace were useful to veterinarians to share scientific knowledge. Effectiveness meetings can be a common ground for veterinarians to discuss and agree protocols for clinical conditions and surgical procedures. Protocols should be evidence-based, follow current recommendations and take into account the resources available in the workplace. Targeted training of veterinarians in the workplace with peer support should be used to promote responsible antimicrobial usage. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. CernVM WebAPI - Controlling Virtual Machines from the Web

    NASA Astrophysics Data System (ADS)

    Charalampidis, I.; Berzano, D.; Blomer, J.; Buncic, P.; Ganis, G.; Meusel, R.; Segal, B.

    2015-12-01

    Lately, there is a trend in scientific projects to look for computing resources in the volunteering community. In addition, to reduce the development effort required to port the scientific software stack to all the known platforms, the use of Virtual Machines (VMs)u is becoming increasingly popular. Unfortunately their use further complicates the software installation and operation, restricting the volunteer audience to sufficiently expert people. CernVM WebAPI is a software solution addressing this specific case in a way that opens wide new application opportunities. It offers a very simple API for setting-up, controlling and interfacing with a VM instance in the users computer, while in the same time offloading the user from all the burden of downloading, installing and configuring the hypervisor. WebAPI comes with a lightweight javascript library that guides the user through the application installation process. Malicious usage is prohibited by offering a per-domain PKI validation mechanism. In this contribution we will overview this new technology, discuss its security features and examine some test cases where it is already in use.

  7. Production Maintenance Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason Gabler, David Skinner

    2005-11-01

    PMI is a XML framework for formulating tests of software and software environments which operate in a relatively push button manner, i.e., can be automated, and that provide results that are readily consumable/publishable via RSS. Insofar as possible the tests are carried out in manner congruent with real usage. PMI drives shell scripts via a perl program which is charge of timing, validating each test, and controlling the flow through sets of tests. Testing in PMI is built up hierarchically. A suite of tests may start by testing basic functionalities (file system is writable, compiler is found and functions, shellmore » environment behaves as expected, etc.) and work up to large more complicated activities (execution of parallel code, file transfers, etc.) At each step in this hierarchy a failure leads to generation of a text message or RSS that can be tagged as to who should be notified of the failure. There are two functionalities that PMI has been directed at. 1) regular and automated testing of multi user environments and 2) version-wise testing of new software releases prior to their deployment in a production mode.« less

  8. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software

    PubMed Central

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh.; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the ’omics’ context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL. PMID:25717363

  9. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software.

    PubMed

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the 'omics' context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL.

  10. The BNM-LPTF software for the frequency comparison of atomic clocks by the carrier phase of the GPS signal.

    PubMed

    Taris, F; Uhrich, P; Petit, G; Jiang, Z; Barillet, R; Hamouda, F

    2000-01-01

    This paper describes the software and equipment used at the Laboratoire Primaire du Temps et des Frequences du Bureau National de Metrologie (BNM-LPTF), Paris, France. Two H-masers in short baseline, one located at the BNM-LPTF and the other at the Laboratoire de l'Horloge Atomique du Centre National de la Recherche Scientifique (CNRS-LHA), Orsay, France, were computed in parallel with the BNM-LPTF software and with the BERNESE V 4.1 software. The comparison of the results issued from both computations shows an agreement within 100 ps (1 sigma). In addition, comparisons with the BNM-LPTF software were made over 10 days with the H-masers located at the Physikalisch-Technische Bundesanstalt (PTB), Braunschweig, Germany, and another at the National Physical Laboratory (NPL), Teddington, United Kingdom. The data collected show that a modulation with an amplitude of 50 ps and a period of 700-800 ps affects the equipment of the NPL. In addition, these comparisons show that the noise of the instruments together with the environmental conditions at the PTB was higher than that of the NPL and the BNM-LPTF during the observation period. The best relative frequency stability obtained, in the BNM-LPTF/NPL comparison, is about 3x10(-15) for averaging periods between 6x10(4) s and 3x10(5) s. This result is in good agreement with the expected stability of H-masers. It demonstrates that the noise brought by the GPS carrier phase measurements can be averaged out at this level.

  11. The social disutility of software ownership.

    PubMed

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  12. Modelling of diesel engine fuelled with biodiesel using engine simulation software

    NASA Astrophysics Data System (ADS)

    Said, Mohd Farid Muhamad; Said, Mazlan; Aziz, Azhar Abdul

    2012-06-01

    This paper is about modelling of a diesel engine that operates using biodiesel fuels. The model is used to simulate or predict the performance and combustion of the engine by simplified the geometry of engine component in the software. The model is produced using one-dimensional (1D) engine simulation software called GT-Power. The fuel properties library in the software is expanded to include palm oil based biodiesel fuels. Experimental works are performed to investigate the effect of biodiesel fuels on the heat release profiles and the engine performance curves. The model is validated with experimental data and good agreement is observed. The simulation results show that combustion characteristics and engine performances differ when biodiesel fuels are used instead of no. 2 diesel fuel.

  13. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  14. Unexpected Control Structure Interaction on International Space Station

    NASA Technical Reports Server (NTRS)

    Gomez, Susan F.; Platonov, Valery; Medina, Elizabeth A.; Borisenko, Alexander; Bogachev, Alexey

    2017-01-01

    On June 23, 2011, the International Space Station (ISS) was performing a routine 180 degree yaw maneuver in support of a Russian vehicle docking when the on board Russian Segment (RS) software unexpectedly declared two attitude thrusters failed and switched thruster configurations in response to unanticipated ISS dynamic motion. Flight data analysis after the maneuver indicated that higher than predicted structural loads had been induced at various locations on the United States (U.S.) segment of the ISS. Further analysis revealed that the attitude control system was firing thrusters in response to both structural flex and rigid body rates, which resonated the structure and caused high loads and fatigue cycles. It was later determined that the thruster themselves were healthy. The RS software logic, which was intended to react to thruster failures, had instead been heavily influenced by interaction between the control system and structural flex. This paper will discuss the technical aspects of the control structure interaction problem that led to the RS control system firing thrusters in response to structural flex, the factors that led to insufficient preflight analysis of the thruster firings, and the ramifications the event had on the ISS. An immediate consequence included limiting which thrusters could be used for attitude control. This complicated the planning of on-orbit thruster events and necessitated the use of suboptimal thruster configurations that increased propellant usage and caused thruster lifetime usage concerns. In addition to the technical aspects of the problem, the team dynamics and communication shortcomings that led to such an event happening in an environment where extensive analysis is performed in support of human space flight will also be examined. Finally, the technical solution will be presented, which required a multidisciplinary effort between the U.S. and Russian control system engineers and loads and dynamics structural engineers to develop and implement an extensive modification in the RS software logic for ISS attitude control thruster firings.

  15. Towards Dynamic Service Level Agreement Negotiation:An Approach Based on WS-Agreement

    NASA Astrophysics Data System (ADS)

    Pichot, Antoine; Wäldrich, Oliver; Ziegler, Wolfgang; Wieder, Philipp

    In Grid, e-Science and e-Business environments, Service Level Agreements are often used to establish frameworks for the delivery of services between service providers and the organisations hosting the researchers. While this high level SLAs define the overall quality of the services, it is desirable for the end-user to have dedicated service quality also for individual services like the orchestration of resources necessary for composed services. Grid level scheduling services typically are responsible for the orchestration and co-ordination of resources in the Grid. Co-allocation e.g. requires the Grid level scheduler to co-ordinate resource management systems located in different domains. As the site autonomy has to be respected negotiation is the only way to achieve the intended co-ordination. SLAs emerged as a new way to negotiate and manage usage of resources in the Grid and are already adopted by a number of management systems. Therefore, it is natural to look for ways to adopt SLAs for Grid level scheduling. In order to do this, efficient and flexible protocols are needed, which support dynamic negotiation and creation of SLAs. In this paper we propose and discuss extensions to the WS-Agreement protocol addressing these issues.

  16. Comparison of the manual, semiautomatic, and automatic selection and leveling of hot spots in whole slide images for Ki-67 quantification in meningiomas.

    PubMed

    Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej

    2015-01-01

    Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma.

  17. Comparison of the Manual, Semiautomatic, and Automatic Selection and Leveling of Hot Spots in Whole Slide Images for Ki-67 Quantification in Meningiomas

    PubMed Central

    Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej

    2015-01-01

    Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma. PMID:26240787

  18. Climate Science Performance, Data and Productivity on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, Benjamin W; Worley, Patrick H; Gaddis, Abigail L

    2015-01-01

    Climate Science models are flagship codes for the largest of high performance computing (HPC) resources, both in visibility, with the newly launched Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) effort, and in terms of significant fractions of system usage. The performance of the DOE ACME model is captured with application level timers and examined through a sizeable run archive. Performance and variability of compute, queue time and ancillary services are examined. As Climate Science advances in the use of HPC resources there has been an increase in the required human and data systems to achieve programs goals.more » A description of current workflow processes (hardware, software, human) and planned automation of the workflow, along with historical and projected data in motion and at rest data usage, are detailed. The combination of these two topics motivates a description of future systems requirements for DOE Climate Modeling efforts, focusing on the growth of data storage and network and disk bandwidth required to handle data at an acceptable rate.« less

  19. Modeling relationships between various domains of hearing aid provision.

    PubMed

    Meister, Hartmut; Lausberg, Isabel; Kiessling, Jürgen; von Wedel, Hasso; Walger, Martin

    2003-01-01

    Various inventories have been developed to quantify the success of hearing aid provision. Though numerous parameters including initial measures (hearing disability, handicap) or 'outcome measures' (e.g. benefit, satisfaction and usage) are recorded, relationships and interactions among them are still unclear. A study applying a questionnaire addressing 11 domains relevant to amplification was conducted in order to generate different psychometric models with the AMOS software package for structural equation modeling. The models expose easily interpretable interactions and are helpful in understanding effects occurring with commonly used outcome measures: benefit reflects the difference between the aided and unaided condition but additionally comprises the importance of the hearing aid within a specific situation. Satisfaction is highly reliant on benefit. Usage is strongly dependent on the severity of hearing problems and therefore not appropriate in assessing the success of amplification. Moreover, the models help to predict the outcome of clinically used inventories (i.e. the Glasgow Hearing Aid Benefit Profile). Copyright 2003 S. Karger AG, Basel

  20. Analysis on Operating Parameter Design to Steam Methane Reforming in Heat Application RDE

    NASA Astrophysics Data System (ADS)

    Dibyo, Sukmanto; Sunaryo, Geni Rina; Bakhri, Syaiful; Zuhair; Irianto, Ign. Djoko

    2018-02-01

    The high temperature reactor has been developed with various power capacities and can produce electricity and heat application. One of heat application is used for hydrogen production. Most hydrogen production occurs by steam reforming that operated at high temperature. This study aims to analyze the feasibility of heat application design of RDE reactor in the steam methane reforming for hydrogen production using the ChemCAD software. The outlet temperature of cogeneration heat exchanger is analyzed to be applied as a feed of steam reformer. Furthermore, the additional heater and calculating amount of fuel usage are described. Results show that at a low mass flow rate of feed, its can produce a temperature up to 480°C. To achieve the temperature of steam methane reforming of 850°C the additional fired heater was required. By the fired heater, an amount of fuel usage is required depending on the Reformer feed temperature produced from the heat exchanger of the cogeneration system.

Top