Sample records for clinical lims software

  1. adLIMS: a customized open source software that allows bridging clinical and basic molecular research studies.

    PubMed

    Calabria, Andrea; Spinozzi, Giulio; Benedicenti, Fabrizio; Tenderini, Erika; Montini, Eugenio

    2015-01-01

    Many biological laboratories that deal with genomic samples are facing the problem of sample tracking, both for pure laboratory management and for efficiency. Our laboratory exploits PCR techniques and Next Generation Sequencing (NGS) methods to perform high-throughput integration site monitoring in different clinical trials and scientific projects. Because of the huge amount of samples that we process every year, which result in hundreds of millions of sequencing reads, we need to standardize data management and tracking systems, building up a scalable and flexible structure with web-based interfaces, which are usually called Laboratory Information Management System (LIMS). We started collecting end-users' requirements, composed of desired functionalities of the system and Graphical User Interfaces (GUI), and then we evaluated available tools that could address our requirements, spanning from pure LIMS to Content Management Systems (CMS) up to enterprise information systems. Our analysis identified ADempiere ERP, an open source Enterprise Resource Planning written in Java J2EE, as the best software that also natively implements some highly desirable technological advances, such as the high usability and modularity that grants high use-case flexibility and software scalability for custom solutions. We extended and customized ADempiere ERP to fulfil LIMS requirements and we developed adLIMS. It has been validated by our end-users verifying functionalities and GUIs through test cases for PCRs samples and pre-sequencing data and it is currently in use in our laboratories. adLIMS implements authorization and authentication policies, allowing multiple users management and roles definition that enables specific permissions, operations and data views to each user. For example, adLIMS allows creating sample sheets from stored data using available exporting operations. This simplicity and process standardization may avoid manual errors and information backtracking, features

  2. LIMS and Clinical Data Management.

    PubMed

    Chen, Yalan; Lin, Yuxin; Yuan, Xuye; Shen, Bairong

    2016-01-01

    In order to achieve more accurate disease prevention, diagnosis, and treatment, clinical and genetic data need extensive and systematically associated study. As one way to achieve precision medicine, a laboratory information management system (LIMS) can effectively associate clinical data in a macrocosmic aspect and genomic data in a microcosmic aspect. This chapter summarizes the application of the LIMS in a clinical data management and implementation mode. It also discusses the principles of a LIMS in clinical data management, as well as the opportunities and challenges in the context of medical informatics.

  3. MendeLIMS: a web-based laboratory information management system for clinical genome sequencing.

    PubMed

    Grimes, Susan M; Ji, Hanlee P

    2014-08-27

    Large clinical genomics studies using next generation DNA sequencing require the ability to select and track samples from a large population of patients through many experimental steps. With the number of clinical genome sequencing studies increasing, it is critical to maintain adequate laboratory information management systems to manage the thousands of patient samples that are subject to this type of genetic analysis. To meet the needs of clinical population studies using genome sequencing, we developed a web-based laboratory information management system (LIMS) with a flexible configuration that is adaptable to continuously evolving experimental protocols of next generation DNA sequencing technologies. Our system is referred to as MendeLIMS, is easily implemented with open source tools and is also highly configurable and extensible. MendeLIMS has been invaluable in the management of our clinical genome sequencing studies. We maintain a publicly available demonstration version of the application for evaluation purposes at http://mendelims.stanford.edu. MendeLIMS is programmed in Ruby on Rails (RoR) and accesses data stored in SQL-compliant relational databases. Software is freely available for non-commercial use at http://dna-discovery.stanford.edu/software/mendelims/.

  4. Principles and application of LIMS in mouse clinics.

    PubMed

    Maier, Holger; Schütt, Christine; Steinkamp, Ralph; Hurt, Anja; Schneltzer, Elida; Gormanns, Philipp; Lengger, Christoph; Griffiths, Mark; Melvin, David; Agrawal, Neha; Alcantara, Rafael; Evans, Arthur; Gannon, David; Holroyd, Simon; Kipp, Christian; Raj, Navis Pretheeba; Richardson, David; LeBlanc, Sophie; Vasseur, Laurent; Masuya, Hiroshi; Kobayashi, Kimio; Suzuki, Tomohiro; Tanaka, Nobuhiko; Wakana, Shigeharu; Walling, Alison; Clary, David; Gallegos, Juan; Fuchs, Helmut; de Angelis, Martin Hrabě; Gailus-Durner, Valerie

    2015-10-01

    Large-scale systemic mouse phenotyping, as performed by mouse clinics for more than a decade, requires thousands of mice from a multitude of different mutant lines to be bred, individually tracked and subjected to phenotyping procedures according to a standardised schedule. All these efforts are typically organised in overlapping projects, running in parallel. In terms of logistics, data capture, data analysis, result visualisation and reporting, new challenges have emerged from such projects. These challenges could hardly be met with traditional methods such as pen & paper colony management, spreadsheet-based data management and manual data analysis. Hence, different Laboratory Information Management Systems (LIMS) have been developed in mouse clinics to facilitate or even enable mouse and data management in the described order of magnitude. This review shows that general principles of LIMS can be empirically deduced from LIMS used by different mouse clinics, although these have evolved differently. Supported by LIMS descriptions and lessons learned from seven mouse clinics, this review also shows that the unique LIMS environment in a particular facility strongly influences strategic LIMS decisions and LIMS development. As a major conclusion, this review states that there is no universal LIMS for the mouse research domain that fits all requirements. Still, empirically deduced general LIMS principles can serve as a master decision support template, which is provided as a hands-on tool for mouse research facilities looking for a LIMS.

  5. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine.

    PubMed

    Bath, Timothy G; Bozdag, Selcuk; Afzal, Vackar; Crowther, Daniel

    2011-05-13

    Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and

  6. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine

    PubMed Central

    2011-01-01

    Background Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. Results We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. Conclusions By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of

  7. Improving the Plasticity of LIMS Implementation: LIMS Extension through Microsoft Excel

    NASA Technical Reports Server (NTRS)

    Culver, Mark

    2017-01-01

    A Laboratory Information Management System (LIMS) is a databasing software with many built-in tools ideal for handling and documenting most laboratory processes in an accurate and consistent manner, making it an indispensable tool for the modern laboratory. However, a lot of LIMS end users will find that in the performance of analyses that have unique considerations such as standard curves, multiple stages incubations, or logical considerations, a base LIMS distribution may not ideally suit their needs. These considerations bring about the need for extension languages, which can extend the functionality of a LIMS. While these languages do provide the implementation team the functionality required to accommodate these special laboratory analyses, they are usually too complex for the end user to modify to compensate for natural changes in laboratory operations. The LIMS utilized by our laboratory offers a unique and easy-to-use choice for an extension language, one that is already heavily relied upon not only in science but also in most academic and business pursuits: Microsoft Excel. The validity of Microsoft Excel as a pseudo programming language and its usability and versatility as a LIMS extension language will be discussed. The NELAC implications and overall drawbacks of this LIMS configuration will also be discussed.

  8. openBIS ELN-LIMS: an open-source database for academic laboratories.

    PubMed

    Barillari, Caterina; Ottoz, Diana S M; Fuentes-Serna, Juan Mariano; Ramakrishnan, Chandrasekhar; Rinn, Bernd; Rudolf, Fabian

    2016-02-15

    The open-source platform openBIS (open Biology Information System) offers an Electronic Laboratory Notebook and a Laboratory Information Management System (ELN-LIMS) solution suitable for the academic life science laboratories. openBIS ELN-LIMS allows researchers to efficiently document their work, to describe materials and methods and to collect raw and analyzed data. The system comes with a user-friendly web interface where data can be added, edited, browsed and searched. The openBIS software, a user guide and a demo instance are available at https://openbis-eln-lims.ethz.ch. The demo instance contains some data from our laboratory as an example to demonstrate the possibilities of the ELN-LIMS (Ottoz et al., 2014). For rapid local testing, a VirtualBox image of the ELN-LIMS is also available. © The Author 2015. Published by Oxford University Press.

  9. Open-source LIMS in Vietnam: The path toward sustainability and host country ownership.

    PubMed

    Landgraf, Kenneth M; Kakkar, Reshma; Meigs, Michelle; Jankauskas, Paul T; Phan, Thi Thu Huong; Nguyen, Viet Nga; Nguyen, Duy Thai; Duong, Thanh Tung; Nguyen, Thi Hoa; Bond, Kyle B

    2016-09-01

    The objectives of this case report are as follows: to describe the process of establishing a national laboratory information management system (LIMS) program for clinical and public health laboratories in Vietnam; to evaluate the outcomes and lessons learned; and to present a model for sustainability based on the program outcomes that could be applied to diverse laboratory programs. This case report comprises a review of program documentation and records, including planning and budgetary records of the donor, monthly reports from the implementer, direct observation, and ad-hoc field reports from technical advisors and governmental agencies. Additional data on program efficacy and user acceptance were collected from routine monitoring of laboratory policies and operational practices. LIMS software was implemented at 38 hospital, public health and HIV testing laboratories in Vietnam. This LIMS was accepted by users and program managers as a useful tool to support laboratory processes. Implementation cost per laboratory and average duration of deployment decreased over time, and project stakeholders initiated transition of financing (from the donor to local institutions) and of system maintenance functions (from the implementer to governmental and site-level staff). Collaboration between the implementer in Vietnam and the global LIMS user community was strongly established, and knowledge was successfully transferred to staff within Vietnam. Implementing open-sourced LIMS with local development and support was a feasible approach towards establishing a sustainable laboratory informatics program that met the needs of health laboratories in Vietnam. Further effort to institutionalize IT support capacity within key government agencies is ongoing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. LIMS user acceptance testing.

    PubMed

    Klein, Corbett S

    2003-01-01

    Laboratory Information Management Systems (LIMS) play a key role in the pharmaceutical industry. Thorough and accurate validation of such systems is critical and is a regulatory requirement. LIMS user acceptance testing is one aspect of this testing and enables the user to make a decision to accept or reject implementation of the system. This paper discusses key elements in facilitating the development and execution of a LIMS User Acceptance Test Plan (UATP).

  11. Implementing the LIM code: the structural basis for cell type-specific assembly of LIM-homeodomain complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhati, Mugdha; Lee, Christopher; Nancarrow, Amy L.

    2008-09-03

    LIM-homeodomain (LIM-HD) transcription factors form a combinatorial 'LIM code' that contributes to the specification of cell types. In the ventral spinal cord, the binary LIM homeobox protein 3 (Lhx3)/LIM domain-binding protein 1 (Ldb1) complex specifies the formation of V2 interneurons. The additional expression of islet-1 (Isl1) in adjacent cells instead specifies the formation of motor neurons through assembly of a ternary complex in which Isl1 contacts both Lhx3 and Ldb1, displacing Lhx3 as the binding partner of Ldb1. However, little is known about how this molecular switch occurs. Here, we have identified the 30-residue Lhx3-binding domain on Isl1 (Isl1{sub LBD}).more » Although the LIM interaction domain of Ldb1 (Ldb1{sub LID}) and Isl1{sub LBD} share low levels of sequence homology, X-ray and NMR structures reveal that they bind Lhx3 in an identical manner, that is, Isl1{sub LBD} mimics Ldb1{sub LID}. These data provide a structural basis for the formation of cell type-specific protein-protein interactions in which unstructured linear motifs with diverse sequences compete to bind protein partners. The resulting alternate protein complexes can target different genes to regulate key biological events.« less

  12. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  13. Leaf LIMS: A Flexible Laboratory Information Management System with a Synthetic Biology Focus.

    PubMed

    Craig, Thomas; Holland, Richard; D'Amore, Rosalinda; Johnson, James R; McCue, Hannah V; West, Anthony; Zulkower, Valentin; Tekotte, Hille; Cai, Yizhi; Swan, Daniel; Davey, Robert P; Hertz-Fowler, Christiane; Hall, Anthony; Caddick, Mark

    2017-12-15

    This paper presents Leaf LIMS, a flexible laboratory information management system (LIMS) designed to address the complexity of synthetic biology workflows. At the project's inception there was a lack of a LIMS designed specifically to address synthetic biology processes, with most systems focused on either next generation sequencing or biobanks and clinical sample handling. Leaf LIMS implements integrated project, item, and laboratory stock tracking, offering complete sample and construct genealogy, materials and lot tracking, and modular assay data capture. Hence, it enables highly configurable task-based workflows and supports data capture from project inception to completion. As such, in addition to it supporting synthetic biology it is ideal for many laboratory environments with multiple projects and users. The system is deployed as a web application through Docker and is provided under a permissive MIT license. It is freely available for download at https://leaflims.github.io .

  14. LIMS Version 6 Level 3 Dataset

    NASA Technical Reports Server (NTRS)

    Remsberg, Ellis E.; Lingenfelser, Gretchen

    2010-01-01

    This report describes the Limb Infrared Monitor of the Stratosphere (LIMS) Version 6 (V6) Level 3 data products and the assumptions used for their generation. A sequential estimation algorithm was used to obtain daily, zonal Fourier coefficients of the several parameters of the LIMS dataset for 216 days of 1978-79. The coefficients are available at up to 28 pressure levels and at every two degrees of latitude from 64 S to 84 N and at the synoptic time of 12 UT. Example plots were prepared and archived from the data at 10 hPa of January 1, 1979, to illustrate the overall coherence of the features obtained with the LIMS-retrieved parameters.

  15. An emerging link between LIM domain proteins and nuclear receptors.

    PubMed

    Sala, Stefano; Ampe, Christophe

    2018-06-01

    Nuclear receptors are ligand-activated transcription factors that partake in several biological processes including development, reproduction and metabolism. Over the last decade, evidence has accumulated that group 2, 3 and 4 LIM domain proteins, primarily known for their roles in actin cytoskeleton organization, also partake in gene transcription regulation. They shuttle between the cytoplasm and the nucleus, amongst other as a consequence of triggering cells with ligands of nuclear receptors. LIM domain proteins act as important coregulators of nuclear receptor-mediated gene transcription, in which they can either function as coactivators or corepressors. In establishing interactions with nuclear receptors, the LIM domains are important, yet pleiotropy of LIM domain proteins and nuclear receptors frequently occurs. LIM domain protein-nuclear receptor complexes function in diverse physiological processes. Their association is, however, often linked to diseases including cancer.

  16. A guide for the laboratory information management system (LIMS) for light stable isotopes--Versions 7 and 8

    USGS Publications Warehouse

    Coplen, Tyler B.

    2000-01-01

    The reliability and accuracy of isotopic data can be improved by utilizing database software to (i) store information about samples, (ii) store the results of mass spectrometric isotope-ratio analyses of samples, (iii) calculate analytical results using standardized algorithms stored in a database, (iv) normalize stable isotopic data to international scales using isotopic reference materials, and (v) generate multi-sheet paper templates for convenient sample loading of automated mass-spectrometer sample preparation manifolds. Such a database program, the Laboratory Information Management System (LIMS) for Light Stable Isotopes, is presented herein. Major benefits of this system include (i) a dramatic improvement in quality assurance, (ii) an increase in laboratory efficiency, (iii) a reduction in workload due to the elimination or reduction of retyping of data by laboratory personnel, and (iv) a decrease in errors in data reported to sample submitters. Such a database provides a complete record of when and how often laboratory reference materials have been analyzed and provides a record of what correction factors have been used through time. It provides an audit trail for laboratories. LIMS for Light Stable Isotopes is available for both Microsoft Office 97 Professional and Microsoft Office 2000 Professional as versions 7 and 8, respectively. Both source code (mdb file) and precompiled executable files (mde) are available. Numerous improvements have been made for continuous flow isotopic analysis in this version (specifically 7.13 for Microsoft Access 97 and 8.13 for Microsoft Access 2000). It is much easier to import isotopic results from Finnigan ISODAT worksheets, even worksheets on which corrections for amount of sample (linearity corrections) have been added. The capability to determine blank corrections using isotope mass balance from analyses of elemental analyzer samples has been added. It is now possible to calculate and apply drift corrections to isotopic

  17. Laboratory Information Management System (LIMS): A case study

    NASA Technical Reports Server (NTRS)

    Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.

    1987-01-01

    In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.

  18. MetaLIMS, a simple open-source laboratory information management system for small metagenomic labs

    PubMed Central

    Gaultier, Nicolas Paul Eugène; Miller, Dana; Purbojati, Rikky Wenang; Lauro, Federico M.

    2017-01-01

    Abstract Background: As the cost of sequencing continues to fall, smaller groups increasingly initiate and manage larger sequencing projects and take on the complexity of data storage for high volumes of samples. This has created a need for low-cost laboratory information management systems (LIMS) that contain flexible fields to accommodate the unique nature of individual labs. Many labs do not have a dedicated information technology position, so LIMS must also be easy to setup and maintain with minimal technical proficiency. Findings: MetaLIMS is a free and open-source web-based application available via GitHub. The focus of MetaLIMS is to store sample metadata prior to sequencing and analysis pipelines. Initially designed for environmental metagenomics labs, in addition to storing generic sample collection information and DNA/RNA processing information, the user can also add fields specific to the user's lab. MetaLIMS can also produce a basic sequencing submission form compatible with the proprietary Clarity LIMS system used by some sequencing facilities. To help ease the technical burden associated with web deployment, MetaLIMS options the use of commercial web hosting combined with MetaLIMS bash scripts for ease of setup. Conclusions: MetaLIMS overcomes key challenges common in LIMS by giving labs access to a low-cost and open-source tool that also has the flexibility to meet individual lab needs and an option for easy deployment. By making the web application open source and hosting it on GitHub, we hope to encourage the community to build upon MetaLIMS, making it more robust and tailored to the needs of more researchers. PMID:28430964

  19. MetaLIMS, a simple open-source laboratory information management system for small metagenomic labs.

    PubMed

    Heinle, Cassie Elizabeth; Gaultier, Nicolas Paul Eugène; Miller, Dana; Purbojati, Rikky Wenang; Lauro, Federico M

    2017-06-01

    As the cost of sequencing continues to fall, smaller groups increasingly initiate and manage larger sequencing projects and take on the complexity of data storage for high volumes of samples. This has created a need for low-cost laboratory information management systems (LIMS) that contain flexible fields to accommodate the unique nature of individual labs. Many labs do not have a dedicated information technology position, so LIMS must also be easy to setup and maintain with minimal technical proficiency. MetaLIMS is a free and open-source web-based application available via GitHub. The focus of MetaLIMS is to store sample metadata prior to sequencing and analysis pipelines. Initially designed for environmental metagenomics labs, in addition to storing generic sample collection information and DNA/RNA processing information, the user can also add fields specific to the user's lab. MetaLIMS can also produce a basic sequencing submission form compatible with the proprietary Clarity LIMS system used by some sequencing facilities. To help ease the technical burden associated with web deployment, MetaLIMS options the use of commercial web hosting combined with MetaLIMS bash scripts for ease of setup. MetaLIMS overcomes key challenges common in LIMS by giving labs access to a low-cost and open-source tool that also has the flexibility to meet individual lab needs and an option for easy deployment. By making the web application open source and hosting it on GitHub, we hope to encourage the community to build upon MetaLIMS, making it more robust and tailored to the needs of more researchers. © The Authors 2017. Published by Oxford University Press.

  20. MASTR-MS: a web-based collaborative laboratory information management system (LIMS) for metabolomics.

    PubMed

    Hunter, Adam; Dayalan, Saravanan; De Souza, David; Power, Brad; Lorrimar, Rodney; Szabo, Tamas; Nguyen, Thu; O'Callaghan, Sean; Hack, Jeremy; Pyke, James; Nahid, Amsha; Barrero, Roberto; Roessner, Ute; Likic, Vladimir; Tull, Dedreia; Bacic, Antony; McConville, Malcolm; Bellgard, Matthew

    2017-01-01

    An increasing number of research laboratories and core analytical facilities around the world are developing high throughput metabolomic analytical and data processing pipelines that are capable of handling hundreds to thousands of individual samples per year, often over multiple projects, collaborations and sample types. At present, there are no Laboratory Information Management Systems (LIMS) that are specifically tailored for metabolomics laboratories that are capable of tracking samples and associated metadata from the beginning to the end of an experiment, including data processing and archiving, and which are also suitable for use in large institutional core facilities or multi-laboratory consortia as well as single laboratory environments. Here we present MASTR-MS, a downloadable and installable LIMS solution that can be deployed either within a single laboratory or used to link workflows across a multisite network. It comprises a Node Management System that can be used to link and manage projects across one or multiple collaborating laboratories; a User Management System which defines different user groups and privileges of users; a Quote Management System where client quotes are managed; a Project Management System in which metadata is stored and all aspects of project management, including experimental setup, sample tracking and instrument analysis, are defined, and a Data Management System that allows the automatic capture and storage of raw and processed data from the analytical instruments to the LIMS. MASTR-MS is a comprehensive LIMS solution specifically designed for metabolomics. It captures the entire lifecycle of a sample starting from project and experiment design to sample analysis, data capture and storage. It acts as an electronic notebook, facilitating project management within a single laboratory or a multi-node collaborative environment. This software is being developed in close consultation with members of the metabolomics research

  1. LIM kinase function and renal growth: Potential role for LIM kinases in fetal programming of kidney development.

    PubMed

    Sparrow, Alexander J; Sweetman, Dylan; Welham, Simon J M

    2017-10-01

    Maternal dietary restriction during pregnancy impairs nephron development and results in offspring with fewer nephrons. Cell turnover in the early developing kidney is altered by exposure to maternal dietary restriction and may be regulated by the LIM-kinase family of enzymes. We set out to establish whether disturbance of LIM-kinase activity might play a role in the impairment of nephron formation. E12.5 metanephric kidneys and HK2 cells were grown in culture with the pharmacological LIM-kinase inhibitor BMS5. Organs were injected with DiI, imaged and cell numbers measured over 48h to assess growth. Cells undergoing mitosis were visualised by pH3 labelling. Growth of cultured kidneys reduced to 83% of controls after exposure to BMS5 and final cell number to 25% of control levels after 48h. Whilst control and BMS5 treated organs showed cells undergoing mitosis (100±11 cells/field vs 113±18 cells/field respectively) the proportion in anaphase was considerably diminished with BMS5 treatment (7.8±0.8% vs 0.8±0.6% respectively; P<0.01). This was consistent with effects on HK2 cells highlighting a severe impact of BMS5 on formation of the mitotic spindle and centriole positioning. DiI labelled cells migrated in 100% of control cultures vs 0% BMS5 treated organs. The number of nephrogenic precursor cells appeared depleted in whole organs and formation of new nephrons was blocked by exposure to BMS5. Pharmacological blockade of LIM-kinase function in the early developing kidney results in failure of renal development. This is likely due to prevention of dividing cells from completion of mitosis with their resultant loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Molecular Characterization of abLIM, a Novel Actin-binding and Double Zinc Finger Protein

    PubMed Central

    Roof, Dorothy J.; Hayes, Annmarie; Adamian, Michael; Chishti, Athar H.; Li, Tiansen

    1997-01-01

    Molecules that couple the actin-based cytoskeleton to intracellular signaling pathways are central to the processes of cellular morphogenesis and differentiation. We have characterized a novel protein, the actin-binding LIM (abLIM) protein, which could mediate such interactions between actin filaments and cytoplasmic targets. abLIM protein consists of a COOH-terminal cytoskeletal domain that is fused to an NH2-terminal domain consisting of four double zinc finger motifs. The cytoskeletal domain is ∼50% identical to erythrocyte dematin, an actin-bundling protein of the red cell membrane skeleton, while the zinc finger domains conform to the LIM motif consensus sequence. In vitro expression studies demonstrate that abLIM protein can bind to F-actin through the dematin-like domain. Transcripts corresponding to three distinct isoforms have a widespread tissue distribution. However, a polypeptide corresponding to the full-length isoform is found exclusively in the retina and is enriched in biochemical extracts of retinal rod inner segments. abLIM protein also undergoes extensive phosphorylation in light-adapted retinas in vivo, and its developmental expression in the retina coincides with the elaboration of photoreceptor inner and outer segments. Based on the composite primary structure of abLIM protein, actin-binding capacity, potential regulation via phosphorylation, and isoform expression pattern, we speculate that abLIM may play a general role in bridging the actin-based cytoskeleton with an array of potential LIM protein-binding partners. The developmental time course of abLIM expression in the retina suggests that the retina-specific isoform may have a specialized role in the development or elaboration of photoreceptor inner and outer segments. PMID:9245787

  3. Principles of Contour Information: Reply to Lim and Leek (2012)

    ERIC Educational Resources Information Center

    Singh, Manish; Feldman, Jacob

    2012-01-01

    Lim and Leek (2012) presented a formalization of information along object contours, which they argued was an alternative to the approach taken in our article (Feldman & Singh, 2005). Here, we summarize the 2 approaches, showing that--notwithstanding Lim and Leek's (2012) critical rhetoric--their approach is substantially identical to ours,…

  4. The LIM homeobox protein mLIM3/Lhx3 induces expression of the prolactin gene by a Pit-1/GHF-1-independent pathway in corticotroph AtT20 cells.

    PubMed

    Girardin, S E; Benjannet, S; Barale, J C; Chrétien, M; Seidah, N G

    1998-07-24

    mLIM3, a member of the LIM homeobox family, was recently demonstrated to be critical for proliferation and differentiation of the pituitary cell lineage. Using a pool of degenerate oligonucleotides we determined the DNA sequence ANNAGGAAA(T/C)GA(CIG)AA as the set preferentially recognized by mLIM3. A nearly identical sequence is found in the prolactin (PRL) promoter, within a 15-mer stretch from nucleotides (nts) -218 to -204 which is highly conserved between human, rat, and bovine. In order to test the hypothesis of a transcriptional effect of mLIM3 on the prolactin promoter, stable transfectants of mLIM3 cDNA in AtT20 tumor cells revealed that PRL mRNA expression was induced in 3 separate stable clones. Gel retardation experiments performed using nuclear extracts isolated from one of the AtT20/mLIM3 stable transfectants revealed affinity towards the 15-mer element of the PRL promoter. From these results, we propose that the PRL promoter element (nts -218 to -204) could be functional in vivo. Finally, we demonstrate that in AtT20 cells prolactin mRNA expression is not induced by the Pit-1/GHF-1 pathway and that growth hormone mRNA is not detected concomitantly with prolactin. We conclude that mLIM3 may play a key role in inducing PRL gene expression in lactotrophs by binding to a conserved motif close to a Pit-1/GHF-1 site within the proximal PRL promoter.

  5. Recent advances in the rational design and development of LIM kinase inhibitors are not enough to enter clinical trials.

    PubMed

    Manetti, Fabrizio

    2018-06-08

    LIM kinases are involved in various pathophysiological processes that depend on actin organization. Alteration of microtubule dynamics by LIMK dysregulation is in fact related to tumor progression and metastasis, viral infection, and ocular diseases, such as glaucoma. As a consequence, many efforts have been done in recent years to rationally design small molecules able to inhibit LIMK activity selectively, without affecting other kinases. As a result, compounds optimized in terms of binding affinity and pharmacokinetic parameters have been discovered, that however failed to access clinical trials. In this review, a comprehensive survey of recent LIMK inhibitors is reported, together with SAR considerations and optimization processes. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  6. Molecular cloning, structure, and chromosomal localization of the mouse LIM/homeobox gene Lhx5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertuzzi, S.; Sheng, Hui Z.; Westphal, H.

    1996-09-01

    Lhx5, the mouse ortholog of the Xenopus Xlim-5, is a LIM/homeobox gene expressed in the central nervous system during both embryonic development and adulthood. During development its domain of expression is mainly localized at the most anterior portion of the neural tube, and it precedes the morphological differentiation of the forebrain; for this reason we believe that Lhx5 could play an important role in forebrain patterning. Here we present the structural organization and the chromosomal localization of the Lhx5 gene. The gene is composed of five exons spanning more than 10 kb of genomic sequence. The first and second LIMmore » domains are encoded by the first and second exon, while the codons of the homeobox are split between the third and the fourth exons. The structure of Lhx5 is similar to that of other LIM/homeodomain proteins, Lxh1/lim1 and Lhx3/lim3, but differs from that of other LIM genes, such as mec3 and LMO1/Rbtn1, in which the codons for the LIM domains are interrupted by introns. We have mapped Lhx5 to the central region of mouse chromosome 5. 38 refs., 4 figs.« less

  7. A Framework for Software Reuse in Safety-Critical System of Systems

    DTIC Science & Technology

    2008-03-01

    environment.8 Pressman , on the other hand, defines a software component as a unit of composition with contractually specified and explicit context...2005, p654. 9 R.S. Pressman ., Software Engineering A Practitioner’s Approach, Sixth Edition, New York, NY.: McGraw-Hill, 2005, p817. 10 W.C. Lim...index.php. 79 Pressman , R.S., Software Engineering A Practitioner’s Approach, Sixth Edition, New York, NY.: McGraw-Hill, 2005. Radio Technical

  8. Technical Considerations in Remote LIMS Access via the World Wide Web

    PubMed Central

    Schlabach, David M.

    2005-01-01

    The increased dependency on the World Wide Web by both laboratories and their customers has led LIMS developers to take advantage of thin-client web applications that provide both remote data entry and manipulation, along with remote reporting functionality. Use of an LIMS through a web browser allows a person to interact with a distant application, providing both remote administration and real-time analytical result delivery from virtually anywhere in the world. While there are many benefits of web-based LIMS applications, some concern must be given to these new methods of system architecture before justifying them as a suitable replacement for their traditional client-server systems. Developers and consumers alike must consider the security aspects of introducing a wide area network capable system into a production environment, as well as the concerns of data integrity and usability. PMID:18924736

  9. A Manual for a Laboratory Information Management System (LIMS) For Light Stable Isotopes - Version 7.0

    DTIC Science & Technology

    1998-01-01

    A MANUAL FOR A LABORATORY INFORMATION MANAGEMENT SYSTEM (LIMS) FOR LIGHT STABLE ISOTOPES— VERSION 7.0 U.S. GEOLOGICAL SURVEY Open-File Report 98-284...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 A MANUAL FOR A LABORATORY INFORMATION MANAGEMENT SYSTEM (LIMS) FOR LIGHT STABLE...Europa Scientific ..................................................120 1 A MANUAL FOR A LABORATORY INFORMATION MANAGEMENT SYSTEM (LIMS) FOR LIGHT STABLE

  10. Cyclic Testing of the 6-Strand Tang and Modified Lim-Tsai Flexor Tendon Repair Techniques.

    PubMed

    Kang, Gavrielle Hui-Ying; Wong, Yoke-Rung; Lim, Rebecca Qian-Ru; Loke, Austin Mun-Kitt; Tay, Shian-Chao

    2018-03-01

    In this study, we compared the Tang repair technique with the 6-strand modified Lim-Tsai repair technique under cyclic testing conditions. Twenty fresh-frozen porcine flexor tendons were randomized into 2 groups for repair with either the modified Lim-Tsai or the Tang technique using Supramid 4-0 core sutures and Ethilon 6-0 epitendinous running suture. The repaired tendons were subjected to 2 stage cyclic loading. The survival rate and gap formation at the repair site were recorded. Tendons repaired by the Tang technique achieved an 80% survival rate. None of the modified Lim-Tsai repairs survived. The mean gap formed at the end of 1000 cycles was 1.09 mm in the Tang repairs compared with 4.15 mm in the modified Lim-Tsai repairs. The Tang repair is biomechanically stronger than the modified Lim-Tsai repair under cyclic loading. The Tang repair technique may exhibit a higher tolerance for active mobilization after surgery with less propensity for gap formation. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  11. Chromatin Immunoprecipitation and DNA Sequencing Identified a LIMS1/ILK Pathway Regulated by LMO1 in Neuroblastoma.

    PubMed

    Saeki, Norihisa; Saito, Akira; Sugaya, Yuki; Amemiya, Mitsuhiro; Ono, Hiroe; Komatsuzaki, Rie; Yanagihara, Kazuyoshi; Sasaki, Hiroki

    2018-01-01

    Overall survival for the high-risk group of neuroblastoma (NB) remains at 40-50%. An integrative genomics study revealed that LIM domain only 1 (LMO1) encoding a transcriptional regulator to be an NB-susceptibility gene with a tumor-promoting activity, that needs to be revealed. We conducted chromatin immunoprecipitation and DNA sequencing analyses and cell proliferation assays on two NB cell lines. We identified three genes regulated by LMO1 in the cells, LIM and senescent cell antigen-like domains 1 (LIMS1), Ras suppressor protein 1 (RSU1) and relaxin 2 (RLN2). LIMS1 and RSU1 encode proteins functioning with integrin-linked kinase (ILK), and inhibition of LIMS1, ILK or RLN2 by shRNA reduced cell proliferation of the NB cells, which was also suppressed with an ILK inhibiting compound Cpd 22. The downstream of LMO1-regulatory cascade includes a tumor-promoting LIMS1/ILK pathway, which has a potential to be a novel therapeutic target. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  12. Comparison of Southern Hemisphere radiosonde and LIMS temperatures at 100 mb. [limb infrared monitor of stratosphere

    NASA Technical Reports Server (NTRS)

    Miles, T.; Grose, W. L.; Russell, J. M., III; Remsberg, E. E.

    1987-01-01

    Radiosonde (RS)and satellite-derived (Nimbus-7 LIMS) 100-mb temperatures over New Zealand at 12 GMT are compared for the 1978-79 summer. The colocated LIMS temperature information consists of synoptically mapped values (for 12 GMT), as well as the primary nighttime orbital retrievals valid at about 1030 GMT. The RS time series of temperature is dominated by temporal fluctuations associated mainly with the eastward passage of waves which have characteristic periods of 4-5 and 11-12 days and peak-to-peak amplitudes of 10-15 K. The LIMS temperatures and the corresponding temperature time series are also found to exhibit quite close agreement (in terms of temporal phase for the latter) with the RS data. However, the LIMS-mapped temperature fluctuations suffer from a noticeable attenuation in amplitude (approaching 50 percent for higher-frequency fluctuations), which will affect the accuracy of LIMS-derived estimates of dynamical quantities such as wind velocity and relative vorticity in the lower stratosphere.

  13. Phase coexistence and exchange-bias effect in LiM n2O4 nanorods

    NASA Astrophysics Data System (ADS)

    Zhang, X. K.; Yuan, J. J.; Xie, Y. M.; Yu, Y.; Kuang, F. G.; Yu, H. J.; Zhu, X. R.; Shen, H.

    2018-03-01

    In this paper, the magnetic properties of LiM n2O4 nanorods with an average diameter of ˜100 nm and length of ˜1 μ m are investigated. The temperature dependences of dc and ac susceptibility measurements show that LiM n2O4 nanorods experience multiple magnetic phase transitions upon cooling, i.e., paramagnetic (PM), antiferromagnetic (AFM), canted antiferromagnetic (CAFM), and cluster spin glass (SG). The coexistence between a long-range ordered AFM phase due to a M n4 +-M n4 + interaction and a cluster SG phase originating from frozen AFM clusters at low temperature in LiM n2O4 nanorods is elucidated. Field-cooled hysteresis loops (FC loops) and magnetic training effect (TE) measurements confirm the presence of an exchange-bias (EB) effect in LiM n2O4 nanorods below the Néel temperature (TN˜60 K ) . Furthermore, by analyzing the TE, we conclude that the observed EB effect originates completely from an exchange coupling interaction at the interface between the AFM and cluster SG states. A phenomenological model based on phase coexistence is proposed to interpret the origin of the EB effect below 60 K in the present compound. In turn, the appearance of the EB effect further supports the coexistence of AFM order along with a cluster SG state in LiM n2O4 nanorods.

  14. On the Quality of the Nimbus 7 LIMS Version 6 Water Vapor Profiles and Distributions

    NASA Technical Reports Server (NTRS)

    Remsberg, E. E.; Natarajan, M.; Lingenfelser, G. S.; Thompson, R. E.; Marshall, B. T.; Gordley, L. L.

    2009-01-01

    This report describes the quality of the Nimbus 7 Limb Infrared Monitor of the Stratosphere (LIMS) water vapor (H2O) profiles of 1978/79 that were processed with a Version 6 (V6) algorithm and archived in 2002. The V6 profiles incorporate a better knowledge of the instrument attitude for the LIMS measurements along its orbits, leading to improvements for its temperature profiles and for the registration of its water vapor radiances with pressure. As a result, the LIMS V6 zonal-mean distributions of H2O exhibit better hemispheric symmetry than was the case from the original Version 5 (V5) dataset that was archived in 1982. Estimates of the precision and accuracy of the V6 H2O profiles are developed and provided. Individual profiles have a precision of order 5% and an estimated accuracy of about 19% at 3 hPa, 14% at 10 hPa, and 26% at 50 hPa. Profile segments within about 2 km of the tropopause are often affected by emissions from clouds that appear in the finite field-of-view of the detector for the LIMS H2O channel. Zonally-averaged distributions of the LIMS V6 H2O are compared with those from the more recent Microwave Limb Sounder (MLS) satellite experiment for November, February, and May of 2004/2005. The patterns and values of their respective distributions are similar in many respects. Effects of a strengthened Brewer-Dobson circulation are indicated in the MLS distributions of the recent decade versus those of LIMS from 1978/79. A tropical tape recorder signal is present in the 7-month time series of LIMS V6 H2O with lowest values in February 1979, and the estimated, annually-averaged "entry-level" H2O is 3.5 to 3.8 ppmv. It is judged that this historic LIMS water vapor dataset is of good quality for studies of the near global-scale chemistry and transport for pressure levels from 3 hPa to about 70 to 100 hPa.

  15. The Transcription Factors Islet and Lim3 Combinatorially Regulate Ion Channel Gene Expression

    PubMed Central

    Wolfram, Verena; Southall, Tony D.; Günay, Cengiz; Prinz, Astrid A.; Brand, Andrea H.

    2014-01-01

    Expression of appropriate ion channels is essential to allow developing neurons to form functional networks. Our previous studies have identified LIM-homeodomain (HD) transcription factors (TFs), expressed by developing neurons, that are specifically able to regulate ion channel gene expression. In this study, we use the technique of DNA adenine methyltransferase identification (DamID) to identify putative gene targets of four such TFs that are differentially expressed in Drosophila motoneurons. Analysis of targets for Islet (Isl), Lim3, Hb9, and Even-skipped (Eve) identifies both ion channel genes and genes predicted to regulate aspects of dendritic and axonal morphology. Significantly, some ion channel genes are bound by more than one TF, consistent with the possibility of combinatorial regulation. One such gene is Shaker (Sh), which encodes a voltage-dependent fast K+ channel (Kv1.1). DamID reveals that Sh is bound by both Isl and Lim3. We used body wall muscle as a test tissue because in conditions of low Ca2+, the fast K+ current is carried solely by Sh channels (unlike neurons in which a second fast K+ current, Shal, also contributes). Ectopic expression of isl, but not Lim3, is sufficient to reduce both Sh transcript and Sh current level. By contrast, coexpression of both TFs is additive, resulting in a significantly greater reduction in both Sh transcript and current compared with isl expression alone. These observations provide evidence for combinatorial activity of Isl and Lim3 in regulating ion channel gene expression. PMID:24523544

  16. AJUBA LIM Proteins Limit Hippo Activity in Proliferating Cells by Sequestering the Hippo Core Kinase Complex in the Cytosol

    PubMed Central

    Jagannathan, Radhika; Schimizzi, Gregory V.; Zhang, Kun; Loza, Andrew J.; Yabuta, Norikazu; Nojima, Hitoshi

    2016-01-01

    The Hippo pathway controls organ growth and is implicated in cancer development. Whether and how Hippo pathway activity is limited to sustain or initiate cell growth when needed is not understood. The members of the AJUBA family of LIM proteins are negative regulators of the Hippo pathway. In mammalian epithelial cells, we found that AJUBA LIM proteins limit Hippo regulation of YAP, in proliferating cells only, by sequestering a cytosolic Hippo kinase complex in which LATS kinase is inhibited. At the plasma membranes of growth-arrested cells, AJUBA LIM proteins do not inhibit or associate with the Hippo kinase complex. The ability of AJUBA LIM proteins to inhibit YAP regulation by Hippo and to associate with the kinase complex directly correlate with their capacity to limit Hippo signaling during Drosophila wing development. AJUBA LIM proteins did not influence YAP activity in response to cell-extrinsic or cell-intrinsic mechanical signals. Thus, AJUBA LIM proteins limit Hippo pathway activity in contexts where cell proliferation is needed. PMID:27457617

  17. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  18. AJUBA LIM Proteins Limit Hippo Activity in Proliferating Cells by Sequestering the Hippo Core Kinase Complex in the Cytosol.

    PubMed

    Jagannathan, Radhika; Schimizzi, Gregory V; Zhang, Kun; Loza, Andrew J; Yabuta, Norikazu; Nojima, Hitoshi; Longmore, Gregory D

    2016-10-15

    The Hippo pathway controls organ growth and is implicated in cancer development. Whether and how Hippo pathway activity is limited to sustain or initiate cell growth when needed is not understood. The members of the AJUBA family of LIM proteins are negative regulators of the Hippo pathway. In mammalian epithelial cells, we found that AJUBA LIM proteins limit Hippo regulation of YAP, in proliferating cells only, by sequestering a cytosolic Hippo kinase complex in which LATS kinase is inhibited. At the plasma membranes of growth-arrested cells, AJUBA LIM proteins do not inhibit or associate with the Hippo kinase complex. The ability of AJUBA LIM proteins to inhibit YAP regulation by Hippo and to associate with the kinase complex directly correlate with their capacity to limit Hippo signaling during Drosophila wing development. AJUBA LIM proteins did not influence YAP activity in response to cell-extrinsic or cell-intrinsic mechanical signals. Thus, AJUBA LIM proteins limit Hippo pathway activity in contexts where cell proliferation is needed. Copyright © 2016 Jagannathan et al.

  19. Molecular recognition of the Tes LIM2-3 domains by the actin-related protein Arp7A.

    PubMed

    Boëda, Batiste; Knowles, Phillip P; Briggs, David C; Murray-Rust, Judith; Soriano, Erika; Garvalov, Boyan K; McDonald, Neil Q; Way, Michael

    2011-04-01

    Actin-related proteins (Arps) are a highly conserved family of proteins that have extensive sequence and structural similarity to actin. All characterized Arps are components of large multimeric complexes associated with chromatin or the cytoskeleton. In addition, the human genome encodes five conserved but largely uncharacterized "orphan" Arps, which appear to be mostly testis-specific. Here we show that Arp7A, which has 43% sequence identity with β-actin, forms a complex with the cytoskeletal proteins Tes and Mena in the subacrosomal layer of round spermatids. The N-terminal 65-residue extension to the actin-like fold of Arp7A interacts directly with Tes. The crystal structure of the 1-65(Arp7A)·LIM2-3(Tes)·EVH1(Mena) complex reveals that residues 28-49 of Arp7A contact the LIM2-3 domains of Tes. Two alanine residues from Arp7A that occupy equivalent apolar pockets in both LIM domains as well as an intervening GPAK linker that binds the LIM2-3 junction are critical for the Arp7A-Tes interaction. Equivalent occupied apolar pockets are also seen in the tandem LIM domain structures of LMO4 and Lhx3 bound to unrelated ligands. Our results indicate that apolar pocket interactions are a common feature of tandem LIM domain interactions, but ligand specificity is principally determined by the linker sequence.

  20. Voice recognition software for clinical use.

    PubMed

    Korn, K

    1998-11-01

    The current generation voice recognition products truly offer the promise of voice recognition systems, that are financially and operationally acceptable for use in a health care facility. Although the initial capital outlay for the purchase of such equipment may be substantial, the long-term benefit is felt to outweigh the expense. The ability to utilize computer equipment for educational purposes and information management alone helps to rationalize the cost. In addition, it is important to remember that the Internet has become a substantial source of information which provides another functional use for this equipment. Although one can readily see the implication for such a program in clinical practice, other uses for the program should not be overlooked. Uses far beyond the writing of clinic notes and correspondence can be easily envisioned. Utilization of voice recognition software offers clinical practices the ability to produce quality printed records in a timely and cost-effective manner. After learning procedures for the selected product and appropriately formatting word processing software and printers, printed progress notes should be able to be produced in less time than traditional dictation and transcription methods. Although certain procedures and practices may need to be altered, or may preclude optimal utilization of this type of system, many advantages are apparent. It is recommended that facilities consider utilization of Voice Recognition products such as Dragon Systems Naturally Speaking Software, or at least consider a trial of this method with one of the limited-feature products, if current dictation practices are unsatisfactory or excessively costly. Free downloadable trial software or single user software can provide a reduced-cost method for trial evaluation of such products if a major commitment is not felt to be desired. A list of voice recognition software manufacturer web sites may be accessed through the following: http

  1. A novel muscle LIM-only protein is generated from the paxillin gene locus in Drosophila.

    PubMed

    Yagi, R; Ishimaru, S; Yano, H; Gaul, U; Hanafusa, H; Sabe, H

    2001-09-01

    Paxillin is a protein containing four LIM domains, and functions in integrin signaling. We report here that two transcripts are generated from the paxillin gene locus in Drosophila; one encodes a protein homolog of the vertebrate Paxillin (DPxn37), and the other a protein with only three LIM domains, partly encoded by its own specific exon (PDLP). At the myotendinous junctions of Drosophila embryos where integrins play important roles, both DPxn37 and PDLP are highly expressed with different patterns; DPxn37 is predominantly concentrated at the center of the junctions, whereas PDLP is highly enriched at neighboring sides of the junction centers, primarily expressed in the mesodermal myotubes. Northern blot analysis revealed that DPxn37 is ubiquitously expressed throughout the life cycle, whereas PDLP expression exhibits a biphasic pattern during development, largely concomitant with muscle generation and remodeling. Our results collectively reveal that a unique system exists in Drosophila for the generation of a novel type of LIM-only protein, highly expressed in the embryonic musculature, largely utilizing the Paxillin LIM domains.

  2. Development of quantitative laser ionization mass spectrometry (LIMS). Final report, 1 Aug 87-1 Jan 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odom, R.W.

    1991-06-04

    The objective of the research was to develop quantitative microanalysis methods for dielectric thin films using the laser ionization mass spectrometry (LIMS) technique. The research involved preparation of thin (5,000 A) films of SiO2, Al2O3, MgF2, TiO2, Cr2O3, Ta2O5, Si3N4, and ZrO2, and doping these films with ion implant impurities of 11B, 40Ca, 56Fe, 68Zn, 81Br, and 121Sb. Laser ionization mass spectrometry (LIMS), secondary ion mass spectrometry (SIMS) and Rutherford backscattering spectrometry (RBS) were performed on these films. The research demonstrated quantitative LIMS analysis down to detection levels of 10-100 ppm, and led to the development of (1) a compoundmore » thin film standards product line for the performing organization, (2) routine LIMS analytical methods, and (3) the manufacture of high speed preamplifiers for time-of-flight mass spectrometry (TOF-MS) techniques.« less

  3. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers

  4. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web

  5. On the Quality of the Nimbus 7 LIMS Version 6 Ozone for Studies of the Middle Atmosphere

    NASA Technical Reports Server (NTRS)

    Remsberg, Ellis; Lingenfelser, Gretchen; Natarajan, Murali; Gordley, Larry; Thompson, Earl

    2006-01-01

    The Nimbus 7 Limb Infrared Monitor of the Stratosphere (LIMS) radiance profile dataset of 1978/79 was reconditioned and reprocessed to Version 6 (V6) profiles of temperature and species that are improved significantly over those from Version 5 (V5). The LIMS V6 dataset was archived for public use in 2002. Improvements for its ozone include: (1) a more accurate accounting for instrument and spacecraft motion effects in the radiances, (2) the use of better spectroscopic line parameters for its ozone forward model, (3) retrievals of all its scans, (4) more accurate and compatible temperature versus pressure profiles (or T(p)), which are needed for the registration of the ozone radiances and for the removal of temperature effects from them, and (5) a better accounting for interfering species in the lower stratosphere. The retrieved V6 ozone profiles extend from near cloud top altitudes to about 80 km and from 64S to 84N latitude with better sampling along the orbit than for the V5 dataset. Calculated estimates of the single-profile precision and accuracy are provided for the V6 ozone from this study. Precision estimates based on the data themselves are of order 3% or better from 1 to 30 hPa. Estimates of total systematic error for a single profile are hard to generalize because the separate sources of error may not all be of the same sign and they depend somewhat on the atmospheric state. It is estimated that the V6 zonal mean ozone distributions are accurate to within 9% to 7% from 50 hPa to 3 hPa, respectively. Effects of a temperature bias can be significant and may be present at 1 to 2 hPa though. There may be ozone biases of order 10% at those levels due to possible biases of up to +2 K, but there is no indication of a similar problem elsewhere in the stratosphere. Simulation studies show that the LIMS retrievals are also underestimating slightly the small amplitudes of the atmospheric temperature tides, which affect its retrieved day/night ozone differences

  6. Characterisation of Four LIM Protein-Encoding Genes Involved in Infection-Related Development and Pathogenicity by the Rice Blast Fungus Magnaporthe oryzae

    PubMed Central

    Li, Ya; Yue, Xiaofeng; Que, Yawei; Yan, Xia; Ma, Zhonghua; Talbot, Nicholas J.; Wang, Zhengyi

    2014-01-01

    LIM domain proteins contain contiguous double-zinc finger domains and play important roles in cytoskeletal re-organisation and organ development in multi-cellular eukaryotes. Here, we report the characterization of four genes encoding LIM proteins in the rice blast fungus Magnaporthe oryzae. Targeted gene replacement of either the paxillin-encoding gene, PAX1, or LRG1 resulted in a significant reduction in hyphal growth and loss of pathogenicity, while deletion of RGA1 caused defects in conidiogenesis and appressorium development. A fourth LIM domain gene, LDP1, was not required for infection-associated development by M. oryzae. Live cell imaging revealed that Lrg1-GFP and Rga1-GFP both localize to septal pores, while Pax1-GFP is present in the cytoplasm. To explore the function of individual LIM domains, we carried out systematic deletion of each LIM domain, which revealed the importance of the Lrg1-LIM2 and Lrg1-RhoGAP domains for Lrg1 function and overlapping functions of the three LIM domains of Pax1. Interestingly, deletion of either PAX1 or LRG1 led to decreased sensitivity to cell wall-perturbing agents, such as Congo Red and SDS (sodium dodecyl sulfate). qRT-PCR analysis demonstrated the importance of both Lrg1 and Pax1 to regulation of genes associated with cell wall biogenesis. When considered together, our results indicate that LIM domain proteins are key regulators of infection-associated morphogenesis by the rice blast fungus. PMID:24505448

  7. Thermography based prescreening software tool for veterinary clinics

    NASA Astrophysics Data System (ADS)

    Dahal, Rohini; Umbaugh, Scott E.; Mishra, Deependra; Lama, Norsang; Alvandipour, Mehrdad; Umbaugh, David; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Under development is a clinical software tool which can be used in the veterinary clinics as a prescreening tool for these pathologies: anterior cruciate ligament (ACL) disease, bone cancer and feline hyperthyroidism. Currently, veterinary clinical practice uses several imaging techniques including radiology, computed tomography (CT), and magnetic resonance imaging (MRI). But, harmful radiation involved during imaging, expensive equipment setup, excessive time consumption and the need for a cooperative patient during imaging, are major drawbacks of these techniques. In veterinary procedures, it is very difficult for animals to remain still for the time periods necessary for standard imaging without resorting to sedation - which creates another set of complexities. Therefore, clinical application software integrated with a thermal imaging system and the algorithms with high sensitivity and specificity for these pathologies, can address the major drawbacks of the existing imaging techniques. A graphical user interface (GUI) has been created to allow ease of use for the clinical technician. The technician inputs an image, enters patient information, and selects the camera view associated with the image and the pathology to be diagnosed. The software will classify the image using an optimized classification algorithm that has been developed through thousands of experiments. Optimal image features are extracted and the feature vector is then used in conjunction with the stored image database for classification. Classification success rates as high as 88% for bone cancer, 75% for ACL and 90% for feline hyperthyroidism have been achieved. The software is currently undergoing preliminary clinical testing.

  8. Oracle, a novel PDZ-LIM domain protein expressed in heart and skeletal muscle.

    PubMed

    Passier, R; Richardson, J A; Olson, E N

    2000-04-01

    In order to identify novel genes enriched in adult heart, we performed a subtractive hybridization for genes expressed in mouse heart but not in skeletal muscle. We identified two alternative splicing variants of a novel PDZ-LIM domain protein, which we named Oracle. Both variants contain a PDZ domain at the amino-terminus and three LIM domains at the carboxy-terminus. Highest homology of Oracle was found with the human and rat enigma proteins in the PDZ domain (62 and 61%, respectively) and in the LIM domains (60 and 69%, respectively). By Northern hybridization analysis, we showed that expression is highest in adult mouse heart, low in skeletal muscle and undetectable in other adult mouse tissues. In situ hybridization in mouse embryos confirmed and extended these data by showing high expression of Oracle mRNA in atrial and ventricular myocardial cells from E8.5. From E9.5 low expression of Oracle mRNA was detectable in myotomes. These data suggest a role for Oracle in the early development and function of heart and skeletal muscle.

  9. Sustainable Land Management in the Lim River Basin

    NASA Astrophysics Data System (ADS)

    Grujic, Gordana; Petkovic, Sava; Tatomir, Uros

    2017-04-01

    In the cross-border belt between Serbia and Montenegro are located more than one hundred torrential water flows that belong to the Lim River Basin. Under extreme climate events they turned into floods of destructive power and great energy causing enormous damage on the environment and socio-economic development in the wider region of the Western Balkans. In addition, anthropogenic factors influence the land instability, erosion of river beds and loss of topsoil. Consequently, this whole area is affected by pluvial and fluvial erosion of various types and intensity. Terrain on the slopes over 5% is affected by intensive degree of erosion, while strong to medium degree covers 70% of the area. Moreover, in the Lim River Basin were built several hydro-energetic systems and accumulations which may to a certain extent successfully regulate the water regime downstream and to reduce the negative impact on the processes of water erosion. However, siltation of accumulation reduces their useful volume and threatens the basic functions (water reservoirs), especially those ones for water supply, irrigation and energy production that have lost a significant part of the usable volume due to accumulated sediments. Facing the negative impacts of climate change and human activities on the process of land degradation in the Lim River basin imposes urgent need of adequate preventive and protective measures at the local and regional level, which can be effectively applied only through enhanced cross-border cooperation among affected communities in the region. The following set of activities were analyzed to improve the actual management of river catchment: Identifying priorities in the spatial planning, land use and water resources management while respecting the needs of local people and the communities in the cross border region; development of cooperation and partnership between the local population, owners and users of real estate (pastures, agricultural land, forests, fisheries

  10. Role of the New South Wales Department of Primary Industries' Laboratory Information Management System (LIMS) in the 2007 equine influenza emergency animal disease response.

    PubMed

    Croft, M G; Fraser, G C; Gaul, W N

    2011-07-01

    A Laboratory Information Management System (LIMS) was used to manage the laboratory data and support planning and field activities as part of the response to the equine influenza outbreak in Australia in 2007. The database structure of the LIMS and the system configurations that were made to best handle the laboratory implications of the disease response are discussed. The operational aspects of the LIMS and the related procedures used at the laboratory to process the increased sample throughput are reviewed, as is the interaction of the LIMS with other corporate systems used in the management of the response. Outcomes from this tailored configuration and operation of the LIMS resulted in effective provision and control of the laboratory and laboratory information aspects of the response. The extent and immediate availability of the information provided from the LIMS was critical to some of the activities of key operatives involved in controlling the response. © 2011 The Authors. Australian Veterinary Journal © 2011 Australian Veterinary Association.

  11. 77 FR 6782 - In the Matter of: Kok Tong Lim, a/k/a Thomas Lim Blk 258A Compassvale Road #07-551 Singapore...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... (50 U.S.C. 1701 et seq. (2000)) (``IEEPA''). Specifically, Lim conspired to illegally export wound... the Internal Security Act of 1950 (50 U.S.C. 783(b)), or section 38 of the Arms Export Control Act (22... Export Administration Act (50 U.S.C. app. Sec. Sec. 2401-2420 (2000)) (``EAA''). Since August 21, 2001...

  12. Comparative study of Arctic sea ice response from NEMO-LIM3 to two different atmospheric forcings

    NASA Astrophysics Data System (ADS)

    Massonnet, Francois; Fichefet, Thierry; Goosse, Hugues; Mathiot, Pierre; König Beatty, Christof; Vancoppenolle, Martin

    2010-05-01

    Sea ice plays a key role within the climate system as it is, e.g., an efficient barrier to transfers of heat, mass and momentum between atmosphere and ocean. In order to simulate the observed sea ice state, global Ocean General Circulation Models (OGCMs) must benefit from good quality atmospheric forcings. NEMO-LIM3 is one of those OGCMs. This model results from the coupling of the sea ice model LIM3 with the ocean model OPA. So far, the NCEP/NCAR reanalysis dataset (2-m atmospheric temperatures and 10-m wind speeds) has been used jointly with monthly climatologies of relative humidity, cloudiness and precipitation to set up and calibrate NEMO-LIM3. Clear biases in model outputs have been tentatively attributed to this forcing. Here, we investigate the consequences of using the ERA-40-based DFS4 forcing on an ORCA1 configuration (1° resolution), with focus on the Arctic sea ice. Using an adequate metric, we measure the discrepancies between the simulations resulting from the respective forcings. A particular attention is paid to the sea ice features along Siberia at the beginning of the 80s, as previous NEMO-LIM3 runs with the NCEP/NCAR forcing exhibit a significant overestimation of ice extent in this area during this time period.

  13. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  14. Expression of LIM-homeodomain transcription factors in the developing and mature mouse retina

    PubMed Central

    Balasubramanian, Revathi; Bui, Andrew; Ding, Qian; Gan, Lin

    2014-01-01

    LIM-homeodomain (LIM-HD) transcription factors have been extensively studied for their role in the development of the central nervous system. Their function is key to several developmental events like cell proliferation, differentiation and subtype specification. However, their roles in retinal neurogenesis remain largely unknown. Here we report a detailed expression study of LIM-HD transcription factors LHX9 and LHX2, LHX3 and LHX4, and LHX6 in the developing and mature mouse retina using immunohistochemistry and in situ hybridization techniques. We show that LHX9 is expressed during the early stages of development in the retinal ganglion cell layer and the inner nuclear layer. We also show that LHX9 is expressed in a subset of amacrine cells in the adult retina. LHX2 is known to be expressed in retinal progenitor cells during development and in Müller glial cells and a subset of amacrine cells in the adult retina. We found that the LHX2 subset of amacrine cells is not cholinergic and that a very few of LHX2 amacrine cells express calretinin. LHX3 and LHX4 are expressed in a subset of bipolar cells in the adult retina. LHX6 is expressed in cells in the ganglion cell layer and the neuroblast layer starting at embryonic stage 13.5 (E13.5) and continues to be expressed in cells in the ganglion cell layer and inner nuclear layer, postnatally, suggesting its likely expression in amacrine cells or a subset thereof. Taken together, our comprehensive assay of expression patterns of LIM-HD transcription factors during mouse retinal development will help further studies elucidating their biological functions in the differentiation of retinal cell subtypes. PMID:24333658

  15. Clinical genomics information management software linking cancer genome sequence and clinical decisions.

    PubMed

    Watt, Stuart; Jiao, Wei; Brown, Andrew M K; Petrocelli, Teresa; Tran, Ben; Zhang, Tong; McPherson, John D; Kamel-Reid, Suzanne; Bedard, Philippe L; Onetto, Nicole; Hudson, Thomas J; Dancey, Janet; Siu, Lillian L; Stein, Lincoln; Ferretti, Vincent

    2013-09-01

    Using sequencing information to guide clinical decision-making requires coordination of a diverse set of people and activities. In clinical genomics, the process typically includes sample acquisition, template preparation, genome data generation, analysis to identify and confirm variant alleles, interpretation of clinical significance, and reporting to clinicians. We describe a software application developed within a clinical genomics study, to support this entire process. The software application tracks patients, samples, genomic results, decisions and reports across the cohort, monitors progress and sends reminders, and works alongside an electronic data capture system for the trial's clinical and genomic data. It incorporates systems to read, store, analyze and consolidate sequencing results from multiple technologies, and provides a curated knowledge base of tumor mutation frequency (from the COSMIC database) annotated with clinical significance and drug sensitivity to generate reports for clinicians. By supporting the entire process, the application provides deep support for clinical decision making, enabling the generation of relevant guidance in reports for verification by an expert panel prior to forwarding to the treating physician. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    PubMed

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  17. LimTox: a web tool for applied text mining of adverse event and toxicity associations of compounds, drugs and genes

    PubMed Central

    Cañada, Andres; Rabal, Obdulia; Oyarzabal, Julen; Valencia, Alfonso

    2017-01-01

    Abstract A considerable effort has been devoted to retrieve systematically information for genes and proteins as well as relationships between them. Despite the importance of chemical compounds and drugs as a central bio-entity in pharmacological and biological research, only a limited number of freely available chemical text-mining/search engine technologies are currently accessible. Here we present LimTox (Literature Mining for Toxicology), a web-based online biomedical search tool with special focus on adverse hepatobiliary reactions. It integrates a range of text mining, named entity recognition and information extraction components. LimTox relies on machine-learning, rule-based, pattern-based and term lookup strategies. This system processes scientific abstracts, a set of full text articles and medical agency assessment reports. Although the main focus of LimTox is on adverse liver events, it enables also basic searches for other organ level toxicity associations (nephrotoxicity, cardiotoxicity, thyrotoxicity and phospholipidosis). This tool supports specialized search queries for: chemical compounds/drugs, genes (with additional emphasis on key enzymes in drug metabolism, namely P450 cytochromes—CYPs) and biochemical liver markers. The LimTox website is free and open to all users and there is no login requirement. LimTox can be accessed at: http://limtox.bioinfo.cnio.es PMID:28531339

  18. Inclusion through Access to Outdoor Education: Learning in Motion (LIM)

    ERIC Educational Resources Information Center

    Brodin, Jane

    2009-01-01

    Learning in Motion (LIM) was a European project involving seven partners in five countries: Sweden, Finland, Latvia, Germany and Greece. The project focused on inclusion and access to outdoor education and was financed by the European Commission within the framework of the Socrates-Grundtvig Programme. The aim of the project was to explore if and…

  19. From the SAIN,LIM system to the SENS algorithm: a review of a French approach of nutrient profiling.

    PubMed

    Tharrey, Marion; Maillot, Matthieu; Azaïs-Braesco, Véronique; Darmon, Nicole

    2017-08-01

    Nutrient profiling aims to classify or rank foods according to their nutritional composition to assist policies aimed at improving the nutritional quality of foods and diets. The present paper reviews a French approach of nutrient profiling by describing the SAIN,LIM system and its evolution from its early draft to the simplified nutrition labelling system (SENS) algorithm. Considered in 2010 by WHO as the 'French model' of nutrient profiling, SAIN,LIM classifies foods into four classes based on two scores: a nutrient density score (NDS) called SAIN and a score of nutrients to limit called LIM, and one threshold on each score. The system was first developed by the French Food Standard Agency in 2008 in response to the European regulation on nutrition and health claims (European Commission (EC) 1924/2006) to determine foods that may be eligible for bearing claims. Recently, the European regulation (EC 1169/2011) on the provision of food information to consumers allowed simplified nutrition labelling to facilitate consumer information and help them make fully informed choices. In that context, the SAIN,LIM was adapted to obtain the SENS algorithm, a system able to rank foods for simplified nutrition labelling. The implementation of the algorithm followed a step-by-step, systematic, transparent and logical process where shortcomings of the SAIN,LIM were addressed by integrating specificities of food categories in the SENS, reducing the number of nutrients, ordering the four classes and introducing European reference intakes. Through the French example, this review shows how an existing nutrient profiling system can be specifically adapted to support public health nutrition policies.

  20. Slip control for LIM propelled transit vehicles

    NASA Astrophysics Data System (ADS)

    Wallace, A. K.; Parker, J. H.; Dawson, G. E.

    1980-09-01

    Short stator linear induction motors, with an iron-backed aluminum sheet reaction rail and powered by a controlled inverter, have been selected as the propulsion system for transit vehicles in an intermediate capacity system (12-20,000 pphpd). The linear induction motor is capable of adhesion independent braking and acceleration levels which permit safe, close headways. In addition, simple control is possible allowing moving block automatic train control. This paper presents a slip frequency control scheme for the LIM. Experimental results for motoring and braking obtained from a test vehicle are also presented. These values are compared with theoretical predictions.

  1. LIM-domain proteins, LIMD1, Ajuba, and WTIP are required for microRNA-mediated gene silencing

    PubMed Central

    James, Victoria; Zhang, Yining; Foxler, Daniel E.; de Moor, Cornelia H.; Kong, Yi Wen; Webb, Thomas M.; Self, Tim J.; Feng, Yungfeng; Lagos, Dimitrios; Chu, Chia-Ying; Rana, Tariq M.; Morley, Simon J.; Longmore, Gregory D.; Bushell, Martin; Sharp, Tyson V.

    2010-01-01

    In recent years there have been major advances with respect to the identification of the protein components and mechanisms of microRNA (miRNA) mediated silencing. However, the complete and precise repertoire of components and mechanism(s) of action remain to be fully elucidated. Herein we reveal the identification of a family of three LIM domain-containing proteins, LIMD1, Ajuba and WTIP (Ajuba LIM proteins) as novel mammalian processing body (P-body) components, which highlight a novel mechanism of miRNA-mediated gene silencing. Furthermore, we reveal that LIMD1, Ajuba, and WTIP bind to Ago1/2, RCK, Dcp2, and eIF4E in vivo, that they are required for miRNA-mediated, but not siRNA-mediated gene silencing and that all three proteins bind to the mRNA 5′ m7GTP cap–protein complex. Mechanistically, we propose the Ajuba LIM proteins interact with the m7GTP cap structure via a specific interaction with eIF4E that prevents 4EBP1 and eIF4G interaction. In addition, these LIM-domain proteins facilitate miRNA-mediated gene silencing by acting as an essential molecular link between the translationally inhibited eIF4E-m7GTP-5′cap and Ago1/2 within the miRISC complex attached to the 3′-UTR of mRNA, creating an inhibitory closed-loop complex. PMID:20616046

  2. LimTox: a web tool for applied text mining of adverse event and toxicity associations of compounds, drugs and genes.

    PubMed

    Cañada, Andres; Capella-Gutierrez, Salvador; Rabal, Obdulia; Oyarzabal, Julen; Valencia, Alfonso; Krallinger, Martin

    2017-07-03

    A considerable effort has been devoted to retrieve systematically information for genes and proteins as well as relationships between them. Despite the importance of chemical compounds and drugs as a central bio-entity in pharmacological and biological research, only a limited number of freely available chemical text-mining/search engine technologies are currently accessible. Here we present LimTox (Literature Mining for Toxicology), a web-based online biomedical search tool with special focus on adverse hepatobiliary reactions. It integrates a range of text mining, named entity recognition and information extraction components. LimTox relies on machine-learning, rule-based, pattern-based and term lookup strategies. This system processes scientific abstracts, a set of full text articles and medical agency assessment reports. Although the main focus of LimTox is on adverse liver events, it enables also basic searches for other organ level toxicity associations (nephrotoxicity, cardiotoxicity, thyrotoxicity and phospholipidosis). This tool supports specialized search queries for: chemical compounds/drugs, genes (with additional emphasis on key enzymes in drug metabolism, namely P450 cytochromes-CYPs) and biochemical liver markers. The LimTox website is free and open to all users and there is no login requirement. LimTox can be accessed at: http://limtox.bioinfo.cnio.es. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Comparison of horizontal winds from the LIMS satellite instrument with rocket measurements

    NASA Technical Reports Server (NTRS)

    Smith, A. K.; Bailey, P. L.

    1985-01-01

    Statistical results are given for a comparison between horizontal geostrophic winds computed from satellite height data and all available in situ rocket wind soundings during a 7-month period. The satellite data are the daily mapped fields from the Nimbus 7 Limb Infrared Monitor of the Stratosphere (LIMS) instrument, which extend from 100 to 0.1 mbar. Results indicate that in both the tropics and the extratropical Northern Hemisphere, the average zonal and meridional wind speeds agree to within 2-4 m/s throughout the stratosphere. The rms differences are much larger, with values of 5-10 m/s in the lower stratosphere, increasing to 20-40 m/s in the lower mesosphere. Time series show that LIMS and rocketsonde zonal wind speeds show coherent variations with temporal periods of 1-2 weeks and more, and both exhibit irregular variations on time scales of less than one week.

  4. Optimized conditions for selective gold flotation by ToF-SIMS and ToF-LIMS

    NASA Astrophysics Data System (ADS)

    Chryssoulis, S. L.; Dimov, S. S.

    2004-06-01

    This work describes a comprehensive characterization of the factors controlling the floatability of free gold from flotation test using reagents (collectors) at plant concentration levels. A relationship between the collectors loadings on gold particles and their surface composition has been established. The findings of this study show that silver activates gold flotation and there is a strong correlation between the surface concentration of silver and the loading of certain collectors. The organic surface analysis was done by ToF-SIMS while the inorganic surface analysis was carried out by time-of-flight laser ionization mass spectrometry (ToF-LIMS). The developed testing protocol based on ToF-LIMS and ToF-SIMS complementary surface analysis allows for optimization of the flotation scheme and hence improved gold recovery.

  5. Pros and Cons of Clinical Pathway Software Management: A Qualitative Study.

    PubMed

    Aarnoutse, M F; Brinkkemper, S; de Mul, M; Askari, M

    2018-01-01

    In this study we aimed to assess the perceived effectiveness of clinical pathway management software for healthcare professionals. A case study on the clinical pathway management software program Check-It was performed in three departments at an academic medical center. Four months after the implementation of the software, interviews were held with healthcare professionals who work with the system. The interview questions were posed in a semi-structured interview format and the participant were asked about the perceived positive or negative effects of Check-It, and whether they thought the software is effective for them. The interviews were recorded and transcribed based on grounded theory, using different coding techniques. Our results showed fewer overlooked tasks, pre-filled orders and letters, better overview, and increased protocol insight as positive aspects of using the software. Being not flexible enough was experienced as a negative aspect.

  6. A Couple of "Lim (h[right arrow]0)-Is-Missing" Problems

    ERIC Educational Resources Information Center

    Lau, Ko Hin

    2007-01-01

    Since most students "hate" the concept of limit, in order to make them "happier," this article suggests a couple of naive "lim (h[right arrow]0)-is-missing" problems for them to try for fun. Indeed, differential functional equations that are related to difference quotients in calculus are studied in this paper. In particular, two interesting…

  7. ROCK1 and LIM kinase modulate retrovirus particle release and cell-cell transmission events.

    PubMed

    Wen, Xiaoyun; Ding, Lingmei; Wang, Jaang-Jiun; Qi, Mingli; Hammonds, Jason; Chu, Hin; Chen, Xuemin; Hunter, Eric; Spearman, Paul

    2014-06-01

    The assembly and release of retroviruses from the host cells require dynamic interactions between viral structural proteins and a variety of cellular factors. It has been long speculated that the actin cytoskeleton is involved in retrovirus production, and actin and actin-related proteins are enriched in HIV-1 virions. However, the specific role of actin in retrovirus assembly and release remains unknown. Here we identified LIM kinase 1 (LIMK1) as a cellular factor regulating HIV-1 and Mason-Pfizer monkey virus (M-PMV) particle release. Depletion of LIMK1 reduced not only particle output but also virus cell-cell transmission and was rescued by LIMK1 replenishment. Depletion of the upstream LIMK1 regulator ROCK1 inhibited particle release, as did a competitive peptide inhibitor of LIMK1 activity that prevented cofilin phosphorylation. Disruption of either ROCK1 or LIMK1 led to enhanced particle accumulation on the plasma membrane as revealed by total internal reflection fluorescence microscopy (TIRFM). Electron microscopy demonstrated a block to particle release, with clusters of fully mature particles on the surface of the cells. Our studies support a model in which ROCK1- and LIMK1-regulated phosphorylation of cofilin and subsequent local disruption of dynamic actin turnover play a role in retrovirus release from host cells and in cell-cell transmission events. Viruses often interact with the cellular cytoskeletal machinery in order to deliver their components to the site of assembly and budding. This study indicates that a key regulator of actin dynamics at the plasma membrane, LIM kinase, is important for the release of viral particles for HIV as well as for particle release by a distantly related retrovirus, Mason-Pfizer monkey virus. Moreover, disruption of LIM kinase greatly diminished the spread of HIV from cell to cell. These findings suggest that LIM kinase and its dynamic modulation of the actin cytoskeleton in the cell may be an important host factor for

  8. Software for MR image overlay guided needle insertions: the clinical translation process

    NASA Astrophysics Data System (ADS)

    Ungi, Tamas; U-Thainual, Paweena; Fritz, Jan; Iordachita, Iulian I.; Flammang, Aaron J.; Carrino, John A.; Fichtinger, Gabor

    2013-03-01

    PURPOSE: Needle guidance software using augmented reality image overlay was translated from the experimental phase to support preclinical and clinical studies. Major functional and structural changes were needed to meet clinical requirements. We present the process applied to fulfill these requirements, and selected features that may be applied in the translational phase of other image-guided surgical navigation systems. METHODS: We used an agile software development process for rapid adaptation to unforeseen clinical requests. The process is based on iterations of operating room test sessions, feedback discussions, and software development sprints. The open-source application framework of 3D Slicer and the NA-MIC kit provided sufficient flexibility and stable software foundations for this work. RESULTS: All requirements were addressed in a process with 19 operating room test iterations. Most features developed in this phase were related to workflow simplification and operator feedback. CONCLUSION: Efficient and affordable modifications were facilitated by an open source application framework and frequent clinical feedback sessions. Results of cadaver experiments show that software requirements were successfully solved after a limited number of operating room tests.

  9. The terminal basal mitosis of chicken retinal Lim1 horizontal cells is not sensitive to cisplatin-induced cell cycle arrest.

    PubMed

    Shirazi Fard, Shahrzad; Thyselius, Malin; All-Ericsson, Charlotta; Hallböök, Finn

    2014-01-01

    For proper development, cells need to coordinate proliferation and cell cycle-exit. This is mediated by a cascade of proteins making sure that each phase of the cell cycle is controlled before the initiation of the next. Retinal progenitor cells divide during the process of interkinetic nuclear migration, where they undergo S-phase on the basal side, followed by mitoses on the apical side of the neuroepithelium. The final cell cycle of chicken retinal horizontal cells (HCs) is an exception to this general cell cycle behavior. Lim1 expressing (+) horizontal progenitor cells (HPCs) have a heterogenic final cell cycle, with some cells undergoing a terminal mitosis on the basal side of the retina. The results in this study show that this terminal basal mitosis of Lim1+ HPCs is not dependent on Chk1/2 for its regulation compared to retinal cells undergoing interkinetic nuclear migration. Neither activating nor blocking Chk1 had an effect on the basal mitosis of Lim1+ HPCs. Furthermore, the Lim1+ HPCs were not sensitive to cisplatin-induced DNA damage and were able to continue into mitosis in the presence of γ-H2AX without activation of caspase-3. However, Nutlin3a-induced expression of p21 did reduce the mitoses, suggesting the presence of a functional p53/p21 response in HPCs. In contrast, the apical mitoses were blocked upon activation of either Chk1/2 or p21, indicating the importance of these proteins during the process of interkinetic nuclear migration. Inhibiting Cdk1 blocked M-phase transition both for apical and basal mitoses. This confirmed that the cyclin B1-Cdk1 complex was active and functional during the basal mitosis of Lim1+ HPCs. The regulation of the final cell cycle of Lim1+ HPCs is of particular interest since it has been shown that the HCs are able to sustain persistent DNA damage, remain in the cell cycle for an extended period of time and, consequently, survive for months.

  10. The LIM Protein Zyxin Binds CARP-1 and Promotes Apoptosis

    PubMed Central

    Hervy, Martial; Hoffman, Laura M.; Jensen, Christopher C.; Smith, Mark; Beckerle, Mary C.

    2010-01-01

    Zyxin is a dual-function LIM domain protein that regulates actin dynamics in response to mechanical stress and shuttles between focal adhesions and the cell nucleus. Here we show that zyxin contributes to UV-induced apoptosis. Exposure of wild-type fibroblasts to UV-C irradiation results in apoptotic cell death, whereas cells harboring a homozygous disruption of the zyxin gene display a statistically significant survival advantage. To gain insight into the molecular mechanism by which zyxin promotes apoptotic signaling, we expressed an affinity-tagged zyxin variant in zyxin-null cells and isolated zyxin-associated proteins from cell lysates under physiological conditions. A 130-kDa protein that was co-isolated with zyxin was identified by microsequence analysis as the Cell Cycle and Apoptosis Regulator Protein-1 (CARP-1). CARP-1 associates with the LIM region of zyxin. Zyxin lacking the CARP-1 binding region shows reduced proapoptotic activity in response to UV-C irradiation. We demonstrate that CARP-1 is a nuclear protein. Zyxin is modified by phosphorylation in cells exposed to UV-C irradiation, and nuclear accumulation of zyxin is induced by UV-C exposure. These findings highlight a novel mechanism for modulating the apoptotic response to UV irradiation. PMID:20852740

  11. PASSIM--an open source software system for managing information in biomedical studies.

    PubMed

    Viksna, Juris; Celms, Edgars; Opmanis, Martins; Podnieks, Karlis; Rucevskis, Peteris; Zarins, Andris; Barrett, Amy; Neogi, Sudeshna Guha; Krestyaninova, Maria; McCarthy, Mark I; Brazma, Alvis; Sarkans, Ugis

    2007-02-09

    One of the crucial aspects of day-to-day laboratory information management is collection, storage and retrieval of information about research subjects and biomedical samples. An efficient link between sample data and experiment results is absolutely imperative for a successful outcome of a biomedical study. Currently available software solutions are largely limited to large-scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but often implies sufficient investment of time, effort and funds, which are not always available. There is a clear need for lightweight open source systems for patient and sample information management. We present a web-based tool for submission, management and retrieval of sample and research subject data. The system secures confidentiality by separating anonymized sample information from individuals' records. It is simple and generic, and can be customised for various biomedical studies. Information can be both entered and accessed using the same web interface. User groups and their privileges can be defined. The system is open-source and is supplied with an on-line tutorial and necessary documentation. It has proven to be successful in a large international collaborative project. The presented system closes the gap between the need and the availability of lightweight software solutions for managing information in biomedical studies involving human research subjects.

  12. Using Clinical Decision Support Software in Health Insurance Company

    NASA Astrophysics Data System (ADS)

    Konovalov, R.; Kumlander, Deniss

    This paper proposes the idea to use Clinical Decision Support software in Health Insurance Company as a tool to reduce the expenses related to Medication Errors. As a prove that this class of software will help insurance companies reducing the expenses, the research was conducted in eight hospitals in United Arab Emirates to analyze the amount of preventable common Medication Errors in drug prescription.

  13. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software

    PubMed Central

    Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.

    2018-01-01

    Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement

  14. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.

    PubMed

    Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E

    2018-01-01

    The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established

  15. The intelligent clinical laboratory as a tool to increase cancer care management productivity.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza

    2014-01-01

    Studies of the causes of cancer, early detection, prevention or treatment need accurate, comprehensive, and timely cancer data. The clinical laboratory provides important cancer information needed for physicians which influence clinical decisions regarding treatment, diagnosis and patient monitoring. Poor communication between health care providers and clinical laboratory personnel can lead to medical errors and wrong decisions in providing cancer care. Because of the key impact of laboratory information on cancer diagnosis and treatment the quality of the tests, lab reports, and appropriate lab management are very important. A laboratory information management system (LIMS) can have an important role in diagnosis, fast and effective access to cancer data, decrease redundancy and costs, and facilitate the integration and collection of data from different types of instruments and systems. In spite of significant advantages LIMS is limited by factors such as problems in adaption to new instruments that may change existing work processes. Applications of intelligent software simultaneously with existing information systems, in addition to remove these restrictions, have important benefits including adding additional non-laboratory-generated information to the reports, facilitating decision making, and improving quality and productivity of cancer care services. Laboratory systems must have flexibility to change and have the capability to develop and benefit from intelligent devices. Intelligent laboratory information management systems need to benefit from informatics tools and latest technologies like open sources. The aim of this commentary is to survey application, opportunities and necessity of intelligent clinical laboratory as a tool to increase cancer care management productivity.

  16. Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation

    PubMed Central

    Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.

    2012-01-01

    Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720

  17. Interaction of subway LIM vehicle with ballasted track in polygonal wheel wear development

    NASA Astrophysics Data System (ADS)

    Li, Ling; Xiao, Xin-Biao; Jin, Xue-Song

    2011-04-01

    This paper develops a coupled dynamics model for a linear induction motor (LIM) vehicle and a subway track to investigate the influence of polygonal wheels of the vehicle on the dynamic behavior of the system. In the model, the vehicle is modeled as a multi-body system with 35 degrees of freedom. A Timoshenko beam is used to model the rails which are discretely supported by sleepers. The sleepers are modeled as rigid bodies with their vertical, lateral, and rolling motions being considered. In order to simulate the vehicle running along the track, a moving sleeper support model is introduced to simulate the excitation by the discrete sleeper supporters, in which the sleepers are assumed to move backward at a constant speed that is the same as the train speed. The Hertzian contact theory and the Shen-Hedrick-Elkins' model are utilized to deal with the normal dynamic forces and the tangential forces between wheels and rails, respectively. In order to better characterize the linear metro system (LMS), Euler beam theory based on modal superposition method is used to model LIM and RP. The vertical electric magnetic force and the lateral restoring force between the LIM and RP are also taken into consideration. The former has gap-varying nonlinear characteristics, whilst the latter is considered as a constant restoring force of 1 kN. The numerical analysis considers the effect of the excitation due to polygonal wheels on the dynamic behavior of the system at different wear stages, in which the used data regarding the polygonal wear on the wheel tread are directly measured at the subway site.

  18. A manual for a laboratory information management system (LIMS) for light stable isotopes

    USGS Publications Warehouse

    Coplen, Tyler B.

    1997-01-01

    The reliability and accuracy of isotopic data can be improved by utilizing database software to (i) store information about samples, (ii) store the results of mass spectrometric isotope-ratio analyses of samples, (iii) calculate analytical results using standardized algorithms stored in a database, (iv) normalize stable isotopic data to international scales using isotopic reference materials, and (v) generate multi-sheet paper templates for convenient sample loading of automated mass-spectrometer sample preparation manifolds. Such a database program is presented herein. Major benefits of this system include (i) an increase in laboratory efficiency, (ii) reduction in the use of paper, (iii) reduction in workload due to the elimination or reduction of retyping of data by laboratory personnel, and (iv) decreased errors in data reported to sample submitters. Such a database provides a complete record of when and how often laboratory reference materials have been analyzed and provides a record of what correction factors have been used through time. It provides an audit trail for stable isotope laboratories. Since the original publication of the manual for LIMS for Light Stable Isotopes, the isotopes 3 H, 3 He, and 14 C, and the chlorofluorocarbons (CFCs), CFC-11, CFC-12, and CFC-113, have been added to this program.

  19. A manual for a Laboratory Information Management System (LIMS) for light stable isotopes

    USGS Publications Warehouse

    Coplen, Tyler B.

    1998-01-01

    The reliability and accuracy of isotopic data can be improved by utilizing database software to (i) store information about samples, (ii) store the results of mass spectrometric isotope-ratio analyses of samples, (iii) calculate analytical results using standardized algorithms stored in a database, (iv) normalize stable isotopic data to international scales using isotopic reference materials, and (v) generate multi-sheet paper templates for convenient sample loading of automated mass-spectrometer sample preparation manifolds. Such a database program is presented herein. Major benefits of this system include (i) an increase in laboratory efficiency, (ii) reduction in the use of paper, (iii) reduction in workload due to the elimination or reduction of retyping of data by laboratory personnel, and (iv) decreased errors in data reported to sample submitters. Such a database provides a complete record of when and how often laboratory reference materials have been analyzed and provides a record of what correction factors have been used through time. It provides an audit trail for stable isotope laboratories. Since the original publication of the manual for LIMS for Light Stable Isotopes, the isotopes 3 H, 3 He, and 14 C, and the chlorofluorocarbons (CFCs), CFC-11, CFC-12, and CFC-113, have been added to this program.

  20. Evaluation of features to support safety and quality in general practice clinical software

    PubMed Central

    2011-01-01

    Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics) that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62%) were fully implemented, 9-13 (18-26%) partially implemented, and 9-20 (18-40%) not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  1. Lightning Mapping and Leader Propagation Reconstruction using LOFAR-LIM

    NASA Astrophysics Data System (ADS)

    Hare, B.; Ebert, U.; Rutjes, C.; Scholten, O.; Trinh, G. T. N.

    2017-12-01

    LOFAR (LOw Frequency ARray) is a radio telescope that consists of a large number of dual-polarized antennas spread over the northern Netherlands and beyond. The LOFAR for Lightning Imaging project (LOFAR-LIM) has successfully used LOFAR to map out lightning in the Netherlands. Since LOFAR covers a large frequency range (10-90 MHz), has antennas spread over a large area, and saves the raw trace data from the antennas, LOFAR-LIM can combine all the strongest aspects of both lightning mapping arrays and lightning interferometers. These aspects include a nanosecond resolution between pulses, nanosecond timing accuracy, and an ability to map lightning in all 3 spatial dimensions and time. LOFAR should be able to map out overhead lightning with a spatial accuracy on the order of meters. The large amount of complex data provide by LOFAR has presented new data processing challenges, such as handling the time offsets between stations with large baselines and locating as many sources as possible. New algorithms to handle these challenges have been developed and will be discussed. Since the antennas are dual-polarized, all three components of the electric field can be extracted and the structure of the R.F. pulses can be investigated at a large number of distances and angles relative to the lightning source, potentially allowing for modeling of lightning current distributions relevant to the 10 to 90 MHz frequency range. R.F. pulses due to leader propagation will be presented, which show a complex sub-structure, indicating intricate physics that could potentially be reconstructed.

  2. Dynamic software design for clinical exome and genome analyses: insights from bioinformaticians, clinical geneticists, and genetic counselors.

    PubMed

    Shyr, Casper; Kushniruk, Andre; van Karnebeek, Clara D M; Wasserman, Wyeth W

    2016-03-01

    The transition of whole-exome and whole-genome sequencing (WES/WGS) from the research setting to routine clinical practice remains challenging. With almost no previous research specifically assessing interface designs and functionalities of WES and WGS software tools, the authors set out to ascertain perspectives from healthcare professionals in distinct domains on optimal clinical genomics user interfaces. A series of semi-scripted focus groups, structured around professional challenges encountered in clinical WES and WGS, were conducted with bioinformaticians (n = 8), clinical geneticists (n = 9), genetic counselors (n = 5), and general physicians (n = 4). Contrary to popular existing system designs, bioinformaticians preferred command line over graphical user interfaces for better software compatibility and customization flexibility. Clinical geneticists and genetic counselors desired an overarching interactive graphical layout to prioritize candidate variants--a "tiered" system where only functionalities relevant to the user domain are made accessible. They favored a system capable of retrieving consistent representations of external genetic information from third-party sources. To streamline collaboration and patient exchanges, the authors identified user requirements toward an automated reporting system capable of summarizing key evidence-based clinical findings among the vast array of technical details. Successful adoption of a clinical WES/WGS system is heavily dependent on its ability to address the diverse necessities and predilections among specialists in distinct healthcare domains. Tailored software interfaces suitable for each group is likely more appropriate than the current popular "one size fits all" generic framework. This study provides interfaces for future intervention studies and software engineering opportunities. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. Dynamic software design for clinical exome and genome analyses: insights from bioinformaticians, clinical geneticists, and genetic counselors

    PubMed Central

    Shyr, Casper; Kushniruk, Andre; van Karnebeek, Clara D.M.

    2016-01-01

    Background The transition of whole-exome and whole-genome sequencing (WES/WGS) from the research setting to routine clinical practice remains challenging. Objectives With almost no previous research specifically assessing interface designs and functionalities of WES and WGS software tools, the authors set out to ascertain perspectives from healthcare professionals in distinct domains on optimal clinical genomics user interfaces. Methods A series of semi-scripted focus groups, structured around professional challenges encountered in clinical WES and WGS, were conducted with bioinformaticians (n = 8), clinical geneticists (n = 9), genetic counselors (n = 5), and general physicians (n = 4). Results Contrary to popular existing system designs, bioinformaticians preferred command line over graphical user interfaces for better software compatibility and customization flexibility. Clinical geneticists and genetic counselors desired an overarching interactive graphical layout to prioritize candidate variants—a “tiered” system where only functionalities relevant to the user domain are made accessible. They favored a system capable of retrieving consistent representations of external genetic information from third-party sources. To streamline collaboration and patient exchanges, the authors identified user requirements toward an automated reporting system capable of summarizing key evidence-based clinical findings among the vast array of technical details. Conclusions Successful adoption of a clinical WES/WGS system is heavily dependent on its ability to address the diverse necessities and predilections among specialists in distinct healthcare domains. Tailored software interfaces suitable for each group is likely more appropriate than the current popular “one size fits all” generic framework. This study provides interfaces for future intervention studies and software engineering opportunities. PMID:26117142

  4. The pH sensibility of actin-bundling LIM proteins is governed by the acidic properties of their C-terminal domain.

    PubMed

    Moes, Danièle; Hoffmann, Céline; Dieterle, Monika; Moreau, Flora; Neumann, Katrin; Papuga, Jessica; Furtado, Angela Tavares; Steinmetz, André; Thomas, Clément

    2015-08-19

    Actin-bundling Arabidopsis LIM proteins are subdivided into two subfamilies differing in their pH sensitivity. Widely-expressed WLIMs are active under low and high physiologically-relevant pH conditions, whereas pollen-enriched PLIMs are inactivated by pH values above 6.8. By a domain swapping approach we identified the C-terminal (Ct) domain of PLIMs as the domain responsible for pH responsiveness. Remarkably, this domain conferred pH sensitivity to LIM proteins, when provided "in trans" (i.e., as a single, independent, peptide), indicating that it operates through the interaction with another domain. An acidic 6xc-Myc peptide functionally mimicked the Ct domain of PLIMs and efficiently inhibited LIM actin bundling activity under high pH conditions. Together, our data suggest a model where PLIMs are regulated by an intermolecular interaction between their acidic Ct domain and another, yet unidentified, domain. Copyright © 2015 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  5. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    PubMed

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  6. LIMS Instrument Package (LIP) balloon experiment: Nimbus 7 satellite correlative temperature, ozone, water vapor, and nitric acid measurements

    NASA Technical Reports Server (NTRS)

    Lee, R. B., III; Gandrud, B. W.; Robbins, D. E.; Rossi, L. C.; Swann, N. R. W.

    1982-01-01

    The Limb Infrared Monitor of the Stratosphere (LIMS) LIP balloon experiment was used to obtain correlative temperature, ozone, water vapor, and nitric acid data at altitudes between 10 and 36 kilometers. The performance of the LIMS sensor flown on the Nimbus 7 Satellite was assessed. The LIP consists of the modified electrochemical concentration cell ozonesonde, the ultraviolet absorption photometric of ozone, the water vapor infrared radiometer sonde, the chemical absorption filter instrument for nitric acid vapor, and the infrared radiometer for nitric acid vapor. The limb instrument package (LIP), its correlative sensors, and the resulting data obtained from an engineering and four correlative flights are described.

  7. LIM domain protein TES changes its conformational states in different cellular compartments.

    PubMed

    Zhong, Yingli; Zhu, Jiaolian; Wang, Yan; Zhou, Jianlin; Ren, Kaiqun; Ding, Xiaofeng; Zhang, Jian

    2009-01-01

    The human TESTIN (TES) is a putative tumor suppressor and localizes to the cytoplasm as a component of focal adhesions and cell contacts. TES contains a PET domain in the NH(2)-terminus and three tandem LIM domains in the COOH-terminus. It has been hypothesized that interactions between two termini of TES might lead to a "closed" conformational state of the protein. Here, we provide evidence for different conformational states of TES. We confirmed that the NH(2)-terminus of TES can interact with its third LIM domain in the COOH-terminus by GST pull-down assays. In addition, antisera against the full-length or two truncations of TES were prepared to examine the relationship between the conformation and cellular distribution of the protein. We found that these antisera recognize different regions of TES and showed that TES is co-localised with the marker protein B23 in nucleolus, in addition to its localization in endoplasmic reticulum (ER). Furthermore, our co-immunoprecipitation (co-IP) analysis of TES and B23 demonstrated their co-existence in the same complex. Taken together, our results suggest that TES has different conformational states in different cellular compartments, and a "closed" conformational state of TES may be involved in nucleolar localization.

  8. Global lower mesospheric water vapor revealed by LIMS observations

    NASA Technical Reports Server (NTRS)

    Gordley, L. L.; Russell, J. M., III; Remsberg, E. E.

    1985-01-01

    The Limb Infrared Monitor of the Stratospheric water vapor channel data analysis has been extended from the 1. mb level (about 48 km) to the .3 mb level (about 60 km) through a radiance averaging procedure and better understanding of systematic errors. The data show H2O mixing ratio peaks near the .5 mb level varying from 4 to 7 ppmv with latitude and season. Above this level the mixing ratio drops off quickly with altitude, but, due to experimental uncertainties, at an uncertain rate. The stratospheric results are virtually the same as determined from the archived LIMS results with a tropical hygropause and enhanced H2O concentration in the lower levels at high winter latitudes.

  9. Custom software development for use in a clinical laboratory

    PubMed Central

    Sinard, John H.; Gershkovich, Peter

    2012-01-01

    In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care. PMID:23372985

  10. Custom software development for use in a clinical laboratory.

    PubMed

    Sinard, John H; Gershkovich, Peter

    2012-01-01

    In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care.

  11. Effects of polar stratospheric clouds in the Nimbus 7 LIMS Version 6 data set

    NASA Astrophysics Data System (ADS)

    Remsberg, Ellis; Harvey, V. Lynn

    2016-07-01

    The historic Limb Infrared Monitor of the Stratosphere (LIMS) measurements of 1978-1979 from the Nimbus 7 satellite were re-processed with Version 6 (V6) algorithms and archived in 2002. The V6 data set employs updated radiance registration methods, improved spectroscopic line parameters, and a common vertical resolution for all retrieved parameters. Retrieved profiles are spaced about every 1.6° of latitude along orbits and include the additional parameter of geopotential height. Profiles of O3 are sensitive to perturbations from emissions of polar stratospheric clouds (PSCs). This work presents results of implementing a first-order screening for effects of PSCs using simple algorithms based on vertical gradients of the O3 mixing ratio. Their occurrences are compared with the co-located, retrieved temperatures and related to the temperature thresholds needed for saturation of H2O and/or HNO3 vapor onto PSC particles. Observed daily locations where the major PSC screening criteria are satisfied are validated against PSCs observed with the Stratospheric Aerosol Monitor (SAM) II experiment also on Nimbus 7. Remnants of emissions from PSCs are characterized for O3 and HNO3 following the screening. PSCs may also impart a warm bias in the co-located LIMS temperatures, but by no more than 1-2 K at the altitudes of where effects of PSCs are a maximum in the ozone; thus, no PSC screening was applied to the V6 temperatures. Minimum temperatures vary between 187 and 194 K and often occur 1 to 2 km above where PSC effects are first identified in the ozone (most often between about 21 and 28 hPa). Those temperature-pressure values are consistent with conditions for the existence of nitric acid trihydrate (NAT) mixtures and to a lesser extent of super-cooled ternary solution (STS) droplets. A local, temporary uptake of HNO3 vapor of order 1-3 ppbv is indicated during mid-January for the 550 K surface. Seven-month time series of the distributions of LIMS O3 and HNO3 are shown

  12. Comparative study of sea ice dynamics simulations with a Maxwell elasto-brittle rheology and the elastic-viscous-plastic rheology in NEMO-LIM3

    NASA Astrophysics Data System (ADS)

    Raulier, Jonathan; Dansereau, Véronique; Fichefet, Thierry; Legat, Vincent; Weiss, Jérôme

    2017-04-01

    Sea ice is a highly dynamical environment characterized by a dense mesh of fractures or leads, constantly opening and closing over short time scales. This characteristic geomorphology is linked to the existence of linear kinematic features, which consist of quasi-linear patterns emerging from the observed strain rate field of sea ice. Standard rheologies used in most state-of-the-art sea ice models, like the well-known elastic-viscous-plastic rheology, are thought to misrepresent those linear kinematic features and the observed statistical distribution of deformation rates. Dedicated rheologies built to catch the processes known to be at the origin of the formation of leads are developed but still need evaluations on the global scale. One of them, based on a Maxwell elasto-brittle formulation, is being integrated in the NEMO-LIM3 global ocean-sea ice model (www.nemo-ocean.eu; www.elic.ucl.ac.be/lim). In the present study, we compare the results of the sea ice model LIM3 obtained with two different rheologies: the elastic-viscous-plastic rheology commonly used in LIM3 and a Maxwell elasto-brittle rheology. This comparison is focused on the statistical characteristics of the simulated deformation rate and on the ability of the model to reproduce the existence of leads within the ice pack. The impact of the lead representation on fluxes between ice, atmosphere and ocean is also assessed.

  13. Extended cooperation in clinical studies through exchange of CDISC metadata between different study software solutions.

    PubMed

    Kuchinke, W; Wiegelmann, S; Verplancke, P; Ohmann, C

    2006-01-01

    Our objectives were to analyze the possibility of an exchange of an entire clinical study between two different and independent study software solutions. The question addressed was whether a software-independent transfer of study metadata can be performed without programming efforts and with software routinely used for clinical research. Study metadata was transferred with ODM standard (CDISC). Study software systems employed were MACRO (InferMed) and XTrial (XClinical). For the Proof of Concept, a test study was created with MACRO and exported as ODM. For modification and validation of the ODM export file XML-Spy (Altova) and ODM-Checker (XML4Pharma) were used. Through exchange of a complete clinical study between two different study software solutions, a Proof of Concept of the technical feasibility of a system-independent metadata exchange was conducted successfully. The interchange of study metadata between two different systems at different centers was performed with minimal expenditure. A small number of mistakes had to be corrected in order to generate a syntactically correct ODM file and a "vendor extension" had to be inserted. After these modifications, XTrial exhibited the study, including all data fields, correctly. However, the optical appearance of both CRFs (case report forms) was different. ODM can be used as an exchange format for clinical studies between different study software. Thus, new forms of cooperation through exchange of metadata seem possible, for example the joint creation of electronic study protocols or CRFs at different research centers. Although the ODM standard represents a clinical study completely, it contains no information about the representation of data fields in CRFs.

  14. The effect of advertising in clinical software on general practitioners' prescribing behaviour.

    PubMed

    Henderson, Joan; Miller, Graeme; Pan, Ying; Britt, Helena

    2008-01-07

    To assess the effect of pharmaceutical advertising embedded in clinical software on the prescribing behaviour of general practitioners. Secondary analysis of data from a random sample of 1336 Australian GPs who participated in Bettering the Evaluation and Care of Health, a national continuous cross-sectional survey of general practice activity, between November 2003 and March 2005. The prescribing behaviour of participants who used the advertising software was compared with that of participants who did not, for seven pharmaceutical products advertised continually throughout the study period. Prescription for advertised product as a proportion (%) of prescriptions for all pharmaceutical products in the same generic class or group. GP age, practice location, accreditation status, patient bulk-billing status and hours worked were significantly associated (P < 0.05) with use of advertising software. We found no significant differences, either before or after adjustment for these confounders, in the prescribing rate of Lipitor (adjusted odds ratio [AOR], 0.90; P = 0.26); Micardis (AOR, 0.98; P = 0.91); Mobic (AOR, 1.02; P = 0.89); Norvasc (AOR, 1.02; P = 0.91); Natrilix (AOR, 0.80; P = 0.32); or Zanidip (AOR, 0.88; P = 0.47). GPs using advertising software prescribed Nexium significantly less often than those not using advertising software (AOR, 0.78; P = 0.02). When all advertised products were combined and compared with products that were not advertised, no difference in the overall prescribing behaviour was demonstrated (AOR, 0.96; P = 0.42). Exposure to advertisements in clinical software has little influence on the prescribing behaviour of GPs.

  15. An update on the LIM and SH3 domain protein 1 (LASP1): a versatile structural, signaling, and biomarker protein

    PubMed Central

    Orth, Martin F.; Cazes, Alex; Butt, Elke; Grunewald, Thomas G. P.

    2015-01-01

    The gene encoding the LIM and SH3 domain protein (LASP1) was cloned two decades ago from a cDNA library of breast cancer metastases. As the first protein of a class comprising one N-terminal LIM and one C-terminal SH3 domain, LASP1 founded a new LIM-protein subfamily of the nebulin group. Since its discovery LASP1 proved to be an extremely versatile protein because of its exceptional structure allowing interaction with various binding partners, its ubiquitous expression in normal tissues, albeit with distinct expression patterns, and its ability to transmit signals from the cytoplasm into the nucleus. As a result, LASP1 plays key roles in cell structure, physiological processes, and cell signaling. Furthermore, LASP1 overexpression contributes to cancer aggressiveness hinting to a potential value of LASP1 as a cancer biomarker. In this review we summarize published data on structure, regulation, function, and expression pattern of LASP1, with a focus on its role in human cancer and as a biomarker protein. In addition, we provide a comprehensive transcriptome analysis of published microarrays (n=2,780) that illustrates the expression profile of LASP1 in normal tissues and its overexpression in a broad range of human cancer entities. PMID:25622104

  16. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  17. [Development of integrated support software for clinical nutrition].

    PubMed

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  18. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, S; Dolly, S; Cai, B

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deepmore » modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality

  19. Human ESP1/CRP2, a member of the LIM domain protein family: Characterization of the cDNA and assignment of the gene locus to chromosome 14q32.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karim, Mohammad Azharul; Ohta, Kohji; Matsuda, Ichiro

    1996-01-15

    The LIM domain is present in a wide variety of proteins with diverse functions and exhibits characteristic arrangements of Cys and His residues with a novel zinc-binding motif. LIM domain proteins have been implicated in development, cell regulation, and cell structure. A LIM domain protein was identified by screening a human cDNA library with rat cysteine-rich intestinal protein (CRIP) as a probe, under conditions of low stringency. Comparison of the predicted amino acid sequence with several LIM domain proteins revealed 93% of the residues to be identical to rat LIM domain protein, termed ESP1 or CRP2. Thus, the protein ismore » hereafter referred to as human ESP1/CRP2. The cDNA encompasses a 1171-base region, including 26, 624, and 521 bases in the 5{prime}-noncoding region, coding region, and 3{prime}-noncoding regions, respectively, and encodes the entire ESP1/CRP2 protein has two LIM domains, and each shares 35.1% and 77 or 79% identical residues with human cysteine-rich protein (CRP) and rat CRIP, respectively. Northern blot analysis of ESP1/CRP2 in various human tissues showed distinct tissue distributions compared with CRP and CRIP, suggesting that each might serve related but specific roles in tissue organization or function. Using a panel of human-rodent somatic cell hybrids, the ESP1/CRP2 locus was assigned to chromosome 14. Fluorescence in situ hybridization, using cDNA and a genome DNA fragment of the ESP1/CRP2 as probes, confirms this assignment and relegates regional localization to band 14q32.3 47 refs., 7 figs.« less

  20. The Hawaiian Algal Database: a laboratory LIMS and online resource for biodiversity data

    PubMed Central

    Wang, Norman; Sherwood, Alison R; Kurihara, Akira; Conklin, Kimberly Y; Sauvage, Thomas; Presting, Gernot G

    2009-01-01

    Background Organization and presentation of biodiversity data is greatly facilitated by databases that are specially designed to allow easy data entry and organized data display. Such databases also have the capacity to serve as Laboratory Information Management Systems (LIMS). The Hawaiian Algal Database was designed to showcase specimens collected from the Hawaiian Archipelago, enabling users around the world to compare their specimens with our photographs and DNA sequence data, and to provide lab personnel with an organizational tool for storing various biodiversity data types. Description We describe the Hawaiian Algal Database, a comprehensive and searchable database containing photographs and micrographs, geo-referenced collecting information, taxonomic checklists and standardized DNA sequence data. All data for individual samples are linked through unique accession numbers. Users can search online for sample information by accession number, numerous levels of taxonomy, or collection site. At the present time the database contains data representing over 2,000 samples of marine, freshwater and terrestrial algae from the Hawaiian Archipelago. These samples are primarily red algae, although other taxa are being added. Conclusion The Hawaiian Algal Database is a digital repository for Hawaiian algal samples and acts as a LIMS for the laboratory. Users can make use of the online search tool to view and download specimen photographs and micrographs, DNA sequences and relevant habitat data, including georeferenced collecting locations. It is publicly available at . PMID:19728892

  1. Description of data on the Nimbus 7 LIMS map archive tape: Ozone and nitric acid

    NASA Technical Reports Server (NTRS)

    Remsberg, E. E.; Kurzeja, R. J.; Haggard, K. V.; Russell, J. M., III; Gordley, L. L.

    1986-01-01

    The Nimbus 7 Limb Infrared Monitor of the Stratosphere (LIMS) data set has been processed into a Fourier coefficient representation with a Kalman filter algorithm applied to profile data at individual latitudes and pressure levels. The algorithm produces synoptic data at noon Greenwich Mean Time (GMT) from the asynoptic orbital profiles. This form of the data set is easy to use and is appropriate for time series analysis and further data manipulation and display. Ozone and nitric acid results are grouped together in this report because the LIMS vertical field of views (FOV's) and analysis characteristics for these species are similar. A comparison of the orbital input data with mixing ratios derived from Kalman filter coefficients indicates errors in mixing ratio of generally less than 5 percent, with 15 percent being a maximum error. The high quality of the mapped data was indicated by coherence of both the phases and the amplitudes of waves with latitude and pressure. Examples of the mapped fields are presented, and details are given concerning the importance of diurnal variations, the removal of polar stratospheric cloud signatures, and the interpretation of bias effects in the data near the tops of profiles.

  2. Clinical records anonymisation and text extraction (CRATE): an open-source software system.

    PubMed

    Cardinal, Rudolf N

    2017-04-26

    Electronic medical records contain information of value for research, but contain identifiable and often highly sensitive confidential information. Patient-identifiable information cannot in general be shared outside clinical care teams without explicit consent, but anonymisation/de-identification allows research uses of clinical data without explicit consent. This article presents CRATE (Clinical Records Anonymisation and Text Extraction), an open-source software system with separable functions: (1) it anonymises or de-identifies arbitrary relational databases, with sensitivity and precision similar to previous comparable systems; (2) it uses public secure cryptographic methods to map patient identifiers to research identifiers (pseudonyms); (3) it connects relational databases to external tools for natural language processing; (4) it provides a web front end for research and administrative functions; and (5) it supports a specific model through which patients may consent to be contacted about research. Creation and management of a research database from sensitive clinical records with secure pseudonym generation, full-text indexing, and a consent-to-contact process is possible and practical using entirely free and open-source software.

  3. Automated pre-processing and multivariate vibrational spectra analysis software for rapid results in clinical settings

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, T.; Kumar, P.; Fillipe, L.

    2018-02-01

    Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.

  4. Implementation of Task-Tracking Software for Clinical IT Management.

    PubMed

    Purohit, Anne-Maria; Brutscheck, Clemens; Prokosch, Hans-Ulrich; Ganslandt, Thomas; Schneider, Martin

    2017-01-01

    Often in clinical IT departments, many different methods and IT systems are used for task-tracking and project organization. Based on managers' personal preferences and knowledge about project management methods, tools differ from team to team and even from employee to employee. This causes communication problems, especially when tasks need to be done in cooperation with different teams. Monitoring tasks and resources becomes impossible: there are no defined deliverables, which prevents reliable deadlines. Because of these problems, we implemented task-tracking software which is now in use across all seven teams at the University Hospital Erlangen. Over a period of seven months, a working group defined types of tasks (project, routine task, etc.), workflows, and views to monitor the tasks of the 7 divisions, 20 teams and 340 different IT services. The software has been in use since December 2016.

  5. Laboratory and software applications for clinical trials: the global laboratory environment.

    PubMed

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  6. Ex-vivo transduced autologous skin fibroblasts expressing human Lim Mineralization Protein-3 efficiently form new bone in animal models

    PubMed Central

    Lattanzi, Wanda; Parrilla, Claudio; Fetoni, Annarita; Logroscino, Giandomenico; Straface, Giuseppe; Pecorini, Giovanni; Stigliano, Egidio; Tampieri, Anna; Bedini, Rossella; Pecci, Raffaella; Michetti, Fabrizio; Gambotto, Andrea; Robbins, Paul D.; Pola, Enrico

    2012-01-01

    Local gene transfer of the human LIM Mineralization Protein (LMP), a novel intracellular positive regulator of the osteoblast differentiation program, can induce efficient bone formation in rodents. In order to develop a clinically relevant gene therapy approach to facilitate bone healing, we have used primary dermal fibroblasts transduced ex vivo with Ad.LMP3 and seeded on an hydroxyapatite/collagen matrix prior to autologous implantation. Here we demonstrate that genetically modified autologous dermal fibroblasts expressing Ad.LMP-3 are able to induce ectopic bone formation following implantation of the matrix into the mouse triceps and paravertebral muscles. Moreover, implantation of the Ad.LMP-3-modified dermal fibroblasts into a rat mandibular bone critical size defect model results in efficient healing as determined by X-ray, histology and three dimensional micro computed tomography (3DμCT). These results demonstrate the effectiveness of the non-secreted intracellular osteogenic factor LMP-3, in inducing bone formation in vivo. Moreover, the utilization of autologous dermal fibroblasts implanted on a biomaterial represents a promising approach for possible future clinical applications aimed at inducing new bone formation. PMID:18633445

  7. Visual and computer software-aided estimates of Dupuytren's contractures: correlation with clinical goniometric measurements.

    PubMed

    Smith, R P; Dias, J J; Ullah, A; Bhowal, B

    2009-05-01

    Corrective surgery for Dupuytren's disease represents a significant proportion of a hand surgeon's workload. The decision to go ahead with surgery and the success of surgery requires measuring the degree of contracture of the diseased finger(s). This is performed in clinic with a goniometer, pre- and postoperatively. Monitoring the recurrence of the contracture can inform on surgical outcome, research and audit. We compared visual and computer software-aided estimation of Dupuytren's contractures to clinical goniometric measurements in 60 patients with Dupuytren's disease. Patients' hands were digitally photographed. There were 76 contracted finger joints--70 proximal interphalangeal joints and six distal interphalangeal joints. The degrees of contracture of these images were visually assessed by six orthopaedic staff of differing seniority and re-assessed with computer software. Across assessors, the Pearson correlation between the goniometric measurements and the visual estimations was 0.83 and this significantly improved to 0.88 with computer software. Reliability with intra-class correlations achieved 0.78 and 0.92 for the visual and computer-aided estimations, respectively, and with test-retest analysis, 0.92 for visual estimation and 0.95 for computer-aided measurements. Visual estimations of Dupuytren's contractures correlate well with actual clinical goniometric measurements and improve further if measured with computer software. Digital images permit monitoring of contracture after surgery and may facilitate research into disease progression and auditing of surgical technique.

  8. Enhancing outpatient clinics management software by reducing patients' waiting time.

    PubMed

    Almomani, Iman; AlSarheed, Ahlam

    The Kingdom of Saudi Arabia (KSA) gives great attention to improving the quality of services provided by health care sectors including outpatient clinics. One of the main drawbacks in outpatient clinics is long waiting time for patients-which affects the level of patient satisfaction and the quality of services. This article addresses this problem by studying the Outpatient Management Software (OMS) and proposing solutions to reduce waiting times. Many hospitals around the world apply solutions to overcome the problem of long waiting times in outpatient clinics such as hospitals in the USA, China, Sri Lanka, and Taiwan. These clinics have succeeded in reducing wait times by 15%, 78%, 60% and 50%, respectively. Such solutions depend mainly on adding more human resources or changing some business or management policies. The solutions presented in this article reduce waiting times by enhancing the software used to manage outpatient clinics services. Both quantitative and qualitative methods have been used to understand current OMS and examine level of patient's satisfaction. Five main problems that may cause high or unmeasured waiting time have been identified: appointment type, ticket numbering, doctor late arrival, early arriving patient and patients' distribution list. These problems have been mapped to the corresponding OMS components. Solutions to the above problems have been introduced and evaluated analytically or by simulation experiments. Evaluation of the results shows a reduction in patient waiting time. When late doctor arrival issues are solved, this can reduce the clinic service time by up to 20%. However, solutions for early arriving patients reduces 53.3% of vital time, 20% of the clinic time and overall 30.3% of the total waiting time. Finally, well patient-distribution lists make improvements by 54.2%. Improvements introduced to the patients' waiting time will consequently affect patients' satisfaction and improve the quality of health care services

  9. Islamic Education Philosophy Development (Study Analysis on Ta'lim Al-Kitab Al-Zarnuji Muta'allim Works)

    ERIC Educational Resources Information Center

    Asrori, H. Achmad

    2016-01-01

    "Ta'lim Muta'allim" is one of the monumental works of Shaykh Tajuddin Nu'man ibn Ibrahim ibn al-Khalil al-Zarnuji, who lived in the 6th century H/13-14 M. The reason for writing this study ie: (1) it is very rich with the basic values of Islamic education, (2) the values are already widely practiced in the world of education, especially…

  10. PhysioNet: physiologic signals, time series and related open source software for basic, clinical, and applied research.

    PubMed

    Moody, George B; Mark, Roger G; Goldberger, Ary L

    2011-01-01

    PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.

  11. Nuclear accumulation of myocyte muscle LIM protein is regulated by heme oxygenase 1 and correlates with cardiac function in the transition to failure

    PubMed Central

    Paudyal, Anju; Dewan, Sukriti; Ikie, Cindy; Whalley, Benjamin J; de Tombe, Pieter P.

    2016-01-01

    Key points The present study investigated the mechanism associated with impaired cardiac mechanosensing that leads to heart failure by examining the factors regulating muscle LIM protein subcellular distribution in myocytes.In myocytes, muscle LIM protein subcellular distribution is regulated by cell contractility rather than passive stretch via heme oxygenase‐1 and histone deacetylase signalling. The result of the present study provide new insights into mechanotransduction in cardiac myocytes.Myocyte mechanosensitivity, as indicated by the muscle LIM protein ratio, is also correlated with cardiac function in the transition to failure in a guinea‐pig model of disease. This shows that the loss mechanosensitivity plays an important role during the transition to failure in the heart.The present study provides the first indication that mechanosensing could be modified pharmacologically during the transition to heart failure. Abstract Impaired mechanosensing leads to heart failure and a decreased ratio of cytoplasmic to nuclear CSRP3/muscle LIM protein (MLP ratio) is associated with a loss of mechanosensitivity. In the present study, we tested whether passive or active stress/strain was important in modulating the MLP ratio and determined whether this correlated with heart function during the transition to failure. We exposed cultured neonatal rat myocytes to a 10% cyclic mechanical stretch at 1 Hz, or electrically paced myocytes at 6.8 V (1 Hz) for 48 h. The MLP ratio decreased by 50% (P < 0.05, n = 4) only in response to electrical pacing, suggesting impaired mechanosensitivity. Inhibition of contractility with 10 μm blebbistatin resulted in an ∼3‐fold increase in the MLP ratio (n = 8, P < 0.05), indicating that myocyte contractility regulates nuclear MLP. Inhibition of histone deacetylase (HDAC) signalling with trichostatin A increased nuclear MLP following passive stretch, suggesting that HDACs block MLP nuclear accumulation. Inhibition of heme

  12. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  13. Neotectonics and paleoseismology of the Limón and pedro miguel faults in Panamá: earthquake hazard to the Panamá canal

    USGS Publications Warehouse

    Rockwell, Thomas; Gath, Edon; Gonzalez, Tania; Madden, Chris; Verdugo, Danielle; Lippincott, Caitlin; Dawson, Tim; Owen, Lewis A.; Fuchs, Markus; Cadena, Ana; Williams, Pat; Weldon, Elise; Franceschi, Pastora

    2010-01-01

    We present new geologic, tectonic geomorphic, and geochronologic data on the slip rate, timing, and size of past surface ruptures for the right-lateral Lim??n and Pedro Miguel faults in central Panam??. These faults are part of a system of conjugate faults that accommodate the internal deformation of Panam?? resulting from the ongoing collision of Central and South America. There have been at least three surface ruptures on the Lim??n fault in the past 950-1400 years, with the most recent during the past 365 years. Displacement in this young event is at least 1.2 m (based on trenching) and may be 1.6-2 m (based on small channel offsets). Awell-preserved 4.2 m offset suggests that the penultimate event also sustained significant displacement. The Holocene slip rate has averaged about 6 mm=yr, based on a 30-m offset terrace riser incised into a 5-ka abandoned channel. The Pedro Miguel fault has sustained three surface ruptures in the past 1600 years, the most recent being the 2 May 1621 earthquake that partially destroyed Panam?? Viejo. At least 2.1 m of slip occurred in this event near the Canal, with geomorphic offsets suggesting 2.5-3 m. The historic Camino de Cruces is offset 2.8 m, indicating multimeter displacement over at least 20 km of fault length. Channel offsets of 100-400 m, together with a climate-induced incision model, suggest a Late Quaternary slip rate of about 5 mm=yr, which is consistent with the paleoseismic results. Comparison of the timing of surface ruptures between the Lim??n and Pedro Miguel faults suggests that large earthquakes may rupture both faults with 2-3 m of displacement for over 40 km, such as is likely in earthquakes in the M 7 range. Altogether, our observations indicate that the Lim??n and Pedro Miguel faults represent a significant seismic hazard to central Panam?? and, specifically, to the Canal and Panam?? City.

  14. PhenoTips: patient phenotyping software for clinical and research use.

    PubMed

    Girdea, Marta; Dumitriu, Sergiu; Fiume, Marc; Bowdin, Sarah; Boycott, Kym M; Chénier, Sébastien; Chitayat, David; Faghfoury, Hanna; Meyn, M Stephen; Ray, Peter N; So, Joyce; Stavropoulos, Dimitri J; Brudno, Michael

    2013-08-01

    We have developed PhenoTips: open source software for collecting and analyzing phenotypic information for patients with genetic disorders. Our software combines an easy-to-use interface, compatible with any device that runs a Web browser, with a standardized database back end. The PhenoTips' user interface closely mirrors clinician workflows so as to facilitate the recording of observations made during the patient encounter. Collected data include demographics, medical history, family history, physical and laboratory measurements, physical findings, and additional notes. Phenotypic information is represented using the Human Phenotype Ontology; however, the complexity of the ontology is hidden behind a user interface, which combines simple selection of common phenotypes with error-tolerant, predictive search of the entire ontology. PhenoTips supports accurate diagnosis by analyzing the entered data, then suggesting additional clinical investigations and providing Online Mendelian Inheritance in Man (OMIM) links to likely disorders. By collecting, classifying, and analyzing phenotypic information during the patient encounter, PhenoTips allows for streamlining of clinic workflow, efficient data entry, improved diagnosis, standardization of collected patient phenotypes, and sharing of anonymized patient phenotype data for the study of rare disorders. Our source code and a demo version of PhenoTips are available at http://phenotips.org. © 2013 WILEY PERIODICALS, INC.

  15. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  16. [Evaluation of Web-based software applications for administrating and organising an ophthalmological clinical trial site].

    PubMed

    Kortüm, K; Reznicek, L; Leicht, S; Ulbig, M; Wolf, A

    2013-07-01

    The importance and complexity of clinical trials is continuously increasing, especially in innovative specialties like ophthalmology. Therefore an efficient clinical trial site organisational structure is essential. In modern internet times, this can be accomplished by web-based applications. In total, 3 software applications (Vibe on Prem, Sharepoint and open source software) were evaluated in a clinical trial site in ophthalmology. Assessment criteria were set; they were: reliability, easiness of administration, usability, scheduling, task list, knowledge management, operating costs and worldwide availability. Vibe on Prem customised by the local university met the assessment criteria best. Other applications were not as strong. By introducing a web-based application for administrating and organising an ophthalmological trial site, studies can be conducted in a more efficient and reliable manner. Georg Thieme Verlag KG Stuttgart · New York.

  17. Comparison of direct selective versus nonselective agar media plus LIM broth enrichment for determination of group B streptococcus colonization status in pregnant women.

    PubMed

    Elsayed, Sameer; Gregson, Daniel B; Church, Deirdre L

    2003-06-01

    Group B streptococcus (GBS) is the most common cause of early-onset neonatal sepsis in developed countries, and determination of the GBS colonization status in pregnant patients near term is essential for the provision of prophylactic measures to prevent early-onset disease. To determine if GBS recovery rates and/or result turnaround times for vaginal or combined vaginal/rectal swab specimens from pregnant patients near term are enhanced if swabs are inoculated initially onto selective versus nonselective agar media, in addition to the standard Centers for Disease Control and Prevention method. Prospective laboratory analysis. Urban health region/centralized diagnostic microbiology laboratory. Pregnant women presenting for routine obstetrical care and collection of vaginal or combined vaginal/rectal swab specimens for GBS testing at 35 to 37 weeks' gestation. Culture of specimens directly onto selective (5% sheep blood with colistin and nalidixic acid) or nonselective (5% sheep blood) agar media, in addition to LIM broth enrichment and terminal subculture. Group B streptococcus recovery rate and culture result turnaround time. A total of 639 specimens were tested, with 128 (20%) positive for GBS. Sixty-three isolates were recovered on direct agar media at 24 hours, of which 16 (12.5%) were isolated on selective plates only. An additional 38 isolates were recovered at 48 hours from direct plates. Twenty-seven (21.1%) isolates that failed to grow on direct plates were recovered from the LIM broth subculture only. Three (2.3%) isolates not recovered from LIM broths were detected at 48 hours on the direct selective (2 isolates) and nonselective (1 isolate) agar plates. A 24-hour result turnaround time was achieved for 63 (49.2%) and 47 (36.7%) of the 128 culture-positive specimens for direct selective and nonselective plates, respectively (chi2 = 76.63, P <.001). Use of direct selective agar media, in addition to LIM broth enrichment, for the determination of the GBS

  18. Multiattribute selection of acute stroke imaging software platform for Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) clinical trial.

    PubMed

    Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A

    2013-04-01

    The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.

  19. The LIM protein complex establishes a retinal circuitry of visual adaptation by regulating Pax6 α-enhancer activity

    PubMed Central

    Kim, Yeha; Lim, Soyeon; Ha, Taejeong; Song, You-Hyang; Sohn, Young-In; Park, Dae-Jin; Paik, Sun-Sook; Kim-Kaneyama, Joo-ri; Song, Mi-Ryoung; Leung, Amanda; Levine, Edward M; Kim, In-Beom; Goo, Yong Sook; Lee, Seung-Hee; Kang, Kyung Hwa; Kim, Jin Woo

    2017-01-01

    The visual responses of vertebrates are sensitive to the overall composition of retinal interneurons including amacrine cells, which tune the activity of the retinal circuitry. The expression of Paired-homeobox 6 (PAX6) is regulated by multiple cis-DNA elements including the intronic α-enhancer, which is active in GABAergic amacrine cell subsets. Here, we report that the transforming growth factor ß1-induced transcript 1 protein (Tgfb1i1) interacts with the LIM domain transcription factors Lhx3 and Isl1 to inhibit the α-enhancer in the post-natal mouse retina. Tgfb1i1-/- mice show elevated α-enhancer activity leading to overproduction of Pax6ΔPD isoform that supports the GABAergic amacrine cell fate maintenance. Consequently, the Tgfb1i1-/- mouse retinas show a sustained light response, which becomes more transient in mice with the auto-stimulation-defective Pax6ΔPBS/ΔPBS mutation. Together, we show the antagonistic regulation of the α-enhancer activity by Pax6 and the LIM protein complex is necessary for the establishment of an inner retinal circuitry, which controls visual adaptation. DOI: http://dx.doi.org/10.7554/eLife.21303.001 PMID:28139974

  20. Perceptions of clinical utility of an Augmented Reality musical software among health care professionals.

    PubMed

    Corrêa, Ana Grasielle Dionísio; de Assis, Gilda Aparecida; do Nascimento, Marilena; de Deus Lopes, Roseli

    2017-04-01

    Augmented Reality musical software (GenVirtual) is a technology, which primarily allows users to develop music activities for rehabilitation. This study aimed to analyse the perceptions of health care professionals regarding the clinical utility of GenVirtual. A second objective was to identify improvements to GenVirtual software and similar technologies. Music therapists, occupational therapists, physiotherapists and speech and language therapist who assist people with physical and cognitive disabilities were enrolled in three focus groups. The quantitative and qualitative data were collected through inductive thematic analysis. Three main themes were identified: the use of GenVirtual in health care areas; opportunities for realistic application of GenVirtual; and limitations in the use of GenVirtual. The registration units identified were: motor stimulation, cognitive stimulation, verbal learning, recreation activity, musicality, accessibility, motivation, sonic accuracy, interference of lighting, poor sound, children and adults. This research suggested that the GenVirtual is a complementary tool to conventional clinical practice and has great potential to motor and cognitive rehabilitation of children and adults. Implications for Rehabilitation Gaining health professional' perceptions of the Augmented Reality musical game (GenVirtual) give valuable information as to the clinical utility of the software. GenVirtual was perceived as a tool that could be used as enhancing the motor and cognitive rehabilitation process. GenVirtual was viewed as a tool that could enhance clinical practice and communication among various agencies, but it was suggested that it should be used with caution to avoid confusion and replacement of important services.

  1. Development of an open source laboratory information management system for 2-D gel electrophoresis-based proteomics workflow

    PubMed Central

    Morisawa, Hiraku; Hirota, Mikako; Toda, Tosifusa

    2006-01-01

    Background In the post-genome era, most research scientists working in the field of proteomics are confronted with difficulties in management of large volumes of data, which they are required to keep in formats suitable for subsequent data mining. Therefore, a well-developed open source laboratory information management system (LIMS) should be available for their proteomics research studies. Results We developed an open source LIMS appropriately customized for 2-D gel electrophoresis-based proteomics workflow. The main features of its design are compactness, flexibility and connectivity to public databases. It supports the handling of data imported from mass spectrometry software and 2-D gel image analysis software. The LIMS is equipped with the same input interface for 2-D gel information as a clickable map on public 2DPAGE databases. The LIMS allows researchers to follow their own experimental procedures by reviewing the illustrations of 2-D gel maps and well layouts on the digestion plates and MS sample plates. Conclusion Our new open source LIMS is now available as a basic model for proteome informatics, and is accessible for further improvement. We hope that many research scientists working in the field of proteomics will evaluate our LIMS and suggest ways in which it can be improved. PMID:17018156

  2. Evaluation of DICOM viewer software for workflow integration in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Page, Charles E.; Kabino, Klaus; Deserno, Thomas M.

    2015-03-01

    The digital imaging and communications in medicine (DICOM) protocol is nowadays the leading standard for capture, exchange and storage of image data in medical applications. A broad range of commercial, free, and open source software tools supporting a variety of DICOM functionality exists. However, different from patient's care in hospital, DICOM has not yet arrived in electronic data capture systems (EDCS) for clinical trials. Due to missing integration, even just the visualization of patient's image data in electronic case report forms (eCRFs) is impossible. Four increasing levels for integration of DICOM components into EDCS are conceivable, raising functionality but also demands on interfaces with each level. Hence, in this paper, a comprehensive evaluation of 27 DICOM viewer software projects is performed, investigating viewing functionality as well as interfaces for integration. Concerning general, integration, and viewing requirements the survey involves the criteria (i) license, (ii) support, (iii) platform, (iv) interfaces, (v) two-dimensional (2D) and (vi) three-dimensional (3D) image viewing functionality. Optimal viewers are suggested for applications in clinical trials for 3D imaging, hospital communication, and workflow. Focusing on open source solutions, the viewers ImageJ and MicroView are superior for 3D visualization, whereas GingkoCADx is advantageous for hospital integration. Concerning workflow optimization in multi-centered clinical trials, we suggest the open source viewer Weasis. Covering most use cases, an EDCS and PACS interconnection with Weasis is suggested.

  3. Quantitative Neuroimaging Software for Clinical Assessment of Hippocampal Volumes on MR Imaging

    PubMed Central

    Ahdidan, Jamila; Raji, Cyrus A.; DeYoe, Edgar A.; Mathis, Jedidiah; Noe, Karsten Ø.; Rimestad, Jens; Kjeldsen, Thomas K.; Mosegaard, Jesper; Becker, James T.; Lopez, Oscar

    2015-01-01

    Background: Multiple neurological disorders including Alzheimer’s disease (AD), mesial temporal sclerosis, and mild traumatic brain injury manifest with volume loss on brain MRI. Subtle volume loss is particularly seen early in AD. While prior research has demonstrated the value of this additional information from quantitative neuroimaging, very few applications have been approved for clinical use. Here we describe a US FDA cleared software program, NeuroreaderTM, for assessment of clinical hippocampal volume on brain MRI. Objective: To present the validation of hippocampal volumetrics on a clinical software program. Method: Subjects were drawn (n = 99) from the Alzheimer Disease Neuroimaging Initiative study. Volumetric brain MR imaging was acquired in both 1.5 T (n = 59) and 3.0 T (n = 40) scanners in participants with manual hippocampal segmentation. Fully automated hippocampal segmentation and measurement was done using a multiple atlas approach. The Dice Similarity Coefficient (DSC) measured the level of spatial overlap between NeuroreaderTM and gold standard manual segmentation from 0 to 1 with 0 denoting no overlap and 1 representing complete agreement. DSC comparisons between 1.5 T and 3.0 T scanners were done using standard independent samples T-tests. Results: In the bilateral hippocampus, mean DSC was 0.87 with a range of 0.78–0.91 (right hippocampus) and 0.76–0.91 (left hippocampus). Automated segmentation agreement with manual segmentation was essentially equivalent at 1.5 T (DSC = 0.879) versus 3.0 T (DSC = 0.872). Conclusion: This work provides a description and validation of a software program that can be applied in measuring hippocampal volume, a biomarker that is frequently abnormal in AD and other neurological disorders. PMID:26484924

  4. LIM Kinase, a Newly Identified Regulator of Presynaptic Remodeling by Rod Photoreceptors After Injury

    PubMed Central

    Wang, Weiwei; Townes-Anderson, Ellen

    2015-01-01

    Purpose Rod photoreceptors retract their axon terminals and develop neuritic sprouts in response to retinal detachment and reattachment, respectively. This study examines the role of LIM kinase (LIMK), a component of RhoA and Rac pathways, in the presynaptic structural remodeling of rod photoreceptors. Methods Phosphorylated LIMK (p-LIMK), the active form of LIMK, was examined in salamander retina with Western blot and confocal microscopy. Axon length within the first 7 hours and process growth after 3 days of culture were assessed in isolated rod photoreceptors treated with inhibitors of upstream regulators ROCK and p21-activated kinase (Pak) (Y27632 and IPA-3) and a direct LIMK inhibitor (BMS-5). Porcine retinal explants were also treated with BMS-5 and analyzed 24 hours after detachment. Because Ca2+ influx contributes to axonal retraction, L-type channels were blocked in some experiments with nicardipine. Results Phosphorylated LIMK is present in rod terminals during retraction and in newly formed processes. Axonal retraction over 7 hours was significantly reduced by inhibition of LIMK or its regulators, ROCK and Pak. Process growth was reduced by LIMK or Pak inhibition especially at the basal (axon-bearing) region of the rod cells. Combining Ca2+ channel and LIMK inhibition had no additional effect on retraction but did further inhibit sprouting after 3 days. In detached porcine retina, LIMK inhibition reduced rod axonal retraction and improved retinal morphology. Conclusions Thus structural remodeling, in the form of either axonal retraction or neuritic growth, requires LIMK activity. LIM kinase inhibition may have therapeutic potential for reducing pathologic rod terminal plasticity after retinal injury. PMID:26658506

  5. Impact of computer-based treatment planning software on clinical judgment of dental students for planning prosthodontic rehabilitation

    PubMed Central

    Deshpande, Saee; Chahande, Jayashree

    2014-01-01

    Purpose Successful prosthodontic rehabilitation involves making many interrelated clinical decisions which have an impact on each other. Self-directed computer-based training has been shown to be a very useful tool to develop synthetic and analytical problem-solving skills among students. Thus, a computer-based case study and treatment planning (CSTP) software program was developed which would allow students to work through the process of comprehensive, multidisciplinary treatment planning for patients in a structured and logical manner. The present study was aimed at assessing the effect of this CSTP software on the clinical judgment of dental students while planning prosthodontic rehabilitation and to assess the students’ perceptions about using the program for its intended use. Methods A CSTP software program was developed and validated. The impact of this program on the clinical decision making skills of dental graduates was evaluated by real life patient encounters, using a modified and validated mini-CEX. Students’ perceptions about the program were obtained by a pre-validated feedback questionnaire. Results The faculty assessment scores of clinical judgment improved significantly after the use of this program. The majority of students felt it was an informative, useful, and innovative way of learning and they strongly felt that they had learnt the logical progression of planning, the insight into decision making, and the need for flexibility in treatment planning after using this program. Conclusion CSTP software was well received by the students. There was significant improvement in students’ clinical judgment after using this program. It should thus be envisaged fundamentally as an adjunct to conventional teaching techniques to improve students’ decision making skills and confidence. PMID:25170288

  6. Free software to analyse the clinical relevance of drug interactions with antiretroviral agents (SIMARV®) in patients with HIV/AIDS.

    PubMed

    Giraldo, N A; Amariles, P; Monsalve, M; Faus, M J

    Highly active antiretroviral therapy has extended the expected lifespan of patients with HIV/AIDS. However, the therapeutic benefits of some drugs used simultaneously with highly active antiretroviral therapy may be adversely affected by drug interactions. The goal was to design and develop a free software to facilitate analysis, assessment, and clinical decision making according to the clinical relevance of drug interactions in patients with HIV/AIDS. A comprehensive Medline/PubMed database search of drug interactions was performed. Articles that recognized any drug interactions in HIV disease were selected. The publications accessed were limited to human studies in English or Spanish, with full texts retrieved. Drug interactions were analyzed, assessed, and grouped into four levels of clinical relevance according to gravity and probability. Software to systematize the information regarding drug interactions and their clinical relevance was designed and developed. Overall, 952 different references were retrieved and 446 selected; in addition, 67 articles were selected from the citation lists of identified articles. A total of 2119 pairs of drug interactions were identified; of this group, 2006 (94.7%) were drug-drug interactions, 1982 (93.5%) had an identified pharmacokinetic mechanism, and 1409 (66.5%) were mediated by enzyme inhibition. In terms of clinical relevance, 1285 (60.6%) drug interactions were clinically significant in patients with HIV (levels 1 and 2). With this information, a software program that facilitates identification and assessment of the clinical relevance of antiretroviral drug interactions (SIMARV ® ) was developed. A free software package with information on 2119 pairs of antiretroviral drug interactions was designed and developed that could facilitate analysis, assessment, and clinical decision making according to the clinical relevance of drug interactions in patients with HIV/AIDS. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Treatment delivery software for a new clinical grade ultrasound system for thermoradiotherapy.

    PubMed

    Novák, Petr; Moros, Eduardo G; Straube, William L; Myerson, Robert J

    2005-11-01

    A detailed description of a clinical grade Scanning Ultrasound Reflector Linear Array System (SURLAS) applicator was given in a previous paper [Med. Phys. 32, 230-240 (2005)]. In this paper we concentrate on the design, development, and testing of the personal computer (PC) based treatment delivery software that runs the therapy system. The SURLAS requires the coordinated interaction between the therapy applicator and several peripheral devices for its proper and safe operation. One of the most important tasks was the coordination of the input power sequences for the elements of two parallel opposed ultrasound arrays (eight 1.5 cm x 2 cm elements/array, array 1 and 2 operate at 1.9 and 4.9 MHz, respectively) in coordination with the position of a dual-face scanning acoustic reflector. To achieve this, the treatment delivery software can divide the applicator's treatment window in up to 64 sectors (minimum size of 2 cm x 2 cm), and control the power to each sector independently by adjusting the power output levels from the channels of a 16-channel radio-frequency generator. The software coordinates the generator outputs with the position of the reflector as it scans back and forth between the arrays. Individual sector control and dual frequency operation allows the SURLAS to adjust power deposition in three dimensions to superficial targets coupled to its treatment window. The treatment delivery software also monitors and logs several parameters such as temperatures acquired using a 16-channel thermocouple thermometry unit. Safety (in particular to patients) was the paramount concern and design criterion. Failure mode and effects analysis (FMEA) was applied to the applicator as well as to the entire therapy system in order to identify safety issues and rank their relative importance. This analysis led to the implementation of several safety mechanisms and a software structure where each device communicates with the controlling PC independently of the others. In case

  8. Baobab Laboratory Information Management System: Development of an Open-Source Laboratory Information Management System for Biobanking

    PubMed Central

    Bendou, Hocine; Sizani, Lunga; Reid, Tim; Swanepoel, Carmen; Ademuyiwa, Toluwaleke; Merino-Martinez, Roxana; Meuller, Heimo; Abayomi, Akin

    2017-01-01

    A laboratory information management system (LIMS) is central to the informatics infrastructure that underlies biobanking activities. To date, a wide range of commercial and open-source LIMSs are available and the decision to opt for one LIMS over another is often influenced by the needs of the biobank clients and researchers, as well as available financial resources. The Baobab LIMS was developed by customizing the Bika LIMS software (www.bikalims.org) to meet the requirements of biobanking best practices. The need to implement biobank standard operation procedures as well as stimulate the use of standards for biobank data representation motivated the implementation of Baobab LIMS, an open-source LIMS for Biobanking. Baobab LIMS comprises modules for biospecimen kit assembly, shipping of biospecimen kits, storage management, analysis requests, reporting, and invoicing. The Baobab LIMS is based on the Plone web-content management framework. All the system requirements for Plone are applicable to Baobab LIMS, including the need for a server with at least 8 GB RAM and 120 GB hard disk space. Baobab LIMS is a server–client-based system, whereby the end user is able to access the system securely through the internet on a standard web browser, thereby eliminating the need for standalone installations on all machines. PMID:28375759

  9. Baobab Laboratory Information Management System: Development of an Open-Source Laboratory Information Management System for Biobanking.

    PubMed

    Bendou, Hocine; Sizani, Lunga; Reid, Tim; Swanepoel, Carmen; Ademuyiwa, Toluwaleke; Merino-Martinez, Roxana; Meuller, Heimo; Abayomi, Akin; Christoffels, Alan

    2017-04-01

    A laboratory information management system (LIMS) is central to the informatics infrastructure that underlies biobanking activities. To date, a wide range of commercial and open-source LIMSs are available and the decision to opt for one LIMS over another is often influenced by the needs of the biobank clients and researchers, as well as available financial resources. The Baobab LIMS was developed by customizing the Bika LIMS software ( www.bikalims.org ) to meet the requirements of biobanking best practices. The need to implement biobank standard operation procedures as well as stimulate the use of standards for biobank data representation motivated the implementation of Baobab LIMS, an open-source LIMS for Biobanking. Baobab LIMS comprises modules for biospecimen kit assembly, shipping of biospecimen kits, storage management, analysis requests, reporting, and invoicing. The Baobab LIMS is based on the Plone web-content management framework. All the system requirements for Plone are applicable to Baobab LIMS, including the need for a server with at least 8 GB RAM and 120 GB hard disk space. Baobab LIMS is a server-client-based system, whereby the end user is able to access the system securely through the internet on a standard web browser, thereby eliminating the need for standalone installations on all machines.

  10. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  11. Expression of LIM kinase 1 is associated with reversible G1/S phase arrest, chromosomal instability and prostate cancer.

    PubMed

    Davila, Monica; Jhala, Darshana; Ghosh, Debashis; Grizzle, William E; Chakrabarti, Ratna

    2007-06-08

    LIM kinase 1 (LIMK1), a LIM domain containing serine/threonine kinase, modulates actin dynamics through inactivation of the actin depolymerizing protein cofilin. Recent studies have indicated an important role of LIMK1 in growth and invasion of prostate and breast cancer cells; however, the molecular mechanism whereby LIMK1 induces tumor progression is unknown. In this study, we investigated the effects of ectopic expression of LIMK1 on cellular morphology, cell cycle progression and expression profile of LIMK1 in prostate tumors. Ectopic expression of LIMK1 in benign prostatic hyperplasia cells (BPH), which naturally express low levels of LIMK1, resulted in appearance of abnormal mitotic spindles, multiple centrosomes and smaller chromosomal masses. Furthermore, a transient G1/S phase arrest and delayed G2/M progression was observed in BPH cells expressing LIMK1. When treated with chemotherapeutic agent Taxol, no metaphase arrest was noted in these cells. We have also noted increased nuclear staining of LIMK1 in tumors with higher Gleason Scores and incidence of metastasis. Our results show that increased expression of LIMK1 results in chromosomal abnormalities, aberrant cell cycle progression and alteration of normal cellular response to microtubule stabilizing agent Taxol; and that LIMK1 expression may be associated with cancerous phenotype of the prostate.

  12. A prospective development study of software-guided radio-frequency ablation of primary and secondary liver tumors: Clinical intervention modelling, planning and proof for ablation cancer treatment (ClinicIMPPACT).

    PubMed

    Reinhardt, Martin; Brandmaier, Philipp; Seider, Daniel; Kolesnik, Marina; Jenniskens, Sjoerd; Sequeiros, Roberto Blanco; Eibisberger, Martin; Voglreiter, Philip; Flanagan, Ronan; Mariappan, Panchatcharam; Busse, Harald; Moche, Michael

    2017-12-01

    Radio-frequency ablation (RFA) is a promising minimal-invasive treatment option for early liver cancer, however monitoring or predicting the size of the resulting tissue necrosis during the RFA-procedure is a challenging task, potentially resulting in a significant rate of under- or over treatments. Currently there is no reliable lesion size prediction method commercially available. ClinicIMPPACT is designed as multicenter-, prospective-, non-randomized clinical trial to evaluate the accuracy and efficiency of innovative planning and simulation software. 60 patients with early liver cancer will be included at four European clinical institutions and treated with the same RFA system. The preinterventional imaging datasets will be used for computational planning of the RFA treatment. All ablations will be simulated simultaneously to the actual RFA procedure, using the software environment developed in this project. The primary outcome measure is the comparison of the simulated ablation zones with the true lesions shown in follow-up imaging after one month, to assess accuracy of the lesion prediction. This unique multicenter clinical trial aims at the clinical integration of a dedicated software solution to accurately predict lesion size and shape after radiofrequency ablation of liver tumors. Accelerated and optimized workflow integration, and real-time intraoperative image processing, as well as inclusion of patient specific information, e.g. organ perfusion and registration of the real RFA needle position might make the introduced software a powerful tool for interventional radiologists to optimize patient outcomes.

  13. Numerical Simulations of the 1991 Limón Tsunami, Costa Rica Caribbean Coast

    NASA Astrophysics Data System (ADS)

    Chacón-Barrantes, Silvia; Zamora, Natalia

    2017-08-01

    The second largest recorded tsunami along the Caribbean margin of Central America occurred 25 years ago. On April 22nd, 1991, an earthquake with magnitude Mw 7.6 ruptured along the thrust faults that form the North Panamá Deformed Belt (NPDB). The earthquake triggered a tsunami that affected the Caribbean coast of Costa Rica and Panamá within few minutes, generating two casualties. These are the only deaths caused by a tsunami in Costa Rica. Coseismic uplift up to 1.6 m and runup values larger than 2 m were measured along some coastal sites. Here, we consider three solutions for the seismic source as initial conditions to model the tsunami, each considering a single rupture plane. We performed numerical modeling of the tsunami propagation and runup using NEOWAVE numerical model (Yamazaki et al. in Int J Numer Methods Fluids 67:2081-2107, 2010, doi: 10.1002/fld.2485 ) on a system of nested grids from the entire Caribbean Sea to Limón city. The modeled surface deformation and tsunami runup agreed with the measured data along most of the coastal sites with one preferred model that fits the field data. The model results are useful to determine how the 1991 tsunami could have affected regions where tsunami records were not preserved and to simulate the effects of the coastal surface deformations as buffer to tsunami. We also performed tsunami modeling to simulate the consequences if a similar event with larger magnitude Mw 7.9 occurs offshore the southern Costa Rican Caribbean coast. Such event would generate maximum wave heights of more than 5 m showing that Limón and northwestern Panamá coastal areas are exposed to moderate-to-large tsunamis. These simulations considering historical events and maximum credible scenarios can be useful for hazard assessment and also as part of studies leading to tsunami evacuation maps and mitigation plans, even when that is not the scope of this paper.

  14. The Clinical Utilisation of Respiratory Elastance Software (CURE Soft): a bedside software for real-time respiratory mechanics monitoring and mechanical ventilation management.

    PubMed

    Szlavecz, Akos; Chiew, Yeong Shiong; Redmond, Daniel; Beatson, Alex; Glassenbury, Daniel; Corbett, Simon; Major, Vincent; Pretty, Christopher; Shaw, Geoffrey M; Benyo, Balazs; Desaive, Thomas; Chase, J Geoffrey

    2014-09-30

    Real-time patient respiratory mechanics estimation can be used to guide mechanical ventilation settings, particularly, positive end-expiratory pressure (PEEP). This work presents a software, Clinical Utilisation of Respiratory Elastance (CURE Soft), using a time-varying respiratory elastance model to offer this ability to aid in mechanical ventilation treatment. CURE Soft is a desktop application developed in JAVA. It has two modes of operation, 1) Online real-time monitoring decision support and, 2) Offline for user education purposes, auditing, or reviewing patient care. The CURE Soft has been tested in mechanically ventilated patients with respiratory failure. The clinical protocol, software testing and use of the data were approved by the New Zealand Southern Regional Ethics Committee. Using CURE Soft, patient's respiratory mechanics response to treatment and clinical protocol were monitored. Results showed that the patient's respiratory elastance (Stiffness) changed with the use of muscle relaxants, and responded differently to ventilator settings. This information can be used to guide mechanical ventilation therapy and titrate optimal ventilator PEEP. CURE Soft enables real-time calculation of model-based respiratory mechanics for mechanically ventilated patients. Results showed that the system is able to provide detailed, previously unavailable information on patient-specific respiratory mechanics and response to therapy in real-time. The additional insight available to clinicians provides the potential for improved decision-making, and thus improved patient care and outcomes.

  15. The Drosophila muscle LIM protein, Mlp84B, cooperates with D-titin to maintain muscle structural integrity.

    PubMed

    Clark, Kathleen A; Bland, Jennifer M; Beckerle, Mary C

    2007-06-15

    Muscle LIM protein (MLP) is a cytoskeletal LIM-only protein expressed in striated muscle. Mutations in human MLP are associated with cardiomyopathy; however, the molecular mechanism by which MLP functions is not established. A Drosophila MLP homolog, mlp84B, displays many of the same features as the vertebrate protein, illustrating the utility of the fly for the study of MLP function. Animals lacking Mlp84B develop into larvae with a morphologically intact musculature, but the mutants arrest during pupation with impaired muscle function. Mlp84B displays muscle-specific expression and is a component of the Z-disc and nucleus. Preventing nuclear retention of Mlp84B does not affect its function, indicating that Mlp84B site of action is likely to be at the Z-disc. Within the Z-disc, Mlp84B is colocalized with the N-terminus of D-titin, a protein crucial for sarcomere organization and stretch mechanics. The mlp84B mutants phenotypically resemble weak D-titin mutants. Furthermore, reducing D-titin activity in the mlp84B background leads to pronounced enhancement of the mlp84B muscle defects and loss of muscle structural integrity. The genetic interactions between mlp84B and D-titin reveal a role for Mlp84B in maintaining muscle structural integrity that was not obvious from analysis of the mlp84B mutants themselves, and suggest Mlp84B and D-titin cooperate to stabilize muscle sarcomeres.

  16. Agile Acceptance Test–Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software

    PubMed Central

    Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-01-01

    Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with

  17. VARK Learning Preferences and Mobile Anatomy Software Application Use in Pre-Clinical Chiropractic Students

    ERIC Educational Resources Information Center

    Meyer, Amanda J.; Stomski, Norman J.; Innes, Stanley I.; Armson, Anthony J.

    2016-01-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists…

  18. Evaluation of a deidentification (De-Id) software engine to share pathology reports and clinical documents for research.

    PubMed

    Gupta, Dilip; Saul, Melissa; Gilbertson, John

    2004-02-01

    We evaluated a comprehensive deidentification engine at the University of Pittsburgh Medical Center (UPMC), Pittsburgh, PA, that uses a complex set of rules, dictionaries, pattern-matching algorithms, and the Unified Medical Language System to identify and replace identifying text in clinical reports while preserving medical information for sharing in research. In our initial data set of 967 surgical pathology reports, the software did not suppress outside (103), UPMC (47), and non-UPMC (56) accession numbers; dates (7); names (9) or initials (25) of case pathologists; or hospital or laboratory names (46). In 150 reports, some clinical information was suppressed inadvertently (overmarking). The engine retained eponymic patient names, eg, Barrett and Gleason. In the second evaluation (1,000 reports), the software did not suppress outside (90) or UPMC (6) accession numbers or names (4) or initials (2) of case pathologists. In the third evaluation, the software removed names of patients, hospitals (297/300), pathologists (297/300), transcriptionists, residents and physicians, dates of procedures, and accession numbers (298/300). By the end of the evaluation, the system was reliably and specifically removing safe-harbor identifiers and producing highly readable deidentified text without removing important clinical information. Collaboration between pathology domain experts and system developers and continuous quality assurance are needed to optimize ongoing deidentification processes.

  19. Ease of adoption of clinical natural language processing software: An evaluation of five systems.

    PubMed

    Zheng, Kai; Vydiswaran, V G Vinod; Liu, Yang; Wang, Yue; Stubbs, Amber; Uzuner, Özlem; Gururaj, Anupama E; Bayer, Samuel; Aberdeen, John; Rumshisky, Anna; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua

    2015-12-01

    In recognition of potential barriers that may inhibit the widespread adoption of biomedical software, the 2014 i2b2 Challenge introduced a special track, Track 3 - Software Usability Assessment, in order to develop a better understanding of the adoption issues that might be associated with the state-of-the-art clinical NLP systems. This paper reports the ease of adoption assessment methods we developed for this track, and the results of evaluating five clinical NLP system submissions. A team of human evaluators performed a series of scripted adoptability test tasks with each of the participating systems. The evaluation team consisted of four "expert evaluators" with training in computer science, and eight "end user evaluators" with mixed backgrounds in medicine, nursing, pharmacy, and health informatics. We assessed how easy it is to adopt the submitted systems along the following three dimensions: communication effectiveness (i.e., how effective a system is in communicating its designed objectives to intended audience), effort required to install, and effort required to use. We used a formal software usability testing tool, TURF, to record the evaluators' interactions with the systems and 'think-aloud' data revealing their thought processes when installing and using the systems and when resolving unexpected issues. Overall, the ease of adoption ratings that the five systems received are unsatisfactory. Installation of some of the systems proved to be rather difficult, and some systems failed to adequately communicate their designed objectives to intended adopters. Further, the average ratings provided by the end user evaluators on ease of use and ease of interpreting output are -0.35 and -0.53, respectively, indicating that this group of users generally deemed the systems extremely difficult to work with. While the ratings provided by the expert evaluators are higher, 0.6 and 0.45, respectively, these ratings are still low indicating that they also experienced

  20. The Ras suppressor Rsu-1 binds to the LIM 5 domain of the adaptor protein PINCH1 and participates in adhesion-related functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dougherty, Gerard W.; Section on Structural Cell Biology, National Institute on Deafness and Communication Disorders; Chopp, Treasa

    2005-05-15

    Rsu-1 is a highly conserved leucine rich repeat (LRR) protein that is expressed ubiquitously in mammalian cells. Rsu-1 was identified based on its ability to inhibit transformation by Ras, and previous studies demonstrated that ectopic expression of Rsu-1 inhibited anchorage-independent growth of Ras-transformed cells and human tumor cell lines. Using GAL4-based yeast two-hybrid screening, the LIM domain protein, PINCH1, was identified as the binding partner of Rsu-1. PINCH1 is an adaptor protein that localizes to focal adhesions and it has been implicated in the regulation of adhesion functions. Subdomain mapping in yeast revealed that Rsu-1 binds to the LIM 5more » domain of PINCH1, a region not previously identified as a specific binding domain for any other protein. Additional testing demonstrated that PINCH2, which is highly homologous to PINCH1, except in the LIM 5 domain, does not interact with Rsu-1. Glutathione transferase fusion protein binding studies determined that the LRR region of Rsu-1 interacts with PINCH1. Transient expression studies using epitope-tagged Rsu-1 and PINCH1 revealed that Rsu-1 co-immunoprecipitated with PINCH1 and colocalized with vinculin at sites of focal adhesions in mammalian cells. In addition, endogenous P33 Rsu-1 from 293T cells co-immunoprecipitated with transiently expressed myc-tagged PINCH1. Furthermore, RNAi-induced reduction in Rsu-1 RNA and protein inhibited cell attachment, and while previous studies demonstrated that ectopic expression of Rsu-1 inhibited Jun kinase activation, the depletion of Rsu-1 resulted in activation of Jun and p38 stress kinases. These studies demonstrate that Rsu-1 interacts with PINCH1 in mammalian cells and functions, in part, by altering cell adhesion.« less

  1. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    PubMed

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test

  2. Description of data on the Nimbus 7 LIMS map archive tape: Water vapor and nitrogen dioxide

    NASA Technical Reports Server (NTRS)

    Haggard, Kenneth V.; Marshall, B. T.; Kurzeja, Robert J.; Remsberg, Ellis E.; Russell, James M., III

    1988-01-01

    Described is the process by which the analysis of the Limb Infrared Monitor of the Stratosphere (LIMS) experiment data were used to produce estimates of synoptic maps of water vapor and nitrogen dioxide. In addition to a detailed description of the analysis procedure, also discussed are several interesting features in the data which are used to demonstrate how the analysis procedure produced the final maps and how one can estimate the uncertainties in the maps. In addition, features in the analysis are noted that would influence how one might use, or interpret, the results. These include subjects such as smoothing and the interpretation of wave components.

  3. Dynamic Performance of Subway Vehicle with Linear Induction Motor System

    NASA Astrophysics Data System (ADS)

    Wu, Pingbo; Luo, Ren; Hu, Yan; Zeng, Jing

    The light rail vehicle with Linear Induction Motor (LIM) bogie, which is a new type of urban rail traffic tool, has the advantages of low costs, wide applicability, low noise, simple maintenance and better dynamic behavior. This kind of vehicle, supported and guided by the wheel and rail, is not driven by the wheel/rail adhesion force, but driven by the electromagnetic force between LIM and reaction plate. In this paper, three different types of suspensions and their characteristic are discussed with considering the interactions both between wheel and rail and between LIM and reaction plate. A nonlinear mathematical model of the vehicle with LIM bogie is set up by using the software SIMPACK, and the electromechanical model is also set up on Simulink roof. Then the running behavior of the LIM vehicle is simulated, and the influence of suspension on the vehicle dynamic performance is investigated.

  4. [Software-based visualization of patient flow at a university eye clinic].

    PubMed

    Greb, O; Abou Moulig, W; Hufendiek, K; Junker, B; Framme, C

    2017-03-01

    This article presents a method for visualization and navigation of patient flow in outpatient eye clinics with a high level of complexity. A network-based software solution was developed targeting long-term process optimization by structural analysis and temporal coordination of process navigation. Each examination unit receives a separate waiting list of patients in which the patient flow for every patient is recorded in a timeline. Time periods and points in time can be executed by mouse clicks and the desired diagnostic procedure can be entered. Recent progress in any of these diagnostic requests, as well as a variety of information on patient progress are collated and drawn into the corresponding timeline which can be viewed by any of the personnel involved. The software called TimeElement has been successfully tested in the practical implemenation for several months. As an example the patient flow regarding time stamps of defined events for intravitreous injections on 250 patients was recorded and an average attendance time of 169.71 min was found, whereby the time was also automatically recorded for each individual stage. Recording of patient flow data is a fundamental component of patient flow management, waiting time reduction, patient flow navigation with time and coordination in particular regarding timeline-based visualization for each individual patient. Long-term changes in process management can be planned and evaluated by comparing patient flow data. As using the software itself causes structural changes within the organization, a questionnaire is being planned for appraisal by the personnel involved.

  5. RAPID-L Highly Automated Fast Reactor Concept Without Any Control Rods (2) Critical experiment of lithium-6 used in LEM and LIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsunoda, Hirokazu; Sato, Osamu; Okajima, Shigeaki

    2002-07-01

    In order to achieve fully automated reactor operation of RAPID-L reactor, innovative reactivity control systems LEM, LIM, and LRM are equipped with lithium-6 as a liquid poison. Because lithium-6 has not been used as a neutron absorbing material of conventional fast reactors, measurements of the reactivity worth of Lithium-6 were performed at the Fast Critical Assembly (FCA) of Japan Atomic Energy Research Institute (JAERI). The FCA core was composed of highly enriched uranium and stainless steel samples so as to simulate the core spectrum of RAPID-L. The samples of 95% enriched lithium-6 were inserted into the core parallel to themore » core axis for the measurement of the reactivity worth at each position. It was found that the measured reactivity worth in the core region well agreed with calculated value by the method for the core designs of RAPID-L. Bias factors for the core design method were obtained by comparing between experimental and calculated results. The factors were used to determine the number of LEM and LIM equipped in the core to achieve fully automated operation of RAPID-L. (authors)« less

  6. Nociceptive DRG neurons express muscle lim protein upon axonal injury.

    PubMed

    Levin, Evgeny; Andreadaki, Anastasia; Gobrecht, Philipp; Bosse, Frank; Fischer, Dietmar

    2017-04-04

    Muscle lim protein (MLP) has long been regarded as a cytosolic and nuclear muscular protein. Here, we show that MLP is also expressed in a subpopulation of adult rat dorsal root ganglia (DRG) neurons in response to axonal injury, while the protein was not detectable in naïve cells. Detailed immunohistochemical analysis of L4/L5 DRG revealed ~3% of MLP-positive neurons 2 days after complete sciatic nerve crush and maximum ~10% after 4-14 days. Similarly, in mixed cultures from cervical, thoracic, lumbar and sacral DRG ~6% of neurons were MLP-positive after 2 days and maximal 17% after 3 days. In both, histological sections and cell cultures, the protein was detected in the cytosol and axons of small diameter cells, while the nucleus remained devoid. Moreover, the vast majority could not be assigned to any of the well characterized canonical DRG subpopulations at 7 days after nerve injury. However, further analysis in cell culture revealed that the largest population of MLP expressing cells originated from non-peptidergic IB4-positive nociceptive neurons, which lose their ability to bind the lectin upon axotomy. Thus, MLP is mostly expressed in a subset of axotomized nociceptive neurons and can be used as a novel marker for this population of cells.

  7. RNA-LIM: a novel procedure for analyzing protein/single-stranded RNA propensity data with concomitant estimation of interface structure.

    PubMed

    Hall, Damien; Li, Songling; Yamashita, Kazuo; Azuma, Ryuzo; Carver, John A; Standley, Daron M

    2015-03-01

    RNA-LIM is a procedure that can analyze various pseudo-potentials describing the affinity between single-stranded RNA (ssRNA) ribonucleotides and surface amino acids to produce a coarse-grained estimate of the structure of the ssRNA at the protein interface. The search algorithm works by evolving an ssRNA chain, of known sequence, as a series of walks between fixed sites on a protein surface. Optimal routes are found by application of a set of minimal "limiting" restraints derived jointly from (i) selective sampling of the ribonucleotide amino acid affinity pseudo-potential data, (ii) limited surface path exploration by prior determination of surface arc lengths, and (iii) RNA structural specification obtained from a statistical potential gathered from a library of experimentally determined ssRNA structures. We describe the general approach using a NAST (Nucleic Acid Simulation Tool)-like approximation of the ssRNA chain and a generalized pseudo-potential reflecting the location of nucleic acid binding residues. Minimum and maximum performance indicators of the methodology are established using both synthetic data, for which the pseudo-potential defining nucleic acid binding affinity is systematically degraded, and a representative real case, where the RNA binding sites are predicted by the amplified antisense RNA (aaRNA) method. Some potential uses and extensions of the routine are discussed. RNA-LIM analysis programs along with detailed instructions for their use are available on request from the authors. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  8. Randomization in clinical trials: stratification or minimization? The HERMES free simulation software.

    PubMed

    Fron Chabouis, Hélène; Chabouis, Francis; Gillaizeau, Florence; Durieux, Pierre; Chatellier, Gilles; Ruse, N Dorin; Attal, Jean-Pierre

    2014-01-01

    Operative clinical trials are often small and open-label. Randomization is therefore very important. Stratification and minimization are two randomization options in such trials. The first aim of this study was to compare stratification and minimization in terms of predictability and balance in order to help investigators choose the most appropriate allocation method. Our second aim was to evaluate the influence of various parameters on the performance of these techniques. The created software generated patients according to chosen trial parameters (e.g., number of important prognostic factors, number of operators or centers, etc.) and computed predictability and balance indicators for several stratification and minimization methods over a given number of simulations. Block size and proportion of random allocations could be chosen. A reference trial was chosen (50 patients, 1 prognostic factor, and 2 operators) and eight other trials derived from this reference trial were modeled. Predictability and balance indicators were calculated from 10,000 simulations per trial. Minimization performed better with complex trials (e.g., smaller sample size, increasing number of prognostic factors, and operators); stratification imbalance increased when the number of strata increased. An inverse correlation between imbalance and predictability was observed. A compromise between predictability and imbalance still has to be found by the investigator but our software (HERMES) gives concrete reasons for choosing between stratification and minimization; it can be downloaded free of charge. This software will help investigators choose the appropriate randomization method in future two-arm trials.

  9. Development of computer tablet software for clinical quantification of lateral knee compartment translation during the pivot shift test.

    PubMed

    Muller, Bart; Hofbauer, Marcus; Rahnemai-Azar, Amir Ata; Wolf, Megan; Araki, Daisuke; Hoshino, Yuichi; Araujo, Paulo; Debski, Richard E; Irrgang, James J; Fu, Freddie H; Musahl, Volker

    2016-01-01

    The pivot shift test is a commonly used clinical examination by orthopedic surgeons to evaluate knee function following injury. However, the test can only be graded subjectively by the examiner. Therefore, the purpose of this study is to develop software for a computer tablet to quantify anterior translation of the lateral knee compartment during the pivot shift test. Based on the simple image analysis method, software for a computer tablet was developed with the following primary design constraint - the software should be easy to use in a clinical setting and it should not slow down an outpatient visit. Translation of the lateral compartment of the intact knee was 2.0 ± 0.2 mm and for the anterior cruciate ligament-deficient knee was 8.9 ± 0.9 mm (p < 0.001). Intra-tester (ICC range = 0.913 to 0.999) and inter-tester (ICC = 0.949) reliability were excellent for the repeatability assessments. Overall, the average percent error of measuring simulated translation of the lateral knee compartment with the tablet parallel to the monitor increased from 2.8% at 50 cm distance to 7.7% at 200 cm. Deviation from the parallel position of the tablet did not have a significant effect until a tablet angle of 45°. Average percent error during anterior translation of the lateral knee compartment of 6mm was 2.2% compared to 6.2% for 2 mm of translation. The software provides reliable, objective, and quantitative data on translation of the lateral knee compartment during the pivot shift test and meets the design constraints posed by the clinical setting.

  10. Description of data on the Nimbus 7 LIMS map archive tape: Temperature and geopotential height

    NASA Technical Reports Server (NTRS)

    Haggard, K. V.; Remsberg, E. E.; Grose, W. L.; Russell, J. M., III; Marshall, B. T.; Lingenfelser, G.

    1986-01-01

    The process by which the analysis of the Limb Infared Monitor of the Stratosphere (LIMS) experiment data were used to produce estimates of synoptic maps of temperature and geopotential height is described. In addition to a detailed description of the analysis procedure, several interesting features in the data are discussed and these features are used to demonstrate how the analysis procedure produced the final maps and how one can estimate the uncertainties in the maps. In addition, features in the analysis are noted that would influence how one might use, or interpret, the results. These include subjects such as smoothing and the interpretation of wave components. While some suggestions are made for an improved analysis of the data, it is shown that, in general, the maps are an excellent estimation of the synoptic fields.

  11. Development of preoperative planning software for transforaminal endoscopic surgery and the guidance for clinical applications.

    PubMed

    Chen, Xiaojun; Cheng, Jun; Gu, Xin; Sun, Yi; Politis, Constantinus

    2016-04-01

    Preoperative planning is of great importance for transforaminal endoscopic techniques applied in percutaneous endoscopic lumbar discectomy. In this study, a modular preoperative planning software for transforaminal endoscopic surgery was developed and demonstrated. The path searching method is based on collision detection, and the oriented bounding box was constructed for the anatomical models. Then, image reformatting algorithms were developed for multiplanar reconstruction which provides detailed anatomical information surrounding the virtual planned path. Finally, multithread technique was implemented to realize the steady-state condition of the software. A preoperative planning software for transforaminal endoscopic surgery (TE-Guider) was developed; seven cases of patients with symptomatic lumbar disc herniations were planned preoperatively using TE-Guider. The distances to the midlines and the direction of the optimal paths were exported, and each result was in line with the empirical value. TE-Guider provides an efficient and cost-effective way to search the ideal path and entry point for the puncture. However, more clinical cases will be conducted to demonstrate its feasibility and reliability.

  12. A Lagrangian analysis of a sudden stratospheric warming - Comparison of a model simulation and LIMS observations

    NASA Technical Reports Server (NTRS)

    Pierce, R. B.; Remsberg, Ellis E.; Fairlie, T. D.; Blackshear, W. T.; Grose, William L.; Turner, Richard E.

    1992-01-01

    Lagrangian area diagnostics and trajectory techniques are used to investigate the radiative and dynamical characteristics of a spontaneous sudden warming which occurred during a 2-yr Langley Research Center model simulation. The ability of the Langley Research Center GCM to simulate the major features of the stratospheric circulation during such highly disturbed periods is illustrated by comparison of the simulated warming to the observed circulation during the LIMS observation period. The apparent sink of vortex area associated with Rossby wave-breaking accounts for the majority of the reduction of the size of the vortex and also acts to offset the radiatively driven increase in the area occupied by the 'surf zone'. Trajectory analysis of selected material lines substantiates the conclusions from the area diagnostics.

  13. Agreement between clinical estimation and a new quantitative analysis by Photoshop software in fundus and angiographic image variables.

    PubMed

    Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R

    2009-12-01

    To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.

  14. The LIM protein LIMD1 influences osteoblast differentiation and function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luderer, Hilary F.; Bai Shuting; Longmore, Gregory D.

    2008-09-10

    The balance between bone resorption and bone formation involves the coordinated activities of osteoblasts and osteoclasts. Communication between these two cell types is essential for maintenance of normal bone homeostasis; however, the mechanisms regulating this cross talk are not completely understood. Many factors that mediate differentiation and function of both osteoblasts and osteoclasts have been identified. The LIM protein Limd1 has been implicated in the regulation of stress osteoclastogenesis through an interaction with the p62/sequestosome protein. Here we show that Limd1 also influences osteoblast progenitor numbers, differentiation, and function. Limd1{sup -/-} calvarial osteoblasts display increased mineralization and accelerated differentiation. Whilemore » no significant differences in osteoblast number or function were detected in vivo, bone marrow stromal cells isolated from Limd1{sup -/-} mice contain significantly more osteoblast progenitors compared to wild type controls when cultured ex vivo. Furthermore, we observed a significant increase in nuclear {beta}-catenin staining in differentiating Limd1{sup -/-} calvarial osteoblasts suggesting that Limd1 is a negative regulator of canonical Wnt signaling in osteoblasts. These results demonstrate that Limd1 influences not only stress osteoclastogenesis but also osteoblast function and osteoblast progenitor commitment. Together, these data identify Limd1 as a novel regulator of both bone osetoclast and bone osteoblast development and function.« less

  15. World Workshop on Oral Medicine VI: Utilization of Oral Medicine-specific software for support of clinical care, research, and education: current status and strategy for broader implementation.

    PubMed

    Brailo, Vlaho; Firriolo, Francis John; Tanaka, Takako Imai; Varoni, Elena; Sykes, Rosemary; McCullough, Michael; Hua, Hong; Sklavounou, Alexandra; Jensen, Siri Beier; Lockhart, Peter B; Mattsson, Ulf; Jontell, Mats

    2015-08-01

    To assess the current scope and status of Oral Medicine-specific software (OMSS) utilized to support clinical care, research, and education in Oral Medicine and to propose a strategy for broader implementation of OMSS within the global Oral Medicine community. An invitation letter explaining the objectives was sent to the global Oral Medicine community. Respondents were interviewed to obtain information about different aspects of OMSS functionality. Ten OMSS tools were identified. Four were being used for clinical care, one was being used for research, two were being used for education, and three were multipurpose. Clinical software was being utilized as databases developed to integrate of different type of clinical information. Research software was designed to facilitate multicenter research. Educational software represented interactive, case-orientated technology designed for clinical training in Oral Medicine. Easy access to patient data was the most commonly reported advantage. Difficulty of use and poor integration with other software was the most commonly reported disadvantage. The OMSS presented in this paper demonstrate how information technology (IT) can have an impact on the quality of patient care, research, and education in the field of Oral Medicine. A strategy for broader implementation of OMSS is proposed. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. A data model for clinical legal medicine practice and the development of a dedicated software for both practitioners and researchers.

    PubMed

    Dang, Catherine; Phuong, Thomas; Beddag, Mahmoud; Vega, Anabel; Denis, Céline

    2018-07-01

    To present a data model for clinical legal medicine and the software based on that data model for both practitioners and researchers. The main functionalities of the presented software are computer-assisted production of medical certificates and data capture, storage and retrieval. The data model and the software were jointly developed by the department of forensic medicine of the Jean Verdier Hospital (Bondy, France) and an bioinformatics laboratory (LIMICS, Paris universities 6-13) between November 2015 and May 2016. The data model was built based on four sources: i) a template used in our department for producing standardised medical certificates; ii) a random sample of medical certificates produced by the forensic department; iii) anterior consensus between four healthcare professionals (two forensic practitioners, a psychologist and a forensic psychiatrist) and iv) anatomical dictionaries. The trial version of the open source software was first designed for examination of physical assault survivors. An UML-like data model dedicated to clinical legal practice was built. The data model describes the terminology for examinations of sexual assault survivors, physical assault survivors, individuals kept in police custody and undocumented migrants for age estimation. A trial version of a software relying on the data model was developed and tested by three physicians. The software allows files archiving, standardised data collection, extraction and assistance for certificate generation. It can be used for research purpose, by data exchange and analysis. Despite some current limitations of use, it is a tool which can be shared and used by other departments of forensic medicine and other specialties, improving data management and exploitation. Full integration with external sources, analytics software and use of a semantic interoperability framework are planned for the next months. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights

  17. Data warehouse implementation with clinical pharmacokinetic/pharmacodynamic data.

    PubMed

    Koprowski, S P; Barrett, J S

    2002-03-01

    We have created a data warehouse for human pharmacokinetic (PK) and pharmacodynamic (PD) data generated primarily within the Clinical PK Group of the Drug Metabolism and Pharmacokinetics (DM&PK) Department of DuPont Pharmaceuticals. Data which enters an Oracle-based LIMS directly from chromatography systems or through files from contract research organizations are accessed via SAS/PH.Kinetics, GLP-compliant data analysis software residing on individual users' workstations. Upon completion of the final PK or PD analysis, data are pushed to a predefined location. Data analyzed/created with other software (i.e., WinNonlin, NONMEM, Adapt, etc.) are added to this file repository as well. The warehouse creates views to these data and accumulates metadata on all data sources defined in the warehouse. The warehouse is managed via the SAS/Warehouse Administrator product that defines the environment, creates summarized data structures, and schedules data refresh. The clinical PK/PD warehouse encompasses laboratory, biometric, PK and PD data streams. Detailed logical tables for each compound are created/updated as the clinical PK/PD data warehouse is populated. The data model defined to the warehouse is based on a star schema. Summarized data structures such as multidimensional data bases (MDDB), infomarts, and datamarts are created from detail tables. Data mining and querying of highly summarized data as well as drill-down to detail data is possible via the creation of exploitation tools which front-end the warehouse data. Based on periodic refreshing of the warehouse data, these applications are able to access the most current data available and do not require a manual interface to update/populate the data store. Prototype applications have been web-enabled to facilitate their usage to varied data customers across platform and location. The warehouse also contains automated mechanisms for the construction of study data listings and SAS transport files for eventual

  18. MNE Scan: Software for real-time processing of electrophysiological data.

    PubMed

    Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph

    2018-06-01

    Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Tissue specific characterisation of Lim-kinase 1 expression during mouse embryogenesis

    PubMed Central

    Lindström, Nils O.; Neves, Carlos; McIntosh, Rebecca; Miedzybrodzka, Zosia; Vargesson, Neil; Collinson, J. Martin

    2012-01-01

    The Lim-kinase (LIMK) proteins are important for the regulation of the actin cytoskeleton, in particular the control of actin nucleation and depolymerisation via regulation of cofilin, and hence may control a large number of processes during development, including cell tensegrity, migration, cell cycling, and axon guidance. LIMK1/LIMK2 knockouts disrupt spinal cord morphogenesis and synapse formation but other tissues and developmental processes that require LIMK are yet to be fully determined. To identify tissues and cell-types that may require LIMK, we characterised the pattern of LIMK1 protein during mouse embryogenesis. We showed that LIMK1 displays an expression pattern that is temporally dynamic and tissue-specific. In several tissues LIMK1 is detected in cell-types that also express Wilms’ tumour protein 1 and that undergo transitions between epithelial and mesenchymal states, including the pleura, epicardium, kidney nephrons, and gonads. LIMK1 was also found in a subset of cells in the dorsal retina, and in mesenchymal cells surrounding the peripheral nerves. This detailed study of the spatial and temporal expression of LIMK1 shows that LIMK1 expression is more dynamic than previously reported, in particular at sites of tissue–tissue interactions guiding multiple developmental processes. PMID:21167960

  20. Palmitoylation of LIM Kinase-1 ensures spine-specific actin polymerization and morphological plasticity

    PubMed Central

    George, Joju; Soares, Cary; Montersino, Audrey; Beique, Jean-Claude; Thomas, Gareth M

    2015-01-01

    Precise regulation of the dendritic spine actin cytoskeleton is critical for neurodevelopment and neuronal plasticity, but how neurons spatially control actin dynamics is not well defined. Here, we identify direct palmitoylation of the actin regulator LIM kinase-1 (LIMK1) as a novel mechanism to control spine-specific actin dynamics. A conserved palmitoyl-motif is necessary and sufficient to target LIMK1 to spines and to anchor LIMK1 in spines. ShRNA knockdown/rescue experiments reveal that LIMK1 palmitoylation is essential for normal spine actin polymerization, for spine-specific structural plasticity and for long-term spine stability. Palmitoylation is critical for LIMK1 function because this modification not only controls LIMK1 targeting, but is also essential for LIMK1 activation by its membrane-localized upstream activator PAK. These novel roles for palmitoylation in the spatial control of actin dynamics and kinase signaling provide new insights into structural plasticity mechanisms and strengthen links between dendritic spine impairments and neuropathological conditions. DOI: http://dx.doi.org/10.7554/eLife.06327.001 PMID:25884247

  1. Visually directed vs. software-based targeted biopsy compared to transperineal template mapping biopsy in the detection of clinically significant prostate cancer.

    PubMed

    Valerio, Massimo; McCartan, Neil; Freeman, Alex; Punwani, Shonit; Emberton, Mark; Ahmed, Hashim U

    2015-10-01

    Targeted biopsy based on cognitive or software magnetic resonance imaging (MRI) to transrectal ultrasound registration seems to increase the detection rate of clinically significant prostate cancer as compared with standard biopsy. However, these strategies have not been directly compared against an accurate test yet. The aim of this study was to obtain pilot data on the diagnostic ability of visually directed targeted biopsy vs. software-based targeted biopsy, considering transperineal template mapping (TPM) biopsy as the reference test. Prospective paired cohort study included 50 consecutive men undergoing TPM with one or more visible targets detected on preoperative multiparametric MRI. Targets were contoured on the Biojet software. Patients initially underwent software-based targeted biopsies, then visually directed targeted biopsies, and finally systematic TPM. The detection rate of clinically significant disease (Gleason score ≥3+4 and/or maximum cancer core length ≥4mm) of one strategy against another was compared by 3×3 contingency tables. Secondary analyses were performed using a less stringent threshold of significance (Gleason score ≥4+3 and/or maximum cancer core length ≥6mm). Median age was 68 (interquartile range: 63-73); median prostate-specific antigen level was 7.9ng/mL (6.4-10.2). A total of 79 targets were detected with a mean of 1.6 targets per patient. Of these, 27 (34%), 28 (35%), and 24 (31%) were scored 3, 4, and 5, respectively. At a patient level, the detection rate was 32 (64%), 34 (68%), and 38 (76%) for visually directed targeted, software-based biopsy, and TPM, respectively. Combining the 2 targeted strategies would have led to detection rate of 39 (78%). At a patient level and at a target level, software-based targeted biopsy found more clinically significant diseases than did visually directed targeted biopsy, although this was not statistically significant (22% vs. 14%, P = 0.48; 51.9% vs. 44.3%, P = 0.24). Secondary

  2. An overview of wave-mean flow interactions during the winter of 1978-79 derived from LIMS observations. [Limb Infrared Monitor of Stratosphere

    NASA Technical Reports Server (NTRS)

    Gille, J. C.; Lyjak, L. V.

    1984-01-01

    Gradient winds, Eliassen-Palm (EP) fluxes and flux divergences, and the squared refractive index for planetary waves have been calculated from mapped data from the Limb Infrared Monitor of the Stratosphere (LIMS) experiment on Nimbus 7. The changes in the zonal mean atmospheric state, from early winter through 3 disturbances, is described. Convergence or divergence of the EP fluxes clearly produces changes in the zonal mean wind. The steering of the waves by the refractive index structure is not as clear on a daily basis.

  3. From LIMS to OMPS-LP: limb ozone observations for future reanalyses

    NASA Astrophysics Data System (ADS)

    Wargan, K.; Kramarova, N. A.; Remsberg, E. E.; Coy, L.; Harvey, L.; Livesey, N. J.; Pawson, S.

    2017-12-01

    High vertical resolution and accuracy of ozone data from satellite-borne limb sounders have made them an invaluable tool in scientific studies of the middle and upper atmosphere. However, it was not until recently that these measurements were successfully incorporated in atmospheric reanalyses: of the major multidecadal reanalyses only ECMWF's ERA-Interim/ERA5 and NASA's MERRA-2 use limb ozone data. Validation and comparison studies have demonstrated that the addition of observations from the Microwave Limb Sounder (MLS) on EOS Aura greatly improved the quality of ozone fields in MERRA-2 making these assimilated data sets useful for scientific research. In this presentation, we will show the results of test experiments assimilating retrieved ozone from the Limb Infrared Monitor of the Stratosphere (LIMS, 1978/1979) and Ozone Mapping Profiler Suite Limb Profiler (OMPS-LP, 2012 to present). Our approach builds on the established assimilation methodology used for MLS in MERRA-2 and, in the case of OMPS-LP, extends the excellent record of MLS ozone assimilation into the post-EOS era in Earth observations. We will show case studies, discuss comparisons of the new experiments with MERRA-2, strategies for bias correction and the potential for combined assimilation of multiple limb ozone data types in future reanalyses for studies of multidecadal stratospheric ozone changes including trends.

  4. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service.

    PubMed

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee; Yoo, Sooyoung

    2015-04-01

    To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs.

  5. Four and a half LIM domain protein signaling and cardiomyopathy.

    PubMed

    Liang, Yan; Bradford, William H; Zhang, Jing; Sheikh, Farah

    2018-06-20

    Four and a half LIM domain (FHL) protein family members, FHL1 and FHL2, are multifunctional proteins that are enriched in cardiac muscle. Although they both localize within the cardiomyocyte sarcomere (titin N2B), they have been shown to have important yet unique functions within the context of cardiac hypertrophy and disease. Studies in FHL1-deficient mice have primarily uncovered mitogen-activated protein kinase (MAPK) scaffolding functions for FHL1 as part of a novel biomechanical stretch sensor within the cardiomyocyte sarcomere, which acts as a positive regulator of pressure overload-mediated cardiac hypertrophy. New data have highlighted a novel role for the serine/threonine protein phosphatase (PP5) as a deactivator of the FHL1-based biomechanical stretch sensor, which has implications in not only cardiac hypertrophy but also heart failure. In contrast, studies in FHL2-deficient mice have primarily uncovered an opposing role for FHL2 as a negative regulator of adrenergic-mediated signaling and cardiac hypertrophy, further suggesting unique functions targeted by FHL proteins in the "stressed" cardiomyocyte. In this review, we provide current knowledge of the role of FHL1 and FHL2 in cardiac muscle as it relates to their actions in cardiac hypertrophy and cardiomyopathy. A specific focus will be to dissect the pathways and protein-protein interactions that underlie FHLs' signaling role in cardiac hypertrophy as well as provide a comprehensive list of FHL mutations linked to cardiac disease, using evidence gained from genetic mouse models and human genetic studies.

  6. Dynamic contrast-enhanced MRI: Study of inter-software accuracy and reproducibility using simulated and clinical data.

    PubMed

    Beuzit, Luc; Eliat, Pierre-Antoine; Brun, Vanessa; Ferré, Jean-Christophe; Gandon, Yves; Bannier, Elise; Saint-Jalmes, Hervé

    2016-06-01

    To test the reproducibility and accuracy of pharmacokinetic parameter measurements on five analysis software packages (SPs) for dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI), using simulated and clinical data. This retrospective study was Institutional Review Board-approved. Simulated tissues consisted of pixel clusters of calculated dynamic signal changes for combinations of Tofts model pharmacokinetic parameters (volume transfer constant [K(trans) ], extravascular extracellular volume fraction [ve ]), longitudinal relaxation time (T1 ). The clinical group comprised 27 patients treated for rectal cancer, with 36 3T DCE-MR scans performed between November 2012 and February 2014, including dual-flip-angle T1 mapping and a dynamic postcontrast T1 -weighted, 3D spoiled gradient-echo sequence. The clinical and simulated images were postprocessed with five SPs to measure K(trans) , ve , and the initial area under the gadolinium curve (iAUGC). Modified Bland-Altman analysis was conducted, intraclass correlation coefficients (ICCs) and within-subject coefficients of variation were calculated. Thirty-one examinations from 23 patients were of sufficient technical quality and postprocessed. Measurement errors were observed on the simulated data for all the pharmacokinetic parameters and SPs, with a bias ranging from -0.19 min(-1) to 0.09 min(-1) for K(trans) , -0.15 to 0.01 for ve , and -0.65 to 1.66 mmol.L(-1) .min for iAUGC. The ICC between SPs revealed moderate agreement for the simulated data (K(trans) : 0.50; ve : 0.67; iAUGC: 0.77) and very poor agreement for the clinical data (K(trans) : 0.10; ve : 0.16; iAUGC: 0.21). Significant errors were found in the calculated DCE-MRI pharmacokinetic parameters for the perfusion analysis SPs, resulting in poor inter-software reproducibility. J. Magn. Reson. Imaging 2016;43:1288-1300. © 2015 Wiley Periodicals, Inc.

  7. LIM mineralization protein-1 potentiates bone morphogenetic protein responsiveness via a novel interaction with Smurf1 resulting in decreased ubiquitination of Smads.

    PubMed

    Sangadala, Sreedhara; Boden, Scott D; Viggeswarapu, Manjula; Liu, Yunshan; Titus, Louisa

    2006-06-23

    Development and repair of the skeletal system and other organs is highly dependent on precise regulation of bone morphogenetic proteins (BMPs), their receptors, and their intracellular signaling proteins known as Smads. The use of BMPs clinically to induce bone formation has been limited in part by the requirement of much higher doses of recombinant proteins in primates than were needed in cell culture or rodents. Therefore, control of cellular responsiveness to BMPs is now a critical area that is poorly understood. We determined that LMP-1, a LIM domain protein capable of inducing de novo bone formation, interacts with Smurf1 (Smad ubiquitin regulatory factor 1) and prevents ubiquitination of Smads. In the region of LMP responsible for bone formation, there is a motif that directly interacts with the Smurf1 WW2 domain and can effectively compete with Smad1 and Smad5 for binding. We have shown that small peptides containing this motif can mimic the ability to block Smurf1 from binding Smads. This novel interaction of LMP-1 with the WW2 domain of Smurf1 to block Smad binding results in increased cellular responsiveness to exogenous BMP and demonstrates a novel regulatory mechanism for the BMP signaling pathway.

  8. Towards integration of clinical decision support in commercial hospital information systems using distributed, reusable software and knowledge components.

    PubMed

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2001-12-01

    Clinicians' acceptance of clinical decision support depends on its workflow-oriented, context-sensitive accessibility and availability at the point of care, integrated into the Electronic Patient Record (EPR). Commercially available Hospital Information Systems (HIS) often focus on administrative tasks and mostly do not provide additional knowledge based functionality. Their traditionally monolithic and closed software architecture encumbers integration of and interaction with external software modules. Our aim was to develop methods and interfaces to integrate knowledge sources into two different commercial hospital information systems to provide the best decision support possible within the context of available patient data. An existing, proven standalone scoring system for acute abdominal pain was supplemented by a communication interface. In both HIS we defined data entry forms and developed individual and reusable mechanisms for data exchange with external software modules. We designed an additional knowledge support frontend which controls data exchange between HIS and the knowledge modules. Finally, we added guidelines and algorithms to the knowledge library. Despite some major drawbacks which resulted mainly from the HIS' closed software architectures we showed exemplary, how external knowledge support can be integrated almost seamlessly into different commercial HIS. This paper describes the prototypical design and current implementation and discusses our experiences.

  9. From LIMS to OMPS-LP: Limb Ozone Observations for Future Reanalyses

    NASA Technical Reports Server (NTRS)

    Wargan, K.; Kramarova, N.; Remsberg, E.; Coy, L.; Harvey, L.; Livesey, N.; Pawson, S.

    2017-01-01

    High vertical resolution and accuracy of ozone data from satellite-borne limb sounders has made them an invaluable tool in scientific studies of the middle and upper atmosphere. However, it was not until recently that these measurements were successfully incorporated in atmospheric reanalyses: of the major multidecadal reanalyses only ECMWF's (European Centre for Medium-Range Weather Forecasts') ERA (ECMWF Re-Analysis)-Interim/ERA5 and NASA's MERRA-2 (Modern-Era Retrospective Analysis for Research and Applications-2) use limb ozone data. Validation and comparison studies have demonstrated that the addition of observations from the Microwave Limb Sounder (MLS) on EOS (Earth Observing System) Aura greatly improved the quality of ozone fields in MERRA-2 making these assimilated data sets useful for scientific research. In this presentation, we will show the results of test experiments assimilating retrieved ozone from the Limb Infrared Monitor of the Stratosphere (LIMS, 1978/1979) and Ozone Mapping Profiler Suite Limb Profiler (OMPS-LP, 2012 to present). Our approach builds on the established assimilation methodology used for MLS in MERRA-2 and, in the case of OMPS-LP, extends the excellent record of MLS ozone assimilation into the post-EOS era in Earth observations. We will show case studies, discuss comparisons of the new experiments with MERRA-2, strategies for bias correction and the potential for combined assimilation of multiple limb ozone data types in future reanalyses for studies of multidecadal stratospheric ozone changes including trends.

  10. General guidelines for biomedical software development

    PubMed Central

    Silva, Luis Bastiao; Jimenez, Rafael C.; Blomberg, Niklas; Luis Oliveira, José

    2017-01-01

    Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic. PMID:28443186

  11. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service

    PubMed Central

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee

    2015-01-01

    Objectives To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. Methods We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. Results The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. Conclusions We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs. PMID:25995962

  12. LIMS for Lasers 2015 for achieving long-term accuracy and precision of δ2H, δ17O, and δ18O of waters using laser absorption spectrometry

    USGS Publications Warehouse

    Coplen, Tyler B.; Wassenaar, Leonard I

    2015-01-01

    RationaleAlthough laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits.MethodsA Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ17O, δ18O, and δ2H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales.ResultsCost-free LIMS for Lasers 2015 enables users to obtain improved δ17O, δ18O, and δ2H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ2HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale.ConclusionsLIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ2H, δ17O, and δ18O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits

  13. LIMS for Lasers 2015 for achieving long-term accuracy and precision of δ2H, δ17O, and δ18O of waters using laser absorption spectrometry

    USGS Publications Warehouse

    Coplen, Tyler B.; Wassenaar, Leonard I

    2015-01-01

    Although laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits. METHODS: A Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ(17) O, δ(18) O, and δ(2) H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales. RESULTS: Cost-free LIMS for Lasers 2015 enables users to obtain improved δ(17) O, δ(18) O, and δ(2) H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ(2) HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale. CONCLUSIONS: LIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ(2) H, δ(17) O, and δ(18) O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and

  14. Nck-2, a Novel Src Homology2/3-containing Adaptor Protein That Interacts with the LIM-only Protein PINCH and Components of Growth Factor Receptor Kinase-signaling Pathways

    PubMed Central

    Tu, Yizeng; Li, Fugang; Wu, Chuanyue

    1998-01-01

    Many of the protein–protein interactions that are essential for eukaryotic intracellular signal transduction are mediated by protein binding modules including SH2, SH3, and LIM domains. Nck is a SH3- and SH2-containing adaptor protein implicated in coordinating various signaling pathways, including those of growth factor receptors and cell adhesion receptors. We report here the identification, cloning, and characterization of a widely expressed, Nck-related adaptor protein termed Nck-2. Nck-2 comprises primarily three N-terminal SH3 domains and one C-terminal SH2 domain. We show that Nck-2 interacts with PINCH, a LIM-only protein implicated in integrin-linked kinase signaling. The PINCH-Nck-2 interaction is mediated by the fourth LIM domain of PINCH and the third SH3 domain of Nck-2. Furthermore, we show that Nck-2 is capable of recognizing several key components of growth factor receptor kinase-signaling pathways including EGF receptors, PDGF receptor-β, and IRS-1. The association of Nck-2 with EGF receptors was regulated by EGF stimulation and involved largely the SH2 domain of Nck-2, although the SH3 domains of Nck-2 also contributed to the complex formation. The association of Nck-2 with PDGF receptor-β was dependent on PDGF activation and was mediated solely by the SH2 domain of Nck-2. Additionally, we have detected a stable association between Nck-2 and IRS-1 that was mediated primarily via the second and third SH3 domain of Nck-2. Thus, Nck-2 associates with PINCH and components of different growth factor receptor-signaling pathways via distinct mechanisms. Finally, we provide evidence indicating that a fraction of the Nck-2 and/or Nck-1 proteins are associated with the cytoskeleton. These results identify a novel Nck-related SH2- and SH3-domain–containing protein and suggest that it may function as an adaptor protein connecting the growth factor receptor-signaling pathways with the integrin-signaling pathways. PMID:9843575

  15. LIM kinase inhibitors disrupt mitotic microtubule organization and impair tumor cell proliferation

    PubMed Central

    Mardilovich, Katerina; Baugh, Mark; Crighton, Diane; Kowalczyk, Dominika; Gabrielsen, Mads; Munro, June; Croft, Daniel R.; Lourenco, Filipe; James, Daniel; Kalna, Gabriella; McGarry, Lynn; Rath, Oliver; Shanks, Emma; Garnett, Mathew J.; McDermott, Ultan; Brookfield, Joanna; Charles, Mark; Hammonds, Tim; Olson, Michael F.

    2015-01-01

    The actin and microtubule cytoskeletons are critically important for cancer cell proliferation, and drugs that target microtubules are widely-used cancer therapies. However, their utility is compromised by toxicities due to dose and exposure. To overcome these issues, we characterized how inhibition of the actin and microtubule cytoskeleton regulatory LIM kinases could be used in drug combinations to increase efficacy. A previously-described LIMK inhibitor (LIMKi) induced dose-dependent microtubule alterations that resulted in significant mitotic defects, and increased the cytotoxic potency of microtubule polymerization inhibitors. By combining LIMKi with 366 compounds from the GSK Published Kinase Inhibitor Set, effective combinations were identified with kinase inhibitors including EGFR, p38 and Raf. These findings encouraged a drug discovery effort that led to development of CRT0105446 and CRT0105950, which potently block LIMK1 and LIMK2 activity in vitro, and inhibit cofilin phosphorylation and increase αTubulin acetylation in cells. CRT0105446 and CRT0105950 were screened against 656 cancer cell lines, and rhabdomyosarcoma, neuroblastoma and kidney cancer cells were identified as significantly sensitive to both LIMK inhibitors. These large-scale screens have identified effective LIMK inhibitor drug combinations and sensitive cancer types. In addition, the LIMK inhibitory compounds CRT0105446 and CRT0105950 will enable further development of LIMK-targeted cancer therapy. PMID:26540348

  16. Targeting nitrative stress for attenuating cisplatin-induced downregulation of cochlear LIM domain only 4 and ototoxicity.

    PubMed

    Jamesdaniel, Samson; Rathinam, Rajamani; Neumann, William L

    2016-12-01

    Cisplatin-induced ototoxicity remains a primary dose-limiting adverse effect of this highly effective anticancer drug. The clinical utility of cisplatin could be enhanced if the signaling pathways that regulate the toxic side-effects are delineated. In previous studies, we reported cisplatin-induced nitration of cochlear proteins and provided the first evidence for nitration and downregulation of cochlear LIM domain only 4 (LMO4) in cisplatin ototoxicity. Here, we extend these findings to define the critical role of nitrative stress in cisplatin-induced downregulation of LMO4 and its consequent ototoxic effects in UBOC1 cell cultures derived from sensory epithelial cells of the inner ear and in CBA/J mice. Cisplatin treatment increased the levels of nitrotyrosine and active caspase 3 in UBOC1 cells, which was detected by immunocytochemical and flow cytometry analysis, respectively. The cisplatin-induced nitrative stress and apoptosis were attenuated by co-treatment with SRI110, a peroxynitrite decomposition catalyst (PNDC), which also attenuated the cisplatin-induced downregulation of LMO4 in a dose-dependent manner. Furthermore, transient overexpression of LMO4 in UBOC1 cells prevented cisplatin-induced cytotoxicity while repression of LMO4 exacerbated cisplatin-induced cell death, indicating a direct link between LMO4 protein levels and cisplatin ototoxicity. Finally, auditory brainstem responses (ABR) recorded from CBA/J mice indicated that co-treatment with SRI110 mitigated cisplatin-induced hearing loss. Together, these results suggest that cisplatin-induced nitrative stress leads to a decrease in the levels of LMO4, downregulation of LMO4 is a critical determinant in cisplatin-induced ototoxicity, and targeting peroxynitrite could be a promising strategy for mitigating cisplatin-induced hearing loss. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Internet-based virtual classroom and educational management software enhance students' didactic and clinical experiences in perfusion education programs.

    PubMed

    Riley, Jeffrey B; Austin, Jon W; Holt, David W; Searles, Bruce E; Darling, Edward M

    2004-09-01

    A challenge faced by many university-based perfusion education (PE) programs is the need for student clinical rotations at hospital locations that are geographically disparate from the main educational campus. The problem has been addressed through the employment of distance-learning environments. The purpose of this educational study is to evaluate the effectiveness of this teaching model as it is applied to PE. Web-based virtual classroom (VC) environments and educational management system (EMS) software were implemented independently and as adjuncts to live, interactive Internet-based audio/video transmission from classroom to classroom in multiple university-based PE programs. These Internet environments have been used in a variety of ways including: 1) forum for communication between the university faculty, students, and preceptors at clinical sites, 2) didactic lectures from expert clinicians to students assigned to distant clinical sites, 3) small group problem-based-learning modules designed to enhance students analytical skills, and 4) conversion of traditional face-to-face lectures to asynchronous learning modules. Hypotheses and measures of student and faculty satisfaction, clinical experience, and learning outcomes are proposed, and some early student feedback was collected. For curricula that emphasize both didactic and clinical education, the use of Internet-based VC and EMS software provides significant advancements over traditional models. Recognized advantages include: 1) improved communications between the college faculty and the students and clinical preceptors, 2) enhanced access to a national network of clinical experts in specialized techniques, 3) expanded opportunity for student distant clinical rotations with continued didactic course work, and 4) improved continuity and consistency of clinical experiences between students through implementation of asynchronous learning modules. Students recognize the learning efficiency of on

  18. Synchronization software for automation in anesthesia.

    PubMed

    Bressan, Nadja; Castro, Ana; Brás, Susana; Oliveira, Hélder P; Ribeiro, Lénio; Ferreira, David A; Antunes, Luís; Amorim, Pedro; Nunes, Catarina S

    2007-01-01

    This work presents the development of a software for data acquisition and control (ASYS) on a clinical setup. Similar to the industrial Supervisory Control And Data Acquisition (SCADA) the software assembles a Target Controlled Infusion (TCI) monitoring and supervisory control data in real time from devices in a surgical room. The software is not a full controller since the TCI systems comprehend permanent interaction from the anesthesiologist. Based on pharmacokinetic models, the effect-site and plasma concentrations can be related with the drug dose infused and vice versa. The software determines the infusion rates of the drug which are given as commands to the infusion pumps. This software provides the anesthesiologist with a trustworthy tool for managing a safe and balanced anesthesia. Since it also incorporates the acquisition and display of patients brain signals.

  19. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  20. Neogene-quaternary Ostracoda and paleoenvironments, of the Limón basin, Costa Rica, and Bocas del Toro basin, Panama

    USGS Publications Warehouse

    Borne, P.F.; Cronin, T. M.; Hazel, J.E.

    1999-01-01

    Tropical marine ostracodes from Neogene and Quaternary sediments of the Central American Caribbean region have been the subject of biostratigraphic, ecological, taxonomic, and evolutionary studies. As part of the Panama Paleontology Project (PPP), Neogene and Quaternary ostracodes are being studied from the Central American region. The overall goal of this research is to evaluate the impact of the emergence of the Central American Isthmus as a land barrier between the Caribbean/tropical Atlantic and the Pacific oceans on marine ostracode biodiversity and the oceanic environments in which extant ostracodes evolved. Due to the ecological specificity of many living tropical ostracode species, they are ideally suited for reconstructing paleoenvironments on the basis of their occurrence in fossil assemblages, which in turn can lead to a better understanding of the tropical climatic and tectonic history of Central America. The principal aims of this chapter are: (a) to document the composition of the ostracode assemblages from the Limón Basin of Costa Rica and the Bocas del Toro Basin of Panama, two areas yielding extensive ma rine ostracode assemblages; (b) to describe the environments of deposition within these basins; and (c) to document the stratigraphic distribution of potentially agediagnostic ostracode species in the Limón and Bocas del Toro basins in order to enhance their use in Central American biostratigraphy. A secondary, but none-the-less important goal is to assemble a database on the distribution of modem ostracode species in the Caribbean and adjacent areas as a basis for comparison with fossil assemblages. Although the ecological, biostratigraphic and paleoenvironmental conclusions presented here will improve as additional material is studied, these fossil and modem ostracode databases constitute the foundation for future evolutionary and geochernical studies of tropical Caribbean and eastern Pacific Ocean ostracodes. Moreover, we present here evidence

  1. SU-F-P-54: Guidelines to Check Image Registration QA of a Clinical Deformation Registration Software: A Single Institution Preliminary Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gill, G; Souri, S; Rea, A

    Purpose: The objective of this study is to verify and analyze the accuracy of a clinical deformable image registration (DIR) software. Methods: To test clinical DIR software qualitatively and quantitatively, we focused on lung radiotherapy and analyzed a single (Lung) patient CT scan. Artificial anatomical changes were applied to account for daily variations during the course of treatment including the planning target volume (PTV) and organs at risk (OAR). The primary CT (pCT) and the structure set (pST) was deformed with commercial tool (ImSimQA-Oncology Systems Limited) and after artificial deformation (dCT and dST) sent to another commercial tool (VelocityAI-Varian Medicalmore » Systems). In Velocity, the deformed CT and structures (dCT and dST) were inversely deformed back to original primary CT (dbpCT and dbpST). We compared the dbpST and pST structure sets using similarity metrics. Furthermore, a binary deformation field vector (BDF) was created and sent to ImSimQA software for comparison with known “ground truth” deformation vector fields (DVF). Results: An image similarity comparison was made by using “ground truth” DVF and “deformed output” BDF with an output of normalized “cross correlation (CC)” and “mutual information (MI)” in ImSimQA software. Results for the lung case were MI=0.66 and CC=0.99. The artificial structure deformation in both pST and dbpST was analyzed using DICE coefficient, mean distance to conformity (MDC) and deformation field error volume histogram (DFEVH) by comparing them before and after inverse deformation. We have noticed inadequate structure match for CTV, ITV and PTV due to close proximity of heart and overall affected by lung expansion. Conclusion: We have seen similarity between pCT and dbpCT but not so well between pST and dbpST, because of inadequate structure deformation in clinical DIR system. This system based quality assurance test will prepare us for adopting the guidelines of upcoming AAPM task

  2. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  3. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  4. Photogrammetric method to measure the discrepancy between clinical and software-designed positions of implants.

    PubMed

    Rivara, Federico; Lumetti, Simone; Calciolari, Elena; Toffoli, Andrea; Forlani, Gianfranco; Manfredi, Edoardo

    2016-06-01

    The position of dental implants placed with software-guided systems should be highly accurate in order to ensure safety and a passive fit of the immediate prosthesis. The purpose of this study was to measure the discrepancy between the clinical and software-planned position of dental implants by applying a photogrammetric method. Two casts were obtained, 1 from the surgical template and 1 from the actual position of the implants on the alveolar ridge of a patient. Photogrammetry was then applied to precisely locate the position of each implant on the casts. Because this mathematical technique required the identification of image points and of the relative spatial coordinates, 4 marks were drilled on the implant screw. The position of the implants was then identified as the geometric center of the 4 marks, while the orientation of the implant axis was represented by a vector normal to the plane fitting the points. A series of 16 convergent images all around the object was made using a high-resolution digital camera. A mathematical method called "rototranslation" was used to superimpose the cast images for the comparison. The tests performed on the casts resulted in an average precision level of 4 μm for the locations and less than 1 degree for the axis of the implants. A series of empirical and numerical tests were performed to assess the performance of the procedure and of the measurement protocol. The photogrammetric method is reproducible and can be used to measure the discrepancy between the software-planned and the real position of dental implants. Considering that the average precision level required for an implant-based prosthesis is approximately 50 μm, the error associated with this method can be considered as negligible. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  5. Software-based evaluation of toric IOL orientation in a multicenter clinical study.

    PubMed

    Kasthurirangan, Sanjeev; Feuchter, Lucas; Smith, Pamela; Nixon, Donald

    2014-12-01

    To evaluate the rotational stability of a new one-piece hydrophobic acrylic toric intraocular lens (IOL) using a custom-developed software for analysis of slit-lamp photographs. In a prospective, multicenter study, 174 eyes were implanted with the TECNIS Toric IOL (Abbott Medical Optics, Inc., Santa Ana, CA). A custom-developed software was used to analyze high-resolution slit-lamp photographs of 156 eyes taken at day 1 (baseline) and 1, 3, and 6 months postoperatively. The software uses iris and sclera landmarks to align the baseline image and later images for comparison. Validation of software was performed through repeated analyses of protractor images rotated from 0.1° to 10.0° and randomly selected photographs of 20 eyes. Software validation showed precision (repeatability plus reproducibility variation) of 0.02° using protractor images and 2.22° using slit-lamp photographs. Good quality slit-lamp images and clear landmarks were necessary for precise measurements. At 6 months, 94.2% of eyes had 5° or less change in IOL orientation versus baseline; only 2 eyes (1.4%) had axis shift greater than 30°. Most eyes were within 5° or less of rotation between 1 and 3 months (92.9%) and 3 and 6 months (94.1%). Mean absolute axis change (± standard deviation) from 1 day to 6 months was 2.70° ± 5.51°. The new custom software was precise and quick in analyzing slit-lamp photographs to determine postoperative toric IOL rotation. Copyright 2014, SLACK Incorporated.

  6. [Development and practice evaluation of blood acid-base imbalance analysis software].

    PubMed

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  7. Automated ultrasound edge-tracking software comparable to established semi-automated reference software for carotid intima-media thickness analysis.

    PubMed

    Shenouda, Ninette; Proudfoot, Nicole A; Currie, Katharine D; Timmons, Brian W; MacDonald, Maureen J

    2018-05-01

    Many commercial ultrasound systems are now including automated analysis packages for the determination of carotid intima-media thickness (cIMT); however, details regarding their algorithms and methodology are not published. Few studies have compared their accuracy and reliability with previously established automated software, and those that have were in asymptomatic adults. Therefore, this study compared cIMT measures from a fully automated ultrasound edge-tracking software (EchoPAC PC, Version 110.0.2; GE Medical Systems, Horten, Norway) to an established semi-automated reference software (Artery Measurement System (AMS) II, Version 1.141; Gothenburg, Sweden) in 30 healthy preschool children (ages 3-5 years) and 27 adults with coronary artery disease (CAD; ages 48-81 years). For both groups, Bland-Altman plots revealed good agreement with a negligible mean cIMT difference of -0·03 mm. Software differences were statistically, but not clinically, significant for preschool images (P = 0·001) and were not significant for CAD images (P = 0·09). Intra- and interoperator repeatability was high and comparable between software for preschool images (ICC, 0·90-0·96; CV, 1·3-2·5%), but slightly higher with the automated ultrasound than the semi-automated reference software for CAD images (ICC, 0·98-0·99; CV, 1·4-2·0% versus ICC, 0·84-0·89; CV, 5·6-6·8%). These findings suggest that the automated ultrasound software produces valid cIMT values in healthy preschool children and adults with CAD. Automated ultrasound software may be useful for ensuring consistency among multisite research initiatives or large cohort studies involving repeated cIMT measures, particularly in adults with documented CAD. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  8. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  9. The Computer-based Health Evaluation Software (CHES): a software for electronic patient-reported outcome monitoring.

    PubMed

    Holzner, Bernhard; Giesinger, Johannes M; Pinggera, Jakob; Zugal, Stefan; Schöpf, Felix; Oberguggenberger, Anne S; Gamper, Eva M; Zabernigg, August; Weber, Barbara; Rumpold, Gerhard

    2012-11-09

    Patient-reported Outcomes (PROs) capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO) with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff.The objective of our project was to develop software (CHES - Computer-based Health Evaluation System) for ePRO in hospital settings and at home with a special focus on the presentation of individual patient's results. Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients) to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients' PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total). Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily clinical practice and on extending cancer registries with PRO data

  10. The Computer-based Health Evaluation Software (CHES): a software for electronic patient-reported outcome monitoring

    PubMed Central

    2012-01-01

    Background Patient-reported Outcomes (PROs) capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO) with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System) for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients) to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total). Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily clinical practice and

  11. Detection of group B streptococci in Lim broth by use of group B streptococcus peptide nucleic acid fluorescent in situ hybridization and selective and nonselective agars.

    PubMed

    Montague, Naomi S; Cleary, Timothy J; Martinez, Octavio V; Procop, Gary W

    2008-10-01

    The sensitivity, specificity, and positive and negative predictive values for the detection of group B streptococci from Lim enrichment broth with sheep blood agar (SBA), with selective Streptococcus agar (SSA), and by a peptide nucleic acid fluorescent in situ hybridization (PNA FISH) assay were as follows: for culture on SBA, 68.4%, 100%, 100%, and 87.9%, respectively; for culture on SSA, 85.5%, 100%, 100%, and 94.1%, respectively; and for the PNA FISH assay, 97.4%, 98.3%, 96.1%, and 98.9%, respectively.

  12. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  13. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  14. Clinical Data Systems to Support Public Health Practice: A National Survey of Software and Storage Systems Among Local Health Departments.

    PubMed

    McCullough, J Mac; Goodin, Kate

    2016-01-01

    Numerous software and data storage systems are employed by local health departments (LHDs) to manage clinical and nonclinical data needs. Leveraging electronic systems may yield improvements in public health practice. However, information is lacking regarding current usage patterns among LHDs. To analyze clinical and nonclinical data storage and software types by LHDs. Data came from the 2015 Informatics Capacity and Needs Assessment Survey, conducted by Georgia Southern University in collaboration with the National Association of County and City Health Officials. A total of 324 LHDs from all 50 states completed the survey (response rate: 50%). Outcome measures included LHD's primary clinical service data system, nonclinical data system(s) used, and plans to adopt electronic clinical data system (if not already in use). Predictors of interest included jurisdiction size and governance type, and other informatics capacities within the LHD. Bivariate analyses were performed using χ and t tests. Up to 38.4% of LHDs reported using an electronic health record (EHR). Usage was common especially among LHDs that provide primary care and/or dental services. LHDs serving smaller populations and those with state-level governance were both less likely to use an EHR. Paper records were a common data storage approach for both clinical data (28.9%) and nonclinical data (59.4%). Among LHDs without an EHR, 84.7% reported implementation plans. Our findings suggest that LHDs are increasingly using EHRs as a clinical data storage solution and that more LHDs are likely to adopt EHRs in the foreseeable future. Yet use of paper records remains common. Correlates of electronic system usage emerged across a range of factors. Program- or system-specific needs may be barriers or facilitators to EHR adoption. Policy makers can tailor resources to address barriers specific to LHD size, governance, service portfolio, existing informatics capabilities, and other pertinent characteristics.

  15. SaDA: From Sampling to Data Analysis—An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data

    PubMed Central

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-01-01

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146

  16. SaDA: From Sampling to Data Analysis-An Extensible Open Source Infrastructure for Rapid, Robust and Automated Management and Analysis of Modern Ecological High-Throughput Microarray Data.

    PubMed

    Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola

    2015-06-03

    One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies.

  17. Glioblastoma Segmentation: Comparison of Three Different Software Packages.

    PubMed

    Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid

    2016-01-01

    To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.

  18. LIM Domain Only 2 Regulates Endothelial Proliferation, Angiogenesis, and Tissue Regeneration.

    PubMed

    Meng, Shu; Matrone, Gianfranco; Lv, Jie; Chen, Kaifu; Wong, Wing Tak; Cooke, John P

    2016-10-06

    LIM domain only 2 (LMO2, human gene) is a key transcription factor that regulates hematopoiesis and vascular development. However, its role in adult endothelial function has been incompletely characterized. In vitro loss- and gain-of-function studies on LMO2 were performed in human umbilical vein endothelial cells with lentiviral overexpression or short hairpin RNA knockdown (KD) of LMO2, respectively. LMO2 KD significantly impaired endothelial proliferation. LMO2 controls endothelial G1/S transition through transcriptional regulation of cyclin-dependent kinase 2 and 4 as determined by reverse transcription polymerase chain reaction (PCR), western blot, and chromatin immunoprecipitation, and also influences the expression of Cyclin D1 and Cyclin A1. LMO2 KD also impaired angiogenesis by reducing transforming growth factor-β (TGF-β) expression, whereas supplementation of exogenous TGF-β restored defective network formation in LMO2 KD human umbilical vein endothelial cells. In a zebrafish model of caudal fin regeneration, RT-PCR revealed that the lmo2 (zebrafish gene) gene was upregulated at day 5 postresection. The KD of lmo2 by vivo-morpholino injections in adult Tg(fli1:egfp) y1 zebrafish reduced 5-bromo-2'-deoxyuridine incorporation in endothelial cells, impaired neoangiogenesis in the resected caudal fin, and substantially delayed fin regeneration. The transcriptional factor LMO2 regulates endothelial proliferation and angiogenesis in vitro. Furthermore, LMO2 is required for angiogenesis and tissue healing in vivo. Thus, LMO2 is a critical determinant of vascular and tissue regeneration. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  19. Screensaver: an open source lab information management system (LIMS) for high throughput screening facilities

    PubMed Central

    2010-01-01

    Background Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. Results We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. Conclusions The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities. PMID:20482787

  20. Screensaver: an open source lab information management system (LIMS) for high throughput screening facilities.

    PubMed

    Tolopko, Andrew N; Sullivan, John P; Erickson, Sean D; Wrobel, David; Chiang, Su L; Rudnicki, Katrina; Rudnicki, Stewart; Nale, Jennifer; Selfors, Laura M; Greenhouse, Dara; Muhlich, Jeremy L; Shamu, Caroline E

    2010-05-18

    Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities.

  1. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  2. Detection of Group B Streptococci in Lim Broth by Use of Group B Streptococcus Peptide Nucleic Acid Fluorescent In Situ Hybridization and Selective and Nonselective Agars▿

    PubMed Central

    Montague, Naomi S.; Cleary, Timothy J.; Martinez, Octavio V.; Procop, Gary W.

    2008-01-01

    The sensitivity, specificity, and positive and negative predictive values for the detection of group B streptococci from Lim enrichment broth with sheep blood agar (SBA), with selective Streptococcus agar (SSA), and by a peptide nucleic acid fluorescent in situ hybridization (PNA FISH) assay were as follows: for culture on SBA, 68.4%, 100%, 100%, and 87.9%, respectively; for culture on SSA, 85.5%, 100%, 100%, and 94.1%, respectively; and for the PNA FISH assay, 97.4%, 98.3%, 96.1%, and 98.9%, respectively. PMID:18667597

  3. The State of Cloud-Based Biospecimen and Biobank Data Management Tools.

    PubMed

    Paul, Shonali; Gade, Aditi; Mallipeddi, Sumani

    2017-04-01

    Biobanks are critical for collecting and managing high-quality biospecimens from donors with appropriate clinical annotation. The high-quality human biospecimens and associated data are required to better understand disease processes. Therefore, biobanks have become an important and essential resource for healthcare research and drug discovery. However, collecting and managing huge volumes of data (biospecimens and associated clinical data) necessitate that biobanks use appropriate data management solutions that can keep pace with the ever-changing requirements of research. To automate biobank data management, biobanks have been investing in traditional Laboratory Information Management Systems (LIMS). However, there are a myriad of challenges faced by biobanks in acquiring traditional LIMS. Traditional LIMS are cost-intensive and often lack the flexibility to accommodate changes in data sources and workflows. Cloud technology is emerging as an alternative that provides the opportunity to small and medium-sized biobanks to automate their operations in a cost-effective manner, even without IT personnel. Cloud-based solutions offer the advantage of heightened security, rapid scalability, dynamic allocation of services, and can facilitate collaboration between different research groups by using a shared environment on a "pay-as-you-go" basis. The benefits offered by cloud technology have resulted in the development of cloud-based data management solutions as an alternative to traditional on-premise software. After evaluating the advantages offered by cloud technology, several biobanks have started adopting cloud-based tools. Cloud-based tools provide biobanks with easy access to biospecimen data for real-time sharing with clinicians. Another major benefit realized by biobanks by implementing cloud-based applications is unlimited data storage on the cloud and automatic backups for protecting any data loss in the face of natural calamities.

  4. OxMaR: open source free software for online minimization and randomization for clinical trials.

    PubMed

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  5. SMITH: a LIMS for handling next-generation sequencing workflows

    PubMed Central

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The

  6. SMITH: a LIMS for handling next-generation sequencing workflows.

    PubMed

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available

  7. Cytoskeleton-interacting LIM-domain protein CRP1 suppresses cell proliferation and protects from stress-induced cell death

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latonen, Leena; Jaervinen, Paeivi M.; Haartman Institute, University of Helsinki, FIN-00014 Helsinki

    2008-02-15

    Members of the cysteine-rich protein (CRP) family are actin cytoskeleton-interacting LIM-domain proteins known to act in muscle cell differentiation. We have earlier found that CRP1, a founding member of this family, is transcriptionally induced by UV radiation in human diploid fibroblasts [M. Gentile, L. Latonen, M. Laiho, Cell cycle arrest and apoptosis provoked by UV radiation-induced DNA damage are transcriptionally highly divergent responses, Nucleic Acids Res. 31 (2003) 4779-4790]. Here we show that CRP1 is induced by growth-inhibitory signals, such as increased cellular density, and cytotoxic stress induced by UV radiation or staurosporine. We found that high levels of CRP1more » correlate with differentiation-associated morphology towards the myofibroblast lineage and that expression of ectopic CRP1 suppresses cell proliferation. Following UV- and staurosporine-induced stresses, expression of CRP1 provides a survival advantage evidenced by decreased cellular death and increased cellular metabolic activity and attachment. Our studies identify that CRP1 is a novel stress response factor, and provide evidence for its growth-inhibitory and cytoprotective functions.« less

  8. Clinical evaluation of reducing acquisition time on single-photon emission computed tomography image quality using proprietary resolution recovery software.

    PubMed

    Aldridge, Matthew D; Waddington, Wendy W; Dickson, John C; Prakash, Vineet; Ell, Peter J; Bomanji, Jamshed B

    2013-11-01

    A three-dimensional model-based resolution recovery (RR) reconstruction algorithm that compensates for collimator-detector response, resulting in an improvement in reconstructed spatial resolution and signal-to-noise ratio of single-photon emission computed tomography (SPECT) images, was tested. The software is said to retain image quality even with reduced acquisition time. Clinically, any improvement in patient throughput without loss of quality is to be welcomed. Furthermore, future restrictions in radiotracer supplies may add value to this type of data analysis. The aims of this study were to assess improvement in image quality using the software and to evaluate the potential of performing reduced time acquisitions for bone and parathyroid SPECT applications. Data acquisition was performed using the local standard SPECT/CT protocols for 99mTc-hydroxymethylene diphosphonate bone and 99mTc-methoxyisobutylisonitrile parathyroid SPECT imaging. The principal modification applied was the acquisition of an eight-frame gated data set acquired using an ECG simulator with a fixed signal as the trigger. This had the effect of partitioning the data such that the effect of reduced time acquisitions could be assessed without conferring additional scanning time on the patient. The set of summed data sets was then independently reconstructed using the RR software to permit a blinded assessment of the effect of acquired counts upon reconstructed image quality as adjudged by three experienced observers. Data sets reconstructed with the RR software were compared with the local standard processing protocols; filtered back-projection and ordered-subset expectation-maximization. Thirty SPECT studies were assessed (20 bone and 10 parathyroid). The images reconstructed with the RR algorithm showed improved image quality for both full-time and half-time acquisitions over local current processing protocols (P<0.05). The RR algorithm improved image quality compared with local processing

  9. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  10. Software-assisted post-interventional assessment of radiofrequency ablation

    NASA Astrophysics Data System (ADS)

    Rieder, Christian; Geisler, Benjamin; Bruners, Philipp; Isfort, Peter; Na, Hong-Sik; Mahnken, Andreas H.; Hahn, Horst K.

    2014-03-01

    Radiofrequency ablation (RFA) is becoming a standard procedure for minimally invasive tumor treatment in clinical practice. Due to its common technical procedure, low complication rate, and low cost, RFA has become an alternative to surgical resection in the liver. To evaluate the therapy success of RFA, thorough follow-up imaging is essential. Conventionally, shape, size, and position of tumor and coagulation are visually compared in a side-by-side manner using pre- and post-interventional images. To objectify the verification of the treatment success, a novel software assistant allowing for fast and accurate comparison of tumor and coagulation is proposed. In this work, the clinical value of the proposed assessment software is evaluated. In a retrospective clinical study, 39 cases of hepatic tumor ablation are evaluated using the prototype software and conventional image comparison by four radiologists with different levels of experience. The cases are randomized and evaluated in two sessions to avoid any recall-bias. Self-confidence of correct diagnosis (local recurrence vs. no local recurrence) on a six-point scale is given for each case by the radiologists. Sensitivity, specificity, positive and negative predictive values as well as receiver operating curves are calculated for both methods. It is shown that the software-assisted method allows physicians to correctly identify local tumor recurrence with a higher percentage than the conventional method (sensitivity: 0.6 vs. 0.35), whereas the percentage of correctly identified successful ablations is slightly reduced (specificity: 0.83 vs. 0.89).

  11. Software Library: A Reusable Software Issue.

    DTIC Science & Technology

    1984-06-01

    On reverse aide it neceeary aid Identify by block number) Software Library; Program Library; Reusability; Generator 20 ABSTRACT (Cmlnue on revere... Software Library. A particular example of the Software Library, the Program Library, is described as a prototype of a reusable library. A hierarchical... programming libraries are described. Finally, non code products in the Software Library are discussed. Accesson Fo NTIS R~jS DrrC TA Availability Codes 0

  12. Let's get physical!: Comment on" Physical methods for genetic transformation of fungi and yeast" by Ana L. Rivera, Denis Magaña-Ortíz, Miguel Gómez-Lim, Francisco Fernández and Achim M. Loske.

    USDA-ARS?s Scientific Manuscript database

    This comment analyses the paper “Physical methods for genetic transformation of fungi and yeast” by Ana L. Rivera, Denis Magaña-Ortíz , Miguel Gómez-Lim , Francisco Fernández and Achim M. Loske. I examine the methods described and their advantages and disadvantages. I further discuss the other more ...

  13. LIM Protein Ajuba associates with the RPA complex through direct cell cycle-dependent interaction with the RPA70 subunit.

    PubMed

    Fowler, Sandy; Maguin, Pascal; Kalan, Sampada; Loayza, Diego

    2018-06-22

    DNA damage response pathways are essential for genome stability and cell survival. Specifically, the ATR kinase is activated by DNA replication stress. An early event in this activation is the recruitment and phosphorylation of RPA, a single stranded DNA binding complex composed of three subunits, RPA70, RPA32 and RPA14. We have previously shown that the LIM protein Ajuba associates with RPA, and that depletion of Ajuba leads to potent activation of ATR. In this study, we provide evidence that the Ajuba-RPA interaction occurs through direct protein contact with RPA70, and that their association is cell cycle-regulated and is reduced upon DNA replication stress. We propose a model in which Ajuba negatively regulates the ATR pathway by directly interacting with RPA70, thereby preventing inappropriate ATR activation. Our results provide a framework to further our understanding of the mechanism of ATR regulation in human cells in the context of cellular transformation.

  14. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  15. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  16. Clinical Severity Classification using Automated Conjunctival Hyperemia Analysis Software in Patients with Superior Limbic Keratoconjunctivitis.

    PubMed

    Kurita, Junki; Shoji, Jun; Inada, Noriko; Yoneda, Tsuyoshi; Sumi, Tamaki; Kobayashi, Masahiko; Hoshikawa, Yasuhiro; Fukushima, Atsuki; Yamagami, Satoru

    2018-06-01

    Digitization of clinical observation is necessary for assessing the severity of superior limbic keratoconjunctivitis (SLK). This study aimed to use a novel quantitative marker to examine hyperemia in patients with SLK. We included six eyes of six patients with both dry eye disease and SLK (SLK group) and eight eyes of eight patients with Sjögren syndrome (SS group). We simultaneously obtained the objective finding scores by using slit-lamp examination and calculated the superior hyperemia index (SHI) with an automated conjunctival hyperemia analysis software by using photographs of the anterior segment. Three objective finding scores, including papillary formation of the superior palpebral conjunctiva, superior limbal hyperemia and swelling, and superior corneal epitheliopathy, were determined. The SHI was calculated as the superior/temporal ratio of bulbar conjunctival hyperemia by using the software. Fisher's exact test was used to compare a high SHI (≥1.07) ratio between the SLK and SS groups. P-Values < 0.05 were considered statistically significant. The SHI (mean ± standard deviation) in the SLK and SS groups was 1.19 ± 0.50 and 0.69 ± 0.24, respectively. The number of patients with a high SHI (≥1.07) was significantly higher in the SLK group than in the SS group (p < 0.05). The sensitivity and specificity of the SHI in the differential diagnosis between SS and SLK were 66.7% and 87.5%, respectively. An analysis of the association between the objective finding scores and SHI showed that the SHI had a tendency to indicate the severity of superior limbal hyperemia and swelling score in the SLK group. The SHI calculated using the automated conjunctival hyperemia analysis software could successfully quantify superior bulbar conjunctival hyperemia and may be a useful tool for the differential diagnosis between SS and SLK and for the quantitative follow-up of patients with SLK.

  17. LIM-domain protein AJUBA suppresses malignant mesothelioma cell proliferation via Hippo signaling cascade.

    PubMed

    Tanaka, I; Osada, H; Fujii, M; Fukatsu, A; Hida, T; Horio, Y; Kondo, Y; Sato, A; Hasegawa, Y; Tsujimura, T; Sekido, Y

    2015-01-02

    Malignant mesothelioma (MM) is one of the most aggressive neoplasms usually associated with asbestos exposure and is highly refractory to current therapeutic modalities. MMs show frequent activation of a transcriptional coactivator Yes-associated protein (YAP), which is attributed to the neurofibromatosis type 2 (NF2)-Hippo pathway dysfunction, leading to deregulated cell proliferation and acquisition of a malignant phenotype. However, the whole mechanism of disordered YAP activation in MMs has not yet been well clarified. In the present study, we investigated various components of the NF2-Hippo pathway, and eventually found that MM cells frequently showed downregulation of LIM-domain protein AJUBA, a binding partner of large tumor suppressor type 2 (LATS2), which is one of the last-step kinases of the NF2-Hippo pathway. Although loss of AJUBA expression was independent of the alteration status of other Hippo pathway components, MM cell lines with AJUBA inactivation showed a more dephosphorylated (activated) level of YAP. Immunohistochemical analysis showed frequent downregulation of AJUBA in primary MMs, which was associated with YAP constitutive activation. We found that AJUBA transduction into MM cells significantly suppressed promoter activities of YAP-target genes, and the suppression of YAP activity by AJUBA was remarkably canceled by knockdown of LATS2. In connection with these results, transduction of AJUBA-expressing lentivirus significantly inhibited the proliferation and anchorage-independent growth of the MM cells that harbored ordinary LATS family expression. Taken together, our findings indicate that AJUBA negatively regulates YAP activity through the LATS family, and inactivation of AJUBA is a novel key mechanism in MM cell proliferation.

  18. Open source software projects of the caBIG In Vivo Imaging Workspace Software special interest group.

    PubMed

    Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence

    2007-11-01

    The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research.

  19. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  20. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multi-core, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to .50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  1. The relationships between software publications and software systems

    NASA Astrophysics Data System (ADS)

    Hogg, David W.

    2017-01-01

    When we build software systems or software tools for astronomy, we sometimes do and sometimes don't also write and publish standard scientific papers about those software systems. I will discuss the pros and cons of writing such publications. There are impacts of writing such papers immediately (they can affect the design and structure of the software project itself), in the short term (they can promote adoption and legitimize the software), in the medium term (they can provide a platform for all the literature's mechanisms for citation, criticism, and reuse), and in the long term (they can preserve ideas that are embodied in the software, possibly on timescales much longer than the lifetime of any software context). I will argue that as important as pure software contributions are to astronomy—and I am both a preacher and a practitioner—software contributions are even more valuable when they are associated with traditional scientific publications. There are exceptions and complexities of course, which I will discuss.

  2. Software tools for interactive instruction in radiologic anatomy.

    PubMed

    Alvarez, Antonio; Gold, Garry E; Tobin, Brian; Desser, Terry S

    2006-04-01

    To promote active learning in an introductory Radiologic Anatomy course through the use of computer-based exercises. DICOM datasets from our hospital PACS system were transferred to a networked cluster of desktop computers in a medical school classroom. Medical students in the Radiologic Anatomy course were divided into four small groups and assigned to work on a clinical case for 45 minutes. The groups used iPACS viewer software, a free DICOM viewer, to view images and annotate anatomic structures. The classroom instructor monitored and displayed each group's work sequentially on the master screen by running SynchronEyes, a software tool for controlling PC desktops remotely. Students were able to execute the assigned tasks using the iPACS software with minimal oversight or instruction. Course instructors displayed each group's work on the main display screen of the classroom as the students presented the rationale for their decisions. The interactive component of the course received high ratings from the students and overall course ratings were higher than in prior years when the course was given solely in lecture format. DICOM viewing software is an excellent tool for enabling students to learn radiologic anatomy from real-life clinical datasets. Interactive exercises performed in groups can be powerful tools for stimulating students to learn radiologic anatomy.

  3. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  4. Management of an affiliated Physics Residency Program using a commercial software tool.

    PubMed

    Zacarias, Albert S; Mills, Michael D

    2010-06-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.

  5. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  6. Assembling proteomics data as a prerequisite for the analysis of large scale experiments

    PubMed Central

    Schmidt, Frank; Schmid, Monika; Thiede, Bernd; Pleißner, Klaus-Peter; Böhme, Martina; Jungblut, Peter R

    2009-01-01

    Background Despite the complete determination of the genome sequence of a huge number of bacteria, their proteomes remain relatively poorly defined. Beside new methods to increase the number of identified proteins new database applications are necessary to store and present results of large- scale proteomics experiments. Results In the present study, a database concept has been developed to address these issues and to offer complete information via a web interface. In our concept, the Oracle based data repository system SQL-LIMS plays the central role in the proteomics workflow and was applied to the proteomes of Mycobacterium tuberculosis, Helicobacter pylori, Salmonella typhimurium and protein complexes such as 20S proteasome. Technical operations of our proteomics labs were used as the standard for SQL-LIMS template creation. By means of a Java based data parser, post-processed data of different approaches, such as LC/ESI-MS, MALDI-MS and 2-D gel electrophoresis (2-DE), were stored in SQL-LIMS. A minimum set of the proteomics data were transferred in our public 2D-PAGE database using a Java based interface (Data Transfer Tool) with the requirements of the PEDRo standardization. Furthermore, the stored proteomics data were extractable out of SQL-LIMS via XML. Conclusion The Oracle based data repository system SQL-LIMS played the central role in the proteomics workflow concept. Technical operations of our proteomics labs were used as standards for SQL-LIMS templates. Using a Java based parser, post-processed data of different approaches such as LC/ESI-MS, MALDI-MS and 1-DE and 2-DE were stored in SQL-LIMS. Thus, unique data formats of different instruments were unified and stored in SQL-LIMS tables. Moreover, a unique submission identifier allowed fast access to all experimental data. This was the main advantage compared to multi software solutions, especially if personnel fluctuations are high. Moreover, large scale and high-throughput experiments must be managed

  7. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  8. Portable image-manipulation software: what is the extra development cost?

    PubMed

    Ligier, Y; Ratib, O; Funk, M; Perrier, R; Girard, C; Logean, M

    1992-08-01

    A hospital-wide picture archiving and communication system (PACS) project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging component of a PACS. It was necessary to provide this visualization software on a number of types of workstations because of the varying requirements imposed by the range of clinical uses it must serve. The user interface must be the same, independent of the underlying workstation. In addition to a standard set of image-manipulation and processing tools, there is a need for more specific clinical tools that can be easily adapted to specific medical requirements. To achieve this goal, it was elected to develop a modular and portable software called OSIRIS. This software is available on two different operating systems (the UNIX standard X-11/OSF-Motif based workstations and the Macintosh family) and can be easily ported to other systems. The extra effort required to design such software in a modular and portable way was worthwhile because it resulted in a platform that can be easily expanded and adapted to a variety of specific clinical applications. Its portability allows users to benefit from the rapidly evolving workstation technology and to adapt the performance to suit their needs.

  9. Space Flight Software Development Software for Intelligent System Health Management

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  10. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  11. A new software for prediction of femoral neck fractures.

    PubMed

    Testi, Debora; Cappello, Angelo; Sgallari, Fiorella; Rumpf, Martin; Viceconti, Marco

    2004-08-01

    Femoral neck fractures are an important clinical, social and economic problem. Even if many different attempts have been carried out to improve the accuracy predicting the fracture risk, it was demonstrated in retrospective studies that the standard clinical protocol achieves an accuracy of about 65%. A new procedure was developed including for the prediction not only bone mineral density but also geometric and femoral strength information and achieving an accuracy of about 80% in a previous retrospective study. Aim of the present work was to re-engineer research-based procedures and develop a real-time software for the prediction of the risk for femoral fracture. The result was efficient, repeatable and easy to use software for the evaluation of the femoral neck fracture risk to be inserted in the daily clinical practice providing a useful tool for the improvement of fracture prediction.

  12. An audit of half-count myocardial perfusion imaging using resolution recovery software.

    PubMed

    Lawson, Richard S; White, Duncan; Nijran, Kuldip; Cade, Sarah C; Hall, David O; Kenny, Bob; Knight, Andy; Livieratos, Lefteris; Murray, Anthony; Towey, David

    2014-05-01

    The Nuclear Medicine Software Quality Group of the Institute of Physics and Engineering in Medicine has conducted a multicentre, multivendor audit to evaluate the use of resolution recovery software from several manufacturers when applied to myocardial perfusion data with half the normal counts acquired under a variety of clinical protocols in a range of departments. The objective was to determine whether centres could obtain clinical results with half-count data processed with resolution recovery software that were equivalent to those obtained using their normal protocols. Sixteen centres selected 50 routine myocardial perfusion studies each, from which the Nuclear Medicine Software Quality Group generated simulated half-count studies using Poisson resampling. These half-count studies were reconstructed using resolution recovery and the clinical reports compared with the original reports from the full-count data. A total of 769 patient studies were processed and compared. Eight centres found only a small number of clinically relevant discrepancies between the two reports, whereas eight had an unacceptably high number of discrepancies. There were no significant differences in acquisition parameters between the two groups, although centres finding a high number of discrepancies were more likely to perform both rest and stress scans on normal studies. Half of the participating centres could potentially make use of resolution recovery to reduce the administered activity for myocardial perfusion scans without changing their routine acquisition protocols. The other half could consider adjusting the reconstruction parameters used with their resolution recovery software if they wish to use reduced activity successfully.

  13. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  14. SU-E-P-43: A Knowledge Based Approach to Guidelines for Software Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salomons, G; Kelly, D

    Purpose: In the fall of 2012, a survey was distributed to medical physicists across Canada. The survey asked the respondents to comment on various aspects of software development and use in their clinic. The survey revealed that most centers employ locally produced (in-house) software of some kind. The respondents also indicated an interest in having software guidelines, but cautioned that the realities of cancer clinics include variations, that preclude a simple solution. Traditional guidelines typically involve periodically repeating a set of prescribed tests with defined tolerance limits. However, applying a similar formula to software is problematic since it assumes thatmore » the users have a perfect knowledge of how and when to apply the software and that if the software operates correctly under one set of conditions it will operate correctly under all conditions Methods: In the approach presented here the personnel involved with the software are included as an integral part of the system. Activities performed to improve the safety of the software are done with both software and people in mind. A learning oriented approach is taken, following the premise that the best approach to safety is increasing the understanding of those associated with the use or development of the software. Results: The software guidance document is organized by areas of knowledge related to use and development of software. The categories include: knowledge of the underlying algorithm and its limitations; knowledge of the operation of the software, such as input values, parameters, error messages, and interpretation of output; and knowledge of the environment for the software including both data and users. Conclusion: We propose a new approach to developing guidelines which is based on acquiring knowledge-rather than performing tests. The ultimate goal is to provide robust software guidelines which will be practical and effective.« less

  15. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  16. Software for Dosage Individualization of Voriconazole for Immunocompromised Patients

    PubMed Central

    VanGuilder, Michael; Donnelly, J. Peter; Blijlevens, Nicole M. A.; Brüggemann, Roger J. M.; Jelliffe, Roger W.; Neely, Michael N.

    2013-01-01

    The efficacy of voriconazole is potentially compromised by considerable pharmacokinetic variability. There are increasing insights into voriconazole concentrations that are safe and effective for treatment of invasive fungal infections. Therapeutic drug monitoring is increasingly advocated. Software to aid in the individualization of dosing would be an extremely useful clinical tool. We developed software to enable the individualization of voriconazole dosing to attain predefined serum concentration targets. The process of individualized voriconazole therapy was based on concepts of Bayesian stochastic adaptive control. Multiple-model dosage design with feedback control was used to calculate dosages that achieved desired concentration targets with maximum precision. The performance of the software program was assessed using the data from 10 recipients of an allogeneic hematopoietic stem cell transplant (HSCT) receiving intravenous (i.v.) voriconazole. The program was able to model the plasma concentrations with a high level of precision, despite the wide range of concentration trajectories and interindividual pharmacokinetic variability. The voriconazole concentrations predicted after the last dosages were largely concordant with those actually measured. Simulations provided an illustration of the way in which the software can be used to adjust dosages of patients falling outside desired concentration targets. This software appears to be an extremely useful tool to further optimize voriconazole therapy and aid in therapeutic drug monitoring. Further prospective studies are now required to define the utility of the controller in daily clinical practice. PMID:23380734

  17. Avoidable Software Procurements

    DTIC Science & Technology

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  18. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    PubMed

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  19. Listening to the student voice to improve educational software

    PubMed Central

    van Wyk, Mari; van Ryneveld, Linda

    2017-01-01

    ABSTRACT Academics often develop software for teaching and learning purposes with the best of intentions, only to be disappointed by the low acceptance rate of the software by their students once it is implemented. In this study, the focus is on software that was designed to enable veterinary students to record their clinical skills. A pilot of the software clearly showed that the program had not been received as well as had been anticipated, and therefore the researchers used a group interview and a questionnaire with closed-ended and open-ended questions to obtain the students’ feedback. The open-ended questions were analysed with conceptual content analysis, and themes were identified. Students made valuable suggestions about what they regarded as important considerations when a new software program is introduced. The most important lesson learnt was that students cannot always predict their needs accurately if they are asked for input prior to the development of software. For that reason student input should be obtained on a continuous and regular basis throughout the design and development phases. PMID:28678678

  20. Listening to the student voice to improve educational software.

    PubMed

    van Wyk, Mari; van Ryneveld, Linda

    2017-01-01

    Academics often develop software for teaching and learning purposes with the best of intentions, only to be disappointed by the low acceptance rate of the software by their students once it is implemented. In this study, the focus is on software that was designed to enable veterinary students to record their clinical skills. A pilot of the software clearly showed that the program had not been received as well as had been anticipated, and therefore the researchers used a group interview and a questionnaire with closed-ended and open-ended questions to obtain the students' feedback. The open-ended questions were analysed with conceptual content analysis, and themes were identified. Students made valuable suggestions about what they regarded as important considerations when a new software program is introduced. The most important lesson learnt was that students cannot always predict their needs accurately if they are asked for input prior to the development of software. For that reason student input should be obtained on a continuous and regular basis throughout the design and development phases.

  1. Agile methods in biomedical software development: a multi-site experience report.

    PubMed

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-05-30

    Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods.

  2. Agile methods in biomedical software development: a multi-site experience report

    PubMed Central

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-01-01

    Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914

  3. The eSMAF: a software for the assessment and follow-up of functional autonomy in geriatrics

    PubMed Central

    Boissy, Patrick; Brière, Simon; Tousignant, Michel; Rousseau, Eric

    2007-01-01

    Background Functional status or disability forms the core of most assessment instruments used to identify mix and level of resources and services needed by older adults who possess common characteristics. The Functional Autonomy Measurement System (SMAF) is a 29-item scale measuring functional ability in five different areas. It has been recommended for use for home care, for allocation of chronic beds, for developing care plans in institutional settings and for epidemiological and evaluative studies. The SMAF can also be used with a case-mix classification system (Iso-SMAF) to allocate resources based on patients' functional autonomy characteristics. The objective of this project was to develop a software version of the SMAF to facilitate the evaluation of the functional status of older adults in health services research and to optimize the clinical decision-making process. Results The eSMAF was developed over an 24-month period using a modified waterfall software engineering process. Requirements and functional specifications were determined using focus groups of stakeholders. Different versions of the software were iteratively field-tested in clinical and research environments and software adaptations made accordingly. User documentation and online help were created to assist the deployment of the software. The software is available in French or English versions under a 30-day unregistered demonstration license or a free restricted registered academic license. It can be used locally on a Windows-based PC or over a network to input SMAF data into a database, search and aggregate client data according to clinical and/or administrative criteria, and generate summary or detailed reports of selected data sets for print or export to another database. Conclusion In the last year, the software has been successfully deployed in the clinical workflow of different institutions in research and clinical applications. The software performed relatively well in terms of

  4. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  5. Interactions between Arctic sea ice drift, concentration and thickness modeled by NEMO-LIM3 at different resolutions

    NASA Astrophysics Data System (ADS)

    Docquier, David; Massonnet, François; Raulier, Jonathan; Lecomte, Olivier; Fichefet, Thierry

    2016-04-01

    Sea ice concentration and thickness have substantially decreased in the Arctic since the beginning of the satellite era. As a result, mechanical strength has decreased allowing more fracturing and leading to increased sea ice drift. However, recent studies have highlighted that the interplay between sea ice thermodynamics and dynamics is poorly represented in contemporary global climate model (GCM) simulations. Thus, the considerable inter-model spread in terms of future sea ice extent projections could be reduced by better understanding the interactions between drift, concentration and thickness. This study focuses on the results coming from the global coupled ocean-sea ice model NEMO-LIM3 between 1979 and 2012. Three different simulations are forced by the Drakkar Forcing Set (DFS) 5.2 and run on the global tripolar ORCA grid at spatial resolutions of 0.25, 1° and 2°. The relation between modeled sea ice drift, concentration and thickness is further analyzed, compared to observations and discussed in the framework of the above-mentioned poor representation. It is proposed as a process-based metric for evaluating model performance. This study forms part of the EU Horizon 2020 PRIMAVERA project aiming at developing a new generation of advanced and well-evaluated high-resolution GCMs.

  6. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  7. Browsing Software of the Visible Korean Data Used for Teaching Sectional Anatomy

    ERIC Educational Resources Information Center

    Shin, Dong Sun; Chung, Min Suk; Park, Hyo Seok; Park, Jin Seo; Hwang, Sung Bae

    2011-01-01

    The interpretation of computed tomographs (CTs) and magnetic resonance images (MRIs) to diagnose clinical conditions requires basic knowledge of sectional anatomy. Sectional anatomy has traditionally been taught using sectioned cadavers, atlases, and/or computer software. The computer software commonly used for this subject is practical and…

  8. [The development and evaluation of software to verify diagnostic accuracy].

    PubMed

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  9. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  10. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  11. 21 CFR 862.2570 - Instrumentation for clinical multiplex test systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical... hardware components, as well as raw data storage mechanisms, data acquisition software, and software to...

  12. Software Epistemology

    DTIC Science & Technology

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  13. Software security checklist for the software life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, D. P.; Wolfe, T. L.; Sherif, J. S.

    2002-01-01

    A formal approach to security in the software life cycle is essential to protect corporate resources. However, little thought has been given to this aspect of software development. Due to its criticality, security should be integrated as a formal approach in the software life cycle.

  14. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  15. Evaluation of software maintain ability with open EHR - a comparison of architectures.

    PubMed

    Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, James R

    2014-11-01

    To assess whether it is easier to maintain a clinical information system developed using open EHR model driven development versus mainstream methods. A new open source application (GastrOS) has been developed following open EHR's multi-level modelling approach using .Net/C# based on the same requirements of an existing clinically used application developed using Microsoft Visual Basic and Access database. Almost all the domain knowledge was embedded into the software code and data model in the latter. The same domain knowledge has been expressed as a set of open EHR Archetypes in GastrOS. We then introduced eight real-world change requests that had accumulated during live clinical usage, and implemented these in both systems while measuring time for various development tasks and change in software size for each change request. Overall it took half the time to implement changes in GastrOS. However it was the more difficult application to modify for one change request, suggesting the nature of change is also important. It was not possible to implement changes by modelling only. Comparison of relative measures of time and software size change within each application highlights how architectural differences affected maintain ability across change requests. The use of open EHR model driven development can result in better software maintain ability. The degree to which open EHR affects software maintain ability depends on the extent and nature of domain knowledge involved in changes. Although we used relative measures for time and software size, confounding factors could not be totally excluded as a controlled study design was not feasible. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Implications of the stratospheric water vapor distribution as determined from the Nimbus 7 LIMS experiment. [Limb Infrared Monitor of Stratosphere

    NASA Technical Reports Server (NTRS)

    Remsberg, E. E.; Russell, J. M., III; Gordley, L. L.; Gille, J. C.; Bailey, P. L.

    1984-01-01

    The LIMS experiment on Nimbus 7 has provided new results on the stratospheric water vapor distribution. The data show (1) a latitudinal gradient with mixing ratios that increase by a factor of 2 from equator to + or - 60 degrees at 50 mb, (2) most of the time there is a fairly uniform mixing ratio of 5 ppmv at high latitudes, but more variation exists during winter, (3) a well-developed hygropause at low to midlatitudes of the lower stratosphere, (4) a source region of water vapor in the upper stratospehere to lower mesosphere that is consistent with methane oxidation chemistry, at least within the uncertainties of the data, (5) an apparent zonal mean H2O distribution that is consistent with the circulation proposed by Brewer in 1949, and (6) a zonal mean distribution in the lower stratosphere that is consistent with the idea of quasi-isentropic transport by eddies in the meridional direction. Limits to the use of the data in the refinement of our understanding of the stratospheric water vapor budget are noted.

  17. Objective Assessment of Joint Stiffness: A Clinically Oriented Hardware and Software Device with an Application to the Shoulder Joint.

    PubMed

    McQuade, Kevin; Price, Robert; Liu, Nelson; Ciol, Marcia A

    2012-08-30

    Examination of articular joints is largely based on subjective assessment of the "end-feel" of the joint in response to manually applied forces at different joint orientations. This technical report aims to describe the development of an objective method to examine joints in general, with specific application to the shoulder, and suitable for clinical use. We adapted existing hardware and developed laptop-based software to objectively record the force/displacement behavior of the glenohumeral joint during three common manual joint examination tests with the arm in six positions. An electromagnetic tracking system recorded three-dimensional positions of sensors attached to a clinician examiner and a patient. A hand-held force transducer recorded manually applied translational forces. The force and joint displacement were time-synchronized and the joint stiffness was calculated as a quantitative representation of the joint "end-feel." A methodology and specific system checks were developed to enhance clinical testing reproducibility and precision. The device and testing protocol were tested on 31 subjects (15 with healthy shoulders, and 16 with a variety of shoulder impairments). Results describe the stiffness responses, and demonstrate the feasibility of using the device and methods in clinical settings.

  18. Evaluation of the BreastSimulator software platform for breast tomography

    NASA Astrophysics Data System (ADS)

    Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.

    2017-08-01

    The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f)  =  α/f   β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.

  19. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  20. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  1. Implementation of a software for REmote COMparison of PARticlE and photon treatment plans: ReCompare.

    PubMed

    Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin

    2015-09-01

    To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.

  2. The Software Engineering Laboratory: An operational software experience factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon

    1992-01-01

    For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.

  3. Some Methods of Applied Numerical Analysis to 3d Facial Reconstruction Software

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Ianeş, Emilia; Roşu, Doina

    2010-09-01

    This paper deals with the collective work performed by medical doctors from the University Of Medicine and Pharmacy Timisoara and engineers from the Politechnical Institute Timisoara in the effort to create the first Romanian 3d reconstruction software based on CT or MRI scans and to test the created software in clinical practice.

  4. Autonomous robot software development using simple software components

    NASA Astrophysics Data System (ADS)

    Burke, Thomas M.; Chung, Chan-Jin

    2004-10-01

    Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.

  5. Computer-Aided Software Engineering - An approach to real-time software development

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  6. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  7. Science and Software

    NASA Astrophysics Data System (ADS)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  8. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  9. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  10. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  11. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  12. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  13. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  14. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  15. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  16. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  17. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  18. Expression of the LIM-Homeodomain Protein Isl1 in the Developing and Mature Mouse Retina

    PubMed Central

    Elshatory, Yasser; Deng, Min; Xie, Xiaoling; Gan, Lin

    2010-01-01

    The mammalian retina is comprised of six major neuronal cell types and is subdivided into more morphological and physiological subtypes. The transcriptional machinery underlying these subtype fate choices is largely unknown. The LIM-homeodomain protein, Isl1, plays an essential role in central nervous system (CNS) differentiation but its relationship to retinal neurogenesis remains unknown. We report here its dynamic spatiotemporal expression in the mouse retina. Among bipolar interneurons, Isl1 expression commences at postnatal day (P)5 and is later restricted to ON-bipolar cells. The intensity of Isl1 expression is found to segregate the pool of ON-bipolar cells into rod and ON-cone bipolar cells with higher expression in rod bipolar cells. As bipolar cell development proceeds from P5–10 the colocalization of Isl1 and the pan-bipolar cell marker Chx10 reveals the organization of ON-center bipolar cell nuclei to the upper portion of the inner nuclear layer. Further, whereas Isl1 is predominantly a ganglion cell marker prior to embryonic day (E)15.5, at E15.5 and later its expression in nonganglion cells expands. We demonstrate that these Isl1-positive, nonganglion cells acquire the expression of amacrine cell markers embryonically, likely representing nascent cholinergic amacrine cells. Taken together, Isl1 is expressed during the maturation of and is later maintained in retinal ganglion cells and subtypes of amacrine and bipolar cells where it may function in the maintenance of these cells into adulthood. J. Comp. Neurol. 503: 182–197, 2007. PMID:17480014

  19. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  20. Modeling and analysis of molecularinteraction between Smurf1-WW2 domain and various isoforms of LIM mineralization protein.

    PubMed

    Sangadala, Sreedhara; Boden, Scott D; Metpally, Raghu Prasad Rao; Reddy, Boojala Vijay B

    2007-08-15

    LIM Mineralization Protein-1 (LMP-1) has been cloned and shown to be osteoinductive. Our efforts to understand the mode of action of LMP-1 led to the determination that LMP-1 interacts with Smad Ubiquitin Regulatory Factor-1 (Smurf1). Smurf1 targets osteogenic Smads, Smad1/5, for ubiquitin-mediated proteasomal degradation. Smurf1 interaction with LMP-1 or Smads is based on the presence of unique WW-domain interacting motif in these target molecules. By performing site-directed mutagenesis and binding studies in vitro on purified recombinant proteins, we identified a specific motif within the osteogenic region of several LMP isoforms that is necessary for Smurf1 interaction. Similarly, we have identified that the WW2 domain of Smurf1 is necessary for target protein interaction. Here, we present a homology-based modeling of the Smurf1 WW2 domain and its interacting motif of LMP-1. We performed computational docking of the interacting domains in Smurf1 and LMPs to identify the key amino acid residues involved in their binding regions. In support of the computational predictions, we also present biochemical evidence supporting the hypothesis that the physical interaction of Smurf1 and osteoinductive forms of LMP may prevent Smurf1 from targeting osteogenic Smads by ubiquitin-mediated proteasomal degradation.

  1. Recording information on protein complexes in an information management system

    PubMed Central

    Savitsky, Marc; Diprose, Jonathan M.; Morris, Chris; Griffiths, Susanne L.; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S.; Blake, Richard; Stuart, David I.; Esnouf, Robert M.

    2011-01-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein–protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. PMID:21605682

  2. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    DTIC Science & Technology

    2011-01-01

    open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical

  3. Occupational therapists' perceptions about the clinical utility of the 3D interior design software.

    PubMed

    Atwa, Anita; Money, Arthur G; Spiliotopoulou, Georgia; Mcintyre, Anne

    2013-07-01

    The 3D interior design software (3DIDS) is a technology, which primarily allows users to simulate their homes and visualize any changes prior to implementing them. This feasibility study aimed to examine occupational therapists' perceptions about the clinical utility of the 3DIDS. A secondary aim was to explore the attitudes of occupational therapists towards technology in general. Three focus groups were conducted with 25 occupational therapists working with older people in the UK. The qualitative data were analysed using inductive thematic analysis. The three main themes that were identified were usage and attitudes of technology, opportunities for realistic application of the 3DIDS and related threats and benefits for the occupational therapy profession. Occupational therapists had a positive attitude towards technology. They suggested that the 3DIDS could be used in discharge planning and in rehabilitation. They viewed it as a tool that could enhance their status within the health care profession and improve communication, but not as a tool that should replace the role of the occupational therapist. This research offers new and important findings about the utilization of the 3DIDS by occupational therapists and provides information as to where this technology should be trialled.

  4. Shade matching assisted by digital photography and computer software.

    PubMed

    Schropp, Lars

    2009-04-01

    To evaluate the efficacy of digital photographs and graphic computer software for color matching compared to conventional visual matching. The shade of a tab from a shade guide (Vita 3D-Master Guide) placed in a phantom head was matched to a second guide of the same type by nine observers. This was done for twelve selected shade tabs (tests). The shade-matching procedure was performed visually in a simulated clinic environment and with digital photographs, and the time spent for both procedures was recorded. An alternative arrangement of the shade tabs was used in the digital photographs. In addition, a graphic software program was used for color analysis. Hue, chroma, and lightness values of the test tab and all tabs of the second guide were derived from the digital photographs. According to the CIE L*C*h* color system, the color differences between the test tab and tabs of the second guide were calculated. The shade guide tab that deviated least from the test tab was determined to be the match. Shade matching performance by means of graphic software was compared with the two visual methods and tested by Chi-square tests (alpha= 0.05). Eight of twelve test tabs (67%) were matched correctly by the computer software method. This was significantly better (p < 0.02) than the performance of the visual shade matching methods conducted in the simulated clinic (32% correct match) and with photographs (28% correct match). No correlation between time consumption for the visual shade matching methods and frequency of correct match was observed. Shade matching assisted by digital photographs and computer software was significantly more reliable than by conventional visual methods.

  5. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  6. 2009 Navy ManTech Project Book

    DTIC Science & Technology

    2009-01-01

    pieces which are welded together, filled with syntactic foam , and welded to the sail and hull structure. The ManTech project was successful in...cladding has demonstrated the required performance characteristics . The testing demonstrated manufacturability of optical fibers with enhanced hard...using Liquid Injection Molding Simulation (LIMS) and Polyworx software tools for infusion set-up optimization. Test articles fabricated are

  7. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  8. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    PubMed

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  9. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  10. Proprietary software

    NASA Technical Reports Server (NTRS)

    Marnock, M. J.

    1971-01-01

    The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.

  11. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  12. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  13. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  14. Laboratory Animal Management Assistant (LAMA): a LIMS for active research colonies.

    PubMed

    Milisavljevic, Marko; Hearty, Taryn; Wong, Tony Y T; Portales-Casamar, Elodie; Simpson, Elizabeth M; Wasserman, Wyeth W

    2010-06-01

    Laboratory Animal Management Assistant (LAMA) is an internet-based system for tracking large laboratory mouse colonies. It has a user-friendly interface with powerful search capabilities that ease day-to-day tasks such as tracking breeding cages and weaning litters. LAMA was originally developed to manage hundreds of new mouse strains generated by a large functional genomics program, the Pleiades Promoter Project ( http://www.pleiades.org ). The software system has proven to be highly flexible, suitable for diverse management approaches to mouse colonies. It allows custom tagging and grouping of animals, simplifying project-specific handling and access to data. Finally, LAMA was developed in close collaboration with mouse technicians to ease the transition from paper- or Excel-based management systems to computerized tracking, allowing data export in a popular spreadsheet format and automatic printing of cage cards. LAMA is an open-access software tool, freely available to the research community at http://launchpad.net/mousedb .

  15. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  16. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  17. Design and validation of Segment--freely available software for cardiovascular image analysis.

    PubMed

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-11

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is

  18. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  20. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  1. Preliminary clinical evaluation of automated analysis of the sublingual microcirculation in the assessment of patients with septic shock: Comparison of automated versus semi-automated software.

    PubMed

    Sharawy, Nivin; Mukhtar, Ahmed; Islam, Sufia; Mahrous, Reham; Mohamed, Hassan; Ali, Mohamed; Hakeem, Amr A; Hossny, Osama; Refaa, Amera; Saka, Ahmed; Cerny, Vladimir; Whynot, Sara; George, Ronald B; Lehmann, Christian

    2017-01-01

    The outcome of patients in septic shock has been shown to be related to changes within the microcirculation. Modern imaging technologies are available to generate high resolution video recordings of the microcirculation in humans. However, evaluation of the microcirculation is not yet implemented in the routine clinical monitoring of critically ill patients. This is mainly due to large amount of time and user interaction required by the current video analysis software. The aim of this study was to validate a newly developed automated method (CCTools®) for microcirculatory analysis of sublingual capillary perfusion in septic patients in comparison to standard semi-automated software (AVA3®). 204 videos from 47 patients were recorded using incident dark field (IDF) imaging. Total vessel density (TVD), proportion of perfused vessels (PPV), perfused vessel density (PVD), microvascular flow index (MFI) and heterogeneity index (HI) were measured using AVA3® and CCTools®. Significant differences between the numeric results obtained by the two different software packages were observed. The values for TVD, PVD and MFI were statistically related though. The automated software technique successes to show septic shock induced microcirculation alterations in near real time. However, we found wide degrees of agreement between AVA3® and CCTools® values due to several technical factors that should be considered in the future studies.

  2. Driving out errors through tight integration between software and automation.

    PubMed

    Reifsteck, Mark; Swanson, Thomas; Dallas, Mary

    2006-01-01

    A clear case has been made for using clinical IT to improve medication safety, particularly bar-code point-of-care medication administration and computerized practitioner order entry (CPOE) with clinical decision support. The equally important role of automation has been overlooked. When the two are tightly integrated, with pharmacy information serving as a hub, the distinctions between software and automation become blurred. A true end-to-end medication management system drives out errors from the dockside to the bedside. Presbyterian Healthcare Services in Albuquerque has been building such a system since 1999, beginning by automating pharmacy operations to support bar-coded medication administration. Encouraged by those results, it then began layering on software to further support clinician workflow and improve communication, culminating with the deployment of CPOE and clinical decision support. This combination, plus a hard-wired culture of safety, has resulted in a dramatically lower mortality and harm rate that could not have been achieved with a partial solution.

  3. Payload software technology: Software technology development plan

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  4. FluxCTTX: A LIMS-based tool for management and analysis of cytotoxicity assays data

    PubMed Central

    2015-01-01

    Background Cytotoxicity assays have been used by researchers to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds or screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical. These assays may be used as an alternative to animal experimentation and are becoming increasingly important in modern laboratories. However, the execution of these assays in large scale and different laboratories requires, among other things, the management of protocols, reagents, cell lines used as well as the data produced, which can be a challenge. The management of all this information is greatly improved by the utilization of computational tools to save time and guarantee quality. However, a tool that performs this task designed specifically for cytotoxicity assays is not yet available. Results In this work, we have used a workflow based LIMS -- the Flux system -- and the Together Workflow Editor as a framework to develop FluxCTTX, a tool for management of data from cytotoxicity assays performed at different laboratories. The main work is the development of a workflow, which represents all stages of the assay and has been developed and uploaded in Flux. This workflow models the activities of cytotoxicity assays performed as described in the OECD 129 Guidance Document. Conclusions FluxCTTX presents a solution for the management of the data produced by cytotoxicity assays performed at Interlaboratory comparisons. Its adoption will contribute to guarantee the quality of activities in the process of cytotoxicity tests and enforce the use of Good Laboratory Practices (GLP). Furthermore, the workflow developed is complete and can be adapted to other contexts and different tests for management of other types of data. PMID:26696462

  5. The Hawaiian Freshwater Algal Database (HfwADB): a laboratory LIMS and online biodiversity resource

    PubMed Central

    2012-01-01

    Background Biodiversity databases serve the important role of highlighting species-level diversity from defined geographical regions. Databases that are specially designed to accommodate the types of data gathered during regional surveys are valuable in allowing full data access and display to researchers not directly involved with the project, while serving as a Laboratory Information Management System (LIMS). The Hawaiian Freshwater Algal Database, or HfwADB, was modified from the Hawaiian Algal Database to showcase non-marine algal specimens collected from the Hawaiian Archipelago by accommodating the additional level of organization required for samples including multiple species. Description The Hawaiian Freshwater Algal Database is a comprehensive and searchable database containing photographs and micrographs of samples and collection sites, geo-referenced collecting information, taxonomic data and standardized DNA sequence data. All data for individual samples are linked through unique 10-digit accession numbers (“Isolate Accession”), the first five of which correspond to the collection site (“Environmental Accession”). Users can search online for sample information by accession number, various levels of taxonomy, habitat or collection site. HfwADB is hosted at the University of Hawaii, and was made publicly accessible in October 2011. At the present time the database houses data for over 2,825 samples of non-marine algae from 1,786 collection sites from the Hawaiian Archipelago. These samples include cyanobacteria, red and green algae and diatoms, as well as lesser representation from some other algal lineages. Conclusions HfwADB is a digital repository that acts as a Laboratory Information Management System for Hawaiian non-marine algal data. Users can interact with the repository through the web to view relevant habitat data (including geo-referenced collection locations) and download images of collection sites, specimen photographs and

  6. Radiological assessment of breast density by visual classification (BI-RADS) compared to automated volumetric digital software (Quantra): implications for clinical practice.

    PubMed

    Regini, Elisa; Mariscotti, Giovanna; Durando, Manuela; Ghione, Gianluca; Luparia, Andrea; Campanino, Pier Paolo; Bianchi, Caterina Chiara; Bergamasco, Laura; Fonio, Paolo; Gandini, Giovanni

    2014-10-01

    This study was done to assess breast density on digital mammography and digital breast tomosynthesis according to the visual Breast Imaging Reporting and Data System (BI-RADS) classification, to compare visual assessment with Quantra software for automated density measurement, and to establish the role of the software in clinical practice. We analysed 200 digital mammograms performed in 2D and 3D modality, 100 of which positive for breast cancer and 100 negative. Radiological density was assessed with the BI-RADS classification; a Quantra density cut-off value was sought on the 2D images only to discriminate between BI-RADS categories 1-2 and BI-RADS 3-4. Breast density was correlated with age, use of hormone therapy, and increased risk of disease. The agreement between the 2D and 3D assessments of BI-RADS density was high (K 0.96). A cut-off value of 21% is that which allows us to best discriminate between BI-RADS categories 1-2 and 3-4. Breast density was negatively correlated to age (r = -0.44) and positively to use of hormone therapy (p = 0.0004). Quantra density was higher in breasts with cancer than in healthy breasts. There is no clear difference between the visual assessments of density on 2D and 3D images. Use of the automated system requires the adoption of a cut-off value (set at 21%) to effectively discriminate BI-RADS 1-2 and 3-4, and could be useful in clinical practice.

  7. Healthcare software assurance.

    PubMed

    Cooper, Jason G; Pauley, Keith A

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.

  8. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  9. Statistical Software Engineering

    DTIC Science & Technology

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  10. Dynamic models for estimating the effect of HAART on CD4 in observational studies: Application to the Aquitaine Cohort and the Swiss HIV Cohort Study.

    PubMed

    Prague, Mélanie; Commenges, Daniel; Gran, Jon Michael; Ledergerber, Bruno; Young, Jim; Furrer, Hansjakob; Thiébaut, Rodolphe

    2017-03-01

    Highly active antiretroviral therapy (HAART) has proved efficient in increasing CD4 counts in many randomized clinical trials. Because randomized trials have some limitations (e.g., short duration, highly selected subjects), it is interesting to assess the effect of treatments using observational studies. This is challenging because treatment is started preferentially in subjects with severe conditions. This general problem had been treated using Marginal Structural Models (MSM) relying on the counterfactual formulation. Another approach to causality is based on dynamical models. We present three discrete-time dynamic models based on linear increments models (LIM): the first one based on one difference equation for CD4 counts, the second with an equilibrium point, and the third based on a system of two difference equations, which allows jointly modeling CD4 counts and viral load. We also consider continuous-time models based on ordinary differential equations with non-linear mixed effects (ODE-NLME). These mechanistic models allow incorporating biological knowledge when available, which leads to increased statistical evidence for detecting treatment effect. Because inference in ODE-NLME is numerically challenging and requires specific methods and softwares, LIM are a valuable intermediary option in terms of consistency, precision, and complexity. We compare the different approaches in simulation and in illustration on the ANRS CO3 Aquitaine Cohort and the Swiss HIV Cohort Study. © 2016, The International Biometric Society.

  11. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  12. Commercial Literacy Software.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  13. Software Configuration Management Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.

  14. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  15. S2O - A software tool for integrating research data from general purpose statistic software into electronic data capture systems.

    PubMed

    Bruland, Philipp; Dugas, Martin

    2017-01-07

    Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible

  16. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    ERIC Educational Resources Information Center

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  17. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  18. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  19. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  20. Space Station Software Issues

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor); Beskenis, S. (Editor)

    1985-01-01

    Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.

  1. The impact of software quality characteristics on healthcare outcome: a literature review.

    PubMed

    Aghazadeh, Sakineh; Pirnejad, Habibollah; Moradkhani, Alireza; Aliev, Alvosat

    2014-01-01

    The aim of this study was to discover the effect of software quality characteristics on healthcare quality and efficiency indicators. Through a systematic literature review, we selected and analyzed 37 original research papers to investigate the impact of the software indicators (coming from the standard ISO 9126 quality characteristics and sub-characteristics) on some of healthcare important outcome indicators and finally ranked these software indicators. The results showed that the software characteristics usability, reliability and efficiency were mostly favored in the studies, indicating their importance. On the other hand, user satisfaction, quality of patient care, clinical workflow efficiency, providers' communication and information exchange, patient satisfaction and care costs were among the healthcare outcome indicators frequently evaluated in relation to the mentioned software characteristics. Regression Logistic Method was the most common assessment methodology, and Confirmatory Factor Analysis and Structural Equation Modeling were performed to test the structural model's fit. The software characteristics were considered to impact the healthcare outcome indicators through other intermediate factors (variables).

  2. SU-F-T-22: Clinical Implications When Using TG-186 (ACE) Heterogeneity Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Likhacheva, A; Grade, E; Sadeghi, A

    Purpose: The purpose of this study is to compare dosimetric calculations using traditional TG-43 formalism and Oncentra Brachy Advanced Collapsed cone Engine (ACE) TG-186 calculation algorithm in clinical setting. Methods: We analyzed dosimetry of four patients treated with accelerated partial breast irradiation using a multi-channel intracavitary device (SAVI). All patients were treated to 34 Gy in 10 fractions using a high-dose-rate (192) Ir source. The plans were designed and treated using the TG-43 model. ACE was used to assess the effect heterogeneity correction on various dosimetric parameters. Mass density was estimated using Hounsfield units. Results: Compared to TG-43 formalism, ACEmore » estimated lower doses to targets and organs at risk. The mean difference was 19.8% (range 15.3–24.1%) for PTV-eval V200, 12.0% (range 9.7–17.7%) for PTV-eval V150, 4.3% (range 3.3–6.5%) for PTV-eval D95, 3.3% (range 1.4–5.4%) for PTV-eval D90, 5.4% (range 2.9–9.9%) for maximum rib dose, and 5.7% (2.4–7.4%) for maximum skin dose. There was no correlation between the magnitude of the difference and the PTV-eval volume, air volume, or tissue-applicator conformance. Conclusion: Based on our preliminary study, the TG-43 algorithm appears to overestimate the dose to targets and organs at risk when compared to the ACE TG-186 software. We hypothesize that air adjacent to the SAVI struts contributes to lack of scatter thereby contributing a significant difference in dose calculation when using ACE. We believe that ACE calculation provides a more realistic isodose distribution than TG-43. We plan to further investigate the impact of heterogeneity correction on brachytherapy planning for a wide variety of clinical scenarios, include skin, cervix/uterus, prostate, and lung.« less

  3. Payload software technology

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.

  4. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  5. Pharmaceutical advertisements in prescribing software: an analysis.

    PubMed

    Harvey, Ken J; Vitry, Agnes I; Roughead, Elizabeth; Aroni, Rosalie; Ballenden, Nicola; Faggotter, Ralph

    2005-07-18

    To assess pharmaceutical advertisements in prescribing software, their adherence to code standards, and the opinions of general practitioners regarding the advertisements. Content analysis of advertisements displayed by Medical Director version 2.81 (Health Communication Network, Sydney, NSW) in early 2005; thematic analysis of a debate on this topic held on the General Practice Computer Group email forum (GPCG_talk) during December 2004. Placement, frequency and type of advertisements; their compliance with the Medicines Australia Code of Conduct, and the views of GPs. 24 clinical functions in Medical Director contained advertisements. These included 79 different advertisements for 41 prescription products marketed by 17 companies, including one generic manufacturer. 57 of 60 (95%) advertisements making a promotional claim appeared noncompliant with one or more requirements of the Code. 29 contributors, primarily GPs, posted 174 emails to GPCG_talk; there was little support for these advertisements, but some concern that the price of software would increase if they were removed. We suggest that pharmaceutical promotion in prescribing software should be banned, and inclusion of independent therapeutic information be mandated.

  6. Report: Scientific Software.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1985-01-01

    Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)

  7. Recording information on protein complexes in an information management system.

    PubMed

    Savitsky, Marc; Diprose, Jonathan M; Morris, Chris; Griffiths, Susanne L; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S; Blake, Richard; Stuart, David I; Esnouf, Robert M

    2011-08-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein-protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. The Elements of an Effective Software Development Plan - Software Development Process Guidebook

    DTIC Science & Technology

    2011-11-11

    standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new

  9. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  10. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    PubMed

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  11. Analysis software can put surgical precision into medical device design.

    PubMed

    Jain, S

    2005-11-01

    Use of finite element analysis software can give design engineers greater freedom to experiment with new designs and materials and allow companies to get products through clinical trials and onto the market faster. This article suggests how.

  12. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  13. Software Assurance Competency Model

    DTIC Science & Technology

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  14. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    ERIC Educational Resources Information Center

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  15. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  16. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  17. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  18. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  19. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  20. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  1. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  2. Integrated software for the detection of epileptogenic zones in refractory epilepsy.

    PubMed

    Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia

    2010-01-01

    In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.

  3. Conceptualization and application of an approach for designing healthcare software interfaces.

    PubMed

    Kumar, Ajit; Maskara, Reena; Maskara, Sanjeev; Chiang, I-Jen

    2014-06-01

    The aim of this study is to conceptualize a novel approach, which facilitates us to design prototype interfaces for healthcare software. Concepts and techniques from various disciplines were used to conceptualize an interface design approach named MORTARS (Map Original Rhetorical To Adapted Rhetorical Situation). The concepts and techniques included in this approach are (1) rhetorical situation - a concept of philosophy provided by Bitzer (1968); (2) move analysis - an applied linguistic technique provided by Swales (1990) and Bhatia (1993); (3) interface design guidelines - a cognitive and computer science concept provided by Johnson (2010); (4) usability evaluation instrument - an interface evaluation questionnaire provided by Lund (2001); (5) user modeling via stereotyping - a cognitive and computer science concept provided by Rich (1979). A prototype interface for outpatient clinic software was designed to introduce the underlying concepts of MORTARS. The prototype interface was evaluated by thirty-two medical informaticians. The medical informaticians found the designed prototype interface to be useful (73.3%), easy to use (71.9%), easy to learn (93.1%), and satisfactory (53.2%). MORTARS approach was found to be effective in designing the prototype user interface for the outpatient clinic software. This approach might be further used to design interfaces for various software pertaining to healthcare and other domains. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Software Metrics

    DTIC Science & Technology

    1988-12-01

    software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value

  5. Software Smarts

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.

  6. Software To Go: A Catalog of Software Available for Loan.

    ERIC Educational Resources Information Center

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  7. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  8. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  9. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  10. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  11. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  12. Software Management Metrics

    DTIC Science & Technology

    1988-05-01

    obtained from Dr. Barry Boehm’s Software 5650, Contract No. F19628-86-C-O001, Engineering Economics [1] and from T. J. ESD/MITRE Software Center Acquisition...of References 1. Boehm, Barry W., SoJtware Engineering 3. Halstead, M. H., Elements of SoJhtare Economics, Englewood Cliffs, New Science, New York...1983, pp. 639-648. 35 35 - Bibliography Beizer, B., Software System Testing and Pressman , Roger S., Software Engineering:QualtyO Assurance, New York: Van

  13. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  14. Software Engineering Improvement Plan

    NASA Technical Reports Server (NTRS)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  15. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  16. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  17. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    NASA Technical Reports Server (NTRS)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  18. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  19. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  20. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  1. The SIFT hardware/software systems. Volume 2: Software listings

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1985-01-01

    This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.

  2. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  3. Software Engineering Laboratory Series: Proceedings of the Twentieth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  4. Software requirements: Guidance and control software development specification

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward; Rich, Don C.; Lowman, Douglas S.; Buckland, R. C.

    1990-01-01

    The software requirements for an implementation of Guidance and Control Software (GCS) are specified. The purpose of the GCS is to provide guidance and engine control to a planetary landing vehicle during its terminal descent onto a planetary surface and to communicate sensory information about that vehicle and its descent to some receiving device. The specification was developed using the structured analysis for real time system specification methodology by Hatley and Pirbhai and was based on a simulation program used to study the probability of success of the 1976 Viking Lander missions to Mars. Three versions of GCS are being generated for use in software error studies.

  5. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2005-01-01

    NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.

  6. Isolating the Effects of a Mobile Phone on the Usability and Safety of eHealth Software Applications.

    PubMed

    Borycki, Elizabeth M; Griffith, Janessa; Monkman, Helen; Reid-Haughian, Cheryl

    2017-01-01

    Mobile phones are used in conjunction with mobile eHealth software applications. These mobile software applications can be used to access, review and document clinical information. The objective of this research was to explore the relationship between mobile phones, usability and safety. Clinical simulations and semi-structured interviews were used to investigate this relationship. The findings revealed that mobile phones may lead to specific types of usability issues that may introduce some types of errors.

  7. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  8. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  9. Clinical evaluation of semi-automatic open-source algorithmic software segmentation of the mandibular bone: Practical feasibility and assessment of a new course of action

    PubMed Central

    Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter

    2018-01-01

    Introduction Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However—due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. Material and methods In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Results Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p<0.05) and correlation coefficients were close to the value one (r > 0.94) for any of the comparison made between the two groups. Discussion Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be

  10. Clinical evaluation of semi-automatic open-source algorithmic software segmentation of the mandibular bone: Practical feasibility and assessment of a new course of action.

    PubMed

    Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter; Egger, Jan

    2018-01-01

    Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However-due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p<0.05) and correlation coefficients were close to the value one (r > 0.94) for any of the comparison made between the two groups. Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists

  11. Software Engineering Education Directory

    DTIC Science & Technology

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  12. Finding Helpful Software Reviews.

    ERIC Educational Resources Information Center

    Kruse, Ted, Comp.

    1987-01-01

    Provides a list of evaluation services currently producing critical reviews of educational software. Includes information about The Apple K-12 Curriculum Software Reference, The Educational Software Preview, The Educational Software Selector, MicroSIFT, and Only The Best: The Discriminating Guide for Preschool-Grade 12. (TW)

  13. Software Reuse Issues

    NASA Technical Reports Server (NTRS)

    Voigt, Susan J. (Editor); Smith, Kathryn A. (Editor)

    1989-01-01

    NASA Langley Research Center sponsored a Workshop on NASA Research in Software Reuse on November 17-18, 1988 in Melbourne, Florida, hosted by Software Productivity Solutions, Inc. Participants came from four NASA centers and headquarters, eight NASA contractor companies, and three research institutes. Presentations were made on software reuse research at the four NASA centers; on Eli, the reusable software synthesis system designed and currently under development by SPS; on Space Station Freedom plans for reuse; and on other reuse research projects. This publication summarizes the presentations made and the issues discussed during the workshop.

  14. Happy software developers solve problems better: psychological measurements in empirical software engineering

    PubMed Central

    Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint

  15. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    PubMed

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  16. The LIM-homeodomain transcription factor LMX1B regulates expression of NF-kappa B target genes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rascle, Anne; Neumann, Tanja; Raschta, Anne-Sarah

    2009-01-01

    LMX1B is a LIM-homeodomain transcription factor essential for development. Putative LMX1B target genes have been identified through mouse gene targeting studies, but their identity as direct LMX1B targets remains hypothetical. We describe here the first molecular characterization of LMX1B target gene regulation. Microarray analysis using a tetracycline-inducible LMX1B expression system in HeLa cells revealed that a subset of NF-{kappa}B target genes, including IL-6 and IL-8, are upregulated in LMX1B-expressing cells. Inhibition of NF-{kappa}B activity by short interfering RNA-mediated knock-down of p65 impairs, while activation of NF-{kappa}B activity by TNF-{alpha} synergizes induction of NF-{kappa}B target genes by LMX1B. Chromatin immunoprecipitation demonstratedmore » that LMX1B binds to the proximal promoter of IL-6 and IL-8 in vivo, in the vicinity of the characterized {kappa}B site, and that LMX1B recruitment correlates with increased NF-{kappa}B DNA association. IL-6 promoter-reporter assays showed that the {kappa}B site and an adjacent putative LMX1B binding motif are both involved in LMX1B-mediated transcription. Expression of NF-{kappa}B target genes is affected in the kidney of Lmx1b{sup -/-} knock-out mice, thus supporting the biological relevance of our findings. Together, these data demonstrate for the first time that LMX1B directly regulates transcription of a subset of NF-{kappa}B target genes in cooperation with nuclear p50/p65 NF-{kappa}B.« less

  17. Self-assembling software generator

    DOEpatents

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  18. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  19. A software tool for pure‑tone audiometry. Classification of audiograms for inclusion of patients in clinical trials. English version.

    PubMed

    Rahne, T; Buthut, F; Plößl, S; Plontke, S K

    2016-03-01

    Selecting subjects for clinical trials on hearing loss therapies relies on the patient meeting the audiological inclusion criteria. In studies on the treatment of idiopathic sudden sensorineural hearing loss, the patient's acute audiogram is usually compared with a previous audiogram, the audiogram of the non-affected ear, or a normal audiogram according to an ISO standard. Generally, many more patients are screened than actually fulfill the particular inclusion criteria. The inclusion criteria often require a calculation of pure-tone averages, selection of the most affected frequencies, and calculation of hearing loss differences. A software tool was developed to simplify and accelerate this inclusion procedure for investigators to estimate the possible recruitment rate during the planning phase of a clinical trial and during the actual study. This tool is Microsoft Excel-based and easy to modify to meet the particular inclusion criteria of a specific clinical trial. The tool was retrospectively evaluated on 100 patients with acute hearing loss comparing the times for classifying automatically and manually. The study sample comprised 100 patients with idiopathic sudden sensorineural hearing loss. The age- and sex-related normative audiogram was calculated automatically by the tool and the hearing impairment was graded. The estimated recruitment rate of our sample was quickly calculated. Information about meeting the inclusion criteria was provided instantaneously. A significant reduction of 30 % in the time required for classifying (30 s per patient) was observed.

  20. Evidence synthesis software.

    PubMed

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  1. Availability of software services for a hospital information system.

    PubMed

    Sakamoto, N

    1998-03-01

    Hospital information systems (HISs) are becoming more important and covering more parts in daily hospital operations as order-entry systems become popular and electronic charts are introduced. Thus, HISs today need to be able to provide necessary services for hospital operations for a 24-h day, 365 days a year. The provision of services discussed here does not simply mean the availability of computers, in which all that matters is that the computer is functioning. It means the provision of necessary information for hospital operations by the computer software, and we will call it the availability of software services. HISs these days are mostly client-server systems. To increase availability of software services in these systems, it is not enough to just use system structures that are highly reliable in existing host-centred systems. Four main components which support availability of software services are network systems, client computers, server computers, and application software. In this paper, we suggest how to structure these four components to provide the minimum requested software services even if a part of the system stops to function. The network system should be double-protected in stratus using Asynchronous Transfer Mode (ATM) as its base network. Client computers should be fat clients with as much application logic as possible, and reference information which do not require frequent updates (master files, for example) should be replicated in clients. It would be best if all server computers could be double-protected. However, if that is physically impossible, one database file should be made accessible by several server computers. Still, at least the basic patients' information and the latest clinical records should be double-protected physically. Application software should be tested carefully before introduction. Different versions of the application software should always be kept and managed in case the new version has problems. If a hospital

  2. Software development predictors, error analysis, reliability models and software metric analysis

    NASA Technical Reports Server (NTRS)

    Basili, Victor

    1983-01-01

    The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.

  3. Strengthening Software Authentication with the ROSE Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen

    1988-01-01

    Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6)…

  5. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  6. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  7. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  8. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  9. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  10. Dtest Testing Software

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  11. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  12. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  13. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  14. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  15. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  16. DigitalVHI--a freeware open-source software application to capture the Voice Handicap Index and other questionnaire data in various languages.

    PubMed

    Herbst, Christian T; Oh, Jinook; Vydrová, Jitka; Švec, Jan G

    2015-07-01

    In this short report we introduce DigitalVHI, a free open-source software application for obtaining Voice Handicap Index (VHI) and other questionnaire data, which can be put on a computer in clinics and used in clinical practice. The software can simplify performing clinical studies since it makes the VHI scores directly available for analysis in a digital form. It can be downloaded from http://www.christian-herbst.org/DigitalVHI/.

  17. Third-Party Software's Trust Quagmire.

    PubMed

    Voas, J; Hurlburt, G

    2015-12-01

    Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.

  18. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  19. Sustaining Software-Intensive Systems

    DTIC Science & Technology

    2006-05-01

    2.2 Multi- Service Operational Test and Evaluation .......................................4 2.3 Stable Software Baseline...or equivalent document • completed Multi- Service Operational Test and Evaluation (MOT&E) for the potential production software package (or OT&E if...not multi- service ) • stable software production baseline • complete and current software documentation • Authority to Operate (ATO) for an

  20. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  1. Diagnostic evaluation of three cardiac software packages using a consecutive group of patients

    PubMed Central

    2011-01-01

    Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226

  2. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  3. NASA PC software evaluation project

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  4. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images.

    PubMed

    Kadoya, Noriyuki; Nakajima, Yujiro; Saito, Masahide; Miyabe, Yuki; Kurooka, Masahiko; Kito, Satoshi; Fujita, Yukio; Sasaki, Motoharu; Arai, Kazuhiro; Tani, Kensuke; Yagi, Masashi; Wakita, Akihisa; Tohyama, Naoki; Jingu, Keiichi

    2016-10-01

    To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab (dir-lab.com) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated by the landmark. Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D-CT data. Copyright © 2016 Elsevier Inc. All rights

  5. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kadoya, Noriyuki, E-mail: kadoya.n@rad.med.tohoku.ac.jp; Nakajima, Yujiro; Saito, Masahide

    Purpose: To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Methods and Materials: Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab ( (dir-lab.com)) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated bymore » the landmark. Results: Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. Conclusions: We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using

  6. Frequency and analysis of non-clinical errors made in radiology reports using the National Integrated Medical Imaging System voice recognition dictation software.

    PubMed

    Motyer, R E; Liddy, S; Torreggiani, W C; Buckley, O

    2016-11-01

    Voice recognition (VR) dictation of radiology reports has become the mainstay of reporting in many institutions worldwide. Despite benefit, such software is not without limitations, and transcription errors have been widely reported. Evaluate the frequency and nature of non-clinical transcription error using VR dictation software. Retrospective audit of 378 finalised radiology reports. Errors were counted and categorised by significance, error type and sub-type. Data regarding imaging modality, report length and dictation time was collected. 67 (17.72 %) reports contained ≥1 errors, with 7 (1.85 %) containing 'significant' and 9 (2.38 %) containing 'very significant' errors. A total of 90 errors were identified from the 378 reports analysed, with 74 (82.22 %) classified as 'insignificant', 7 (7.78 %) as 'significant', 9 (10 %) as 'very significant'. 68 (75.56 %) errors were 'spelling and grammar', 20 (22.22 %) 'missense' and 2 (2.22 %) 'nonsense'. 'Punctuation' error was most common sub-type, accounting for 27 errors (30 %). Complex imaging modalities had higher error rates per report and sentence. Computed tomography contained 0.040 errors per sentence compared to plain film with 0.030. Longer reports had a higher error rate, with reports >25 sentences containing an average of 1.23 errors per report compared to 0-5 sentences containing 0.09. These findings highlight the limitations of VR dictation software. While most error was deemed insignificant, there were occurrences of error with potential to alter report interpretation and patient management. Longer reports and reports on more complex imaging had higher error rates and this should be taken into account by the reporting radiologist.

  7. Educational Software Acquisition for Microcomputers.

    ERIC Educational Resources Information Center

    Erikson, Warren; Turban, Efraim

    1985-01-01

    Examination of issues involved in acquiring appropriate microcomputer software for higher education focuses on the following points: developing your own software; finding commercially available software; using published evaluations; pre-purchase testing; customizing and adapting commercial software; post-purchase testing; and software use. A…

  8. Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  9. Software Engineering Laboratory Series: Proceedings of the Twenty-Second Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  10. Software Measurement Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.

  11. Microcomputer software development facilities

    NASA Technical Reports Server (NTRS)

    Gorman, J. S.; Mathiasen, C.

    1980-01-01

    A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.

  12. Managing configuration software of ground software applications with glueware

    NASA Technical Reports Server (NTRS)

    Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.

    2003-01-01

    This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.

  13. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner

  14. Clinical value of CT-based preoperative software assisted lung lobe volumetry for predicting postoperative pulmonary function after lung surgery

    NASA Astrophysics Data System (ADS)

    Wormanns, Dag; Beyer, Florian; Hoffknecht, Petra; Dicken, Volker; Kuhnigk, Jan-Martin; Lange, Tobias; Thomas, Michael; Heindel, Walter

    2005-04-01

    This study was aimed to evaluate a morphology-based approach for prediction of postoperative forced expiratory volume in one second (FEV1) after lung resection from preoperative CT scans. Fifteen Patients with surgically treated (lobectomy or pneumonectomy) bronchogenic carcinoma were enrolled in the study. A preoperative chest CT and pulmonary function tests before and after surgery were performed. CT scans were analyzed by prototype software: automated segmentation and volumetry of lung lobes was performed with minimal user interaction. Determined volumes of different lung lobes were used to predict postoperative FEV1 as percentage of the preoperative values. Predicted FEV1 values were compared to the observed postoperative values as standard of reference. Patients underwent lobectomy in twelve cases (6 upper lobes; 1 middle lobe; 5 lower lobes; 6 right side; 6 left side) and pneumonectomy in three cases. Automated calculation of predicted postoperative lung function was successful in all cases. Predicted FEV1 ranged from 54% to 95% (mean 75% +/- 11%) of the preoperative values. Two cases with obviously erroneous LFT were excluded from analysis. Mean error of predicted FEV1 was 20 +/- 160 ml, indicating absence of systematic error; mean absolute error was 7.4 +/- 3.3% respective 137 +/- 77 ml/s. The 200 ml reproducibility criterion for FEV1 was met in 11 of 13 cases (85%). In conclusion, software-assisted prediction of postoperative lung function yielded a clinically acceptable agreement with the observed postoperative values. This method might add useful information for evaluation of functional operability of patients with lung cancer.

  15. Software platform for simulation of a prototype proton CT scanner.

    PubMed

    Giacometti, Valentina; Bashkirov, Vladimir A; Piersimoni, Pierluigi; Guatelli, Susanna; Plautz, Tia E; Sadrozinski, Hartmut F-W; Johnson, Robert P; Zatserklyaniy, Andriy; Tessonnier, Thomas; Parodi, Katia; Rosenfeld, Anatoly B; Schulte, Reinhard W

    2017-03-01

    Proton computed tomography (pCT) is a promising imaging technique to substitute or at least complement x-ray CT for more accurate proton therapy treatment planning as it allows calculating directly proton relative stopping power from proton energy loss measurements. A proton CT scanner with a silicon-based particle tracking system and a five-stage scintillating energy detector has been completed. In parallel a modular software platform was developed to characterize the performance of the proposed pCT. The modular pCT software platform consists of (1) a Geant4-based simulation modeling the Loma Linda proton therapy beam line and the prototype proton CT scanner, (2) water equivalent path length (WEPL) calibration of the scintillating energy detector, and (3) image reconstruction algorithm for the reconstruction of the relative stopping power (RSP) of the scanned object. In this work, each component of the modular pCT software platform is described and validated with respect to experimental data and benchmarked against theoretical predictions. In particular, the RSP reconstruction was validated with both experimental scans, water column measurements, and theoretical calculations. The results show that the pCT software platform accurately reproduces the performance of the existing prototype pCT scanner with a RSP agreement between experimental and simulated values to better than 1.5%. The validated platform is a versatile tool for clinical proton CT performance and application studies in a virtual setting. The platform is flexible and can be modified to simulate not yet existing versions of pCT scanners and higher proton energies than those currently clinically available. © 2017 American Association of Physicists in Medicine.

  16. Designing Networks that are Capable of Self-Healing and Adapting

    DTIC Science & Technology

    2017-04-01

    Undergrad. Res. Fellowship, visiting from Caltech. Undergraduate Eugene Park Math Duke Models of self-healing networks (undergrad. senior thesis...Graduate student Anastasia Deckard Math Duke 3rd/4th year PhD. Wrote software for simulation. Undergraduate Nick Day Math LIMS Summer project at...Harer gave a talk on this DTRA grant to undergraduate math majors at Duke. 11 Q UA D C HA R T Uploaded to the DTRA Basic and Fundamental Research

  17. Software Repository

    NASA Technical Reports Server (NTRS)

    Merwarth, P., D.

    1983-01-01

    The Common Software Module Repository (CSMR) is computerized library system with high product and service visibility to potential users. Online capabilities of system allow both librarian and user to interact with library. Librarian is responsible for maintaining information in CSMR library. User searches library to locate software modules that meet his or her current needs.

  18. Self-assembled software and method of overriding software execution

    DOEpatents

    Bouchard, Ann M.; Osbourn, Gordon C.

    2013-01-08

    A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.

  19. Software process improvement in the NASA software engineering laboratory

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  20. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  1. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  2. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  3. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  4. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  5. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  6. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  7. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  8. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  9. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  10. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  11. Guidance and Control Software,

    DTIC Science & Technology

    1980-05-01

    commitments of function, cost, and schedule . The phrase "software engineering" was intended to contrast with the phrase "computer science" the latter aims...the software problems of cost, delivery schedule , and quality were gradually being recognized at the highest management levels. Thus, in a project... schedule dates. Although the analysis of software problems indicated that the entire software development process (figure 1) needed new methods, only

  12. Software Engineering Improvement Activities/Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  13. LIM homeobox transcription factor Isl1 is required for melatonin synthesis in the pig pineal gland.

    PubMed

    Zhang, Jinglin; Qiu, Jingtao; Zhou, Yewen; Wang, Yue; Li, Hongjiao; Zhang, Taojie; Jiang, Ying; Gou, Kemian; Cui, Sheng

    2018-02-26

    Melatonin is a key hormone that regulates circadian rhythms, metabolism, and reproduction. However, the mechanisms of melatonin synthesis and secretion have not been fully defined. The purpose of this study was to investigate the functions of the LIM homeobox transcription factor Isl1 in regulating melatonin synthesis and secretion in porcine pineal gland. We found that Isl1 is highly expressed in the melatonin-producing cells in the porcine pineal gland. Further functional studies demonstrate that Isl1 knockdown in cultured primary porcine pinealocytes results in the decline of melatonin and arylalkylamine N-acetyltransferase (AANAT) mRNA levels by 29.2% and 72.2%, respectively, whereas Isl1 overexpression raised by 1.3-fold and 2.7-fold. In addition, the enhancing effect of norepinephrine (NE) on melatonin synthesis was abolished by Isl1 knockdown. The in vivo intracerebroventricular NE injections upregulate Isl1 mRNA and protein levels by about threefold and 4.5-fold in the porcine pineal gland. We then examined the changes in Isl1 expression in the pineal gland and global melatonin levels throughout the day. The results show that Isl1 protein level at 24:00 is 2.5-fold higher than that at 12:00, which is parallel to melatonin levels. We further found that Isl1 increases the activity of AANAT promoter, and the effect of NE on Isl1 expression was blocked by an ERK inhibitor. Collectively, the results presented here demonstrate that Isl1 positively modulates melatonin synthesis by targeting AANAT, via the ERK signaling pathway of NE. These suggest that Isl1 plays important roles in maintaining the daily circadian rhythm. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Selecting Software for Libraries.

    ERIC Educational Resources Information Center

    Beiser, Karl

    1993-01-01

    Discusses resources and strategies that libraries can use to evaluate competing database management software for purchase. Needs assessments, types of software available, features of good software, evaluation aids, shareware, and marketing and product trends are covered. (KRN)

  15. Assuring Software Reliability

    DTIC Science & Technology

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  16. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  17. Software Quality Assurance Audits Guidebooks

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  18. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  19. ACTS: from ATLAS software towards a common track reconstruction software

    NASA Astrophysics Data System (ADS)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  20. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  1. Experimentation in software engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Selby, R. W.; Hutchens, D. H.

    1986-01-01

    Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.

  2. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  3. GFAST Software Demonstration

    NASA Image and Video Library

    2017-03-17

    NASA engineers and test directors gather in Firing Room 3 in the Launch Control Center at NASA's Kennedy Space Center in Florida, to watch a demonstration of the automated command and control software for the agency's Space Launch System (SLS) and Orion spacecraft. The software is called the Ground Launch Sequencer. It will be responsible for nearly all of the launch commit criteria during the final phases of launch countdowns. The Ground and Flight Application Software Team (GFAST) demonstrated the software. It was developed by the Command, Control and Communications team in the Ground Systems Development and Operations (GSDO) Program. GSDO is helping to prepare the center for the first test flight of Orion atop the SLS on Exploration Mission 1.

  4. Small-College Software Survey.

    ERIC Educational Resources Information Center

    Birch, Anthony D.

    1986-01-01

    Computers have a great number of potential uses at the small college. A survey of the role of software in the effective use of computers is described. Hardware characteristics, spreadsheets, purchasing or developing software, and software information are discussed. (Author/MLW)

  5. Software Engineering for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  6. [Confirming the Utility of RAISUS Antifungal Susceptibility Testing by New-Software].

    PubMed

    Ono, Tomoko; Suematsu, Hiroyuki; Sawamura, Haruki; Yamagishi, Yuka; Mikamo, Hiroshige

    2017-08-15

    Clinical and Laboratory Standards Institute (CLSI) methods for susceptibility tests of yeast are used in Japan. On the other hand, the methods have some disadvantage; 1) reading at 24 and 48 h, 2) using unclear scale, approximately 50% inhibition, to determine MICs, 3) calculating trailing growth and paradoxical effects. These makes it difficult to test the susuceptibility for yeasts. Old software of RAISUS, Ver. 6.0 series, resolved problem 1) and 2) but did not resolve problem 3). Recently, new software of RAISUS, Ver. 7.0 series, resolved problem 3). We confirmed that using the new software made it clear whether all these issue were settled or not. Eighty-four Candida isolated from Aichi Medical University was used in this study. We compared the MICs obtained by using RAISUS antifungal susceptibility testing of yeasts RSMY1, RSMY1, with those obtained by using ASTY. The concordance rates (±four-fold of MICs) between the MICs obtained by using ASTY and RSMY1 with the new software were more than 90%, except for miconazole (MCZ). The rate of MCZ was low, but MICs obtained by using CLSI methods and Yeast-like Fungus DP 'EIKEN' methods, E-DP, showed equivalent MICs of RSMY1 using the new software. The frequency of skip effects on RSMY1 using the new software markedly decreased relative to RSMY1 using the old software. In case of showing trailing growth, the new software of RAISUS made it possible to choice the correct MICs and to put up the sign of trailing growth on the result screen. New software of RAISUS enhances its usability and the accuracy of MICs. Using automatic instrument to determine MICs is useful to obtain objective results easily.

  7. Factors That Affect Software Testability

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  8. Workflow-Based Software Development Environment

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  9. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software.

    PubMed

    Ebersberger, Ullrich; Marcus, Roy P; Schoepf, U Joseph; Lo, Gladys G; Wang, Yining; Blanke, Philipp; Geyer, Lucas L; Gray, J Cranston; McQuiston, Andrew D; Cho, Young Jun; Scheuering, Michael; Canstein, Christian; Nikolaou, Konstantin; Hoffmann, Ellen; Bamberg, Fabian

    2014-01-01

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. • Myocardial perfusion CT is attractive for comprehensive coronary heart disease assessment. • Traditional image analysis methods are cumbersome and time-consuming. • Automated 3D perfusion software shortens analysis times. • Automated 3D perfusion software increases standardisation of myocardial perfusion CT. • Automated, standardised analysis fosters myocardial perfusion CT integration into clinical practice.

  10. Context-specific function of the LIM homeobox 1 transcription factor in head formation of the mouse embryo.

    PubMed

    Fossat, Nicolas; Ip, Chi Kin; Jones, Vanessa J; Studdert, Joshua B; Khoo, Poh-Lynn; Lewis, Samara L; Power, Melinda; Tourle, Karin; Loebel, David A F; Kwan, Kin Ming; Behringer, Richard R; Tam, Patrick P L

    2015-06-01

    Lhx1 encodes a LIM homeobox transcription factor that is expressed in the primitive streak, mesoderm and anterior mesendoderm of the mouse embryo. Using a conditional Lhx1 flox mutation and three different Cre deleters, we demonstrated that LHX1 is required in the anterior mesendoderm, but not in the mesoderm, for formation of the head. LHX1 enables the morphogenetic movement of cells that accompanies the formation of the anterior mesendoderm, in part through regulation of Pcdh7 expression. LHX1 also regulates, in the anterior mesendoderm, the transcription of genes encoding negative regulators of WNT signalling, such as Dkk1, Hesx1, Cer1 and Gsc. Embryos carrying mutations in Pcdh7, generated using CRISPR-Cas9 technology, and embryos without Lhx1 function specifically in the anterior mesendoderm displayed head defects that partially phenocopied the truncation defects of Lhx1-null mutants. Therefore, disruption of Lhx1-dependent movement of the anterior mesendoderm cells and failure to modulate WNT signalling both resulted in the truncation of head structures. Compound mutants of Lhx1, Dkk1 and Ctnnb1 show an enhanced head truncation phenotype, pointing to a functional link between LHX1 transcriptional activity and the regulation of WNT signalling. Collectively, these results provide comprehensive insight into the context-specific function of LHX1 in head formation: LHX1 enables the formation of the anterior mesendoderm that is instrumental for mediating the inductive interaction with the anterior neuroectoderm and LHX1 also regulates the expression of factors in the signalling cascade that modulate the level of WNT activity. © 2015. Published by The Company of Biologists Ltd.

  11. Java for flight software

    NASA Technical Reports Server (NTRS)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  12. Astronomical Software Directory Service

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as

  13. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  14. The NASA Software Research Infusion Initiative: Successful Technology Transfer for Software Assurance

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.

    2006-01-01

    New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.

  15. The software product assurance metrics study: JPL's software systems quality and productivity

    NASA Technical Reports Server (NTRS)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  16. NASA Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda

    1997-01-01

    If software is a critical element in a safety critical system, it is imperative to implement a systematic approach to software safety as an integral part of the overall system safety programs. The NASA-STD-8719.13A, "NASA Software Safety Standard", describes the activities necessary to ensure that safety is designed into software that is acquired or developed by NASA, and that safety is maintained throughout the software life cycle. A PDF version, is available on the WWW from Lewis. A Guidebook that will assist in the implementation of the requirements in the Safety Standard is under development at the Lewis Research Center (LeRC). After completion, it will also be available on the WWW from Lewis.

  17. How Safe Is Control Software

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Corliss, Lloyd D.

    1991-01-01

    Paper examines issue of software safety. Presents four case histories of software-safety analysis. Concludes that, to be safe, software, for all practical purposes, must be free of errors. Backup systems still needed to prevent catastrophic software failures.

  18. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one

  19. UWB Tracking Software Development

    NASA Technical Reports Server (NTRS)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  20. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  1. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  2. Detection and Structural Characterization of Nucleophiles Trapped Reactive Metabolites of Limonin Using Liquid Chromatography-Mass Spectrometry

    PubMed Central

    Deng, Yujie; Fu, Yudong; Xu, Shumin; Wang, Ping; Yang, Nailong; Li, Chengqian

    2018-01-01

    Limonin (LIM), a furan-containing limonoid, is one of the most abundant components of Dictamnus dasycarpus Turcz. Recent studies demonstrated that LIM has great potential for inhibiting the activity of drug-metabolizing enzymes. However, the mechanisms of LIM-induced enzyme inactivation processes remain unexplored. The main objective of this study was to identify the reactive metabolites of LIM using liquid chromatography-mass spectrometry. Three nucleophiles, glutathione (GSH), N-acetyl cysteine (NAC), and N-acetyl lysine (NAL), were used to trap the reactive metabolites of LIM in in vitro and in vivo models. Two different types of mass spectrometry, a hybrid quadrupole time-of-flight (Q-TOF) mass spectrometry and a LTQ velos Pro ion trap mass spectrometry, were employed to acquire structural information of nucleophile adducts of LIM. In total, six nucleophile adducts of LIM (M1–M6) with their isomers were identified; among them, M1 was a GSH and NAL conjugate of LIM, M2–M4 were glutathione adducts of LIM, M5 was a NAC and NAL conjugate of LIM, and M6 was a NAC adduct of LIM. Additionally, CYP3A4 was found to be the key enzyme responsible for the bioactivation of limonin. This metabolism study largely facilitates the understanding of mechanisms of limonin-induced enzyme inactivation processes. PMID:29850372

  3. Software for enhanced video capsule endoscopy: challenges for essential progress.

    PubMed

    Iakovidis, Dimitris K; Koulaouzidis, Anastasios

    2015-03-01

    Video capsule endoscopy (VCE) has revolutionized the diagnostic work-up in the field of small bowel diseases. Furthermore, VCE has the potential to become the leading screening technique for the entire gastrointestinal tract. Computational methods that can be implemented in software can enhance the diagnostic yield of VCE both in terms of efficiency and diagnostic accuracy. Since the appearance of the first capsule endoscope in clinical practice in 2001, information technology (IT) research groups have proposed a variety of such methods, including algorithms for detecting haemorrhage and lesions, reducing the reviewing time, localizing the capsule or lesion, assessing intestinal motility, enhancing the video quality and managing the data. Even though research is prolific (as measured by publication activity), the progress made during the past 5 years can only be considered as marginal with respect to clinically significant outcomes. One thing is clear-parallel pathways of medical and IT scientists exist, each publishing in their own area, but where do these research pathways meet? Could the proposed IT plans have any clinical effect and do clinicians really understand the limitations of VCE software? In this Review, we present an in-depth critical analysis that aims to inspire and align the agendas of the two scientific groups.

  4. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  5. Software Security Knowledge: Training

    DTIC Science & Technology

    2011-05-01

    eliminating those erro~rs. It can be found at http:ffcwe.mitre.org/top25. Any programmer who writes C’Ode \\r-Vith~out betng aware of those proble ~ms a·nd...time on security. Ultimately, these reasons stem from an underlying problem in the software market . B~cause software is essentially a black·box, it is...security of software and start to effect change in the software market . Nevertheless, we still frequently get pushback when we advocate for security

  6. DSS command software update

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1980-01-01

    The modifications, additions, and testing results for a version of the Deep Space Station command software, generated for support of the Voyager Saturn encounter, are discussed. The software update requirements included efforts to: (1) recode portions of the software to permit recovery of approximately 2000 words of memory; (2) correct five Voyager Ground data System liens; (3) provide capability to automatically turn off the command processor assembly local printer during periods of low activity; and (4) correct anomalies existing in the software.

  7. On-Orbit Software Analysis

    NASA Technical Reports Server (NTRS)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  8. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  9. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  10. Genome puzzle master (GPM): an integrated pipeline for building and editing pseudomolecules from fragmented sequences.

    PubMed

    Zhang, Jianwei; Kudrna, Dave; Mu, Ting; Li, Weiming; Copetti, Dario; Yu, Yeisoo; Goicoechea, Jose Luis; Lei, Yang; Wing, Rod A

    2016-10-15

    Next generation sequencing technologies have revolutionized our ability to rapidly and affordably generate vast quantities of sequence data. Once generated, raw sequences are assembled into contigs or scaffolds. However, these assemblies are mostly fragmented and inaccurate at the whole genome scale, largely due to the inability to integrate additional informative datasets (e.g. physical, optical and genetic maps). To address this problem, we developed a semi-automated software tool-Genome Puzzle Master (GPM)-that enables the integration of additional genomic signposts to edit and build 'new-gen-assemblies' that result in high-quality 'annotation-ready' pseudomolecules. With GPM, loaded datasets can be connected to each other via their logical relationships which accomplishes tasks to 'group,' 'merge,' 'order and orient' sequences in a draft assembly. Manual editing can also be performed with a user-friendly graphical interface. Final pseudomolecules reflect a user's total data package and are available for long-term project management. GPM is a web-based pipeline and an important part of a Laboratory Information Management System (LIMS) which can be easily deployed on local servers for any genome research laboratory. The GPM (with LIMS) package is available at https://github.com/Jianwei-Zhang/LIMS CONTACTS: jzhang@mail.hzau.edu.cn or rwing@mail.arizona.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. Challenges in small screening laboratories: implementing an on-demand laboratory information management system.

    PubMed

    Lemmon, Vance P; Jia, Yuanyuan; Shi, Yan; Holbrook, S Douglas; Bixby, John L; Buchser, William

    2011-11-01

    The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signaling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Documenting and integrating the experimental workflows, library data and extensive experimental results is challenging. For academic laboratories generating large data sets from experiments involving thousands of perturbagens, a Laboratory Information Management System (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with an On Demand or Software As A Service (SaaS) LIMS to ensure the quality of its experiments and workflows. The article discusses how the system was selected and integrated into the laboratory. The advantages of a SaaS based LIMS over a client-server based system are described. © 2011 Bentham Science Publishers

  12. Textbook Software versus Professional Software: Which Is Better for Instructional Purposes?

    ERIC Educational Resources Information Center

    Snell, Meggan; Yatsenko, Olga

    2002-01-01

    Compares textbook software with professional packages such as Peachtree for teaching accounting, in terms of cost, availability, ease of teaching and learning, and applicability. Makes suggestions for choosing accounting software. (SK)

  13. Current trends in hardware and software for brain-computer interfaces (BCIs)

    NASA Astrophysics Data System (ADS)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  14. Software Classifications: Trends in Literacy Software Publication and Marketing.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    First in a continuing series of reports on trends in marketing and publication of software for literacy education, a study explored the development of a database to track the trends and reported on trends seen in 1995. The final version of the 1995 database consisted of 1011 software titles, 165 of which had been published in 1995 and 846…

  15. The social disutility of software ownership.

    PubMed

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  16. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    NASA Astrophysics Data System (ADS)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  17. Culture shock: Improving software quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less

  18. Astronomical Software Directory Service

    NASA Technical Reports Server (NTRS)

    Hanisch, R. J.; Payne, H.; Hayes, J.

    1998-01-01

    This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.

  19. Specifications for Thesaurus Software.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1991-01-01

    Presents specifications for software that is designed to support manual development and maintenance of information retrieval thesauri. Evaluation of existing software and design of custom software is discussed, requirements for integration with larger systems and for the user interface are described, and relationships among terms are discussed.…

  20. Antiterrorist Software

    NASA Technical Reports Server (NTRS)

    Clark, David A.

    1998-01-01

    In light of the escalation of terrorism, the Department of Defense spearheaded the development of new antiterrorist software for all Government agencies by issuing a Broad Agency Announcement to solicit proposals. This Government-wide competition resulted in a team that includes NASA Lewis Research Center's Computer Services Division, who will develop the graphical user interface (GUI) and test it in their usability lab. The team launched a program entitled Joint Sphere of Security (JSOS), crafted a design architecture (see the following figure), and is testing the interface. This software system has a state-ofthe- art, object-oriented architecture, with a main kernel composed of the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS will be used as the software "breadboard" for assembling the components of explosions, such as blast and collapse simulations.

  1. Software quality in 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized thatmore » success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.« less

  2. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  3. Executable assertions and flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    Executable assertions are used to test flight control software. The techniques used for testing flight software; however, are different from the techniques used to test other kinds of software. This is because of the redundant nature of flight software. An experimental setup for testing flight software using executable assertions is described. Techniques for writing and using executable assertions to test flight software are presented. The error detection capability of assertions is studied and many examples of assertions are given. The issues of placement and complexity of assertions and the language features to support efficient use of assertions are discussed.

  4. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  5. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  6. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  7. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  8. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  9. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  10. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  11. Automated matching software for clinical trials eligibility: measuring efficiency and flexibility.

    PubMed

    Penberthy, Lynne; Brown, Richard; Puma, Federico; Dahman, Bassam

    2010-05-01

    Clinical trials (CT) serve as the media that translates clinical research into standards of care. Low or slow recruitment leads to delays in delivery of new therapies to the public. Determination of eligibility in all patients is one of the most important factors to assure unbiased results from the clinical trials process and represents the first step in addressing the issue of under representation and equal access to clinical trials. This is a pilot project evaluating the efficiency, flexibility, and generalizibility of an automated clinical trials eligibility screening tool across 5 different clinical trials and clinical trial scenarios. There was a substantial total savings during the study period in research staff time spent in evaluating patients for eligibility ranging from 165h to 1329h. There was a marked enhancement in efficiency with the automated system for all but one study in the pilot. The ratio of mean staff time required per eligible patient identified ranged from 0.8 to 19.4 for the manual versus the automated process. The results of this study demonstrate that automation offers an opportunity to reduce the burden of the manual processes required for CT eligibility screening and to assure that all patients have an opportunity to be evaluated for participation in clinical trials as appropriate. The automated process greatly reduces the time spent on eligibility screening compared with the traditional manual process by effectively transferring the load of the eligibility assessment process to the computer. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  12. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  13. Formal Validation of Aerospace Software

    NASA Astrophysics Data System (ADS)

    Lesens, David; Moy, Yannick; Kanig, Johannes

    2013-08-01

    Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.

  14. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    PubMed

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined

  15. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  16. The Implications of Using Integrated Software Support Environment for Design of Guidance and Control Systems Software

    DTIC Science & Technology

    1990-02-01

    inspections are performed before each formal review of each software life cycle phase. * Required software audits are performed . " The software is acceptable... Audits : Software audits are performed bySQA consistent with thegeneral audit rules and an auditreportis prepared. Software Quality Inspection (SQI...DSD Software Development Method 3-34 DEFINITION OF ACRONYMS Acronym Full Name or Description MACH Methode d’Analyse et de Conception Flierarchisee

  17. GFAST Software Demonstration

    NASA Image and Video Library

    2017-03-17

    NASA engineers and test directors gather in Firing Room 3 in the Launch Control Center at NASA's Kennedy Space Center in Florida, to watch a demonstration of the automated command and control software for the agency's Space Launch System (SLS) and Orion spacecraft. In front, far right, is Charlie Blackwell-Thompson, launch director for Exploration Mission 1 (EM-1). The software is called the Ground Launch Sequencer. It will be responsible for nearly all of the launch commit criteria during the final phases of launch countdowns. The Ground and Flight Application Software Team (GFAST) demonstrated the software. It was developed by the Command, Control and Communications team in the Ground Systems Development and Operations (GSDO) Program. GSDO is helping to prepare the center for the first test flight of Orion atop the SLS on EM-1.

  18. Classification software technique assessment

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.

    1976-01-01

    A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.

  19. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  20. VARK learning preferences and mobile anatomy software application use in pre-clinical chiropractic students.

    PubMed

    Meyer, Amanda J; Stomski, Norman J; Innes, Stanley I; Armson, Anthony J

    2016-05-06

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists between preferred learning styles as determined by the validated VARK(©) questionnaire and use of mobile anatomy apps. The majority of the students who completed the VARK questionnaire were multimodal learners with kinesthetic and visual preferences. Sixty-seven percent (73/109) of students owned one or more mobile anatomy apps which were used by 57 students. Most of these students owned one to five apps and spent less than 30 minutes per week using them. Six of the top eight mobile anatomy apps owned and recommended by the students were developed by 3D4Medical. Visual learning preferences were not associated with time spent using mobile anatomy apps (OR = 0.40, 95% CI 0.12-1.40). Similarly, kinesthetic learning preferences (OR = 1.88, 95% CI 0.18-20.2), quadmodal preferences (OR = 0.71, 95% CI 0.06-9.25), or gender (OR = 1.51, 95% CI 0.48-4.81) did not affect the time students' spent using mobile anatomy apps. Learning preferences do not appear to influence students' time spent using mobile anatomy apps. Anat Sci Educ 9: 247-254. © 2015 American Association of Anatomists. © 2015 American Association of Anatomists.