Sample records for software development i-used

  1. [Development of a software standardizing optical density with operation settings related to several limitations].

    PubMed

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  2. Reuse at the Software Productivity Consortium

    NASA Technical Reports Server (NTRS)

    Weiss, David M.

    1989-01-01

    The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.

  3. Characterization of Morphology using MAMA Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gravelle, Julie

    The MAMA (Morphological Analysis for Material Attribution) software was developed at the Los Alamos National Laboratory funded through the National Technical Nuclear Forensics Center in the Department of Homeland Security. The software allows images to be analysed and quantified. The largest project I worked on was to quantify images of plutonium oxides and ammonium diuranates prepared by the group with the software and provide analyses on the particles of each sample. Images were quantified through MAMA, with a color analysis, a lexicon description and powder x-ray diffraction. Through this we were able to visually see a difference between some ofmore » the syntheses. An additional project was to revise the manual for MAMA to help streamline training and provide useful tips to users to more quickly become acclimated to using the software. The third project investigated expanding the scope of MAMA and finding a statistically relevant baseline for the particulates through the analysis of maps in the software and using known measurements to compare the error associated with the software. During this internship, I worked on several different projects dealing with the MAMA software. The revision of the usermanual for the MAMA software was the first project I was able to work and collaborate on. I first learned how to use the software by getting instruction from a skilled user at the laboratory, Dan Schwartz, and by using the existing user manual and examples. After becoming accustomed to the program, I started to go over the manual to correct and change items that were not as useful or descriptive as they could have been. I also added in tips that I learned as I explored the software. The updated manual was also worked on by several others who have been developing the program. The goal of these revisions was to ensure the most concise and simple directions to the software were available to future users. By incorporating tricks and shortcuts that I discovered and picked up from watching other users into the user guide, I believe that anyone who utilizes the software will be able to quickly understand the best way to analyze their image and use the tools the program offers to achieve useful results.« less

  4. Using iKidTools™ Software Support Systems to Develop and Implement Self-Monitoring Interventions

    ERIC Educational Resources Information Center

    Patti, Angela L.; Miller, Kevin J.

    2011-01-01

    Educational teams often are faced with the task of developing and implementing Behavioral Intervention Plans (BIPs) for students who present challenging and/or disruptive behaviors. This article describes the steps used to develop and implement a self-monitoring BIP that incorporated an innovative software system, iKidTools™. An authentic case…

  5. An Engineering Context for Software Engineering

    DTIC Science & Technology

    2008-09-01

    medium in which I can plant the ideas from this dissertation. I have also written a book on requirements development that is used at NPS by myself and...Addison-Wesley, Anniversary ed., 1995. [Bry00] Bryant, A., “Metaphor, Myth, and Mimicry : The Bases of Software Engineering,” Annals of Software

  6. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  7. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  8. Astronomers as Software Developers

    NASA Astrophysics Data System (ADS)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  9. Lessons Learned through the Development and Publication of AstroImageJ

    NASA Astrophysics Data System (ADS)

    Collins, Karen

    2018-01-01

    As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.

  10. SNLSimMagic v 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software is an iOS (Apple) Augmented Reality (AR) application that runs on the iPhone and iPad. It is designed to scan in a photograph or graphic and "play" an associated video. This release, SNLSimMagic, was built using Wikitude Augmented Reality (AR) software development kit (SDK) integrated into Apple iOS SDK application and the Cordova libraries. These codes enable the generation of runtime targets using cloud recognition and developer-defined target features which are then accessed by means of a custom application.

  11. Programmable Logic Device (PLD) Design Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. At the conclusion of the development, the software and hardware description language (HDL) code was delivered to JSC for their use in their iPAS test bed to get hands-on experience with the STRS standard, and for development of their own STRS Waveforms on the now STRS compliant platform.The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe the design of the HDL code for the FPGA portion of the iPAS STRS Radio particularly the design of the FPGA wrapper and the test waveform.

  12. The roles of the AAS Journals' Data Editors

    NASA Astrophysics Data System (ADS)

    Muench, August; NASA/SAO ADS, CERN/Zenodo.org, Harvard/CfA Wolbach Library

    2018-01-01

    I will summarize the community services provided by the AAS Journals' Data Editors to support authors’ when citing and preserving the software and data used in the published literature. In addition I will describe the life of a piece of code as it passes through the current workflows for software citation in astronomy. Using this “lifecycle” I will detail the ongoing work funded by a grant from the Alfred P. Sloan Foundation to the American Astronomical Society to improve the citation of software in the literature. The funded development team and advisory boards, made up of non-profit publishers, literature indexers, and preservation archives, is implementing the Force11 Software citation principles for astronomy Journals. The outcome of this work will be new workflows for authors and developers that fit in their current practices while enabling versioned citation of software and granular credit for its creators.

  13. Improving the Agency's Software Acquisition Capability

    NASA Technical Reports Server (NTRS)

    Hankinson, Allen

    2003-01-01

    External development of software has oftc n led to unsatisfactory results and great frustration for the assurE 7ce community. Contracts frequently omit critical assuranc 4 processes or the right to oversee software development activitie: At a time when NASA depends more and more on software to in plement critical system functions, combination of three factors ex; cerbate this problem: I ) the ever-increasing trend to acquire rather than develop software in-house, 2) the trend toward performance based contracts, and 3) acquisition vehicles that only state softwar 2 requirements while leaving development standards and assur! ince methodologies up to the contractor. We propose to identify specific methods at d tools that NASA projects can use to mitigate the adverse el ects of the three problems. TWO broad classes of methoddt ols will be explored. The first will be those that provide NASA p ojects with insight and oversight into contractors' activities. The st cond will be those that help projects objectively assess, and thus i nprwe, their software acquisition capability. Of particular interest is the Software Engineering Institute's (SEI) Software Acqt isition Capability Maturity Model (SA-CMMO).

  14. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Hawkins, Lamar

    1996-01-01

    This draft final report describes the work performed under the delivery order number 145 from May 1995 through August 1996. The scope of work included a number of software development tasks for the performance modeling of AXAF-I. A number of new capabilities and functions have been added to the GT software, which is the command mode version of the GRAZTRACE software, originally developed by MSFC. A structural data interface has been developed for the EAL (old SPAR) finite element analysis FEA program, which is being used by MSFC Structural Analysis group for the analysis of AXAF-I. This interface utility can read the structural deformation file from the EAL and other finite element analysis programs such as NASTRAN and COSMOS/M, and convert the data to a suitable format that can be used for the deformation ray-tracing to predict the image quality for a distorted mirror. There is a provision in this utility to expand the data from finite element models assuming 180 degrees symmetry. This utility has been used to predict image characteristics for the AXAF-I HRMA, when subjected to gravity effects in the horizontal x-ray ground test configuration. The development of the metrology data processing interface software has also been completed. It can read the HDOS FITS format surface map files, manipulate and filter the metrology data, and produce a deformation file, which can be used by GT for ray tracing for the mirror surface figure errors. This utility has been used to determine the optimum alignment (axial spacing and clocking) for the four pairs of AXAF-I mirrors. Based on this optimized alignment, the geometric images and effective focal lengths for the as built mirrors were predicted to cross check the results obtained by Kodak.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peck, T; Sparkman, D; Storch, N

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less

  16. Development of the FITS tools package for multiple software environments

    NASA Technical Reports Server (NTRS)

    Pence, W. D.; Blackburn, J. K.

    1992-01-01

    The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.

  17. Form versus Function: Using Technology to Develop Individualized Education Programs for Students with Disabilities

    ERIC Educational Resources Information Center

    Wilson, Gloria Lodato; Michaels, Craig A.; Margolis, Howard

    2005-01-01

    This article discusses the use of IEP software applications from the perspectives of form (i.e., legally correct documents) and function (i.e., educationally appropriate individualized programs). The article provides an overview of the basic components of two fairly comprehensive IEP software programs and discusses the general strengths and…

  18. Flight Planning Branch NASA Co-op Tour

    NASA Technical Reports Server (NTRS)

    Marr, Aja M.

    2013-01-01

    This semester I worked with the Flight Planning Branch at the NASA Johnson Space Center. I learned about the different aspects of flight planning for the International Space Station as well as the software that is used internally and ISSLive! which is used to help educate the public on the space program. I had the opportunity to do on the job training in the Mission Control Center with the planning team. I transferred old timeline records from the planning team's old software to the new software in order to preserve the data for the future when the software is retired. I learned about the operations of the International Space Station, the importance of good communication between the different parts of the planning team, and enrolled in professional development classes as well as technical classes to learn about the space station.

  19. Technical Performance Assessment: Mission Success in Software Acquisition Management

    DTIC Science & Technology

    2010-04-27

    Examples Design constraints make software acquisition and development t l iti lex reme y cr ca Application domain – Operational Flight Program, Air...environment – used to produce the software Ri k t t bli h d d i t i d i k ts managemen – es a s e an ma n a ne r s managemen systems Milestone reviews...Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that

  20. 48 CFR 970.5227-1 - Rights in data-facilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software. (2) Computer software, as used in this clause, means (i) computer programs which are data... software. The term “data” does not include data incidental to the administration of this contract, such as... this clause, means data, other than computer software, developed at private expense that embody trade...

  1. A self-referential HOWTO on release engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galassi, Mark C.

    Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early andmore » continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.« less

  2. Waveform Developer's Guide for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx(Trademark) ML605 Virtex(Trademark)-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek(Trademark) eBox 620-110-FL) running the Ubuntu 12.4 operating system. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications. The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  3. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  4. Automation of the Environmental Control and Life Support System

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, J. Ray

    1990-01-01

    The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.

  5. Data and Analysis Center for Software.

    DTIC Science & Technology

    1980-06-01

    can make use of it in their day- to -day activities of developing, maintaining, and managing software. The biblio- graphic collection is composed of...which refer to development, design, or programming approaches whicn view a software system component, or module in terms of its required or intended... practices " are also included In this group. PROCEDURES (I keyword) Procedures is a term used ambiguously in the literature to refer to functions

  6. [Prenatal risk calculation: comparison between Fast Screen pre I plus software and ViewPoint software. Evaluation of the risk calculation algorithms].

    PubMed

    Morin, Jean-François; Botton, Eléonore; Jacquemard, François; Richard-Gireme, Anouk

    2013-01-01

    The Fetal medicine foundation (FMF) has developed a new algorithm called Prenatal Risk Calculation (PRC) to evaluate Down syndrome screening based on free hCGβ, PAPP-A and nuchal translucency. The peculiarity of this algorithm is to use the degree of extremeness (DoE) instead of the multiple of the median (MoM). The biologists measuring maternal seric markers on Kryptor™ machines (Thermo Fisher Scientific) use Fast Screen pre I plus software for the prenatal risk calculation. This software integrates the PRC algorithm. Our study evaluates the data of 2.092 patient files of which 19 show a fœtal abnormality. These files have been first evaluated with the ViewPoint software based on MoM. The link between DoE and MoM has been analyzed and the different calculated risks compared. The study shows that Fast Screen pre I plus software gives the same risk results as ViewPoint software, but yields significantly fewer false positive results.

  7. 32 CFR 37.550 - May I accept intellectual property as cost sharing?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for...

  8. 32 CFR 37.550 - May I accept intellectual property as cost sharing?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions...

  9. 48 CFR 52.250-5 - SAFETY Act-Equitable Adjustment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., engineering services, software development services, software integration services, threat assessments... security, i.e., it will perform as intended, conforms to the seller's specifications, and is safe for use...

  10. 48 CFR 52.250-5 - SAFETY Act-Equitable Adjustment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., engineering services, software development services, software integration services, threat assessments... security, i.e., it will perform as intended, conforms to the seller's specifications, and is safe for use...

  11. 48 CFR 52.250-5 - SAFETY Act-Equitable Adjustment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., engineering services, software development services, software integration services, threat assessments... security, i.e., it will perform as intended, conforms to the seller's specifications, and is safe for use...

  12. 48 CFR 52.250-5 - SAFETY Act-Equitable Adjustment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., engineering services, software development services, software integration services, threat assessments... security, i.e., it will perform as intended, conforms to the seller's specifications, and is safe for use...

  13. Developing Avionics Hardware and Software for Rocket Engine Testing

    NASA Technical Reports Server (NTRS)

    Aberg, Bryce Robert

    2014-01-01

    My summer was spent working as an intern at Kennedy Space Center in the Propulsion Avionics Branch of the NASA Engineering Directorate Avionics Division. The work that I was involved with was part of Rocket University's Project Neo, a small scale liquid rocket engine test bed. I began by learning about the layout of Neo in order to more fully understand what was required of me. I then developed software in LabView to gather and scale data from two flowmeters and integrated that code into the main control software. Next, I developed more LabView code to control an igniter circuit and integrated that into the main software, as well. Throughout the internship, I performed work that mechanics and technicians would do in order to maintain and assemble the engine.

  14. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  15. The International River Interface Cooperative: Public Domain Software for River Flow and Morphodynamics (Invited)

    NASA Astrophysics Data System (ADS)

    Nelson, J. M.; Shimizu, Y.; McDonald, R.; Takebayashi, H.

    2009-12-01

    The International River Interface Cooperative is an informal organization made up of academic faculty and government scientists with the goal of developing, distributing and providing education for a public-domain software interface for modeling river flow and morphodynamics. Formed in late 2007, the group released the first version of this interface (iRIC) in late 2009. iRIC includes models for two and three-dimensional flow, sediment transport, bed evolution, groundwater-surface water interaction, topographic data processing, and habitat assessment, as well as comprehensive data and model output visualization, mapping, and editing tools. All the tools in iRIC are specifically designed for use in river reaches and utilize common river data sets. The models are couched within a single graphical user interface so that a broad spectrum of models are available to users without learning new pre- and post-processing tools. The first version of iRIC was developed by combining the USGS public-domain Multi-Dimensional Surface Water Modeling System (MD_SWMS), developed at the USGS Geomorphology and Sediment Transport Laboratory in Golden, Colorado, with the public-domain river modeling code NAYS developed by the Universities of Hokkaido and Kyoto, Mizuho Corporation, and the Foundation of the River Disaster Prevention Research Institute in Sapporo, Japan. Since this initial effort, other Universities and Agencies have joined the group, and the interface has been expanded to allow users to integrate their own modeling code using Executable Markup Language (XML), which provides easy access and expandability to the iRIC software interface. In this presentation, the current components of iRIC are described and results from several practical modeling applications are presented to illustrate the capabilities and flexibility of the software. In addition, some future extensions to iRIC are demonstrated, including software for Lagrangian particle tracking and the prediction of bedform development and response to time-varying flows. Education and supporting documentation for iRIC, including detailed tutorials, are available at www.i-ric.org. The iRIC model codes, interface, and all supporting documentation are in the public domain.

  16. "Test" is a Four Letter Word

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G M

    2005-05-03

    For a number of years I had the pleasure of teaching Testing Seminars all over the world and meeting and learning from others in our field. Over a twelve year period, I always asked the following questions to Software Developers, Test Engineers, and Managers who took my two or three day seminar on Software Testing: 'When was the first time you heard the word test'? 'Where were you when you first heard the word test'? 'Who said the word test'? 'How did the word test make you feel'? Most of the thousands of responses were similar to 'It was mymore » third grade teacher at school, and I felt nervous and afraid'. Now there were a few exceptions like 'It was my third grade teacher, and I was happy and excited to show how smart I was'. But by and large, my informal survey found that 'testing' is a word to which most people attach negative meanings, based on its historical context. So why is this important to those of us in the software development business? Because I have found that a preponderance of software developers do not get real excited about hearing that the software they just wrote is going to be 'tested' by the Test Group. Typical reactions I have heard over the years run from: 'I'm sure there is nothing wrong with the software, so go ahead and test it, better you find defects than our customers'. to these extremes: 'There is no need to test my software because there is nothing wrong with it'. 'You are not qualified to test my software because you don't know as much as I do about it'. 'If any Test Engineers come into our office again to test our software we will throw them through the third floor window'. So why is there such a strong negative reaction to testing? It is primitive. It goes back to grade school for many of us. It is a negative word that congers up negative emotions. In other words, 'test' is a four letter word. How many of us associate 'Joy' with 'Test'? Not many. It is hard for most of us to reprogram associations learned at an early age. So what can we do about it (short of hypnotic therapy for software developers)? Well one concept I have used (and still use) is to not call testing 'testing'. Call it something else. Ever wonder why most of the Independent Software Testing groups are called Software Quality Assurance groups? Now you know. Software Quality Assurance is not such a negatively charged phrase, even though Software Quality Assurance is much more than simply testing. It was a real blessing when the concept of Validation and Verification came about for software. Now I define Validation to mean assuring that the product produced does the right thing (usually what the customer wants it to do), and verification means that the product was built the right way (in accordance with some good design principles and practices). So I have deliberately called the System Test Group the Verification and Validation Group, or V&V Group, as a way of avoiding the negative image problem. I remember once having a conversation with a developer colleague who said, in the heat of battle, that it was fine to V&V his code, just don't test it! Once again V&V includes many things besides testing, but it just doesn't sound like an onerous thing to do to software. In my current job, working at a highly regarded national laboratory with world renowned physicists, I have again encountered the negativity about testing software. Except here they don't take kindly to Software Quality Assurance or Software Verification and Validation either. After all, software is just a trivial tool to automate algorithms that implement physics models. Testing, SQA, and V&V take time and get in the way of completing ground breaking science experiments. So I have again had to change the name of software testing to something less negative in the physics world. I found (the hard way) that if I requested more time to do software experimentation, the physicist's resistance melted. And so the conversation continues, 'We have time to run more software experiments. Just don't waste any time testing the software'! In case the concept of not calling testing 'testing' appeals to you, and there may be an opportunity for you to take the sting out of the name at your place of employment, I have compiled a table of things that testing could be called besides 'testing'. Of course we can embellish this by adding some good sounding prefixes and suffixes also. To come up with alternate names for testing, pick a word from columns A, B, and C in the table below. For instance Unified Acceptance Trials (A2,B7,C3) or Tailored Observational Demonstration (A6,B5,C5) or Agile Criteria Scoring (A3,B8,C8) or Rapid Requirement Proof (A1,B9,C7) or Satisfaction Assurance (B10,C1). You can probably think of some additional combinations appropriate for your industry.« less

  17. Organization and use of a Software/Hardware Avionics Research Program (SHARP)

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.; Kareemi, M. N.

    1975-01-01

    The organization and use is described of the software/hardware avionics research program (SHARP) developed to duplicate the automatic portion of the STOLAND simulator system, on a general-purpose computer system (i.e., IBM 360). The program's uses are: (1) to conduct comparative evaluation studies of current and proposed airborne and ground system concepts via single run or Monte Carlo simulation techniques, and (2) to provide a software tool for efficient algorithm evaluation and development for the STOLAND avionics computer.

  18. Desktop Publishing on the Macintosh: A Software Perspective.

    ERIC Educational Resources Information Center

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  19. Hardware Interface Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio Ssystem (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  20. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  1. Multiplexing the Ethernet Interface Among VAX/VMS Users.

    DTIC Science & Technology

    1983-12-01

    Microcomputer Development Systems (MDS) and VAX/VPS sy’stemn were used for the implementation and testing of the project. The software is designed in such a...detailed design and .splesenta- toa of tie project. Is realLty chape: I constitutes the doee-.at os of the developed software. In soms instances, thisgs that...changes. 460i9n4t 4 provides a high level design of a virtual x $44 aletwock. alcag with some hints which may be useful So 44e W". who will work in this

  2. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  3. Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven

    1997-01-01

    Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.

  4. [Utility of Smartphone in Home Care Medicine - First Trial].

    PubMed

    Takeshige, Toshiyuki; Hirano, Chiho; Nakagawa, Midori; Yoshioka, Rentaro

    2015-12-01

    The use of video calls for home care can reduce anxiety and offer patients peace of mind. The most suitable terminals at facilities to support home care have been iPad Air and iPhone with FaceTime software. However, usage has been limited to specific terminals. In order to eliminate the need for special terminals and software, we have developed a program that has been customized to meet the needs of facilities using Web Real Time Communication(WebRTC)in cooperation with the University of Aizu. With this software, video calls can accommodate the large number of home care patients.

  5. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  6. SMC Message Browser Projects

    NASA Technical Reports Server (NTRS)

    Wichmann, Benjamin C.

    2013-01-01

    I work directly with the System Monitoring and Control (SMC) software engineers who develop, test and release custom and commercial software in support of the Kennedy Space Center Spaceport Command and Control System. (SCCS). SMC uses Commercial Off-The-Shelf (COTS) Enterprise Management Systems (EMS) software which provides a centralized subsystem for configuring, monitoring, and controlling SCCS hardware and software used in the Control Rooms. There are multiple projects being worked on using the COTS EMS software. I am currently working with the HP Operations Manager for UNIX (OMU) software which allows Master Console Operators (MCO) to access, view and interpret messages regarding the status of the SCCS hardware and software. The OMU message browser gets cluttered with messages which can make it difficult for the MCO to manage. My main project involves determining ways to reduce the number of messages being displayed in the OMU message browser. I plan to accomplish this task in two different ways: (1) by correlating multiple messages into one single message being displayed and (2) to create policies that will determine the significance of each message and whether or not it needs to be displayed to the MCO. The core idea is to lessen the number of messages being sent to the OMU message browser so the MCO can more effectively use it.

  7. User-friendly tools on handheld devices for observer performance study

    NASA Astrophysics Data System (ADS)

    Matsumoto, Takuya; Hara, Takeshi; Shiraishi, Junji; Fukuoka, Daisuke; Abe, Hiroyuki; Matsusako, Masaki; Yamada, Akira; Zhou, Xiangrong; Fujita, Hiroshi

    2012-02-01

    ROC studies require complex procedures to select cases from many data samples, and to set confidence levels in each selected case to generate ROC curves. In some observer performance studies, researchers have to develop software with specific graphical user interface (GUI) to obtain confidence levels from readers. Because ROC studies could be designed for various clinical situations, it is difficult task for preparing software corresponding to every ROC studies. In this work, we have developed software for recording confidence levels during observer studies on tiny personal handheld devices such as iPhone, iPod touch, and iPad. To confirm the functions of our software, three radiologists performed observer studies to detect lung nodules by using public database of chest radiograms published by Japan Society of Radiological Technology. The output in text format conformed to the format for the famous ROC kit from the University of Chicago. Times required for the reading each case was recorded very precisely.

  8. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  9. Software for Experimental Air-Ground Data Link Volume I : Functional Description and Flowcharts.

    DOT National Transportation Integrated Search

    1975-10-01

    Experimental Data Link System which was implemented for flight test during the Air-Ground Data Link Development Program (FAA-TSC Project Number FA-13). : The software development is presented in three volumes as follows: : Volume I: -- Functional Des...

  10. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  11. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  12. Design and Development of a Virtual Facility Tour Using iPIX(TM) Technology

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2002-01-01

    The capabilities of the iPIX virtual tour software, in conjunction with a web-based interface create a unique and valuable system that provides users with an efficient virtual capability to tour facilities while being able to acquire the necessary technical content is demonstrated. A users guide to the Mechanics and Durability Branch's virtual tour is presented. The guide provides the user with instruction on operating both scripted and unscripted tours as well as a discussion of the tours for Buildings 1148, 1205 and 1256 and NASA Langley Research Center. Furthermore, an indepth discussion has been presented on how to develop a virtual tour using the iPIX software interface with conventional html and JavaScript. The main aspects for discussion are on network and computing issues associated with using this capability. A discussion of how to take the iPIX pictures, manipulate them and bond them together to form hemispherical images is also presented. Linking of images with additional multimedia content is discussed. Finally, a method to integrate the iPIX software with conventional HTML and JavaScript to facilitate linking with multi-media is presented.

  13. Virtual airway simulation to improve dexterity among novices performing fibreoptic intubation.

    PubMed

    De Oliveira, G S; Glassenberg, R; Chang, R; Fitzgerald, P; McCarthy, R J

    2013-10-01

    We developed a virtual reality software application (iLarynx) using built-in accelerometer properties of the iPhone(®) or iPad(®) (Apple Inc., Cupertino, CA, USA) that mimics hand movements for the performance of fibreoptic skills. Twenty novice medical students were randomly assigned to virtual airway training with the iLarynx software or no additional training. Eight out of the 10 subjects in the standard training group had at least one failed (> 120 s) attempt compared with two out of the 10 participants in the iLarynx group (p = 0.01). There were a total of 24 failed attempts in the standard training group and four in the iLarynx group (p < 0.005). Cusum analysis demonstrated continued group improvement in the iLarynx, but not in the standard training group. Virtual airway simulation using freely available software on a smartphone/tablet device improves dexterity among novices performing upper airway endoscopy. © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  14. 2016 Research Outreach Program report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hye Young; Kim, Yangkyu

    2016-10-13

    This paper is the research activity report for 4 weeks in LANL. Under the guidance of Dr. Lee, who performs nuclear physics research at LANSCE, LANL, I studied the Low Energy NZ (LENZ) setup and how to use the LENZ. First, I studied the LENZ chamber and Si detectors, and worked on detector calibrations, using the computer software, ROOT (CERN developed data analysis tool) and EXCEL (Microsoft office software). I also performed the calibration experiments that measure alpha particles emitted from a Th-229 source by using a S1-type detector (Si detector). And with Dr. Lee, we checked the result.

  15. Design of a Software for Calculating Isoelectric Point of a Polypeptide According to Their Net Charge Using the Graphical Programming Language LabVIEW

    ERIC Educational Resources Information Center

    Tovar, Glomen

    2018-01-01

    A software to calculate the net charge and to predict the isoelectric point (pI) of a polypeptide is developed in this work using the graphical programming language LabVIEW. Through this instrument the net charges of the ionizable residues of the chains of the proteins are calculated at different pH values, tabulated, pI is predicted and an Excel…

  16. The Development of Ada (Trademark) Software for Secure Environments

    DTIC Science & Technology

    1986-05-23

    Telecommunications environment, This paper discusses software socurity and seeks to demostrate how the Ada programming language can be utilizec as a tool...complexity 4 . We use abstraction in our lives every day to control complexity; the principles of abstraction for software engineering are ro different...systems. These features directly sup,) )-t t.ie m odernp software engineering principles d1 s I , , 1 t, thne previous section. This is not surprising

  17. Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa

    2013-01-01

    This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software

  18. Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio User's Guide -- Advanced Exploration Systems (AES)

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto; Shalkhauser, Mary Jo Windmille

    2017-01-01

    The Integrated Power, Avionics and Software (IPAS) software defined radio (SDR) was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RAICS) platform, for radio development at NASA Johnson Space Center. Software and hardware description language (HDL) code were delivered by NASA Glenn Research Center for use in the IPAS test bed and for development of their own Space Telecommunications Radio System (STRS) waveforms on the RAICS platform. The purpose of this document is to describe how to setup and operate the IPAS STRS Radio platform with its delivered test waveform.

  19. Software Assurance Curriculum Project Volume 2: Undergraduate Course Outlines

    DTIC Science & Technology

    2010-08-01

    Contents Acknowledgments iii Abstract v 1 An Undergraduate Curriculum Focus on Software Assurance 1 2 Computer Science I 7 3 Computer Science II...confidence that can be integrated into traditional software development and acquisition process models . Thus, in addition to a technology focus...testing throughout the software development life cycle ( SDLC ) AP Security and complexity—system development challenges: security failures

  20. International Assessment of Research and Development in Simulation-Based Engineering and Science. Panel Report

    DTIC Science & Technology

    2009-01-01

    University of California, Berkeley. In this session, Dennis Gannon of Indiana University described the use of high performance computing for storm...Software Development (Session Introduction) Dennis Gannon Indiana University Software for Mesoscale Storm Prediction: Using Supercomputers for On...Ho, D. Ierardi, I. Kolossvary, J. Klepeis, T. Layman, C. McLeavey , M. Moraes, R. Mueller, E. Priest, Y. Shan, J. Spengler, M. Theobald, B. Towles

  1. A Study of Quantitative Measurements of Programmer Productivity for Fleet Material Support Office (FMSO).

    DTIC Science & Technology

    1982-12-01

    paper examines the various measures discussed in the literature and used in selected corpora- tions which develop software. It presents several methods ...examines the various measures discassed in the literature and used in selected corporations which develop software. It presents several methods for...HOUR .... 40 D. SELECTED INDUSrRY METHODS FOR MEASURING PRODUCTIVITY 41 _ I1. 1IBM 41.. . . . . . . . ; 2. Amdahl . . . . . . . . . . . . . . . . . . 44

  2. Cost Estimation Techniques for C3I System Software.

    DTIC Science & Technology

    1984-07-01

    opment manmonth have been determined for maxi, midi , and mini .1 type computers. Small to median size timeshared developments used 0.2 to 1.5 hours...development schedule 1.23 1.00 1.10 2.1.3 Detailed Model The final codification of the COCOMO regressions was the development of separate effort...regardless of the software structure level being estimated: D8VC -- the expected development computer (maxi. midi . mini, micro) MODE -- the expected

  3. Path generation algorithm for UML graphic modeling of aerospace test software

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  4. Flight Software Development for the CHEOPS Instrument with the CORDET Framework

    NASA Astrophysics Data System (ADS)

    Cechticky, V.; Ottensamer, R.; Pasetti, A.

    2015-09-01

    CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)

  5. Identification of Patient Safety Risks Associated with Electronic Health Records: A Software Quality Perspective.

    PubMed

    Virginio, Luiz A; Ricarte, Ivan Luiz Marques

    2015-01-01

    Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.

  6. Using the iBook in medical education and healthcare settings--the iBook as a reusable learning object; a report of the author's experience using iBooks Author software.

    PubMed

    Payne, Karl Fb; Goodson, Alexander Mc; Tahim, Arpan; Wharrad, Heather J; Fan, Kathleen

    2012-12-01

    The recently launched iBooks 2 from Apple has created a new genre of 'interactive multimedia eBook'. This article aims to dscribe the benefit of the iBook in a medical education and healthcare setting. We discuss the attributes of an iBook as compared with the requirements of the conventional web-based Reusable Learning Object. The structure and user interface within an iBook is highlighted, and the iBook-creating software iBooks Author is discussed in detail. A report of personal experience developing and distributing an iBook for junior trainees in oral and maxillofacial surgery is provided, with discussion of the limitations of this approach and the need for further evidence-based studies.

  7. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    NASA Astrophysics Data System (ADS)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  8. Natural Computing: Its Impact on Software Development

    DTIC Science & Technology

    2000-02-01

    liars) Hanieuually ------. - ------ hadirline and single 1________N *ha c ea I coal C.oke loand . SIt thtan I other 0 Boxohead cutoff rule...user can develop new proce- dures by copying objects from documents and connecting them. These procedures can be saved for future use. Figure 27 shows

  9. Extraction and utilization of the repeating patterns for CP writing in mask making

    NASA Astrophysics Data System (ADS)

    Shoji, Masahiro; Inoue, Tadao; Yamabe, Masaki

    2010-05-01

    In May 2006, the Mask Design, Drawing, and Inspection Technology Research Department (Mask D2I) at the Association of Super-Advanced Electronics Technologies (ASET) launched a 4-year program for reducing mask manufacturing cost and TAT by concurrent optimization of Mask Data Preparation (MDP), mask writing, and mask inspection [1]. Figure 1 shows an outline of the project at Mask D2I at ASET. As one of the tasks being pursued at the Mask Design Data Technology Research Laboratory we have evaluated the effect of reducing the writing shot counts by utilizing the repeating patterns, and that showed positive impact on mask making by using CP writing. During the past four years, we have developed a software to extract repeating patterns from fractured OPCed mask data and have evaluated the efficiency of reducing the writing shot counts using the repeating patterns with this software. In this evaluation, we have used many actual device production data obtained from the member companies of Mask D2I. To the extraction software, we added new functions for extracting common repeating patterns from a set of multiple masks, and studied how this step affects the ratio of reducing the shot counts in comparison to the case of utilization of the repeating patterns for single mask. We have also developed a software that uses the result of extracting repeating patterns and prepares writing-data for the MCC/CP writing system which has been developed at the Mask Writing Equipment Technology Research Laboratory. With this software, we have examined how EB proximity effect on CP writing affects in reducing the shot count where CP shots with large CD errors have to be divided into VSB shots. In this paper we will report on making common CP mask from a set of multiple actual device data by using these software, and will also report on the results of CP writing and calculation of writing-TAT by MCC/CP writing system.

  10. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhleh, Luay

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbialmore » genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.« less

  11. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  12. Fault tolerant testbed evaluation, phase 1

    NASA Technical Reports Server (NTRS)

    Caluori, V., Jr.; Newberry, T.

    1993-01-01

    In recent years, avionics systems development costs have become the driving factor in the development of space systems, military aircraft, and commercial aircraft. A method of reducing avionics development costs is to utilize state-of-the-art software application generator (autocode) tools and methods. The recent maturity of application generator technology has the potential to dramatically reduce development costs by eliminating software development steps that have historically introduced errors and the need for re-work. Application generator tools have been demonstrated to be an effective method for autocoding non-redundant, relatively low-rate input/output (I/O) applications on the Space Station Freedom (SSF) program; however, they have not been demonstrated for fault tolerant, high-rate I/O, flight critical environments. This contract will evaluate the use of application generators in these harsh environments. Using Boeing's quad-redundant avionics system controller as the target system, Space Shuttle Guidance, Navigation, and Control (GN&C) software will be autocoded, tested, and evaluated in the Johnson (Space Center) Avionics Engineering Laboratory (JAEL). The response of the autocoded system will be shown to match the response of the existing Shuttle General Purpose Computers (GPC's), thereby demonstrating the viability of using autocode techniques in the development of future avionics systems.

  13. Public Domain Software for Education.

    ERIC Educational Resources Information Center

    Scholastech, Inc., Cambridge, MA.

    This report describes a project undertaken by Scholastech, a public charity addressing the development needs of computing educators, which was designed to advocate and support the increased use of free software in the college curriculum. The first of five sections provides a brief overview and statement of project objectives, i.e., to serve…

  14. Computational Simulations and the Scientific Method

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  15. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  16. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  17. Development of multichannel analyzer using sound card ADC for nuclear spectroscopy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Maslina Mohd; Yussup, Nolida; Lombigit, Lojius

    This paper describes the development of Multi-Channel Analyzer (MCA) using sound card analogue to digital converter (ADC) for nuclear spectroscopy system. The system was divided into a hardware module and a software module. Hardware module consist of detector NaI (Tl) 2” by 2”, Pulse Shaping Amplifier (PSA) and a build in ADC chip from readily available in any computers’ sound system. The software module is divided into two parts which are a pre-processing of raw digital input and the development of the MCA software. Band-pass filter and baseline stabilization and correction were implemented for the pre-processing. For the MCA development,more » the pulse height analysis method was used to process the signal before displaying it using histogram technique. The development and tested result for using the sound card as an MCA are discussed.« less

  18. USING THE ECLPSS SOFTWARE ENVIRONMENT TO BUILD A SPATIALLY EXPLICIT COMPONENT-BASED MODEL OF OZONE EFFECTS ON FOREST ECOSYSTEMS. (R827958)

    EPA Science Inventory

    We have developed a modeling framework to support grid-based simulation of ecosystems at multiple spatial scales, the Ecological Component Library for Parallel Spatial Simulation (ECLPSS). ECLPSS helps ecologists to build robust spatially explicit simulations of ...

  19. Computing in high-energy physics

    DOE PAGES

    Mount, Richard P.

    2016-05-31

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  20. Computing in high-energy physics

    NASA Astrophysics Data System (ADS)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  1. Computing in high-energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mount, Richard P.

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  2. Software to model AXAF-I image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  3. An Accessible User Interface for Geoscience and Programming

    NASA Astrophysics Data System (ADS)

    Sevre, E. O.; Lee, S.

    2012-12-01

    The goal of this research is to develop an interface that will simplify user interaction with software for scientists. The motivating factor of the research is to develop tools that assist scientists with limited motor skills with the efficient generation and use of software tools. Reliance on computers and programming is increasing in the world of geology, and it is increasingly important for geologists and geophysicists to have the computational resources to use advanced software and edit programs for their research. I have developed a prototype of a program to help geophysicists write programs using a simple interface that requires only simple single-mouse-clicks to input code. It is my goal to minimize the amount of typing necessary to create simple programs and scripts to increase accessibility for people with disabilities limiting fine motor skills. This interface can be adapted for various programming and scripting languages. Using this interface will simplify development of code for C/C++, Java, and GMT, and can be expanded to support any other text based programming language. The interface is designed around the concept of maximizing the amount of code that can be written using a minimum number of clicks and typing. The screen is split into two sections: a list of click-commands is on the left hand side, and a text area is on the right hand side. When the user clicks on a command on the left hand side the applicable code is automatically inserted at the insertion point in the text area. Currently in the C/C++ interface, there are commands for common code segments that are often used, such as for loops, comments, print statements, and structured code creation. The primary goal is to provide an interface that will work across many devices for developing code. A simple prototype has been developed for the iPad. Due to the limited number of devices that an iOS application can be used with, the code has been re-written in Java to run on a wider range of devices. Currently, the software works in a prototype mode, and it is our goal to further development to create software that can benefit a wide range of people working in geosciences, which will make code development practical and accessible for a wider audience of scientists. By using an interface like this, it reduces potential for errors by reusing known working code.

  4. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  5. Increase Return on Investment of Software Development Life Cycle by Managing the Risk - A Case Study

    DTIC Science & Technology

    2015-04-01

    for increasing the return on investment during the Software Development Life Cycle ( SDLC ) through selected quantitative analyses employing both the...defect rate, return on investment (ROI), software development life cycle ( SDLC ) DE FE N SE A C Q U IS IT IO N UN IVERSITY ALU M N I A SSO C IATIO N R...becomes comfortable due to its intricacies and learning cycle. The same may be said with respect to software development life cycle ( SDLC ) management

  6. The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1996-01-01

    Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.

  7. Beyond the Evident Content Goals Part I. Tapping the Depth and Flow of the Educational Undercurrent.

    ERIC Educational Resources Information Center

    Dugdale, Sharon; Kibbey, David

    1990-01-01

    The first in a series of three articles, successful instructional materials from a 15-year software development effort are analyzed and characterized with special attention given to educational experiences intended to shape students' perceptions of the fundamental nature, interconnectedness, and usefulness of mathematics. The software programs…

  8. IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.

    PubMed

    Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd

    2011-02-04

    Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.

  9. [The development and evaluation of software to verify diagnostic accuracy].

    PubMed

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  10. C3I Rapid Prototype Investigation.

    DTIC Science & Technology

    1986-01-01

    feasibility of applying rapid K prototyping techniques to Air Force C3 1 system developments . This report presents the technical progress during the...computer tunctions. The cost to use each in terms of hardware, software, analysis, and needed further developments was assessed. Prototyping approaches were...acquirer, and developer are the ". basis for problems in C3I system developments . These problems destabilize r-. the requirements determination process

  11. Description and Flight Test Results of the NASA F-8 Digital Fly-by-Wire Control System

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A NASA program to develop digital fly-by-wire (DFBW) technology for aircraft applications is discussed. Phase I of the program demonstrated the feasibility of using a digital fly-by-wire system for aircraft control through developing and flight testing a single channel system, which used Apollo hardware, in an F-8C airplane. The objective of Phase II of the program is to establish a technology base for designing practical DFBW systems. It will involve developing and flight testing a triplex digital fly-by-wire system using state-of-the-art airborne computers, system hardware, software, and redundancy concepts. The papers included in this report describe the Phase I system and its development and present results from the flight program. Man-rated flight software and the effects of lightning on digital flight control systems are also discussed.

  12. A Roadmap for using Agile Development in a Traditional System

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas

    2006-01-01

    I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.

  13. Tank Monitor and Control System (TMACS) Rev 11.0 Acceptance Test Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HOLM, M.J.

    The purpose of this document is to describe tests performed to validate Revision 11 of the TMACS Monitor and Control System (TMACS) and verify that the software functions as intended by design. This document is intended to test the software portion of TMACS. The tests will be performed on the development system. The software to be tested is the TMACS knowledge bases (KB) and the I/O driver/services. The development system will not be talking to field equipment; instead, the field equipment is simulated using emulators or multiplexers in the lab.

  14. Structuring Z Specifications with Views

    DTIC Science & Technology

    1994-03-01

    Gregory Abowd, Michael Jackson , Jeannette Wing, Michal Young and Pamela Zave, who gave me helpful comments on drafts of this pa- per. I am also grateful to...1988. Uac94] Michael Jackson , "Software Development Method", in A Classical Mind: Essays in Honour of C.A.R Hoare, ed. A.W Roscoe, Prentice Hall...International, 1994. ,0 26 [Jon86] Cliff B. Jones, Systematic Software Development Using VDM, Prentice Hall International, 1986. [JZ93] Michael Jackson and

  15. The Effect of Software Features on Software Adoption and Training in the Audit Profession

    ERIC Educational Resources Information Center

    Kim, Hyo-Jeong

    2012-01-01

    Although software has been studied with technology adoption and training research, the study of specific software features for professional groups has been limited. To address this gap, I researched the impact of software features of varying complexity on internal audit (IA) professionals. Two studies along with the development of training…

  16. Sculpting in cyberspace: Parallel processing the development of new software

    NASA Technical Reports Server (NTRS)

    Fisher, Rob

    1993-01-01

    Stimulating creativity in problem solving, particularly where software development is involved, is applicable to many disciplines. Metaphorical thinking keeps the problem in focus but in a different light, jarring people out of their mental ruts and sparking fresh insights. It forces the mind to stretch to find patterns between dissimilar concepts, in the hope of discovering unusual ideas in odd associations (Technology Review January 1993, p. 37). With a background in Engineering and Visual Design from MIT, I have for the past 30 years pursued a career as a sculptor of interdisciplinary monumental artworks that bridge the fields of science, engineering and art. Since 1979, I have pioneered the application of computer simulation to solve the complex problems associated with these projects. A recent project for the roof of the Carnegie Science Center in Pittsburgh made particular use of the metaphoric creativity technique described above. The problem-solving process led to the creation of hybrid software combining scientific, architectural and engineering visualization techniques. David Steich, a Doctoral Candidate in Electrical Engineering at Penn State, was commissioned to develop special software that enabled me to create innovative free-form sculpture. This paper explores the process of inventing the software through a detailed analysis of the interaction between an artist and a computer programmer.

  17. Virtual Environment User Interfaces to Support RLV and Space Station Simulations in the ANVIL Virtual Reality Lab

    NASA Technical Reports Server (NTRS)

    Dumas, Joseph D., II

    1998-01-01

    Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.

  18. JTAG-based remote configuration of FPGAs over optical fibers

    DOE PAGES

    Deng, B.; Xu, H.; Liu, C.; ...

    2015-01-28

    In this study, a remote FPGA-configuration method based on JTAG extension over optical fibers is presented. The method takes advantage of commercial components and ready-to-use software such as iMPACT and does not require any hardware or software development. The method combines the advantages of the slow remote JTAG configuration and the fast local flash memory configuration. The method has been verified successfully and used in the Demonstrator of Liquid-Argon Trigger Digitization Board (LTDB) for the ATLAS liquid argon calorimeter Phase-I trigger upgrade. All components on the FPGA side are verified to meet the radiation tolerance requirements.

  19. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  20. Hybrid Modeling Improves Health and Performance Monitoring

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.

  1. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    PubMed

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  2. A Web-Based Data Collection Platform for Multisite Randomized Behavioral Intervention Trials: Development, Key Software Features, and Results of a User Survey.

    PubMed

    Modi, Riddhi A; Mugavero, Michael J; Amico, Rivet K; Keruly, Jeanne; Quinlivan, Evelyn Byrd; Crane, Heidi M; Guzman, Alfredo; Zinski, Anne; Montue, Solange; Roytburd, Katya; Church, Anna; Willig, James H

    2017-06-16

    Meticulous tracking of study data must begin early in the study recruitment phase and must account for regulatory compliance, minimize missing data, and provide high information integrity and/or reduction of errors. In behavioral intervention trials, participants typically complete several study procedures at different time points. Among HIV-infected patients, behavioral interventions can favorably affect health outcomes. In order to empower newly diagnosed HIV positive individuals to learn skills to enhance retention in HIV care, we developed the behavioral health intervention Integrating ENGagement and Adherence Goals upon Entry (iENGAGE) funded by the National Institute of Allergy and Infectious Diseases (NIAID), where we deployed an in-clinic behavioral health intervention in 4 urban HIV outpatient clinics in the United States. To scale our intervention strategy homogenously across sites, we developed software that would function as a behavioral sciences research platform. This manuscript aimed to: (1) describe the design and implementation of a Web-based software application to facilitate deployment of a multisite behavioral science intervention; and (2) report on results of a survey to capture end-user perspectives of the impact of this platform on the conduct of a behavioral intervention trial. In order to support the implementation of the NIAID-funded trial iENGAGE, we developed software to deploy a 4-site behavioral intervention for new clinic patients with HIV/AIDS. We integrated the study coordinator into the informatics team to participate in the software development process. Here, we report the key software features and the results of the 25-item survey to evaluate user perspectives on research and intervention activities specific to the iENGAGE trial (N=13). The key features addressed are study enrollment, participant randomization, real-time data collection, facilitation of longitudinal workflow, reporting, and reusability. We found 100% user agreement (13/13) that participation in the database design and/or testing phase made it easier to understand user roles and responsibilities and recommended participation of research teams in developing databases for future studies. Users acknowledged ease of use, color flags, longitudinal work flow, and data storage in one location as the most useful features of the software platform and issues related to saving participant forms, security restrictions, and worklist layout as least useful features. The successful development of the iENGAGE behavioral science research platform validated an approach of early and continuous involvement of the study team in design development. In addition, we recommend post-hoc collection of data from users as this led to important insights on how to enhance future software and inform standard clinical practices. Clinicaltrials.gov NCT01900236; (https://clinicaltrials.gov/ct2/show/NCT01900236 (Archived by WebCite at http://www.webcitation.org/6qAa8ld7v). ©Riddhi A Modi, Michael J Mugavero, Rivet K Amico, Jeanne Keruly, Evelyn Byrd Quinlivan, Heidi M Crane, Alfredo Guzman, Anne Zinski, Solange Montue, Katya Roytburd, Anna Church, James H Willig. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 16.06.2017.

  3. Performance and blood monitoring in sports: the artificial intelligence evoking target testing in antidoping (AR.I.E.T.T.A.) project.

    PubMed

    Manfredini, A F; Malagoni, A M; Litmanen, H; Zhukovskaja, L; Jeannier, P; Dal Follo, D; Felisatti, M; Besseberg, A; Geistlinger, M; Bayer, P; Carrabre, J E

    2011-03-01

    Substances and methods used to increase oxygen blood transport and physical performance can be detected in the blood, but the screening of the athletes to be tested remains a critical issue for the International Federations. This project, AR.I.E.T.T.A., aimed to develop a software capable of analysing athletes' hematological and performance profiles to detect abnormal patterns. One-hundred eighty athletes belonging to the International Biathlon Union gave written informed consent to have their hematological data, previously collected according to anti-doping rules, used to develop the AR.I.E.T.T.A. software. Software was developed with the included sections: 1) log-in; 2) data-entry: where data are loaded, stored and grouped; 3) analysis: where data are analysed, validated scores are calculated, and parameters are simultaneously displayed as statistics, tables and graphs, and individual or subpopulation profiles; 4) screening: where an immediate evaluation of the risk score of the present sample and/or the athlete under study is obtained. The sample risk score or AR.I.E.T.T.A. score is calculated by a simple computational system combining different parameters (absolute values and intra-individual variations) considered concurrently. The AR.I.E.T.T.A. score is obtained by the sum of the deviation units derived from each parameter, considering the shift of the present value from the reference values, based on the number of standard deviations. AR.I.E.T.T.A. enables a quick evaluation of blood results assisting surveillance programs and perform timely target testing controls on athletes by the International Federations. Future studies aiming to validate the AR.I.E.T.T.A. score and improve the diagnostic accuracy will improve the system.

  4. Creating a transducer electronic datasheet using I2C serial EEPROM memory and PIC32-based microcontroller development board

    NASA Astrophysics Data System (ADS)

    Croitoru, Bogdan; Tulbure, Adrian; Abrudean, Mihail; Secara, Mihai

    2015-02-01

    The present paper describes a software method for creating / managing one type of Transducer Electronic Datasheet (TEDS) according to IEEE 1451.4 standard in order to develop a prototype of smart multi-sensor platform (with up to ten different analog sensors simultaneously connected) with Plug and Play capabilities over ETHERNET and Wi-Fi. In the experiments were used: one analog temperature sensor, one analog light sensor, one PIC32-based microcontroller development board with analog and digital I/O ports and other computing resources, one 24LC256 I2C (Inter Integrated Circuit standard) serial Electrically Erasable Programmable Read Only Memory (EEPROM) memory with 32KB available space and 3 bytes internal buffer for page writes (1 byte for data and 2 bytes for address). It was developed a prototype algorithm for writing and reading TEDS information to / from I2C EEPROM memories using the standard C language (up to ten different TEDS blocks coexisting in the same EEPROM device at once). The algorithm is able to write and read one type of TEDS: transducer information with standard TEDS content. A second software application, written in VB.NET platform, was developed in order to access the EEPROM sensor information from a computer through a serial interface (USB).

  5. Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL

    NASA Technical Reports Server (NTRS)

    Dumas, Joseph D., II

    2002-01-01

    The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.

  6. [Development of ophthalmologic software for handheld devices].

    PubMed

    Grottone, Gustavo Teixeira; Pisa, Ivan Torres; Grottone, João Carlos; Debs, Fernando; Schor, Paulo

    2006-01-01

    The formulas for calculation of intraocular lenses have evolved since the first theoretical formulas by Fyodorov. Among the second generation formulas, the SRK-I formula has a simple calculation, taking into account a calculation that only involved anteroposterior length, IOL constant and average keratometry. With the evolution of those formulas, complexicity increased making the reconfiguration of parameters in special situations impracticable. In this way the production and development of software for such a purpose, can help surgeons to recalculate those values if needed. To idealize, develop and test a Brazilian software for calculation of IOL dioptric power for handheld computers. For the development and programming of software for calculation of IOL, we used PocketC program (OrbWorks Concentrated Software, USA). We compared the results collected from a gold-standard device (Ultrascan/Alcon Labs) with the simulation of 100 fictitious patients, using the same IOL parameters. The results were grouped for ULTRASCAN data and SOFTWARE data. Using SRK/T formula the range of those parameters included a keratometry varying between 35 and 55D, axial length between 20 and 28 mm, IOL constants of 118.7, 118.3 and 115.8. Using Wilcoxon test, it was shown that the groups do not differ (p=0.314). We had a variation in the Ultrascan sample between 11.82 and 27.97. In the tested program sample the variation was practically similar (11.83-27.98). The average of the Ultrascan group was 20.93. The software group had a similar average. The standard deviation of the samples was also similar (4.53). The precision of IOL software for handheld devices was similar to that of the standard devices using the SRK/T formula. The software worked properly, was steady without bugs in tested models of operational system.

  7. The CECAM Electronic Structure Library: community-driven development of software libraries for electronic structure simulations

    NASA Astrophysics Data System (ADS)

    Oliveira, Micael

    The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.

  8. Cognitive Tutor[R] Algebra I. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2009

    2009-01-01

    The "Cognitive Tutor[R] Algebra I" curriculum, published by Carnegie Learning, is an approach that combines algebra textbooks with interactive software. The software is developed around an artificial intelligence model that identifies strengths and weaknesses in each individual student's mastery of mathematical concepts. It then customizes prompts…

  9. An International Survey of Industrial Applications of Formal Methods. Volume 1: Purpose, Approach, Analysis, and Conclusions

    DTIC Science & Technology

    1993-09-30

    97 Accesion For NTIS CRA&I DTIC TAB Unannounced 0 Justification ----- ---.......................... Ry Di. t ,:; t,.: 1...months of effort. The product was important for demonstrating to IBM management the potential of the Cleanroom methodology. 3.2.4 Software Architecture ...for Oscilloscopes Using Z (Tektronix) Tektronix in Beaverton, Oregon, used Z to develop a reusable software architecture to be shared among a number

  10. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  11. The validation by measurement theory of proposed object-oriented software metrics

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1994-01-01

    Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics (Li and Henry, 1993; Chidamber and Kemerrer, 1994; Lorenz and Kidd, 1994).

  12. Validity of an Interactive Functional Reach Test.

    PubMed

    Galen, Sujay S; Pardo, Vicky; Wyatt, Douglas; Diamond, Andrew; Brodith, Victor; Pavlov, Alex

    2015-08-01

    Videogaming platforms such as the Microsoft (Redmond, WA) Kinect(®) are increasingly being used in rehabilitation to improve balance performance and mobility. These gaming platforms do not have built-in clinical measures that offer clinically meaningful data. We have now developed software that will enable the Kinect sensor to assess a patient's balance using an interactive functional reach test (I-FRT). The aim of the study was to test the concurrent validity of the I-FRT and to establish the feasibility of implementing the I-FRT in a clinical setting. The concurrent validity of the I-FRT was tested among 20 healthy adults (mean age, 25.8±3.4 years; 14 women). The Functional Reach Test (FRT) was measured simultaneously by both the Kinect sensor using the I-FRT software and the Optotrak Certus(®) 3D motion-capture system (Northern Digital Inc., Waterloo, ON, Canada). The feasibility of implementing the I-FRT in a clinical setting was assessed by performing the I-FRT in 10 participants with mild balance impairments recruited from the outpatient physical therapy clinic (mean age, 55.8±13.5 years; four women) and obtaining their feedback using a NASA Task Load Index (NASA-TLX) questionnaire. There was moderate to good agreement between FRT measures made by the two measurement systems. The greatest agreement between the two measurement system was found with the Kinect sensor placed at a distance of 2.5 m [intraclass correlation coefficient (2,k)=0.786; P<0.001] from the participant. Participants with mild balance impairments whose balance was assessed using the I-FRT software scored their experience favorably by assigning lower scores for the Frustration, Mental Demand, and Temporal Demand subscales on the NASA/TLX questionnaire. FRT measures made using the Kinect sensor I-FRT software provides a valid clinical measure that can be used with the gaming platforms.

  13. Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges

    NASA Astrophysics Data System (ADS)

    Maruping, Likoebe M.

    Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.

  14. National policies for technical change: Where are the increasing returns to economic research?

    PubMed Central

    Pavitt, Keith

    1996-01-01

    Improvements over the past 30 years in statistical data, analysis, and related theory have strengthened the basis for science and technology policy by confirming the importance of technical change in national economic performance. But two important features of scientific and technological activities in the Organization for Economic Cooperation and Development countries are still not addressed adequately in mainstream economics: (i) the justification of public funding for basic research and (ii) persistent international differences in investment in research and development and related activities. In addition, one major gap is now emerging in our systems of empirical measurement—the development of software technology, especially in the service sector. There are therefore dangers of diminishing returns to the usefulness of economic research, which continues to rely completely on established theory and established statistical sources. Alternative propositions that deserve serious consideration are: (i) the economic usefulness of basic research is in the provision of (mainly tacit) skills rather than codified and applicable information; (ii) in developing and exploiting technological opportunities, institutional competencies are just as important as the incentive structures that they face; and (iii) software technology developed in traditional service sectors may now be a more important locus of technical change than software technology developed in “high-tech” manufacturing. PMID:8917481

  15. Model Driven Engineering

    NASA Astrophysics Data System (ADS)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  16. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  17. Component-Based Visualization System

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco

    2005-01-01

    A software system has been developed that gives engineers and operations personnel with no "formal" programming expertise, but who are familiar with the Microsoft Windows operating system, the ability to create visualization displays to monitor the health and performance of aircraft/spacecraft. This software system is currently supporting the X38 V201 spacecraft component/system testing and is intended to give users the ability to create, test, deploy, and certify their subsystem displays in a fraction of the time that it would take to do so using previous software and programming methods. Within the visualization system there are three major components: the developer, the deployer, and the widget set. The developer is a blank canvas with widget menu items that give users the ability to easily create displays. The deployer is an application that allows for the deployment of the displays created using the developer application. The deployer has additional functionality that the developer does not have, such as printing of displays, screen captures to files, windowing of displays, and also serves as the interface into the documentation archive and help system. The third major component is the widget set. The widgets are the visual representation of the items that will make up the display (i.e., meters, dials, buttons, numerical indicators, string indicators, and the like). This software was developed using Visual C++ and uses COTS (commercial off-the-shelf) software where possible.

  18. Intelligent Medical Systems for Aerospace Emergency Medical Services

    NASA Technical Reports Server (NTRS)

    Epler, John; Zimmer, Gary

    2004-01-01

    The purpose of this project is to develop a portable, hands free device for emergency medical decision support to be used in remote or confined settings by non-physician providers. Phase I of the project will entail the development of a voice-activated device that will utilize an intelligent algorithm to provide guidance in establishing an airway in an emergency situation. The interactive, hands free software will process requests for assistance based on verbal prompts and algorithmic decision-making. The device will allow the CMO to attend to the patient while receiving verbal instruction. The software will also feature graphic representations where it is felt helpful in aiding in procedures. We will also develop a training program to orient users to the algorithmic approach, the use of the hardware and specific procedural considerations. We will validate the efficacy of this mode of technology application by testing in the Johns Hopkins Department of Emergency Medicine. Phase I of the project will focus on the validation of the proposed algorithm, testing and validation of the decision making tool and modifications of medical equipment. In Phase 11, we will produce the first generation software for hands-free, interactive medical decision making for use in acute care environments.

  19. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  20. A Framework for Testing Scientific Software: A Case Study of Testing Amsterdam Discrete Dipole Approximation Software

    NASA Astrophysics Data System (ADS)

    Shao, Hongbing

    Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.

  1. Hardware and software systems for the determination of charged particle parameters in low pressure plasmas using impedance-tuned Langmuir probes

    NASA Astrophysics Data System (ADS)

    Ye, Yuancai; Marcus, R. Kenneth

    1997-12-01

    A computer-controlled, impedance-tuned Langmuir probe data acquisition system and processing software package have been designed for the diagnostic study of low pressure plasmas. The combination of impedance-tuning and a wide range of applied potentials (± 100 V) provides a versatile system, applicable to a variety of analytical plasmas without significant modification. The automated probe system can be used to produce complete and undistorted current-voltage (i-V) curves with extremely low noise over the wide potential range. Based on these hardware and software systems, it is possible to determine all of the important charged particle parameters in a plasma; electron number density ( ne), ion number density ( ni), electron temperature ( Te), electron energy distribution function (EEDF), and average electron energy (<ɛ>). The complete data acquisition system and evaluation software are described in detail. A LabView (National Instruments Corporation, Austin, TX) application program has been developed for the Apple Macintosh line of microcomputers to control all of the operational aspects of the Langmuir probe experiments. The description here is mainly focused on the design aspects of the acquisition system with the targets of extremely low noise and reduction of the influence of measurement noise in the calculation procedures. This is particularly important in the case of electron energy distribution functions where multiple derivatives are calculated from the obtained i-V curves. A separate C-language data processing program has been developed and is included here to allow the reader to evaluate data obtained with the described hardware, or any i-V data imported in tab separated variable format. Both of the software systems are included on a Macintosh formatted disk for their use in other laboratories desiring these capabilities.

  2. Test Driven Development of a Parameterized Ice Sheet Component

    NASA Astrophysics Data System (ADS)

    Clune, T.

    2011-12-01

    Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.

  3. Culture shock: Improving software quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less

  4. Enabling Agile Testing through Continuous Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stolberg, Sean E.

    2009-08-24

    A Continuous Integration system is often considered one of the key elements involved in supporting an agile software development and testing environment. As a traditional software tester transitioning to an agile development environment it became clear to me that I would need to put this essential infrastructure in place and promote improved development practices in order to make the transition to agile testing possible. This experience report discusses a continuous integration implementation I lead last year. The initial motivations for implementing continuous integration are discussed and a pre and post-assessment using Martin Fowler's "Practices of Continuous Integration" is provided alongmore » with the technical specifics of the implementation. Finally, I’ll wrap up with a retrospective of my experiences implementing and promoting continuous integration within the context of agile testing.« less

  5. Practical Pocket PC Application w/Biometric Security

    NASA Technical Reports Server (NTRS)

    Logan, Julian

    2004-01-01

    I work in the Flight Software Engineering Branch, where we provide design and development of embedded real-time software applications for flight and supporting ground systems to support the NASA Aeronautics and Space Programs. In addition, this branch evaluates, develops and implements new technologies for embedded real-time systems, and maintains a laboratory for applications of embedded technology. The majority of microchips that are used in modern society have been programmed using embedded technology. These small chips can be found in microwaves, calculators, home security systems, cell phones and more. My assignment this summer entails working with an iPAQ HP 5500 Pocket PC. This top-of-the-line hand-held device is one of the first mobile PC's to introduce biometric security capabilities. Biometric security, in this case a fingerprint authentication system, is on the edge of technology as far as securing information. The benefits of fingerprint authentication are enormous. The most significant of them are that it is extremely difficult to reproduce someone else's fingerprint, and it is equally difficult to lose or forget your own fingerprint as opposed to a password or pin number. One of my goals for this summer is to integrate this technology with another Pocket PC application. The second task for the summer is to develop a simple application that provides an Astronaut EVA (Extravehicular Activity) Log Book capability. The Astronaut EVA Log Book is what an astronaut would use to report the status of field missions, crew physical health, successes, future plans, etc. My goal is to develop a user interface into which these data fields can be entered and stored. The applications that I am developing are created using eMbedded Visual C++ 4.0 with the Pocket PC 2003 Software Development Kit provided by Microsoft.

  6. NASA Ares I Crew Launch Vehicle Upper Stage Avionics and Software Overview

    NASA Technical Reports Server (NTRS)

    Nola, Charles L.; Blue, Lisa

    2008-01-01

    Building on the heritage of the Saturn and Space Shuttle Programs for the Design, Development, Test, and Evaluation (DDT and E) of avionics and software for NASA's Ares I Crew Launch Vehicle (CLV), the Ares I Upper Stage Element is a vital part of the Constellation Program's transportation system. The Upper Stage Element's Avionics Subsystem is actively proceeding toward its objective of delivering a flight-certified Upper Stage Avionics System for the Ares I CLV.

  7. Helicopter Controllability

    DTIC Science & Technology

    1989-09-01

    106 3. Program CC Systems Technology, Inc. (STI) of Hawthorne, CA., develops and markets PC control system analysis and design software including...is marketed in Palo Alto, Ca., by Applied i and can be used for both linear and non- linear control system analysis. Using TUTSIM involves developing...gravity centroid ( ucg ) can be calculated as 112 n m pi - 2 zi acg n i (7-5) where pi = poles zi = zeroes n = number of poles m = number of zeroes If K

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, William Eugene

    These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly:more » the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.« less

  9. A testing-coverage software reliability model considering fault removal efficiency and error generation

    PubMed Central

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance. PMID:28750091

  10. Design and implementation of handheld and desktop software for the structured reporting of hepatic masses using the LI-RADS schema.

    PubMed

    Clark, Toshimasa J; McNeeley, Michael F; Maki, Jeffrey H

    2014-04-01

    The Liver Imaging Reporting and Data System (LI-RADS) can enhance communication between radiologists and clinicians if applied consistently. We identified an institutional need to improve liver imaging report standardization and developed handheld and desktop software to serve this purpose. We developed two complementary applications that implement the LI-RADS schema. A mobile application for iOS devices written in the Objective-C language allows for rapid characterization of hepatic observations under a variety of circumstances. A desktop application written in the Java language allows for comprehensive observation characterization and standardized report text generation. We chose the applications' languages and feature sets based on the computing resources of target platforms, anticipated usage scenarios, and ease of application installation, deployment, and updating. Our primary results are the publication of the core source code implementing the LI-RADS algorithm and the availability of the applications for use worldwide via our website, http://www.liradsapp.com/. The Java application is free open-source software that can be integrated into nearly any vendor's reporting system. The iOS application is distributed through Apple's iTunes App Store. Observation categorizations of both programs have been manually validated to be correct. The iOS application has been used to characterize liver tumors during multidisciplinary conferences of our institution, and several faculty members, fellows, and residents have adopted the generated text of Java application into their diagnostic reports. Although these two applications were developed for the specific reporting requirements of our liver tumor service, we intend to apply this development model to other diseases as well. Through semiautomated structured report generation and observation characterization, we aim to improve patient care while increasing radiologist efficiency. Published by Elsevier Inc.

  11. Digital Oblique Remote Ionospheric Sensing (DORIS) Program Development

    DTIC Science & Technology

    1992-04-01

    waveforms. A new with the ARTIST software (Reinisch and Iluang. autoscaling technique for oblique ionograms 1983, Gamache et al., 1985) which is...development and performance of a complete oblique ionogram autoscaling and inversion algorithm is presented. The inver.i-,n algorithm uses a three...OTIH radar. 14. SUBJECT TERMS 15. NUMBER OF PAGES Oblique Propagation; Oblique lonogram Autoscaling ; i Electron Density Profile Inversion; Simulated 16

  12. Development of Software Tools for ADA Compliance Data Collection, Management, and Inquiry

    DOT National Transportation Integrated Search

    2014-07-01

    In this NUTC research project, the UNR research team developed an iOS application (named NDOT ADA Data) to efficiently and intuitively collect ADA inventory data with iPhones or iPads. This tool was developed to facilitate NDOT ADA data collect...

  13. Designing Better Camels: Developing Effective Documentation for Computer Software.

    ERIC Educational Resources Information Center

    Zacher, Candace M.

    This guide to the development of effective documentation for users of computer software begins by identifying five types of documentation, i.e., training manuals, user guides, tutorials, on-screen help comments, and troubleshooting manuals. Six steps in the development process are then outlined and briefly described: (1) planning and preparation;…

  14. A study on a pedicle-screw-based reduction method for artificially reduced artifacts

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Ju; Lee, Hae-Kag; Cho, Jae-Hwan

    2017-09-01

    The purpose of this study is a quantitative analysis of the degree of the reduction of the artifacts that are induced by pedicle screws through the application of the recently developed iterative metallic artifact reduction (I MAR) software. Screw-type implants that are composed of 4.5 g/cm3 titanium (Ti) with an approximate average computed tomography (CT) value of 6500 Hounsfield units (HUs) that are used for the treatment of spinal diseases were placed in paraffin, a tissueequivalent material, and then dried. After the insertion, the scanning conditions were fixed as 120 kVp and 250 mA using multidetector computed tomography (MDCT) (Enlarge, Siemens, Germany). The slice thickness and the increment were set at the fields of view (FOVs) of 3 mm and 120 mm, respectively; the pitch is 0.8; the rotation time is 1 s; and the I MAR software was applied to the raw data of the acquired images to compare the CT-value changes of the posterior images. When the I MAR software was applied to animal vertebrae, it was possible to reduce the 65.7% image loss of the black-hole-effect image through the application of the I MAR software. When the I MAR image loss (%) was compared with the white-streak-effect image, the high-intensity image type with the white-streak effect could be reduced by 91.34% through the application of the I MAR software. In conclusion, a metal artifact that is due to a high-density material can be reduced more effectively when the I MAR algorithm is applied compared with that from the application of the conventional MAR algorithm. The I MAR can provide information on the various tissues that form around the artifact and the reduced metal structures, which can be helpful for radiologists and clinicians in their determination of an accurate diagnosis.

  15. Updates on Software development for a RICH detector

    NASA Astrophysics Data System (ADS)

    Voloshin, Andrew; Benmokhtar, Fatiha; Lendacky, Andrew; Goodwill, Justin

    2017-01-01

    The CLAS12 detector at Thomas Jefferson National Accelerator Facility (TJNAF) is undergoing an upgrade. One of the improvements is the addition of a Ring Imaging Cherenkov (RICH) detector to improve particle identification in the 3-8 GeV/c momentum range. Approximately 400 multi anode photomultiplier tubes (MAPMTs) are going to be used to detect Cherenkov Radiation in the single photoelectron spectra (SPS). Software development for slow control as well as online monitoring is under development. I will be presenting my work on the development of a java based programs for a monitor and explain its interaction with a Mysql database where the MAPMTs information is stored as well as the techniques used to visualize Cherenkov rings.

  16. Design and implementation of I2Vote--an interactive image-based voting system using windows mobile devices.

    PubMed

    van Ooijen, P M A; Broekema, A; Oudkerk, M

    2011-08-01

    To develop, implement and test a novel audience response system (ARS) that allows image based interaction for radiology education. The ARS developed in this project is based on standard Personal Digital Assistants (PDAs) (HP iPAQ 114 classic handheld) running Microsoft® Windows Mobile® 6 Classic with a large 3.5 in. TFT touch screen (320×240 pixel resolution), high luminance and integrated IEEE 802.11b/g wireless. For software development Visual Studio 2008 professional (Microsoft) was used and all components were written in C#. Two test sessions were conducted to test the software technically followed by two real classroom tests in a radiology class for medical students on thoracic radiology. The novel ARS, called I2Vote, was successfully implemented and provided an easy to use, stable setup. The acceptance of both students and teachers was very high and the interaction with the students improved because of the anonymous interaction possibility. An easy to use handheld based ARS that enables interactive, image-based, teaching is achieved. The system effectively adds an extra dimension to the use of an ARS. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Can your software engineer program your PLC?

    NASA Astrophysics Data System (ADS)

    Borrowman, Alastair J.; Taylor, Philip

    2016-07-01

    The use of Programmable Logic Controllers (PLCs) in the control of large physics experiments is ubiquitous1, 2, 3. The programming of these controllers is normally the domain of engineers with a background in electronics, this paper introduces PLC program development from the software engineer's perspective. PLC programs provide the link between control software running on PC architecture systems and physical hardware controlled and monitored by digital and analog signals. The higher-level software running on the PC is typically responsible for accepting operator input and from this deciding when and how hardware connected to the PLC is controlled. The PLC accepts demands from the PC, considers the current state of its connected hardware and if correct to do so (based upon interlocks or other constraints) adjusts its hardware output signals appropriately for the PC's demands. A published ICD (Interface Control Document) defines the PLC memory locations available to be written and read by the PC to control and monitor the hardware. Historically the method of programming PLCs has been ladder diagrams that closely resemble circuit diagrams, however, PLC manufacturers nowadays also provide, and promote, the use of higher-level programming languages4. Based on techniques used in the development of high-level PC software to control PLCs for multiple telescopes, this paper examines the development of PLC programs to operate the hardware of a medical cyclotron beamline controlled from a PC using the Experimental Physics and Industrial Control System (EPICS), which is also widely used in telescope control5, 6, 7. The PLC used is the new generation Siemens S7-1200 programmed using Siemens Pascal based Structured Control Language (SCL), which is their implementation of Structured Text (ST). The approach described is that from a software engineer's perspective, utilising Siemens Totally Integrated Automation (TIA) Portal integrated development environment (IDE) to create modular PLC programs based upon reusable functions capable of being unit tested without the PLC connected to hardware. Emphasis has been placed on designing an interface between EPICS and SCL that enforces correct operation of hardware through stringent separation of PC accessible PLC memory and hardware I/O addresses used only by the PLC. The paper also introduces the method used to automate the creation, from the same source document, the PLC memory structure (tag) definitions (defining memory used to access hardware I/O and that accessed by the PC) and creation of the PC program data structures (EPICS database records) used to access the permitted PLC addresses. From direct experience this paper demonstrates the advantages of PLC program development being shared between electronic and software engineers, to enable use of the most appropriate processes from both the perspective of the hardware and the higher-level software used to control it.

  18. Tcl as a Software Environment for a TCS

    NASA Astrophysics Data System (ADS)

    Terrett, David L.

    2002-12-01

    This paper describes how the Tcl scripting language and C API has been used as the software environment for a telescope pointing kernel so that new pointing algorithms and software architectures can be developed and tested without needing a real-time operating system or real-time software environment. It has enabled development to continue outside the framework of a specific telescope project while continuing to build a system that is sufficiently complete to be capable of controlling real hardware but expending minimum effort on replacing the services that would normally by provided by a real-time software environment. Tcl is used as a scripting language for configuring the system at startup and then as the command interface for controlling the running system; the Tcl C language API is used to provided a system independent interface to file and socket I/O and other operating system services. The pointing algorithms themselves are implemented as a set of C++ objects calling C library functions that implement the algorithms described in [2]. Although originally designed as a test and development environment, the system, running as a soft real-time process on Linux, has been used to test the SOAR mount control system and will be used as the pointing kernel of the SOAR telescope control system

  19. An Experimental Investigation of Computer Program Development Approaches and Computer Programming Metrics.

    DTIC Science & Technology

    1979-12-01

    team progranming in reducing software dleveloup- ment costs relative to ad hoc approaches and improving software product quality relative to...are interpreted as demonstrating the advantages of disciplined team programming in reducing software development costs relative to ad hoc approaches...is due oartialty to the cost and imoracticality of a valiI experimental setup within a oroauct ion environment. Thus the question remains, are

  20. Constellation Training Facility Support

    NASA Technical Reports Server (NTRS)

    Flores, Jose M.

    2008-01-01

    The National Aeronautics and Space Administration is developing the next set of vehicles that will take men back to the moon under the Constellation Program. The Constellation Training Facility (CxTF) is a project in development that will be used to train astronauts, instructors, and flight controllers on the operation of Constellation Program vehicles. It will also be used for procedure verification and validation of flight software and console tools. The CxTF will have simulations for the Crew Exploration Vehicle (CEV), Crew Module (CM), CEV Service Module (SM), Launch Abort System (LAS), Spacecraft Adapter (SA), Crew Launch Vehicle (CLV), Pressurized Cargo Variant CM, Pressurized Cargo Variant SM, Cargo Launch Vehicle, Earth Departure Stage (EDS), and the Lunar Surface Access Module (LSAM). The Facility will consist of part-task and full-task trainers, each with a specific set of mission training capabilities. Part task trainers will be used for focused training on a single vehicle system or set of related systems. Full task trainers will be used for training on complete vehicles and all of its subsystems. Support was provided in both software development and project planning areas of the CxTF project. Simulation software was developed for the hydraulic system of the Thrust Vector Control (TVC) of the ARES I launch vehicle. The TVC system is in charge of the actuation of the nozzle gimbals for navigation control of the upper stage of the ARES I rocket. Also, software was developed using C standards to send and receive data to and from hand controllers to be used in CxTF cockpit simulations. The hand controllers provided movement in all six rotational and translational axes. Under Project Planning & Control, support was provided to the development and maintenance of integrated schedules for both the Constellation Training Facility and Missions Operations Facilities Division. These schedules maintain communication between projects in different levels. The CxTF support provided is one that requires continuous maintenance since the project is still on initial development phases.

  1. Design and development of control unit and software for the ADFOSC instrument of the 3.6 m Devasthal optical telescope

    NASA Astrophysics Data System (ADS)

    Kumar, T. S.

    2016-08-01

    In this paper, we describe the details of control unit and GUI software for positioning two filter wheels, a slit wheel and a grism wheel in the ADFOSC instrument. This is a first generation instrument being built for the 3.6 m Devasthal optical telescope. The control hardware consists of five electronic boards based on low cost 8-bit PIC microcontrollers and are distributed over I2C bus. The four wheels are controlled by four identical boards which are configured in I2C slave mode while the fifth board acts as an I2C master for sending commands to and receiving status from the slave boards. The master also communicates with the interfacing PC over TCP/IP protocol using simple ASCII commands. For moving the wheels stepper motors along with suitable amplifiers have been employed. Homing after powering ON is achieved using hall effect sensors. By implementing distributed control units having identical design modularity is achieved enabling easier maintenance and upgradation. A GUI based software for commanding the instrument is developed in Microsoft Visual C++. For operating the system during observations the user selects normal mode while the engineering mode is available for offering additional flexibility and low level control during maintenance and testing. A detailed time-stamped log of commands, status and errors are continuously generated. Both the control unit and the software have been successfully tested and integrated with the ADFOSC instrument.

  2. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  3. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    NASA Technical Reports Server (NTRS)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  4. 48 CFR 752.7005 - Submission requirements for development experience documents.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...

  5. 48 CFR 752.7005 - Submission requirements for development experience documents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...

  6. 48 CFR 752.7005 - Submission requirements for development experience documents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...

  7. 48 CFR 752.7005 - Submission requirements for development experience documents.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...

  8. 48 CFR 752.7005 - Submission requirements for development experience documents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...

  9. System Integration and Interface Transition Issues.

    DTIC Science & Technology

    1977-04-01

    OC - 4- u -O m4 U V L.- I~V 0~ C 0 - i CC 0 .iOC30~i .- ~. C > u uU O! ul Wi 0) i~ LUn CL04) z w 0 CL-0r I.- ~ ~~~~ in0 6 - 2-A 0 ~ 4) 0 zEC u~5. 0...Systems Design and Documentation - An Introduction to the HIPO Method, Van Nostrand Reinhold Co. (1976). [34] Peter Freeman, "Toward Improved Review of...Software Design," Proc. National Computer Conf. 44, AFIPS Press (1975) pp 329-334. [35] Peter G. Neumann, "Software Development & Proofs of Multi-Level

  10. Analysis of Schedule Determination in Software Program Development and Software Development Estimation Models

    DTIC Science & Technology

    1988-09-01

    20 SLIM . . . . .e & . . . . . . . . . . . . 24 SoftCost-R . . . . . . . . . . . . . . . 26 SPQR /20 . . . . . . . . . . .*. . . . . 28...PRICB-8 . . . . . . . . . . .. . 83 softCost-R ............. 84 SPQR /20 . . . . . . . . . . . . 0 . 84 System-3 . . . . . . . . . . . . . . 85 Summry...128 Appendix G: SoftCost-R Input Values . . . . . . . . . . 129 Appendix H: SoftCost-R Resources Estimate . . . . . . . 131 Appendix I: SPQR

  11. Institutional Logics, Indie Software Developers and Platform Governance

    ERIC Educational Resources Information Center

    Qiu, Yixin

    2013-01-01

    This two-essay dissertation aims to study institutional logics in the context of Apple's independent third-party software developers. In essay 1, I investigate the embedded agency aspect of the institutional logics theory. It builds on the premise that logics constrain preferences, interests and behaviors of individuals and organizations, thereby…

  12. Multidisciplinary Optimization Branch Experience Using iSIGHT Software

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Korte, J. J.; Dunn, H. J.; Salas, A. O.

    1999-01-01

    The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center is investigating frameworks for supporting multidisciplinary analysis and optimization research. An optimization framework call improve the design process while reducing time and costs. A framework provides software and system services to integrate computational tasks and allows the researcher to concentrate more on the application and less on the programming details. A framework also provides a common working environment and a full range of optimization tools, and so increases the productivity of multidisciplinary research teams. Finally, a framework enables staff members to develop applications for use by disciplinary experts in other organizations. Since the release of version 4.0, the MDO Branch has gained experience with the iSIGHT framework developed by Engineous Software, Inc. This paper describes experiences with four aerospace applications: (1) reusable launch vehicle sizing, (2) aerospike nozzle design, (3) low-noise rotorcraft trajectories, and (4) acoustic liner design. All applications have been successfully tested using the iSIGHT framework, except for the aerospike nozzle problem, which is in progress. Brief overviews of each problem are provided. The problem descriptions include the number and type of disciplinary codes, as well as all estimate of the multidisciplinary analysis execution time. In addition, the optimization methods, objective functions, design variables, and design constraints are described for each problem. Discussions on the experience gained and lessons learned are provided for each problem. These discussions include the advantages and disadvantages of using the iSIGHT framework for each case as well as the ease of use of various advanced features. Potential areas of improvement are identified.

  13. Developing a Promotional Video

    ERIC Educational Resources Information Center

    Epley, Hannah K.

    2014-01-01

    There is a need for Extension professionals to show clientele the benefits of their program. This article shares how promotional videos are one way of reaching audiences online. An example is given on how a promotional video has been used and developed using iMovie software. Tips are offered for how professionals can create a promotional video and…

  14. Automated Report Generation for Research Data Repositories: From i2b2 to PDF.

    PubMed

    Thiemann, Volker S; Xu, Tingyan; Röhrig, Rainer; Majeed, Raphael W

    2017-01-01

    We developed an automated toolchain to generate reports of i2b2 data. It is based on free open source software and runs on a Java Application Server. It is sucessfully used in an ED registry project. The solution is highly configurable and portable to other projects based on i2b2 or compatible factual data sources.

  15. ProteoWizard: open source software for rapid proteomics tools development.

    PubMed

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, H.L.

    Much of the polymer composites industry is built around the thermochemical conversion of raw material into useful composites. The raw materials (molding compound, prepreg) often are made up of thermosetting resins and small fibers or particles. While this conversion can follow a large number of paths, only a few paths are efficient, economical and lead to desirable composite properties. Processing instrument (P/I) technology enables a computer to sense and interpret changes taking place during the cure of prepreg or molding compound. P/I technology has been used to make estimates of gel time and cure time, thermal diffusivity measurements and transitionmore » temperature measurements. Control and sensing software is comparatively straightforward. The interpretation of results with appropriate software is under development.« less

  17. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  18. Digital Library Storage using iRODS Data Grids

    NASA Astrophysics Data System (ADS)

    Hedges, Mark; Blanke, Tobias; Hasan, Adil

    Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.

  19. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume I.

    DTIC Science & Technology

    1981-04-30

    However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to

  20. Enabling joined-up decision making with geotemporal information

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Ahmed, S. E.; Purves, D. W.; Emmott, S.; Joppa, L. N.; Caldararu, S.; Visconti, P.; Newbold, T.; Formica, A. F.

    2015-12-01

    While the use of geospatial data to assist in decision making is becoming increasingly common, the use of geotemporal information: information that can be indexed by geographical space AND time, is much rarer. I will describe our scientific research and software development efforts intended to advance the availability and use of geotemporal information in general. I will show two recent examples of "stacking" geotemporal information to support land use decision making in the Brazilian Amazon and Kenya, involving data-constrained predictive models and empirically derived datasets of road development, deforestation, carbon, agricultural yields, water purification and poverty alleviation services and will show how we use trade-off analyses and constraint reasoning algorithms to explore the costs and benefits of different decisions. For the Brazilian Amazon we explore tradeoffs involved in different deforestation scenarios, while for Kenya we explore the impacts of conserving forest to support international carbon conservation initiatives (REDD+). I will also illustrate the cloud-based software tools we have developed to enable anyone to access geotemporal information, gridded (e.g. climate) or non-gridded (e.g. protected areas), for the past, present or future and incorporate such information into their analyses (e.g. www.fetchclimate.org), including how we train new predictive models to such data using Bayesian techniques: on this latter point I will show how we combine satellite and ground measured data with predictive models to forecast how crops might respond to climate change.

  1. DANTi: Detect and Avoid iN The Cockpit

    NASA Technical Reports Server (NTRS)

    Chamberlain, James; Consiglio, Maria; Munoz, Cesar

    2017-01-01

    Mid-air collision risk continues to be a concern for manned aircraft operations, especially near busy non-towered airports. The use of Detect and Avoid (DAA) technologies and draft standards developed for unmanned aircraft systems (UAS), either alone or in combination with other collision avoidance technologies, may be useful in mitigating this collision risk for manned aircraft. This paper describes a NASA research effort known as DANTi (DAA iN The Cockpit), including the initial development of the concept of use, a software prototype, and results from initial flight tests conducted with this prototype. The prototype used a single Automatic Dependent Surveillance - Broadcast (ADS-B) traffic sensor and the own aircraft's position, track, heading and air data information, along with NASA-developed DAA software to display traffic alerts and maneuver guidance to manned aircraft pilots on a portable tablet device. Initial flight tests with the prototype showed a successful DANTi proof-of-concept, but also demonstrated that the traffic separation parameter set specified in the RTCA SC-228 Phase I DAA MOPS may generate excessive false alerts during traffic pattern operations. Several parameter sets with smaller separation values were also tested in flight, one of which yielded more timely alerts for the maneuvers tested. Results from this study may further inform future DANTi efforts as well as Phase II DAA MOPS development.

  2. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  3. International MODIS and AIRS Processing Package (IMAPP) Implementation of Infusion of Satellite Data into Environmental Applications-International (IDEA-I) for Air Quality Forecasts using Suomi-NPP, Terra and Aqua Aerosol Retrievals

    NASA Astrophysics Data System (ADS)

    Davies, J. E.; Strabala, K.; Pierce, R. B.; Huang, A.

    2016-12-01

    Fine mode aerosols play a significant role in public health through their impact on respiratory and cardiovascular disease. IDEA-I (Infusion of Satellite Data into Environmental Applications-International) is a real-time system for trajectory-based forecasts of aerosol dispersion that can assist in the prediction of poor air quality events. We released a direct broadcast version of IDEA-I for aerosol trajectory forecasts in June 2012 under the International MODIS and AIRS Processing Package (IMAPP). In January 2014 we updated this application with website software to display multi-satellite products. Now we have added VIIRS aerosols from Suomi National Polar-orbiting Partnership (S-NPP). IMAPP is a NASA-funded and freely-distributed software package developed at Space Science and Engineering Center of University of Wisconsin-Madison that has over 2,300 registered users worldwide. With IMAPP, any ground station capable of receiving direct broadcast from Terra or Aqua can produce calibrated and geolocated radiances and a suite of environmental products. These products include MODIS AOD required for IDEA-I. VIIRS AOD for IDEA-I can be generated by Community Satellite Processing Package (CSPP) VIIRS EDR Version 2.0 Software for Suomi NPP. CSPP is also developed and distributed by Space Science & Engineering Center. This presentation describes our updated IMAPP implementation of IDEA-I through an example of its operation in a region known for episodic poor air quality events.

  4. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  5. Reviews Book: Enjoyable Physics Equipment: SEP Colorimeter Box Book: Pursuing Power and Light Equipment: SEP Bottle Rocket Launcher Equipment: Sciencescope GLE Datalogger Equipment: EDU Logger Book: Physics of Sailing Book: The Lightness of Being Software: Logotron Insight iLog Studio iPhone Apps Lecture: 2010 IOP Schools and Colleges Lecture Web Watch

    NASA Astrophysics Data System (ADS)

    2010-09-01

    WE RECOMMEND Enjoyable Physics Mechanics book makes learning more fun SEP Colorimeter Box A useful and inexpensive colorimeter for the classroom Pursuing Power and Light Account of the development of science in the 19th centuary SEP Bottle Rocket Launcher An excellent resource for teaching about projectiles GLE Datalogger GPS software is combined with a datalogger EDU Logger Remote datalogger has greater sensing abilities Logotron Insight iLog Studio Software enables datlogging, data analysis and modelling iPhone Apps Mobile phone games aid study of gravity WORTH A LOOK Physics of Sailing Book journeys through the importance of physics in sailing The Lightness of Being Study of what the world is made from LECTURE The 2010 IOP Schools and Colleges Lecture presents the physics of fusion WEB WATCH Planet Scicast pushes boundaries of pupil creativity

  6. Modeling Physical Systems Using Vensim PLE Systems Dynamics Software

    NASA Astrophysics Data System (ADS)

    Widmark, Stephen

    2012-02-01

    Many physical systems are described by time-dependent differential equations or systems of such equations. This makes it difficult for students in an introductory physics class to solve many real-world problems since these students typically have little or no experience with this kind of mathematics. In my high school physics classes, I address this problem by having my students use a variety of software solutions to model physical systems described by differential equations. These include spreadsheets, applets, software my students themselves create, and systems dynamics software. For the latter, cost is often the main issue in choosing a solution for use in a public school and so I researched no-cost software. I found Sphinx SD,2OptiSim,3 Systems Dynamics,4 Simile (Trial Edition),5 and Vensim PLE.6 In evaluating each of these solutions, I looked for the fewest restrictions in the license for educational use, ease of use by students, power, and versatility. In my opinion, Vensim PLE best fulfills these criteria.7

  7. Software System User’s Manual, Reference Manual, and Installation Guide for the Test Engineer’s Assistant System.

    DTIC Science & Technology

    1989-02-28

    AD-A259 245 RESEARCH TRIANGLE INSTITUTE I SOFTWARE SYSTEM USER’S MANUAL, REFERENCE MANUAL, AND INSTALLATION GUIDE FOR THE TEST ENGINEER’S ASSISTANT...SYSTEM U. yD"VxC - February 28, 1989 Iŕ 5 G3 ’Contract No. DAAL01-86-C-0039 W Prepared for: Department of the Army Electronics Research and...Development Command Fort Monmouth, New Jersey 07703 I Prepared by: Center for Digital Systems ResearchI Research Triangle Institute Research Triangle Park, NC

  8. PAPARA(ZZ)I: An open-source software interface for annotating photographs of the deep-sea

    NASA Astrophysics Data System (ADS)

    Marcon, Yann; Purser, Autun

    PAPARA(ZZ)I is a lightweight and intuitive image annotation program developed for the study of benthic megafauna. It offers functionalities such as free, grid and random point annotation. Annotations may be made following existing classification schemes for marine biota and substrata or with the use of user defined, customised lists of keywords, which broadens the range of potential application of the software to other types of studies (e.g. marine litter distribution assessment). If Internet access is available, PAPARA(ZZ)I can also query and use standardised taxa names directly from the World Register of Marine Species (WoRMS). Program outputs include abundances, densities and size calculations per keyword (e.g. per taxon). These results are written into text files that can be imported into spreadsheet programs for further analyses. PAPARA(ZZ)I is open-source and is available at http://papara-zz-i.github.io. Compiled versions exist for most 64-bit operating systems: Windows, Mac OS X and Linux.

  9. Platform-independent software for medical image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Mancuso, Michael E.; Pathak, Sayan D.; Kim, Yongmin

    1997-05-01

    We have developed a software tool for image processing over the Internet. The tool is a general purpose, easy to use, flexible, platform independent image processing software package with functions most commonly used in medical image processing.It provides for processing of medical images located wither remotely on the Internet or locally. The software was written in Java - the new programming language developed by Sun Microsystems. It was compiled and tested using Microsoft's Visual Java 1.0 and Microsoft's Just in Time Compiler 1.00.6211. The software is simple and easy to use. In order to use the tool, the user needs to download the software from our site before he/she runs it using any Java interpreter, such as those supplied by Sun, Symantec, Borland or Microsoft. Future versions of the operating systems supplied by Sun, Microsoft, Apple, IBM, and others will include Java interpreters. The software is then able to access and process any image on the iNternet or on the local computer. Using a 512 X 512 X 8-bit image, a 3 X 3 convolution took 0.88 seconds on an Intel Pentium Pro PC running at 200 MHz with 64 Mbytes of memory. A window/level operation took 0.38 seconds while a 3 X 3 median filter took 0.71 seconds. These performance numbers demonstrate the feasibility of using this software interactively on desktop computes. Our software tool supports various image processing techniques commonly used in medical image processing and can run without the need of any specialized hardware. It can become an easily accessible resource over the Internet to promote the learning and of understanding image processing algorithms. Also, it could facilitate sharing of medical image databases and collaboration amongst researchers and clinicians, regardless of location.

  10. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  11. The ISO SWS on-line system

    NASA Technical Reports Server (NTRS)

    Roelfsema, P. R.; Kester, D. J. M.; Wesselius, P. R.; Wieprech, E.; Sym, N.

    1992-01-01

    The software which is currently being developed for the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO) is described. The spectrometer has a wide range of capabilities in the 2-45 micron infrared band. SWS contains two independent gratings, one for the long and one for the short wavelength section of the band. With the gratings a spectral resolution of approximately 1000 to approximately 2500 can be obtained. The instrument also contains two Fabry-Perault's yielding a resolution between approximately 1000 and approximately 20000. Software is currently being developed for the acquisition, calibration, and analysis of SWS data. The software is firstly required to run in a pipeline mode without human interaction, to process data as they are received from the telescope. However, both for testing and calibration of the instrument as well as for evaluation of the planned operating procedures the software should also be suitable for interactive use. Thirdly the same software will be used for long term characterization of the instrument. The software must work properly within the environment designed by the European Space Agency (ESA) for the spacecraft operations. As a result strict constraints are put on I/O devices, throughput etc.

  12. Problem solving in magnetic field: Animation in mobile application

    NASA Astrophysics Data System (ADS)

    Najib, A. S. M.; Othman, A. P.; Ibarahim, Z.

    2014-09-01

    This paper is focused on the development of mobile application for smart phone, Android, tablet, iPhone, and iPad as a problem solving tool in magnetic field. Mobile application designs consist of animations that were created by using Flash8 software which could be imported and compiled to prezi.com software slide. The Prezi slide then had been duplicated in Power Point format and instead question bank with complete answer scheme was also additionally generated as a menu in the application. Results of the published mobile application can be viewed and downloaded at Infinite Monkey website or at Google Play Store from your gadgets. Statistics of the application from Google Play Developer Console shows the high impact of the application usage in all over the world.

  13. Open Source Software and Design-Based Research Symbiosis in Developing 3D Virtual Learning Environments: Examples from the iSocial Project

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla

    2014-01-01

    Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…

  14. Managing Written Directives: A Software Solution to Streamline Workflow.

    PubMed

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases confusion about the creation, completion, filing, and retrieval of directives. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  15. Comparison of software tools for kinetic evaluation of chemical degradation data.

    PubMed

    Ranke, Johannes; Wöltjen, Janina; Meinecke, Stefan

    2018-01-01

    For evaluating the fate of xenobiotics in the environment, a variety of degradation or environmental metabolism experiments are routinely conducted. The data generated in such experiments are evaluated by optimizing the parameters of kinetic models in a way that the model simulation fits the data. No comparison of the main software tools currently in use has been published to date. This article shows a comparison of numerical results as well as an overall, somewhat subjective comparison based on a scoring system using a set of criteria. The scoring was separately performed for two types of uses. Uses of type I are routine evaluations involving standard kinetic models and up to three metabolites in a single compartment. Evaluations involving non-standard model components, more than three metabolites or more than a single compartment belong to use type II. For use type I, usability is most important, while the flexibility of the model definition is most important for use type II. Test datasets were assembled that can be used to compare the numerical results for different software tools. These datasets can also be used to ensure that no unintended or erroneous behaviour is introduced in newer versions. In the comparison of numerical results, good agreement between the parameter estimates was observed for datasets with up to three metabolites. For the now unmaintained reference software DegKinManager/ModelMaker, and for OpenModel which is still under development, user options were identified that should be taken care of in order to obtain results that are as reliable as possible. Based on the scoring system mentioned above, the software tools gmkin, KinGUII and CAKE received the best scores for use type I. Out of the 15 software packages compared with respect to use type II, again gmkin and KinGUII were the first two, followed by the script based tool mkin, which is the technical basis for gmkin, and by OpenModel. Based on the evaluation using the system of criteria mentioned above and the comparison of numerical results for the suite of test datasets, the software tools gmkin, KinGUII and CAKE are recommended for use type I, and gmkin and KinGUII for use type II. For users that prefer to work with scripts instead of graphical user interfaces, mkin is recommended. For future software evaluations, it is recommended to include a measure for the total time that a typical user needs for a kinetic evaluation into the scoring scheme. It is the hope of the authors that the publication of test data, source code and overall rankings foster the evolution of useful and reliable software in the field.

  16. Development of an integrated, unattended assay system for LWR-MOX fuel pellet trays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, J.E.; Hatcher, C.R.; Pollat, L.L.

    1994-08-01

    Four identical unattended plutonium assay systems have been developed for use at the new light-water-reactor mixed oxide (LWR-MOX) fuel fabrication facility at Hanau, Germany. The systems provide quantitative plutonium verification for all MOX pellet trays entering or leaving a large, intermediate store. Pellet-tray transport and storage systems are highly automated. Data from the ``I-Point`` (information point) assay systems will be shared by the Euratom and International Atomic Energy Agency (IAEA) Inspectorates. The I-Point system integrates, for the first time, passive neutron coincidence counting (NCC) with electro-mechanical sensing (EMS) in unattended mode. Also, provisions have been made for adding high-resolution gammamore » spectroscopy. The system accumulates data for every tray entering or leaving the store between inspector visits. During an inspection, data are analyzed and compared with operator declarations for the previous inspection period, nominally one month. Specification of the I-point system resulted from a collaboration between the IAEA, Euratom, Siemens, and Los Alamos. Hardware was developed by Siemens and Los Alamos through a bilateral agreement between the German Federal Ministry of Research and Technology (BMFT) and the US DOE. Siemens also provided the EMS subsystem, including software. Through the USSupport Program to the IAEA, Los Alamos developed the NCC software (NCC COLLECT) and also the software for merging and reviewing the EMS and NCC data (MERGE/REVIEW). This paper describes the overall I-Point system, but emphasizes the NCC subsystem, along with the NCC COLLECT and MERGE/REVIEW codes. We also summarize comprehensive testing results that define the quality of assay performance.« less

  17. Quality measures and assurance for AI (Artificial Intelligence) software

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.

  18. Real-time autocorrelator for fluorescence correlation spectroscopy based on graphical-processor-unit architecture: method, implementation, and comparative studies

    NASA Astrophysics Data System (ADS)

    Laracuente, Nicholas; Grossman, Carl

    2013-03-01

    We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College

  19. Non-Algorithmic Issues in Automated Computational Mechanics

    DTIC Science & Technology

    1991-04-30

    Tworzydlo, Senior Research Engineer and Manager of Advanced Projects Group I. Professor I J. T. Oden, President and Senior Scientist of COMCO, was project...practical applications of the systems reported so far is due to the extremely arduous and complex development and management of a realistic knowledge base...software, designed to effectively implement deep, algorithmic knowledge, * and 0 "intelligent" software, designed to manage shallow, heuristic

  20. Wake Turbulence Mitigation for Departures (WTMD) Prototype System - Software Design Document

    NASA Technical Reports Server (NTRS)

    Sturdy, James L.

    2008-01-01

    This document describes the software design of a prototype Wake Turbulence Mitigation for Departures (WTMD) system that was evaluated in shadow mode operation at the Saint Louis (KSTL) and Houston (KIAH) airports. This document describes the software that provides the system framework, communications, user displays, and hosts the Wind Forecasting Algorithm (WFA) software developed by the M.I.T. Lincoln Laboratory (MIT-LL). The WFA algorithms and software are described in a separate document produced by MIT-LL.

  1. Techniques for Developing an Acquisition Strategy by Profiling Software Risks

    DTIC Science & Technology

    2006-08-01

    Drivers...................................................................................... 13 Figure 8: BMW 745Li Software... BMW 745Li, shown in Figure 8, is a good illustration of the increasing software control of hardware systems in automobiles. Among the many features...roll stabilization, dynamic brake con- trol, coded drive-away protection, an adaptive automatic transmission, and iDrive systems. This list can be

  2. Expendable Air Vehicles/High Altitude Balloon Technology. Phase 1.

    DTIC Science & Technology

    1991-08-02

    CHR/91 -2750 I I I I I THIS PAGE INTENTIONALLY LEFT BLANK 3 I I U I I I I I I I I I CHR/91 -2750 PREFACE The work described in this Phase II SBIR...Final Technical Report is the implementation of a capability which Coleman Research Corporation demon- strated during a Phase I SBIR (contract number...CRC) has developed a Balloon Drift Pattern Simulation 1BDPS). CRC developed this simulation software for digital computers as a product of a Phase II

  3. Automated UHPLC separation of 10 pharmaceutical compounds using software-modeling.

    PubMed

    Zöldhegyi, A; Rieger, H-J; Molnár, I; Fekhretdinova, L

    2018-03-20

    Human mistakes are still one of the main reasons of underlying regulatory affairs that in a compliance with FDA's Data Integrity and Analytical Quality by Design (AQbD) must be eliminated. To develop smooth, fast and robust methods that are free of human failures, a state-of-the-art automation was presented. For the scope of this study, a commercial software (DryLab) and a model mixture of 10 drugs were subjected to testing. Following AQbD-principles, the best available working point was selected and conformational experimental runs, i.e. the six worst cases of the conducted robustness calculation, were performed. Simulated results were found to be in excellent agreement with the experimental ones, proving the usefulness and effectiveness of an automated, software-assisted analytical method development. Copyright © 2018. Published by Elsevier B.V.

  4. Use of Doceri Software for iPad in Online Delivery of Chemistry Content

    ERIC Educational Resources Information Center

    Silverberg, Lee J.; Tierney, John; Bodek, Matthew J.

    2014-01-01

    Doceri software for iPad is useful for both synchronous online and asynchronous online delivery of chemistry course content. Using the Doceri wireless connection between the iPad and a personal computer that is running Adobe Connect, online synchronous instruction can be accomplished in which drawings can be completed by hand on the iPad. For…

  5. Development of 3-D Ice Accretion Measurement Method

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Broeren, Andy P.; Addy, Harold E., Jr.; Sills, Robert; Pifer, Ellen M.

    2012-01-01

    A research plan is currently being implemented by NASA to develop and validate the use of a commercial laser scanner to record and archive fully three-dimensional (3-D) ice shapes from an icing wind tunnel. The plan focused specifically upon measuring ice accreted in the NASA Icing Research Tunnel (IRT). The plan was divided into two phases. The first phase was the identification and selection of the laser scanning system and the post-processing software to purchase and develop further. The second phase was the implementation and validation of the selected system through a series of icing and aerodynamic tests. Phase I of the research plan has been completed. It consisted of evaluating several scanning hardware and software systems against an established selection criteria through demonstrations in the IRT. The results of Phase I showed that all of the scanning systems that were evaluated were equally capable of scanning ice shapes. The factors that differentiated the scanners were ease of use and the ability to operate in a wide range of IRT environmental conditions.

  6. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  7. WILDFIRE IGNITION RESISTANCE ESTIMATOR WIZARD SOFTWARE DEVELOPMENT REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, M.; Robinson, C.; Gupta, N.

    2012-10-10

    This report describes the development of a software tool, entitled “WildFire Ignition Resistance Estimator Wizard” (WildFIRE Wizard, Version 2.10). This software was developed within the Wildfire Ignition Resistant Home Design (WIRHD) program, sponsored by the U. S. Department of Homeland Security, Science and Technology Directorate, Infrastructure Protection & Disaster Management Division. WildFIRE Wizard is a tool that enables homeowners to take preventive actions that will reduce their home’s vulnerability to wildfire ignition sources (i.e., embers, radiant heat, and direct flame impingement) well in advance of a wildfire event. This report describes the development of the software, its operation, its technicalmore » basis and calculations, and steps taken to verify its performance.« less

  8. The use of emulator-based simulators for on-board software maintenance

    NASA Astrophysics Data System (ADS)

    Irvine, M. M.; Dartnell, A.

    2002-07-01

    Traditionally, onboard software maintenance activities within the space sector are performed using hardware-based facilities. These facilities are developed around the use of hardware emulation or breadboards containing target processors. Some sort of environment is provided around the hardware to support the maintenance actives. However, these environments are not easy to use to set-up the required test scenarios, particularly when the onboard software executes in a dynamic I/O environment, e.g. attitude control software, or data handling software. In addition, the hardware and/or environment may not support the test set-up required during investigations into software anomalies, e.g. raise spurious interrupt, fail memory, etc, and the overall "visibility" of the software executing may be limited. The Software Maintenance Simulator (SOMSIM) is a tool that can support the traditional maintenance facilities. The following list contains some of the main benefits that SOMSIM can provide: Low cost flexible extension to existing product - operational simulator containing software processor emulator; System-level high-fidelity test-bed in which software "executes"; Provides a high degree of control/configuration over the entire "system", including contingency conditions perhaps not possible with real hardware; High visibility and control over execution of emulated software. This paper describes the SOMSIM concept in more detail, and also describes the SOMSIM study being carried out for ESA/ESOC by VEGA IT GmbH.

  9. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    NASA Astrophysics Data System (ADS)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.

  10. Calculation and use of an environment's characteristic software metric set

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    Since both cost/quality and production environments differ, this study presents an approach for customizing a characteristic set of software metrics to an environment. The approach is applied in the Software Engineering Laboratory (SEL), a NASA Goddard production environment, to 49 candidate process and product metrics of 652 modules from six (51,000 to 112,000 lines) projects. For this particular environment, the method yielded the characteristic metric set (source lines, fault correction effort per executable statement, design effort, code effort, number of I/O parameters, number of versions). The uses examined for a characteristic metric set include forecasting the effort for development, modification, and fault correction of modules based on historical data.

  11. Wearable Notification via Dissemination Service in a Pervasive Computing Environment

    DTIC Science & Technology

    2015-09-01

    context, state, and environment in a manner that would be transparent to a Soldier’s common operations. 15. SUBJECT TERMS pervasive computing, Android ...of user context shifts, i.e., changes in the user’s position, history , workflow, or resource interests. If the PCE is described as a 2-component...convenient viewing on the Glass’s screen just above the line of sight. All of the software developed uses Google’s Android open-source software stack

  12. The Missing Link: The Use of Link Words and Phrases as a Link to Manuscript Quality

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    2016-01-01

    In this article, I provide a typology of transition words/phrases. This typology comprises 12 dimensions of link words/phrases that capture 277 link words/phrases. Using QDA Miner, WordStat, and SPSS--a computer-assisted mixed methods data analysis software, content analysis software, and statistical software, respectively--I analyzed 74…

  13. Gamified Pedagogy: From Gaming Theory to Creating a Self-Motivated Learning Environment in Studio Art

    ERIC Educational Resources Information Center

    Han, Hsiao-Cheng

    2015-01-01

    This research is an empirical study using gamified pedagogy in a 3-D animation course in a Visual Communication Design Department. By conducting this research, I hope to increase student interest in learning 3-D animation and to decrease student fears of learning professional 3-D software. Through this research, I have developed a theory of…

  14. Use of Soft Computing Technologies For Rocket Engine Control

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Olcmen, Semih; Polites, Michael

    2003-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that

  15. Conversing with Computers

    NASA Technical Reports Server (NTRS)

    2004-01-01

    I/NET, Inc., is making the dream of natural human-computer conversation a practical reality. Through a combination of advanced artificial intelligence research and practical software design, I/NET has taken the complexity out of developing advanced, natural language interfaces. Conversational capabilities like pronoun resolution, anaphora and ellipsis processing, and dialog management that were once available only in the laboratory can now be brought to any application with any speech recognition system using I/NET s conversational engine middleware.

  16. STRS SpaceWire FPGA Module

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Taylor, Gregory H.; Lang, Minh; Stern, Ryan A.

    2011-01-01

    An FPGA module leverages the previous work from Goddard Space Flight Center (GSFC) relating to NASA s Space Telecommunications Radio System (STRS) project. The STRS SpaceWire FPGA Module is written in the Verilog Register Transfer Level (RTL) language, and it encapsulates an unmodified GSFC core (which is written in VHDL). The module has the necessary inputs/outputs (I/Os) and parameters to integrate seamlessly with the SPARC I/O FPGA Interface module (also developed for the STRS operating environment, OE). Software running on the SPARC processor can access the configuration and status registers within the SpaceWire module. This allows software to control and monitor the SpaceWire functions, but it is also used to give software direct access to what is transmitted and received through the link. SpaceWire data characters can be sent/received through the software interface, as well as through the dedicated interface on the GSFC core. Similarly, SpaceWire time codes can be sent/received through the software interface or through a dedicated interface on the core. This innovation is designed for plug-and-play integration in the STRS OE. The SpaceWire module simplifies the interfaces to the GSFC core, and synchronizes all I/O to a single clock. An interrupt output (with optional masking) identifies time-sensitive events within the module. Test modes were added to allow internal loopback of the SpaceWire link and internal loopback of the client-side data interface.

  17. Developing a Model Component

    NASA Technical Reports Server (NTRS)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is,. responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) is a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The purpose of the UCTS is to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems s:luring their development. As an intern at KSC, my assignment was to develop a model component for the UCTS. I was given a fluid component (drier) to model in Matlab. The drier was a Catch All replaceable core type filter-drier. The filter-drier provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-drier also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. I completed training for UNIX and Simulink to help aid in my assignment. The filter-drier was modeled by determining affects it has on the pressure, velocity and temperature of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my model filter-drier in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements.

  18. AXAF-1 High Resolution Assembly Image Model and Comparison with X-Ray Ground Test Image

    NASA Technical Reports Server (NTRS)

    Zissa, David E.

    1999-01-01

    The x-ray ground test of the AXAF-I High Resolution Mirror Assembly was completed in 1997 at the X-ray Calibration Facility at Marshall Space Flight Center. Mirror surface measurements by HDOS, alignment results from Kodak, and predicted gravity distortion in the horizontal test configuration are being used to model the x-ray test image. The Marshall Space Flight Center (MSFC) image modeling serves as a cross check with Smithsonian Astrophysical observatory modeling. The MSFC image prediction software has evolved from the MSFC model of the x-ray test of the largest AXAF-I mirror pair in 1991. The MSFC image modeling software development is being assisted by the University of Alabama in Huntsville. The modeling process, modeling software, and image prediction will be discussed. The image prediction will be compared with the x-ray test results.

  19. Software agents and the route to the information economy.

    PubMed

    Kephart, Jeffrey O

    2002-05-14

    Humans are on the verge of losing their status as the sole economic species on the planet. In private laboratories and in the Internet laboratory, researchers and developers are creating a variety of autonomous economically motivated software agents endowed with algorithms for maximizing profit or utility. Many economic software agents will function as miniature businesses, purchasing information inputs from other agents, combining and refining them into information goods and services, and selling them to humans or other agents. Their mutual interactions will form the information economy: a complex economic web of information goods and services that will adapt to the ever-changing needs of people and agents. The information economy will be the largest multiagent system ever conceived and an integral part of the world's economy. I discuss a possible route toward this vision, beginning with present-day Internet trends suggesting that agents will charge one another for information goods and services. Then, to establish that agents can be competent price setters, I describe some laboratory experiments pitting software bidding agents against human bidders. The agents' superior performance suggests they will be used on a broad scale, which in turn suggests that interactions among agents will become frequent and significant. How will this affect macroscopic economic behavior? I describe some interesting phenomena that my colleagues and I have observed in simulations of large populations of automated buyers and sellers, such as price war cycles. I conclude by discussing fundamental scientific challenges that remain to be addressed as we journey toward the information economy.

  20. Factors in Software Quality. Volume I. Concepts and Definitions of Software Quality

    DTIC Science & Technology

    1977-11-01

    FLEXIBILITY COMPLEXITY EXPANDABILITY PRECISION DOCUMENTATION TOLERANCE REPAIRABILITY COMPATABIL ITY SERVICEABILITY 2-4 AiI1I~3~I!-T A1 11 NI󈧥 AIiB 9l 0...applications. Several standard documents are required by DOD/AF’ regulations . The following references were used to compile the rFpnge of documents...documents are specified by the AF regulations or SPO-local regulations listed above. Each ot the document types for a long life/high cost software

  1. Software Development in the Water Sciences: a view from the divide (Invited)

    NASA Astrophysics Data System (ADS)

    Miles, B.; Band, L. E.

    2013-12-01

    While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.

  2. Using an architectural approach to integrate heterogeneous, distributed software components

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Purtilo, James M.

    1995-01-01

    Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.

  3. Research on AutoCAD secondary development and function expansion based on VBA technology

    NASA Astrophysics Data System (ADS)

    Zhang, Runmei; Gu, Yehuan

    2017-06-01

    AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.

  4. Assessing ergonomic risks of software: Development of the SEAT.

    PubMed

    Peres, S Camille; Mehta, Ranjana K; Ritchey, Paul

    2017-03-01

    Software utilizing interaction designs that require extensive dragging or clicking of icons may increase users' risks for upper extremity cumulative trauma disorders. The purpose of this research is to develop a Self-report Ergonomic Assessment Tool (SEAT) for assessing the risks of software interaction designs and facilitate mitigation of those risks. A 28-item self-report measure was developed by combining and modifying items from existing industrial ergonomic tools. Data were collected from 166 participants after they completed four different tasks that varied by method of input (touch or keyboard and mouse) and type of task (selecting or typing). Principal component analysis found distinct factors associated with stress (i.e., demands) and strain (i.e., response). Repeated measures analyses of variance showed that participants could discriminate the different strain induced by the input methods and tasks. However, participants' ability to discriminate between the stressors associated with that strain was mixed. Further validation of the SEAT is necessary but these results indicate that the SEAT may be a viable method of assessing ergonomics risks presented by software design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Predictive Software Cost Model Study. Volume I. Final Technical Report.

    DTIC Science & Technology

    1980-06-01

    development phase to identify computer resources necessary to support computer programs after transfer of program manangement responsibility and system... classical model development with refinements specifically applicable to avionics systems. The refinements are the result of the Phase I literature search

  6. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2014-01-01

    Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.

  7. Test Driven Development: Lessons from a Simple Scientific Model

    NASA Astrophysics Data System (ADS)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  8. iContraception(®): a software tool to assist professionals in choosing contraceptive methods according to WHO medical eligibility criteria.

    PubMed

    Lopez, Ramón Guisado; Polo, Isabel Ramirez; Berral, Jose Eduardo Arjona; Fernandez, Julia Guisado; Castelo-Branco, Camil

    2015-04-01

    To design software to assist health care providers with contraceptive counselling. The Model-View-Controller software architecture pattern was used. Decision logic was incorporated to automatically compute the safety category of each contraceptive option. Decisions are made according to the specific characteristics or known medical conditions of each potential contraception user. The software is an app designed for the iOS and Android platforms and is available in four languages. iContraception(®) facilitates presentation of visual data on medical eligibility criteria for contraceptive treatments. The use of this software was evaluated by a sample of 54 health care providers. The general satisfaction with the use of the app was over 8 on a 0-10 visual analogue scale in 96.3% of cases. iContraception provides easy access to medical eligibility criteria of contraceptive options and may help with contraceptive counselling. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. WetNet: Using SSM/I data interactively for global distribution of tropical rainfall and precipitable water

    NASA Technical Reports Server (NTRS)

    Zipser, Edward J.; Mcguirk, James P.

    1993-01-01

    The research objectives were the following: (1) to use SSM/I to categorize, measure, and parameterize effects of rainfall systems around the globe, especially mesoscale convective systems; (2) to use SSM/I to monitor key components of the global hydrologic cycle, including tropical rainfall and precipitable water, and links to increasing sea surface temperatures; and (3) to assist in the development of efficient methods of exchange of massive satellite data bases and of analysis techniques, especially their use at a university. Numerous tasks have been initiated. First and foremost has been the integration and startup of the WetNet computer system into the TAMU computer network. Scientific activity was infeasible before completion of this activity. Final hardware delivery was not completed until October 1991, after which followed a period of identification and solution of several hardware and software and software problems. Accomplishments representing approximately four months work with the WetNEt system are presented.

  10. Multiplex Quantitative Histologic Analysis of Human Breast Cancer Cell Signaling and Cell Fate

    DTIC Science & Technology

    2010-05-01

    Breast cancer, cell signaling, cell proliferation, histology, image analysis 15. NUMBER OF PAGES - 51 16. PRICE CODE 17. SECURITY CLASSIFICATION...revealed by individual stains in multiplex combinations; and (3) software (FARSIGHT) for automated multispectral image analysis that (i) segments...Task 3. Develop computational algorithms for multispectral immunohistological image analysis FARSIGHT software was developed to quantify intrinsic

  11. Professional Ethics of Software Engineers: An Ethical Framework.

    PubMed

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  12. C3I Systems Acquisition and Maintenance in Relation to the use of COTS Products

    DTIC Science & Technology

    2000-12-01

    the NATO C3 Agency and crypto equipment. * the GFE STARGATE Software subsystem (the prototyped version of which, developed by IAF, Surveillance fctid...been increasing and dual-use systems (ACCAM, ICC, AOIS, STARGATE and re-use potentials have been enhanced. WAN connections) Use of COTS information

  13. Orion Integrated Guidance, Navigation, and Control [GN and C

    NASA Technical Reports Server (NTRS)

    Chevray, Kay

    2009-01-01

    This slide presentation reviews the integrated Guidance, Navigation and Control (iGN&C) system in the design for the Orion spacecraft. Included in the review are the plans for the design and development of the external interfaces, the functional architecture, the iGN&C software, the development and validation process, and the key challenges that are involved in the development of the iGN&C system

  14. Naming in a Programming Support Environment.

    DTIC Science & Technology

    1984-02-01

    and Control, 1974. 10. T. E. Cheatham. An Overview of the Harvard Program Development System. I; Software Engineering Environments, H. Hunke, Ed.. North...Holland Publishing Compary, 1981, pp. 253-266. 11. T. E. Cheatham. Comparing Programming Support Environments. In Software Engineering Environments...Company. 1981. Third Edition 16. F. DeRemer and H Kron Programming -inthe Large Versus Programming -in-theSmall. IEEE Transactions on Software Engineering

  15. Self-Metric Software. Volume I. Summary of Technical Progress.

    DTIC Science & Technology

    1980-04-01

    Development: A CSDL Project History, RADC-TR-77-213, pp. 33-41. A-42186. [3] Goodenough, J. B. and Zara , R. V., "The Effect of Software Structure on Software...1979. **Visiting assistant professor. 99 MISION Of Rome Air Devlopmnt Centfr RWV pta"aa nd eXgdatAA ’~AW&W4 dwveput, ’t* &a -a # "*ate 4UZtLug ~W~A~n

  16. Interoperability in the e-Government Context

    DTIC Science & Technology

    2012-01-01

    Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. Any opinions...Hanscom AFB, MA 01731-2125 NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS... Software Engineering Institute at permission@sei.cmu.edu. * These restrictions do not apply to U.S. government entities. CMU/SEI-2011-TN-014 | i Table

  17. Knowledge-intensive software design systems: Can too much knowledge be a burden?

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    While acknowledging the considerable benefits of domain-specific, knowledge-intensive approaches to automated software engineering, it is prudent to carefully examine the costs of such approaches, as well. In adding domain knowledge to a system, a developer makes a commitment to understanding, representing, maintaining, and communicating that knowledge. This substantial overhead is not generally associated with domain-independent approaches. In this paper, I examine the downside of incorporating additional knowledge, and illustrate with examples based on our experience in building the SIGMA system. I also offer some guidelines for developers building domain-specific systems.

  18. Knowledge-intensive software design systems: Can too much knowledge be a burden?

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    While acknowledging the considerable benefits of domain-specific, knowledge-intensive approaches to automated software engineering, it is prudent to carefully examine the costs of such approaches, as well. In adding domain knowledge to a system, a developer makes a commitment to understanding, representing, maintaining, and communicating that knowledge. This substantial overhead is not generally associated with domain-independent approaches. In this paper, I examine the downside of incorporating additional knowledge, and illustrate with examples based on our experiences building the SIGMA system. I also offer some guidelines for developers building domain-specific systems.

  19. Fleet Numerical Oceanography Center Software Development Standards: An Implementation of DoD-STD-2167A

    DTIC Science & Technology

    1989-09-01

    STD-2167A by William T. Livings September 1989 Thesis Advisor: Barry A. Frew Approved for public release; distribution is unlimited UNCLASSIFIED...f’P) TfLI Po*i~o1 (InoriudC A,.’-g IOle) *’, i 14 41 iProf.- Barry A. Frow (Q11rioqAr. DD Form 1 413, JUN 86 ’Ciij ’iI ’ ti)P ,i I. ij j-~-~I i~4~~6...easily changed or corrected when errors are found; and programs that are delivered for use months or even years too late. ( Pressman , 1988, pp. I- 2

  20. Expert System Software

    NASA Technical Reports Server (NTRS)

    1989-01-01

    C Language Integrated Production System (CLIPS) is a software shell for developing expert systems is designed to allow research and development of artificial intelligence on conventional computers. Originally developed by Johnson Space Center, it enables highly efficient pattern matching. A collection of conditions and actions to be taken if the conditions are met is built into a rule network. Additional pertinent facts are matched to the rule network. Using the program, E.I. DuPont de Nemours & Co. is monitoring chemical production machines; California Polytechnic State University is investigating artificial intelligence in computer aided design; Mentor Graphics has built a new Circuit Synthesis system, and Brooke and Brooke, a law firm, can determine which facts from a file are most important.

  1. Safe Surgery Trainer

    DTIC Science & Technology

    2014-11-15

    design, testing, and development. b) Prototype Development – Continue developing SST software, game -flow, and mechanics. Continue developing art...refined learning objectives into measurement outlines. Update IRB submissions, edit usability game play study, and update I/ITSEC IRB. Provide case...minimal or near zero. 9) Related Activities a) Presenting at the Design of Learning Games Community Workshop, at I/ITSEC, Wednesday, Dec 3 rd

  2. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    NASA Astrophysics Data System (ADS)

    Hawkins, Donovan Lee

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  3. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  4. Photogrammetric 3d Reconstruction in Matlab: Development of a Free Tool

    NASA Astrophysics Data System (ADS)

    Masiero, A.

    2017-11-01

    This paper presents the current state of development of a free Matlab tool for photogrammetric reconstruction developed at the University of Padova, Italy. The goal of this software is mostly educational, i.e. allowing students to have a close look to the specific steps which lead to the computation of a dense point cloud. As most of recently developed photogrammetric softwares, it is based on a Structure from Motion approach. Despite being mainly motivated by educational purposes, certain implementation details are clearly inspired by recent research works, e.g. limiting the computational burden of the feature matching by determining a suboptimal set of features to be considered, using information provided by external sensors to ease the matching process.

  5. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.

  6. TIA Software User's Manual

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  7. Thomas Leps Internship Abstract

    NASA Technical Reports Server (NTRS)

    Leps, Thomas

    2016-01-01

    An optical navigation system is being flown as the backup system to the primary Deep Space Network telemetry for navigation and guidance purposes on Orion. This is required to ensure Orion can recover from a loss of communication, which would simultaneously cause a loss of DSN telemetry. Images taken of the Moon and Earth are used to give range and position information to the navigation computer for trajectory calculations and maneuver execution. To get telemetry data from these images, the size and location of the moon need to be calculated with high accuracy and precision. The reentry envelope for the Orion EM-1 mission requires the centroid and radius of the moon images to be determined within 1/3 of a pixel 3 sigma. In order to ensure this accuracy and precision can be attained, I was tasked with building precise dot grid images for camera calibration as well as building a hardware in the loop test stand for flight software and hardware proofing. To calibrate the Op-Nav camera a dot grid is imaged with the camera, the error between the image dot location and the actual dot location can be used to build a distortion map of the camera and lens system so that images can be fixed to display truth locations. To build the dot grid images I used the Electro Optics Lab optical bench Bright Object Simulator System, and gimbal. The gimbal was slewed to a series of elevations and azimuths. An image of the collimated single point light source was then taken at each position. After a series of 99 images were taken at different locations the single light spots were extracted from each image and added to a composite image containing all 99 points. During the development of these grids it was noticed that an intermittent error in the artificial "star" locations occurred. Prior to the summer this error was attributed to the gimbal having glitches in it's pointing direction and was going to be replaced, however after further examining the issue I determined it to be a software issue. I have since narrowed the likely source of the error down to a Software Development Kit released by the camera supplier PixeLink. I have since developed a workaround in order to build star grids for calibration until the software bug can be isolated and fixed. I was also tasked with building a Hardware in the Loop test stand in order to test the full Op-Nav system. A 4k screen displays simulated Lunar and Terrestrial images from a possible Orion trajectory. These images are then projected through a collimator and then captured with an Op-Nav camera controlled by an Intel NUC computer running flight software. The flight software then analyzes the images to determine attitude and position, this data is then reconstructed into a trajectory and matched to the simulated trajectory in order to determine the accuracy of the attitude and position estimates. In order for the system to work it needs to be precisely and accurately aligned. I developed an alignment procedure that allows the screen, collimator and camera to be squared, centered and collinear with each other within a micron spatially and 5 arcseconds in rotation. I also designed a rigid mount for the screen that was machined on site in Building 10 by another intern. While I was working in the EOL we received a $500k Orion startracker for alignment procedure testing. Due to my prior experience in electronics development, as an ancillary duty, I was tasked with building the cables required to operate and power the startracker. If any errors are made building these cables the startracker would be destroyed, I was honored that the director of the lab entrusted such a critical component with me. This internship has cemented my view on public space exploration. I always preferred public sector to privatization because, as a scientist, the most interesting aspects of space for me are not necessarily the most profitable. I was concerned that the public sector was faltering however, and that in order to improve human space exploration I would be forced into private sector. I now know that, at least at JSC, human spaceflight is still progressing, and exciting work is still being done. I am now actively seeking employment at JSC after I complete my Ph.D and have met with my branch chiefs and mentor to discuss transitioning to a grad Co-op position.

  8. A data model for clinical legal medicine practice and the development of a dedicated software for both practitioners and researchers.

    PubMed

    Dang, Catherine; Phuong, Thomas; Beddag, Mahmoud; Vega, Anabel; Denis, Céline

    2018-07-01

    To present a data model for clinical legal medicine and the software based on that data model for both practitioners and researchers. The main functionalities of the presented software are computer-assisted production of medical certificates and data capture, storage and retrieval. The data model and the software were jointly developed by the department of forensic medicine of the Jean Verdier Hospital (Bondy, France) and an bioinformatics laboratory (LIMICS, Paris universities 6-13) between November 2015 and May 2016. The data model was built based on four sources: i) a template used in our department for producing standardised medical certificates; ii) a random sample of medical certificates produced by the forensic department; iii) anterior consensus between four healthcare professionals (two forensic practitioners, a psychologist and a forensic psychiatrist) and iv) anatomical dictionaries. The trial version of the open source software was first designed for examination of physical assault survivors. An UML-like data model dedicated to clinical legal practice was built. The data model describes the terminology for examinations of sexual assault survivors, physical assault survivors, individuals kept in police custody and undocumented migrants for age estimation. A trial version of a software relying on the data model was developed and tested by three physicians. The software allows files archiving, standardised data collection, extraction and assistance for certificate generation. It can be used for research purpose, by data exchange and analysis. Despite some current limitations of use, it is a tool which can be shared and used by other departments of forensic medicine and other specialties, improving data management and exploitation. Full integration with external sources, analytics software and use of a semantic interoperability framework are planned for the next months. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  9. Value Addition to Cartosat-I Imagery

    NASA Astrophysics Data System (ADS)

    Mohan, M.

    2014-11-01

    In the sector of remote sensing applications, the use of stereo data is on the steady rise. An attempt is hereby made to develop a software suite specifically for exploitation of Cartosat-I data. A few algorithms to enhance the quality of basic Cartosat-I products will be presented. The algorithms heavily exploit the Rational Function Coefficients (RPCs) that are associated with the image. The algorithms include improving the geometric positioning through Bundle Block Adjustment and producing refined RPCs; generating portable stereo views using raw / refined RPCs autonomously; orthorectification and mosaicing; registering a monoscopic image rapidly with a single seed point. The outputs of these modules (including the refined RPCs) are in standard formats for further exploitation in 3rd party software. The design focus has been on minimizing the user-interaction and to customize heavily to suit the Indian context. The core libraries are in C/C++ and some of the applications come with user-friendly GUI. Further customization to suit a specific workflow is feasible as the requisite photogrammetric tools are in place and are continuously upgraded. The paper discusses the algorithms and the design considerations of developing the tools. The value-added products so produced using these tools will also be presented.

  10. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    NASA Astrophysics Data System (ADS)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  11. Managing Complexity - Developing the Node Control Software For The International Space Station

    NASA Technical Reports Server (NTRS)

    Wood, Donald B.

    2000-01-01

    On December 4th, 1998 at 3:36 AM STS-88 (the space shuttle Endeavor) was launched with the "Node 1 Unity Module" in its payload bay. After working on the Space Station program for a very long time, that launch was one of the most beautiful sights I had ever seen! As the Shuttle proceeded to rendezvous with the Russian American module know as Zarya, I returned to Houston quickly to start monitoring the activation of the software I had spent the last 3 years working on. The FGB module (also known as "Zarya"), was grappled by the shuttle robotic arm, and connected to the Unity module. Crewmembers then hooked up the power and data connections between Zarya and Unity. On December 7th, 1998 at 9:49 PM CST the Node Control Software was activated. On December 15th, 1998, the Node-l/Zarya "cornerstone" of the International Space Station was left on-orbit. The Node Control Software (NCS) is the first software flown by NASA for the International Space Station (ISS). The ISS Program is considered the most complex international engineering effort ever undertaken. At last count some 18 countries are active partners in this global venture. NCS has performed all of its intended functions on orbit, over 200 miles above us. I'll be describing how we built the NCS software.

  12. Visual programming for next-generation sequencing data analytics.

    PubMed

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  13. Use of Google Earth to strengthen public health capacity and facilitate management of vector-borne diseases in resource-poor environments.

    PubMed

    Lozano-Fuentes, Saul; Elizondo-Quiroga, Darwin; Farfan-Ale, Jose Arturo; Loroño-Pino, Maria Alba; Garcia-Rejon, Julian; Gomez-Carro, Salvador; Lira-Zumbardo, Victor; Najera-Vazquez, Rosario; Fernandez-Salas, Ildefonso; Calderon-Martinez, Joaquin; Dominguez-Galera, Marco; Mis-Avila, Pedro; Morris, Natashia; Coleman, Michael; Moore, Chester G; Beaty, Barry J; Eisen, Lars

    2008-09-01

    Novel, inexpensive solutions are needed for improved management of vector-borne and other diseases in resource-poor environments. Emerging free software providing access to satellite imagery and simple editing tools (e.g. Google Earth) complement existing geographic information system (GIS) software and provide new opportunities for: (i) strengthening overall public health capacity through development of information for city infrastructures; and (ii) display of public health data directly on an image of the physical environment. We used freely accessible satellite imagery and a set of feature-making tools included in the software (allowing for production of polygons, lines and points) to generate information for city infrastructure and to display disease data in a dengue decision support system (DDSS) framework. Two cities in Mexico (Chetumal and Merida) were used to demonstrate that a basic representation of city infrastructure useful as a spatial backbone in a DDSS can be rapidly developed at minimal cost. Data layers generated included labelled polygons representing city blocks, lines representing streets, and points showing the locations of schools and health clinics. City blocks were colour-coded to show presence of dengue cases. The data layers were successfully imported in a format known as shapefile into a GIS software. The combination of Google Earth and free GIS software (e.g. HealthMapper, developed by WHO, and SIGEpi, developed by PAHO) has tremendous potential to strengthen overall public health capacity and facilitate decision support system approaches to prevention and control of vector-borne diseases in resource-poor environments.

  14. NASA Tech Briefs, June 2013

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Topics include: Cloud Absorption Radiometer Autonomous Navigation System - CANS, Software Method for Computed Tomography Cylinder Data Unwrapping, Re-slicing, and Analysis, Discrete Data Qualification System and Method Comprising Noise Series Fault Detection, Simple Laser Communications Terminal for Downlink from Earth Orbit at Rates Exceeding 10 Gb/s, Application Program Interface for the Orion Aerodynamics Database, Hyperspectral Imager-Tracker, Web Application Software for Ground Operations Planning Database (GOPDb) Management, Software Defined Radio with Parallelized Software Architecture, Compact Radar Transceiver with Included Calibration, Software Defined Radio with Parallelized Software Architecture, Phase Change Material Thermal Power Generator, The Thermal Hogan - A Means of Surviving the Lunar Night, Micromachined Active Magnetic Regenerator for Low-Temperature Magnetic Coolers, Nano-Ceramic Coated Plastics, Preparation of a Bimetal Using Mechanical Alloying for Environmental or Industrial Use, Phase Change Material for Temperature Control of Imager or Sounder on GOES Type Satellites in GEO, Dual-Compartment Inflatable Suitlock, Modular Connector Keying Concept, Genesis Ultrapure Water Megasonic Wafer Spin Cleaner, Piezoelectrically Initiated Pyrotechnic Igniter, Folding Elastic Thermal Surface - FETS, Multi-Pass Quadrupole Mass Analyzer, Lunar Sulfur Capture System, Environmental Qualification of a Single-Crystal Silicon Mirror for Spaceflight Use, Planar Superconducting Millimeter-Wave/Terahertz Channelizing Filter, Qualification of UHF Antenna for Extreme Martian Thermal Environments, Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project, ISS Live!, Space Operations Learning Center (SOLC) iPhone/iPad Application, Software to Compare NPP HDF5 Data Files, Planetary Data Systems (PDS) Imaging Node Atlas II, Automatic Calibration of an Airborne Imaging System to an Inertial Navigation Unit, Translating MAPGEN to ASPEN for MER, Support Routines for In Situ Image Processing, and Semi-Supervised Eigenbasis Novelty Detection.

  15. PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.

    PubMed

    Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt

    2017-01-24

    The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).

  16. Data Analysis of a Space Experiment: Common Software Tackles Uncommon Task

    NASA Technical Reports Server (NTRS)

    Wilkinson, R. Allen

    1998-01-01

    Presented here are the software adaptations developed by laboratory scientists to process the space experiment data products from three experiments on two International Microgravity Laboratory Missions (IML-1 and IML-2). The challenge was to accommodate interacting with many types of hardware and software developed by both European Space Agency (ESA) and NASA aerospace contractors, where data formats were neither commercial nor familiar to scientists. Some of the data had been corrupted by bit shifting of byte boundaries. Least-significant/most-significant byte swapping also occurred as might be expected for the various hardware platforms involved. The data consisted of 20 GBytes per experiment of both numerical and image data. A significant percentage of the bytes were consumed in NASA formatting with extra layers of packetizing structure. It was provided in various pieces to the scientists on magnetic tapes, Syquest cartridges, DAT tapes, CD-ROMS, analog video tapes, and by network FIP. In this paper I will provide some science background and present the software processing used to make the data useful in the months after the missions.

  17. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  18. Use of software tools in the development of real time software systems

    NASA Technical Reports Server (NTRS)

    Garvey, R. C.

    1981-01-01

    The transformation of a preexisting software system into a larger and more versatile system with different mission requirements is discussed. The history of this transformation is used to illustrate the use of structured real time programming techniques and tools to produce maintainable and somewhat transportable systems. The predecessor system is a single ground diagnostic system; its purpose is to exercise a computer controlled hardware set prior to its deployment in its functional environment, as well as test the equipment set by supplying certain well known stimulas. The successor system (FTE) is required to perform certain testing and control functions while this hardware set is in its functional environment. Both systems must deal with heavy user input/output loads and a new I/O requirement is included in the design of the FTF system. Human factors are enhanced by adding an improved console interface and special function keyboard handler. The additional features require the inclusion of much new software to the original set from which FTF was developed. As a result, it is necessary to split the system into a duel programming configuration with high rates of interground communications. A generalized information routing mechanism is used to support this configuration.

  19. Power consumption analysis of pump station control systems based on fuzzy controllers with discrete terms in iThink software

    NASA Astrophysics Data System (ADS)

    Muravyova, E. A.; Bondarev, A. V.; Sharipov, M. I.; Galiaskarova, G. R.; Kubryak, A. I.

    2018-03-01

    In this article, power consumption of pumping station control systems is discussed. To study the issue, two simulation models of oil level control in the iThink software have been developed, using a frequency converter only and using a frequency converter and a fuzzy controller. A simulation of the oil-level control was carried out in a graphic form, and plots of pumps power consumption were obtained. Based on the initial and obtained data, the efficiency of the considered control systems has been compared, and also the power consumption of the systems was shown graphically using a frequency converter only and using a frequency converter and a fuzzy controller. The models analysis has shown that it is more economical and safe to use a control circuit with a frequency converter and a fuzzy controller.

  20. CANES Contracting Strategies for Full Deployment

    DTIC Science & Technology

    2012-01-01

    9 CANES Program Functions in Full Deployment...contractors will design CANES, identifying specific hardware and developing the integration software necessary to consolidate existing C4I functions . At...would be responsible for execut- ing the purchased design and assembling the systems, ensuring that the integration software is functioning . An

  1. Situational Awareness Geospatial Application (iSAGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, Benjamin

    Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less

  2. UWB Tracking Software Development

    NASA Technical Reports Server (NTRS)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  3. Ares I Upper Stage Update

    NASA Technical Reports Server (NTRS)

    Davis, Daniel J.

    2010-01-01

    These presentation slides review the progress in the development of the Ares I upper stage. The development includes development of a manufacturing and processing assembly that will reduce the time required over 100 days, development of a weld tool that is a robotic tool that is the largest welder of its kind in the United States, development of avionics and software, and development of logisitics and operations systems.

  4. Flight code validation simulator

    NASA Astrophysics Data System (ADS)

    Sims, Brent A.

    1996-05-01

    An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.

  5. Implementing Educational Software and Evaluating Its Academic Effectiveness: Part I.

    ERIC Educational Resources Information Center

    Jolicoeur, Karen; Berger, Dale E.

    1988-01-01

    This basic plan for implementing educational software in the classroom incorporates a research design for evaluating its effectiveness. A study of fifth grade classrooms using game and tutorial software for spelling and fractions is used as an example. Topics discussed include software selection, selecting groups of comparable ability, and use of…

  6. Developing a Model Component

    NASA Technical Reports Server (NTRS)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  7. Closing the Certification Gaps in Adaptive Flight Control Software

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2008-01-01

    Over the last five decades, extensive research has been performed to design and develop adaptive control systems for aerospace systems and other applications where the capability to change controller behavior at different operating conditions is highly desirable. Although adaptive flight control has been partially implemented through the use of gain-scheduled control, truly adaptive control systems using learning algorithms and on-line system identification methods have not seen commercial deployment. The reason is that the certification process for adaptive flight control software for use in national air space has not yet been decided. The purpose of this paper is to examine the gaps between the state-of-the-art methodologies used to certify conventional (i.e., non-adaptive) flight control system software and what will likely to be needed to satisfy FAA airworthiness requirements. These gaps include the lack of a certification plan or process guide, the need to develop verification and validation tools and methodologies to analyze adaptive controller stability and convergence, as well as the development of metrics to evaluate adaptive controller performance at off-nominal flight conditions. This paper presents the major certification gap areas, a description of the current state of the verification methodologies, and what further research efforts will likely be needed to close the gaps remaining in current certification practices. It is envisioned that closing the gap will require certain advances in simulation methods, comprehensive methods to determine learning algorithm stability and convergence rates, the development of performance metrics for adaptive controllers, the application of formal software assurance methods, the application of on-line software monitoring tools for adaptive controller health assessment, and the development of a certification case for adaptive system safety of flight.

  8. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    PubMed

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i.e., a 36% reduction. The on-site data collection did not produce time saving, however this is a temporary weakness that will be annihilated very soon in the future after officers are more acquainted with the software. The phase of evaluation, processing and analysis carried out in the office was dramatically shortened, i.e., a 69% reduction. Another benefit was the standardization which allowed fast and consistent data analysis and evaluation. Even if all these benefits are remarkable, the most valuable benefit of the new procedure was the reduction of the police officers mistakes during the manual operations of survey and data evaluation. Because of these benefits, the satisfaction questionnaires administrated to the police officers after the testing phase showed very good acceptance of the procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. 77 FR 50128 - Office of Direct Service and Contracting Tribes; National Indian Health Outreach and Education...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-20

    .... If the proposed projects include information technology (i.e., hardware, software, etc.), provide... information developed and disseminated through the projects is appropriate, useful and addresses the most... and HHS. With the limited funds available for [[Page 50129

  10. A Common DPU Platform for ESA JUICE Mission Instruments

    NASA Astrophysics Data System (ADS)

    Aberg, Martin; Hellstrom, Daniel; Samuelsson, Arne; Torelli, Felice

    2016-08-01

    This paper describes the resulting hardware and software platform based on GR712RC [1] LEON3-FT that Cobham Gaisler developed in accordance with the common system requirements of the ten scientific instruments on-board the ESA JUICE spacecraft destined the Jupiter system [8].The radiation hardened DPU platform features EDAC protected boot, application memory and working memory of configurable sizes and SpaceWire, FPGA I/O-32/16/8, GPIO, UART and SPI I/O interfaces. The design has undergone PSA, Risk, WCA, Radiation analyses etc. to justify component and design choices resulting in a robust design that can be used in spacecrafts requiring a total dose up to 100krad(Si). The prototype board manufactured uses engineering models of the flight components to ensure that development is representative.Validated boot, standby and driver software accommodates the various DPU platform configurations. The boot performs low-level DPU initialization, standby handles OBC SpaceWire communication and finally the loading and executing of application images typically stored in the non-volatile application memory.

  11. Software agents and the route to the information economy

    PubMed Central

    Kephart, Jeffrey O.

    2002-01-01

    Humans are on the verge of losing their status as the sole economic species on the planet. In private laboratories and in the Internet laboratory, researchers and developers are creating a variety of autonomous economically motivated software agents endowed with algorithms for maximizing profit or utility. Many economic software agents will function as miniature businesses, purchasing information inputs from other agents, combining and refining them into information goods and services, and selling them to humans or other agents. Their mutual interactions will form the information economy: a complex economic web of information goods and services that will adapt to the ever-changing needs of people and agents. The information economy will be the largest multiagent system ever conceived and an integral part of the world's economy. I discuss a possible route toward this vision, beginning with present-day Internet trends suggesting that agents will charge one another for information goods and services. Then, to establish that agents can be competent price setters, I describe some laboratory experiments pitting software bidding agents against human bidders. The agents' superior performance suggests they will be used on a broad scale, which in turn suggests that interactions among agents will become frequent and significant. How will this affect macroscopic economic behavior? I describe some interesting phenomena that my colleagues and I have observed in simulations of large populations of automated buyers and sellers, such as price war cycles. I conclude by discussing fundamental scientific challenges that remain to be addressed as we journey toward the information economy. PMID:12011399

  12. ELISA, a demonstrator environment for information systems architecture design

    NASA Technical Reports Server (NTRS)

    Panem, Chantal

    1994-01-01

    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.

  13. Risk Assessment Methodology for Software Supportability (RAMSS): guidelines for Adapting Software Supportability Evaluations

    DTIC Science & Technology

    1986-04-14

    CONCIPT DIFINITION OIVILOPMINTITIST I OPERATION ANO ■ MAINTENANCE ■ TRACK MOifCTIO PROGRAMS • «VIIW CRITICAL ISSUIS . Mt PARI INPUTS TO PMO...development and beyond, evaluation criteria must Include quantitative goals (the desired value) and thresholds (the value beyond which the charac

  14. Star Tracker Performance Estimate with IMU

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Swank, Aaron J.

    2015-01-01

    A software tool for estimating cross-boresight error of a star tracker combined with an inertial measurement unit (IMU) was developed to support trade studies for the Integrated Radio and Optical Communication project (iROC) at the National Aeronautics and Space Administration Glenn Research Center. Typical laser communication systems, such as the Lunar Laser Communication Demonstration (LLCD) and the Laser Communication Relay Demonstration (LCRD), use a beacon to locate ground stations. iROC is investigating the use of beaconless precision laser pointing to enable laser communication at Mars orbits and beyond. Precision attitude knowledge is essential to the iROC mission to enable high-speed steering of the optical link. The preliminary concept to achieve this precision attitude knowledge is to use star trackers combined with an IMU. The Star Tracker Accuracy (STAcc) software was developed to rapidly assess the capabilities of star tracker and IMU configurations. STAcc determines the overall cross-boresight error of a star tracker with an IMU given the characteristic parameters: quantum efficiency, aperture, apparent star magnitude, exposure time, field of view, photon spread, detector pixels, spacecraft slew rate, maximum stars used for quaternion estimation, and IMU angular random walk. This paper discusses the supporting theory used to construct STAcc, verification of the program and sample results.

  15. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  16. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  17. The optimization problems of CP operation

    NASA Astrophysics Data System (ADS)

    Kler, A. M.; Stepanova, E. L.; Maximov, A. S.

    2017-11-01

    The problem of enhancing energy and economic efficiency of CP is urgent indeed. One of the main methods for solving it is optimization of CP operation. To solve the optimization problems of CP operation, Energy Systems Institute, SB of RAS, has developed a software. The software makes it possible to make optimization calculations of CP operation. The software is based on the techniques and software tools of mathematical modeling and optimization of heat and power installations. Detailed mathematical models of new equipment have been developed in the work. They describe sufficiently accurately the processes that occur in the installations. The developed models include steam turbine models (based on the checking calculation) which take account of all steam turbine compartments and regeneration system. They also enable one to make calculations with regenerative heaters disconnected. The software for mathematical modeling of equipment and optimization of CP operation has been developed. It is based on the technique for optimization of CP operating conditions in the form of software tools and integrates them in the common user interface. The optimization of CP operation often generates the need to determine the minimum and maximum possible total useful electricity capacity of the plant at set heat loads of consumers, i.e. it is necessary to determine the interval on which the CP capacity may vary. The software has been applied to optimize the operating conditions of the Novo-Irkutskaya CP of JSC “Irkutskenergo”. The efficiency of operating condition optimization and the possibility for determination of CP energy characteristics that are necessary for optimization of power system operation are shown.

  18. iStethoscope: a demonstration of the use of mobile devices for auscultation.

    PubMed

    Bentley, Peter J

    2015-01-01

    iStethoscope Pro is the first piece of software (an "App") produced for iOS devices, which enabled users to exploit their smartphones, music players, or tablets as stethoscopes. The software exploits the built-in microphone (and supports externally added microphones) and performs real-time amplification and filtering to enable heart sounds to be heard with high fidelity. The software also enables the heart sounds to be recorded, analyzed using a spectrogram, and to be transmitted to others via e-mail. This chapter describes the motivation, functionality, and results from this work.

  19. Software-codec-based full motion video conferencing on the PC using visual pattern image sequence coding

    NASA Astrophysics Data System (ADS)

    Barnett, Barry S.; Bovik, Alan C.

    1995-04-01

    This paper presents a real time full motion video conferencing system based on the Visual Pattern Image Sequence Coding (VPISC) software codec. The prototype system hardware is comprised of two personal computers, two camcorders, two frame grabbers, and an ethernet connection. The prototype system software has a simple structure. It runs under the Disk Operating System, and includes a user interface, a video I/O interface, an event driven network interface, and a free running or frame synchronous video codec that also acts as the controller for the video and network interfaces. Two video coders have been tested in this system. Simple implementations of Visual Pattern Image Coding and VPISC have both proven to support full motion video conferencing with good visual quality. Future work will concentrate on expanding this prototype to support the motion compensated version of VPISC, as well as encompassing point-to-point modem I/O and multiple network protocols. The application will be ported to multiple hardware platforms and operating systems. The motivation for developing this prototype system is to demonstrate the practicality of software based real time video codecs. Furthermore, software video codecs are not only cheaper, but are more flexible system solutions because they enable different computer platforms to exchange encoded video information without requiring on-board protocol compatible video codex hardware. Software based solutions enable true low cost video conferencing that fits the `open systems' model of interoperability that is so important for building portable hardware and software applications.

  20. The Role of Program Structure in Software Maintenance.

    DTIC Science & Technology

    1986-05-29

    0NFDUTC We have entered an era in which it has become increasingly important to develop humlan engineering principles which will 0 significantly...Programmers use slices when debugging. Communications of the ACM1, 25, 446-452. Winer, B. J. (1971). Statistical principles in experimental desin. New York...d dir C.VIo Lir~ 7i, LE -3 C Wi nd J ir E’r~~. ..CJ .. J 1 1Lt . * . BE I N Top Iip END; END-z FU14CiT I L44 Erpt .tALk I. JjiLE;,N; VJIuiH S-taCA DU I

  1. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities.more » Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).« less

  2. DiaFit: The Development of a Smart App for Patients with Type 2 Diabetes and Obesity.

    PubMed

    Modave, François; Bian, Jiang; Rosenberg, Eric; Mendoza, Tonatiuh; Liang, Zhan; Bhosale, Ravi; Maeztu, Carlos; Rodriguez, Camila; Cardel, Michelle I

    2016-01-01

    Optimal management of chronic diseases, such as type 2 diabetes (T2D) and obesity, requires patient-provider communication and proactive self-management from the patient. Mobile apps could be an effective strategy for improving patient-provider communication and provide resources for self-management to patients themselves. The objective of this paper is to describe the development of a mobile tool for patients with T2D and obesity that utilizes an integrative approach to facilitate patient-centered app development, with patient and physician interfaces. Our implementation strategy focused on the building of a multidisciplinary team to create a user-friendly and evidence-based app, to be used by patients in a home setting or at the point-of-care. We present the iterative design, development, and testing of DiaFit, an app designed to improve the self-management of T2D and obesity, using an adapted Agile approach to software implementation. The production team consisted of experts in mobile health, nutrition sciences, and obesity; software engineers; and clinicians. Additionally, the team included citizen scientists and clinicians who acted as the de facto software clients for DiaFit and therefore interacted with the production team throughout the entire app creation, from design to testing. DiaFit (version 1.0) is an open-source, inclusive iOS app that incorporates nutrition data, physical activity data, and medication and glucose values, as well as patient-reported outcomes. DiaFit supports the uploading of data from sensory devices via Bluetooth for physical activity (iOS step counts, FitBit, Apple watch) and glucose monitoring (iHealth glucose meter). The app provides summary statistics and graphics for step counts, dietary information, and glucose values that can be used by patients and their providers to make informed health decisions. The DiaFit iOS app was developed in Swift (version 2.2) with a Web back-end deployed on the Health Insurance Portability and Accountability Act compliant-ready Amazon Web Services cloud computing platform. DiaFit is publicly available on GitHub to the diabetes community at large, under the GNU General Public License agreement. Given the proliferation of health-related apps available to health consumers, it is essential to ensure that apps are evidence-based and user-oriented, with specific health conditions in mind. To this end, we have used a software development approach focusing on community and clinical engagement to create DiaFit, an app that assists patients with T2D and obesity to better manage their health through active communication with their providers and proactive self-management of their diseases.

  3. DiaFit: The Development of a Smart App for Patients with Type 2 Diabetes and Obesity

    PubMed Central

    Modave, François; Bian, Jiang; Rosenberg, Eric; Mendoza, Tonatiuh; Liang, Zhan; Bhosale, Ravi; Maeztu, Carlos; Rodriguez, Camila; Cardel, Michelle I

    2018-01-01

    Background Optimal management of chronic diseases, such as type 2 diabetes (T2D) and obesity, requires patient-provider communication and proactive self-management from the patient. Mobile apps could be an effective strategy for improving patient-provider communication and provide resources for self-management to patients themselves. Objective The objective of this paper is to describe the development of a mobile tool for patients with T2D and obesity that utilizes an integrative approach to facilitate patient-centered app development, with patient and physician interfaces. Our implementation strategy focused on the building of a multidisciplinary team to create a user-friendly and evidence-based app, to be used by patients in a home setting or at the point-of-care. Methods We present the iterative design, development, and testing of DiaFit, an app designed to improve the self-management of T2D and obesity, using an adapted Agile approach to software implementation. The production team consisted of experts in mobile health, nutrition sciences, and obesity; software engineers; and clinicians. Additionally, the team included citizen scientists and clinicians who acted as the de facto software clients for DiaFit and therefore interacted with the production team throughout the entire app creation, from design to testing. Results DiaFit (version 1.0) is an open-source, inclusive iOS app that incorporates nutrition data, physical activity data, and medication and glucose values, as well as patient-reported outcomes. DiaFit supports the uploading of data from sensory devices via Bluetooth for physical activity (iOS step counts, FitBit, Apple watch) and glucose monitoring (iHealth glucose meter). The app provides summary statistics and graphics for step counts, dietary information, and glucose values that can be used by patients and their providers to make informed health decisions. The DiaFit iOS app was developed in Swift (version 2.2) with a Web back-end deployed on the Health Insurance Portability and Accountability Act compliant-ready Amazon Web Services cloud computing platform. DiaFit is publicly available on GitHub to the diabetes community at large, under the GNU General Public License agreement. Conclusions Given the proliferation of health-related apps available to health consumers, it is essential to ensure that apps are evidence-based and user-oriented, with specific health conditions in mind. To this end, we have used a software development approach focusing on community and clinical engagement to create DiaFit, an app that assists patients with T2D and obesity to better manage their health through active communication with their providers and proactive self-management of their diseases. PMID:29388609

  4. Influence of Smartphones and Software on Acoustic Voice Measures

    PubMed Central

    GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA

    2016-01-01

    This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797

  5. Preoperative Planning of Orthopedic Procedures using Digitalized Software Systems.

    PubMed

    Steinberg, Ely L; Segev, Eitan; Drexler, Michael; Ben-Tov, Tomer; Nimrod, Snir

    2016-06-01

    The progression from standard celluloid films to digitalized technology led to the development of new software programs to fulfill the needs of preoperative planning. We describe here preoperative digitalized programs and the variety of conditions for which those programs can be used to facilitate preparation for surgery. A PubMed search using the keywords "digitalized software programs," "preoperative planning" and "total joint arthroplasty" was performed for all studies regarding preoperative planning of orthopedic procedures that were published from 1989 to 2014 in English. Digitalized software programs are enabled to import and export all picture archiving communication system (PACS) files (i.e., X-rays, computerized tomograms, magnetic resonance images) from either the local working station or from any remote PACS. Two-dimension (2D) and 3D CT scans were found to be reliable tools with a high preoperative predicting accuracy for implants. The short learning curve, user-friendly features, accurate prediction of implant size, decreased implant stocks and low-cost maintenance makes digitalized software programs an attractive tool in preoperative planning of total joint replacement, fracture fixation, limb deformity repair and pediatric skeletal disorders.

  6. WTEC monograph on instrumentation, control and safety systems of Canadian nuclear facilities

    NASA Technical Reports Server (NTRS)

    Uhrig, Robert E.; Carter, Richard J.

    1993-01-01

    This report updates a 1989-90 survey of advanced instrumentation and controls (I&C) technologies and associated human factors issues in the U.S. and Canadian nuclear industries carried out by a team from Oak Ridge National Laboratory (Carter and Uhrig 1990). The authors found that the most advanced I&C systems are in the Canadian CANDU plants, where the newest plant (Darlington) has digital systems in almost 100 percent of its control systems and in over 70 percent of its plant protection system. Increased emphasis on human factors and cognitive science in modern control rooms has resulted in a reduced workload for the operators and the elimination of many human errors. Automation implemented through digital instrumentation and control is effectively changing the role of the operator to that of a systems manager. The hypothesis that properly introducing digital systems increases safety is supported by the Canadian experience. The performance of these digital systems has been achieved using appropriate quality assurance programs for both hardware and software development. Recent regulatory authority review of the development of safety-critical software has resulted in the creation of isolated software modules with well defined interfaces and more formal structure in the software generation. The ability of digital systems to detect impending failures and initiate a fail-safe action is a significant safety issue that should be of special interest to nuclear utilities and regulatory authorities around the world.

  7. Supporting Tablet Configuration, Tracking, and Infection Control Practices in Digital Health Interventions: Study Protocol.

    PubMed

    Furberg, Robert D; Ortiz, Alexa M; Zulkiewicz, Brittany A; Hudson, Jordan P; Taylor, Olivia M; Lewis, Megan A

    2016-06-27

    Tablet-based health care interventions have the potential to encourage patient care in a timelier manner, allow physicians convenient access to patient records, and provide an improved method for patient education. However, along with the continued adoption of tablet technologies, there is a concomitant need to develop protocols focusing on the configuration, management, and maintenance of these devices within the health care setting to support the conduct of clinical research. Develop three protocols to support tablet configuration, tablet management, and tablet maintenance. The Configurator software, Tile technology, and current infection control recommendations were employed to develop three distinct protocols for tablet-based digital health interventions. Configurator is a mobile device management software specifically for iPhone operating system (iOS) devices. The capabilities and current applications of Configurator were reviewed and used to develop the protocol to support device configuration. Tile is a tracking tag associated with a free mobile app available for iOS and Android devices. The features associated with Tile were evaluated and used to develop the Tile protocol to support tablet management. Furthermore, current recommendations on preventing health care-related infections were reviewed to develop the infection control protocol to support tablet maintenance. This article provides three protocols: the Configurator protocol, the Tile protocol, and the infection control protocol. These protocols can help to ensure consistent implementation of tablet-based interventions, enhance fidelity when employing tablets for research purposes, and serve as a guide for tablet deployments within clinical settings.

  8. Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.

    2000-01-01

    The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.

  9. Fault Tolerant Considerations and Methods for Guidance and Control Systems

    DTIC Science & Technology

    1987-07-01

    multifunction devices such as microprocessors with software. In striving toward the economic goal, however, a cost is incurred in a different coin, i.e...therefore been developed which reduces the software risk to acceptable proportions. Several of the techniques thus developed incur no significant cost ...complex that their design and implementation need computerized tools in order to be cost -effective (in a broad sense, including the capability of

  10. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    PubMed

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  11. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman’s Sleep at Home

    PubMed Central

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy. PMID:26083422

  12. iELVis: An open source MATLAB toolbox for localizing and visualizing human intracranial electrode data.

    PubMed

    Groppe, David M; Bickel, Stephan; Dykstra, Andrew R; Wang, Xiuyuan; Mégevand, Pierre; Mercier, Manuel R; Lado, Fred A; Mehta, Ashesh D; Honey, Christopher J

    2017-04-01

    Intracranial electrical recordings (iEEG) and brain stimulation (iEBS) are invaluable human neuroscience methodologies. However, the value of such data is often unrealized as many laboratories lack tools for localizing electrodes relative to anatomy. To remedy this, we have developed a MATLAB toolbox for intracranial electrode localization and visualization, iELVis. NEW METHOD: iELVis uses existing tools (BioImage Suite, FSL, and FreeSurfer) for preimplant magnetic resonance imaging (MRI) segmentation, neuroimaging coregistration, and manual identification of electrodes in postimplant neuroimaging. Subsequently, iELVis implements methods for correcting electrode locations for postimplant brain shift with millimeter-scale accuracy and provides interactive visualization on 3D surfaces or in 2D slices with optional functional neuroimaging overlays. iELVis also localizes electrodes relative to FreeSurfer-based atlases and can combine data across subjects via the FreeSurfer average brain. It takes 30-60min of user time and 12-24h of computer time to localize and visualize electrodes from one brain. We demonstrate iELVis's functionality by showing that three methods for mapping primary hand somatosensory cortex (iEEG, iEBS, and functional MRI) provide highly concordant results. COMPARISON WITH EXISTING METHODS: iELVis is the first public software for electrode localization that corrects for brain shift, maps electrodes to an average brain, and supports neuroimaging overlays. Moreover, its interactive visualizations are powerful and its tutorial material is extensive. iELVis promises to speed the progress and enhance the robustness of intracranial electrode research. The software and extensive tutorial materials are freely available as part of the EpiSurg software project: https://github.com/episurg/episurg. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Development of instructional, interactive, multimedia anatomy dissection software: a student-led initiative.

    PubMed

    Inwood, Matthew J; Ahmad, Jamil

    2005-11-01

    Although dissection provides an unparalleled means of teaching gross anatomy, it constitutes a significant logistical and financial investment for educational institutions. The increasing availability and waning cost of computer equipment has enabled many institutions to supplement their anatomy curriculum with Computer Aided Learning (CAL) software. At the Royal College of Surgeons in Ireland, two undergraduate medical students designed and produced instructional anatomy dissection software for use by first and second year medical students. The software consists of full-motion, narrated, QuickTime MPG movies presented in a Macromedia environment. Forty-four movies, between 1-11 min in duration, were produced. Each movie corresponds to a dissection class and precisely demonstrates the dissection and educational objectives for that class. The software is distributed to students free of charge and they are encouraged to install it on their Apple iBook computers. Results of a student evaluation indicated that the software was useful, easy to use, and improved the students' experience in the dissection classes. The evaluation also indicated that only a minority of students regularly used the software or had it installed on their laptop computers. Accordingly, effort should also be directed toward making the software more accessible and increasing students' comfort and familiarity with novel instructional media. The successful design and implementation of this software demonstrates that CAL software can be employed to augment, enhance and improve anatomy instruction. In addition, effective, high quality, instructional multimedia software can be tailored to an educational institution's requirements and produced by novice programmers at minimal cost. Copyright 2005 Wiley-Liss, Inc

  14. Cognitive task analysis-based design and authoring software for simulation training.

    PubMed

    Munro, Allen; Clark, Richard E

    2013-10-01

    The development of more effective medical simulators requires a collaborative team effort where three kinds of expertise are carefully coordinated: (1) exceptional medical expertise focused on providing complete and accurate information about the medical challenges (i.e., critical skills and knowledge) to be simulated; (2) instructional expertise focused on the design of simulation-based training and assessment methods that produce maximum learning and transfer to patient care; and (3) software development expertise that permits the efficient design and development of the software required to capture expertise, present it in an engaging way, and assess student interactions with the simulator. In this discussion, we describe a method of capturing more complete and accurate medical information for simulators and combine it with new instructional design strategies that emphasize the learning of complex knowledge. Finally, we describe three different types of software support (Development/Authoring, Run Time, and Post Run Time) required at different stages in the development of medical simulations and the instructional design elements of the software required at each stage. We describe the contributions expected of each kind of software and the different instructional control authoring support required. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  15. Upper Secondary and Vocational Level Teachers at Social Software

    ERIC Educational Resources Information Center

    Valtonen, Teemu; Kontkanen, Sini; Dillon, Patrick; Kukkonen, Jari; Väisänen, Pertti

    2014-01-01

    This study focuses on upper secondary and vocational level teachers as users of social software i.e. what software they use during their leisure and work and for what purposes they use software in teaching. The study is theorised within a technological pedagogical content knowledge framework, the emphasis is especially on technological knowledge…

  16. DRIFTER Web App Development Support

    NASA Technical Reports Server (NTRS)

    Davis, Derrick D.; Armstrong, Curtis D.

    2015-01-01

    During my 2015 internship at Stennis Space Center (SSC) I supported the development of a web based tool to enable user interaction with a low-cost environmental monitoring buoy called the DRIFTER. DRIFTERs are designed by SSC's Applied Science and Technology Projects branch and are used to measure parameters such as water temperature and salinity. Data collected by the buoys help verify measurements by NASA satellites, which contributes to NASA's mission to advance understanding of the Earth by developing technologies to improve the quality of life on or home planet. My main objective during this internship was to support the development of the DRIFTER by writing web-based software that allows the public to view and access data collected by the buoys. In addition, this software would enable DRIFTER owners to configure and control the devices.

  17. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  18. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1989-01-01

    Reviews three software packages: (1) "Physics," tutorial, grades 11-12, Macintosh; (2) "Hands On Math: Volume I," interactive math exploration/simulation of manipulatives use, grades K-7, Apple II; and (3) "A.I.: An Experience with Artificial Intelligence," simulation, grades 5-12, Apple II. (MVL)

  19. A Prototype Climate Information System

    DTIC Science & Technology

    1993-01-01

    accessing the fields used for matching. The relational data base model was developed in the early 1970s by Edgar F . Codd , has become the most important...Thetr goal ofd tisp earh iTESS(3), and porting this software to TESS(3) hasT he go al o f th is research is to develop a C IS tob e u . C I A T d m o...meteorological observations every three hours around the clock. Lt. Matthew F . Maury, upon taking command of the depot in July of 1842, found that the resultant

  20. Pile Driving

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Machine-oriented structural engineering firm TERA, Inc. is engaged in a project to evaluate the reliability of offshore pile driving prediction methods to eventually predict the best pile driving technique for each new offshore oil platform. Phase I Pile driving records of 48 offshore platforms including such information as blow counts, soil composition and pertinent construction details were digitized. In Phase II, pile driving records were statistically compared with current methods of prediction. Result was development of modular software, the CRIPS80 Software Design Analyzer System, that companies can use to evaluate other prediction procedures or other data bases.

  1. The Planning and Scheduling of HST: Improvements and Enhancements since Launch

    NASA Astrophysics Data System (ADS)

    Taylor, D. K.; Chance, D. R.; Jordan, I. J. E.; Patterson, A. P.; Stanley, M.; Taylor, D. C.

    2001-12-01

    The planning and scheduling (P&S) systems used in operating the Hubble Space Telescope (HST) have undergone such substantial and pervasive re-engineering that today they dimly resemble those used when HST was launched. Processes (i.e., software, procedures, networking, etc.) which allow program implementation, the generation of a Long Range Plan (LRP), and the scheduling of science and mission activities have improved drastically in nearly 12 years, resulting in a consistently high observing efficiency, a stable LRP that principal investigators can use, exceptionally clean command loads uplinked to the spacecraft, and the capability of a very fast response time due to onboard anomalies or targets of opportunity. In this presentation we describe many of the systems which comprise the P&S ("front-end") system for HST, how and why they were improved, and what benefits have been realized by either the HST user community or the STScI staff. The systems include the Guide Star System, the Remote Proposal Submission System - 2 (RPS2), Artificial Intelligence (AI) planning tools such as Spike, and the science and mission scheduling software. We also describe how using modern software languages such as Python and better development practices allow STScI staff to do more with HST (e.g., to handle much more science data when ACS is installed) without increasing the cost to HST operations.

  2. Design of a software for calculating isoelectric point of a polypeptide according to their net charge using the graphical programming language LabVIEW.

    PubMed

    Tovar, Glomen

    2018-01-01

    A software to calculate the net charge and to predict the isoelectric point (pI) of a polypeptide is developed in this work using the graphical programming language LabVIEW. Through this instrument the net charges of the ionizable residues of the polypeptide chains of the proteins are calculated at different pH values, tabulated, pI is predicted and an Excel (-xls) type file is generated. In this work, the experimental values of the pIs (pI) of different proteins are compared with the values of the pIs (pI) calculated graphically, achieving a correlation coefficient (R) of 0.934746 which represents a good reliability for a p < 0.01. In this way the generated program can constitute an instrument applicable in the laboratory, facilitating the calculation to graduate students and junior researchers. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(1):39-46, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.

  3. Reliability Analysis and Optimal Release Problem Considering Maintenance Time of Software Components for an Embedded OSS Porting Phase

    NASA Astrophysics Data System (ADS)

    Tamura, Yoshinobu; Yamada, Shigeru

    OSS (open source software) systems which serve as key components of critical infrastructures in our social life are still ever-expanding now. Especially, embedded OSS systems have been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Also, it is difficult for developers to assess the reliability and portability of embedded OSS on a single-board computer. In this paper, we propose a method of software reliability assessment based on flexible hazard rates for the embedded OSS. Also, we analyze actual data of software failure-occurrence time-intervals to show numerical examples of software reliability assessment for the embedded OSS. Moreover, we compare the proposed hazard rate model for the embedded OSS with the typical conventional hazard rate models by using the comparison criteria of goodness-of-fit. Furthermore, we discuss the optimal software release problem for the porting-phase based on the total expected software maintenance cost.

  4. Better software, better research: the challenge of preserving your research and your reputation

    NASA Astrophysics Data System (ADS)

    Chue Hong, N.

    2017-12-01

    Software is fundamental to research. From short, thrown-together temporary scripts, through an abundance of complex spreadsheets analysing collected data, to the hundreds of software engineers and millions of lines of code behind international efforts such as the Large Hadron Collider and the Square Kilometre Array, software has made an invaluable contribution to advancing our research knowledge. Within the earth and space sciences, data is being generated, collected, processed and analysed in ever greater amounts and detail. However the pace of this improvement leads to challenges around the persistence of research outputs and artefacts. A specific challenge in this field is that often experiments and measurements cannot be repeated, yet the infrastructure used to manage, store and process this data must be continually updated and developed: constant change just to stay still. The UK-based Software Sustainability Institute (SSI) aims to improve research software sustainability, working with researchers, funders, research software engineers, managers, and other stakeholders across the research spectrum. In this talk, I will present lessons learned and good practice based on the work of the Institute and its collaborators. I will summarise some of the work that is being done to improve the integration of infrastructure for managing research outputs, including around software citation and reward, extending data management plans, and improving researcher skills: "better software, better research". Ultimately, being a modern researcher in the geosciences requires you to efficiently balance the pursuit of new knowledge with making your work reusable and reproducible. And as scientists are placed under greater scrutiny about whether others can trust their results, the preservation of your artefacts has a key role in the preservation of your reputation.

  5. 48 CFR 52.227-14 - Rights in Data-General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software. Computer software—(1) Means (i) Computer programs that comprise a series of instructions, rules... or computer software documentation. Computer software documentation means owner's manuals, user's... medium, that explain the capabilities of the computer software or provide instructions for using the...

  6. Prowess - A Software Model for the Ooty Wide Field Array

    NASA Astrophysics Data System (ADS)

    Marthi, Visweshwar Ram

    2017-03-01

    One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted H i emission from z ˜ 3.35. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA H i experiment.

  7. HERMES travels by CAN bus

    NASA Astrophysics Data System (ADS)

    Waller, Lewis G.; Shortridge, Keith; Farrell, Tony J.; Vuong, Minh; Muller, Rolf; Sheinis, Andrew I.

    2014-07-01

    The new HERMES spectrograph represents the first foray by AAO into the use of commercial off-the-shelf industrial field bus technology for instrument control, and we regard the final system, with its relatively simple wiring requirements, as a great success. However, both software and hardware teams had to work together to solve a number of problems integrating the chosen CANopen/CAN bus system into our normal observing systems. A Linux system running in an industrial PC chassis ran the HERMES control software, using a PCI CAN bus interface connected to a number of distributed CANopen/CAN bus I/O devices and servo amplifiers. In the main, the servo amplifiers performed impressively, although some experimentation with homing algorithms was required, and we hit a significant hurdle when we discovered that we needed to disable some of the encoders used during observations; we learned a lot about how servo amplifiers respond when their encoders are turned off, and about how encoders react to losing power. The software was based around a commercial CANopen library from Copley Controls. Early worries about how this heavily multithreaded library would work with our standard data acquisition system led to the development of a very low-level CANopen software simulator to verify the design. This also enabled the software group to develop and test almost all the control software well in advance of the construction of the hardware. In the end, the instrument went from initial installation at the telescope to successful commissioning remarkably smoothly.

  8. Automated cell tracking and analysis in phase-contrast videos (iTrack4U): development of Java software based on combined mean-shift processes.

    PubMed

    Cordelières, Fabrice P; Petit, Valérie; Kumasaka, Mayuko; Debeir, Olivier; Letort, Véronique; Gallagher, Stuart J; Larue, Lionel

    2013-01-01

    Cell migration is a key biological process with a role in both physiological and pathological conditions. Locomotion of cells during embryonic development is essential for their correct positioning in the organism; immune cells have to migrate and circulate in response to injury. Failure of cells to migrate or an inappropriate acquisition of migratory capacities can result in severe defects such as altered pigmentation, skull and limb abnormalities during development, and defective wound repair, immunosuppression or tumor dissemination. The ability to accurately analyze and quantify cell migration is important for our understanding of development, homeostasis and disease. In vitro cell tracking experiments, using primary or established cell cultures, are often used to study migration as cells can quickly and easily be genetically or chemically manipulated. Images of the cells are acquired at regular time intervals over several hours using microscopes equipped with CCD camera. The locations (x,y,t) of each cell on the recorded sequence of frames then need to be tracked. Manual computer-assisted tracking is the traditional method for analyzing the migratory behavior of cells. However, this processing is extremely tedious and time-consuming. Most existing tracking algorithms require experience in programming languages that are unfamiliar to most biologists. We therefore developed an automated cell tracking program, written in Java, which uses a mean-shift algorithm and ImageJ as a library. iTrack4U is a user-friendly software. Compared to manual tracking, it saves considerable amount of time to generate and analyze the variables characterizing cell migration, since they are automatically computed with iTrack4U. Another major interest of iTrack4U is the standardization and the lack of inter-experimenter differences. Finally, iTrack4U is adapted for phase contrast and fluorescent cells.

  9. 15 CFR 734.3 - Items subject to the EAR.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., and foreign-made technology that is commingled with controlled U.S.-origin technology: (i) In any....S. origin technology or software, as described in § 736.2(b)(3) of the EAR. The term “direct product... technology or software; and Note to paragraph (a)(4): Certain foreign-manufactured items developed or...

  10. 15 CFR 734.3 - Items subject to the EAR.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., and foreign-made technology that is commingled with controlled U.S.-origin technology: (i) In any....S. origin technology or software, as described in § 736.2(b)(3) of the EAR. The term “direct product... technology or software; and Note to paragraph (a)(4): Certain foreign-manufactured items developed or...

  11. 15 CFR 734.3 - Items subject to the EAR.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., and foreign-made technology that is commingled with controlled U.S.-origin technology: (i) In any....S. origin technology or software, as described in § 736.2(b)(3) of the EAR. The term “direct product... technology or software; and Note to paragraph (a)(4): Certain foreign-manufactured items developed or...

  12. 15 CFR 734.3 - Items subject to the EAR.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., and foreign-made technology that is commingled with controlled U.S.-origin technology: (i) In any....S. origin technology or software, as described in § 736.2(b)(3) of the EAR. The term “direct product... technology or software; and Note to paragraph (a)(4): Certain foreign-manufactured items developed or...

  13. 15 CFR 734.3 - Items subject to the EAR.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., and foreign-made technology that is commingled with controlled U.S.-origin technology: (i) In any....S. origin technology or software, as described in § 736.2(b)(3) of the EAR. The term “direct product... technology or software; and Note to paragraph (a)(4): Certain foreign-manufactured items developed or...

  14. The Interplay Between Estrogen and Replication Origins in Breast Cancer DNA Amplification

    DTIC Science & Technology

    2014-11-01

    using the CHEF Genomic DNA Plug Kit (Biorad), following the manufacturer’s instructions. Briefly, after the labeling, cells were washed twice with...with BD-CellQuest software. What opportunities for training and professional development has the project provided? During the funded project period I

  15. Developments in Science and Technology.

    DTIC Science & Technology

    1981-01-01

    order to meet API ’s requirements for image processing, large data- base transfers, advanced graphic processing, and shar- Tte use of I)EC’net software...Descripion moored plant at an island site, with the electricity sup- plied by undersea cable to a shore utility grid. The Because the primary objective was

  16. Microcomputer spacecraft thermal analysis routines (MSTAR) Phase I: The user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teti, N.M.

    1993-12-31

    The Microcomputer Spacecraft Thermal Analysis Routines (MSTAR) software package is being developed for NASA/Goddard Space Flight Center by Swales and Associates, Inc. (S&AI). In December 1992, S&AI was awarded a phase I Small Business Inovative Research contract fronm NASA to develop a microcomputer based thermal analysis program to replace the current SSPTA and TRASYS programs. Phase I consists of a six month effort which will focus on developing geometric model generation and visualization capabilities using a graphical user interface (GUI). The information contained in this paper encompasses the work performed during the Phase I development cycle; with emphasis on themore » development of the graphical user interface (GUI). This includes both the theory behind and specific examples of how the MSTAR GUI was implemented. Furthermore, this report discusses new applications and enhancements which will improve the capabilities and commercialization of the MSTAR program.« less

  17. Microcomputer spacecraft thermal analysis routines (MSTAR) Phase I: The user interface

    NASA Technical Reports Server (NTRS)

    Teti, Nicholas M.

    1993-01-01

    The Microcomputer Spacecraft Thermal Analysis Routines (MSTAR) software package is being developed for NASA/Goddard Space Flight Center by Swales and Associates, Inc. (S&AI). In December 1992, S&AI was awarded a phase I Small Business Inovative Research contract fronm NASA to develop a microcomputer based thermal analysis program to replace the current SSPTA and TRASYS programs. Phase I consists of a six month effort which will focus on developing geometric model generation and visualization capabilities using a graphical user interface (GUI). The information contained in this paper encompasses the work performed during the Phase I development cycle; with emphasis on the development of the graphical user interface (GUI). This includes both the theory behind and specific examples of how the MSTAR GUI was implemented. Furthermore, this report discusses new applications and enhancements which will improve the capabilities and commercialization of the MSTAR program.

  18. Bridging the Particle Physics and Big Data Worlds

    NASA Astrophysics Data System (ADS)

    Pivarski, James

    2017-09-01

    For decades, particle physicists have developed custom software because the scale and complexity of our problems were unique. In recent years, however, the ``big data'' industry has begun to tackle similar problems, and has developed some novel solutions. Incorporating scientific Python libraries, Spark, TensorFlow, and machine learning tools into the physics software stack can improve abstraction, reliability, and in some cases performance. Perhaps more importantly, it can free physicists to concentrate on domain-specific problems. Building bridges isn't always easy, however. Physics software and open-source software from industry differ in many incidental ways and a few fundamental ways. I will show work from the DIANA-HEP project to streamline data flow from ROOT to Numpy and Spark, to incorporate ideas of functional programming into histogram aggregation, and to develop real-time, query-style manipulations of particle data.

  19. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  20. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  1. Framework for End-User Programming of Cross-Smart Space Applications

    PubMed Central

    Palviainen, Marko; Kuusijärvi, Jarkko; Ovaska, Eila

    2012-01-01

    Cross-smart space applications are specific types of software services that enable users to share information, monitor the physical and logical surroundings and control it in a way that is meaningful for the user's situation. For developing cross-smart space applications, this paper makes two main contributions: it introduces (i) a component design and scripting method for end-user programming of cross-smart space applications and (ii) a backend framework of components that interwork to support the brunt of the RDFScript translation, and the use and execution of ontology models. Before end-user programming activities, the software professionals must develop easy-to-apply Driver components for the APIs of existing software systems. Thereafter, end-users are able to create applications from the commands of the Driver components with the help of the provided toolset. The paper also introduces the reference implementation of the framework, tools for the Driver component development and end-user programming of cross-smart space applications and the first evaluation results on their application. PMID:23202169

  2. A Software Planning and Development Methodology with Resource Allocation Capability

    DTIC Science & Technology

    1986-01-01

    vll ACKNOWLEDGEMENTS There are many people who must be acknowledged for the support they provided during my graduate program at Texas A&M Dr. Lee ...acquisition, research/development, and operations/ maintenance sources. The concept of a resource mm >^"^*»T’i»"<Wt"> i PH D« mm^ ivi i t-il^’lfn" i^ I...James, Unpublished ICAM Industry Days address. New Orleans, Louisiana, May 1982. IllllHUIIIIVf 127 46. Ledbetter , William N., et al., "Education

  3. Universal programming interface with concurrent access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alferov, Oleg

    2004-10-07

    There exist a number of devices with a positioning nature of operation, such as mechanical linear stages, temperature controllers, or filterwheels with discrete state, and most of them have different programming interfaces. The Universal Positioner software suggests the way to handle all of them is with a single approach, whereby a particular hardware driver is created from the template and by translating the actual commands used by the hardware to and from the universal programming interface. The software contains the universal API module itself, the demo simulation of hardware, and the front-end programs to help developers write their own softwaremore » drivers along with example drivers for actual hardware controllers. The software allows user application programs to call devices simultaneously without race conditions (multitasking and concurrent access). The template suggested in this package permits developers to integrate various devices easily into their applications using the same API. The drivers can be stacked; i.e., they can call each other via the same interface.« less

  4. TaxI: a software tool for DNA barcoding using distance methods

    PubMed Central

    Steinke, Dirk; Vences, Miguel; Salzburger, Walter; Meyer, Axel

    2005-01-01

    DNA barcoding is a promising approach to the diagnosis of biological diversity in which DNA sequences serve as the primary key for information retrieval. Most existing software for evolutionary analysis of DNA sequences was designed for phylogenetic analyses and, hence, those algorithms do not offer appropriate solutions for the rapid, but precise analyses needed for DNA barcoding, and are also unable to process the often large comparative datasets. We developed a flexible software tool for DNA taxonomy, named TaxI. This program calculates sequence divergences between a query sequence (taxon to be barcoded) and each sequence of a dataset of reference sequences defined by the user. Because the analysis is based on separate pairwise alignments this software is also able to work with sequences characterized by multiple insertions and deletions that are difficult to align in large sequence sets (i.e. thousands of sequences) by multiple alignment algorithms because of computational restrictions. Here, we demonstrate the utility of this approach with two datasets of fish larvae and juveniles from Lake Constance and juvenile land snails under different models of sequence evolution. Sets of ribosomal 16S rRNA sequences, characterized by multiple indels, performed as good as or better than cox1 sequence sets in assigning sequences to species, demonstrating the suitability of rRNA genes for DNA barcoding. PMID:16214755

  5. Supporting Interdisciplinary Collaboration Through Reusable Free Software. A Research Student Experience

    NASA Astrophysics Data System (ADS)

    Dimech, C.

    2013-12-01

    In this contribution, I present a critical evaluation of my experience as a research student conducting an interdisciplinary project that bridges the world of geoscience with that of astronomy. The major challenge consists in studying and modifying existing geophysical software to work with synthetic solar data not obtained by direct measurement but useful for testing and evaluation, and data released from the satellite HINODE and the Solar Dynamics Observatory. I have been fortunate to collaborate closely with multiple geoscientists keen to share their software codes and help me understand their implementations so I can extend the methodology to solve problems in solar physics. Moreover, two additional experiences have helped me develop my research and collaborative skills. First was an opportunity to involve an undergraduate student, and secondly, my participation at the GNU Hackers Meeting in Paris. Three aspects that need particular attention to enhance the collective productivity of any group of individuals keen to extend existing codes to achieve further interdisciplinary goals have been identified. (1) The production of easily reusable code that users can study and modify even when large sets of computations are involved. (2) The transformation of solutions into tools that are 100% free software. (3) The harmonisation of collaborative interactions that effectively tackle the two aforementioned tasks. Each one will be discussed in detail during this session based on my experience as a research student.

  6. A new free and open source tool for space plasma modeling.

    NASA Astrophysics Data System (ADS)

    Honkonen, I. J.

    2014-12-01

    I will present a new distributed memory parallel, free and open source computational model for studying space plasma. The model is written in C++ with emphasis on good software development practices and code readability without sacrificing serial or parallel performance. As such the model could be especially useful for education, for learning both (magneto)hydrodynamics (MHD) and computational model development. By using latest features of the C++ standard (2011) it has been possible to develop a very modular program which improves not only the readability of code but also the testability of the model and decreases the effort required to make changes to various parts of the program. Major parts of the model, functionality not directly related to (M)HD, have been outsourced to other freely available libraries which has reduced the development time of the model significantly. I will present an overview of the code architecture as well as details of different parts of the model and will show examples of using the model including preparing input files and plotting results. A multitude of 1-, 2- and 3-dimensional test cases are included in the software distribution and the results of, for example, Kelvin-Helmholtz, bow shock, blast wave and reconnection tests, will be presented.

  7. Near-Infrared Neuroimaging with NinPy

    PubMed Central

    Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas

    2009-01-01

    There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449

  8. Analysis of a Floodplain I-Wall Embedded in Horizontally Stratified Soil Layers During Flood Events Using Corps I-Wall Software Version 1.0

    DTIC Science & Technology

    2016-07-01

    and gap propagation engineering methodology implemented within the software (CI-Wall) makes use of a hydraulic fracturing criterion, as discussed in...moist unit weight). Soil unit weights: Because of the presence of the upper moist (i.e, non - saturated) region R01 clay layer that is immediately...from two series of complete soil-structure interaction (SSI) non - linear finite element studies for I-Walls at New Orleans and other locations

  9. Games as an Artistic Medium: Investigating Complexity Thinking in Game-Based Art Pedagogy

    ERIC Educational Resources Information Center

    Patton, Ryan M.

    2013-01-01

    This action research study examines the making of video games, using an integrated development environment software program called GameMaker, as art education curriculum for students between the ages of 8-13. Through a method I designed, students created video games using the concepts of move, avoid, release, and contact (MARC) to explore their…

  10. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  11. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    PubMed

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  12. Development and Integration of Control System Models

    NASA Technical Reports Server (NTRS)

    Kim, Young K.

    1998-01-01

    The computer simulation tool, TREETOPS, has been upgraded and used at NASA/MSFC to model various complicated mechanical systems and to perform their dynamics and control analysis with pointing control systems. A TREETOPS model of Advanced X-ray Astrophysics Facility - Imaging (AXAF-1) dynamics and control system was developed to evaluate the AXAF-I pointing performance for Normal Pointing Mode. An optical model of Shooting Star Experiment (SSE) was also developed and its optical performance analysis was done using the MACOS software.

  13. Training in software used by practising engineers should be included in university curricula

    NASA Astrophysics Data System (ADS)

    Silveira, A.; Perdigones, A.; García, J. L.

    2009-04-01

    Deally, an engineering education should prepare students, i.e., emerging engineers, to use problem-solving processes that synergistically combine creativity and imagination with rigour and discipline. Recently, pressures on curricula have resulted in the development of software-specific courses, often to the detriment of the understanding of theory [1]. However, it is also true that there is a demand for information technology courses by students other than computer science majors [2]. The emphasis on training engineers may be best placed on answering the needs of industry; indeed, many proposals are now being made to try to reduce the gap between the educational and industrial communities [3]. Training in the use of certain computer programs may be one way of better preparing engineering undergraduates for eventual employment in industry. However, industry's needs in this respect must first be known. The aim of this work was to determine which computer programs are used by practising agricultural engineers with the aim of incorporating training in their use into our department's teaching curriculum. The results showed that 72% of their working hours involved the use computer programs. The software packages most commonly used were Microsoft Office (used by 79% of respondents) and CAD (56%), as well as budgeting (27%), statistical (21%), engineering (15%) and GIS (13%) programs. As a result of this survey our university department opened an additional computer suite in order to provide students practical experience in the use of Microsoft Excel, budgeting and engineering software. The results of this survey underline the importance of computer software training in this and perhaps other fields of engineering. [1] D. J. Moore, and D. R. Voltmer, "Curriculum for an engineering renaissance," IEEE Trans. Educ., vol. 46, pp. 452-455, Nov. 2003. [2] N. Kock, R. Aiken, and C. Sandas, "Using complex IT in specific domains: developing and assessing a course for nonmajors," IEEE Trans. Educ., vol. 45, pp. 50- 56, Feb. 2002. [3] I. Vélez, and J. F. Sevillano, "A course to train digital hardware designers for industry," IEEE Trans. Educ., vol. 50, pp. 236-243, Aug. 2007. Acknowledgement: This work was supported in part by the Universidad Politécnica de Madrid, Spain.

  14. MMX-I: data-processing software for multimodal X-ray imaging and tomography.

    PubMed

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-05-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors' knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments.

  15. HOMER: the Holographic Optical Microscope for Education and Research

    NASA Astrophysics Data System (ADS)

    Luviano, Anali

    Holography was invented in 1948 by Dennis Gabor and has undergone major advancements since the 2000s leading to the development of commercial digital holographic microscopes (DHM). This noninvasive form of microscopy produces a three-dimensional (3-D) digital model of a sample without altering or destroying the sample, thus allowing the same sample to be studied multiple times. HOMER-the Holographic Optical Microscope for Education and Research-produces a 3-D image from a two-dimensional (2-D) interference pattern captured by a camera that is then put through reconstruction software. This 2-D pattern is created when a reference wave interacts with the sample to produce a secondary wave that interferes with the unaltered part of the reference wave. I constructed HOMER to be an efficient, portable in-line DHM using inexpensive material and free reconstruction software. HOMER uses three different-colored LEDs as light sources. I am testing the performance of HOMER with the goal of producing tri-color images of samples. I'm using small basic biological samples to test the effectiveness of HOMER and plan to transition to complex cellular and biological specimens as I pursue my interest in biophysics. Norwich University.

  16. A Tour of Big Data, Open Source Data Management Technologies from the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.

    2012-12-01

    The Apache Software Foundation, a non-profit foundation charged with dissemination of open source software for the public good, provides a suite of data management technologies for distributed archiving, data ingestion, data dissemination, processing, triage and a host of other functionalities that are becoming critical in the Big Data regime. Apache is the world's largest open source software organization, boasting over 3000 developers from around the world all contributing to some of the most pervasive technologies in use today, from the HTTPD web server that powers a majority of Internet web sites to the Hadoop technology that is now projected at over a $1B dollar industry. Apache data management technologies are emerging as de facto off-the-shelf components for searching, distributing, processing and archiving key science data sets both geophysical, space and planetary based, all the way to biomedicine. In this talk, I will give a virtual tour of the Apache Software Foundation, its meritocracy and governance structure, and also its key big data technologies that organizations can take advantage of today and use to save cost, schedule, and resources in implementing their Big Data needs. I'll illustrate the Apache technologies in the context of several national priority projects, including the U.S. National Climate Assessment (NCA), and in the International Square Kilometre Array (SKA) project that are stretching the boundaries of volume, velocity, complexity, and other key Big Data dimensions.

  17. MO-DE-BRA-02: SIMAC: A Simulation Tool for Teaching Linear Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlone, M; Harnett, N; Department of Radiation Oncology, University of Toronto, Toronto, Ontario

    Purpose: The first goal of this work is to develop software that can simulate the physics of linear accelerators (linac). The second goal is to show that this simulation tool is effective in teaching linac physics to medical physicists and linac service engineers. Methods: Linacs were modeled using analytical expressions that can correctly describe the physical response of a linac to parameter changes in real time. These expressions were programmed with a graphical user interface in order to produce an environment similar to that of linac service mode. The software, “SIMAC”, has been used as a learning aid in amore » professional development course 3 times (2014 – 2016) as well as in a physics graduate program. Exercises were developed to supplement the didactic components of the courses consisting of activites designed to reinforce the concepts of beam loading; the effect of steering coil currents on beam symmetry; and the relationship between beam energy and flatness. Results: SIMAC was used to teach 35 professionals (medical physicists; regulators; service engineers; 1 week course) as well as 20 graduate students (1 month project). In the student evaluations, 85% of the students rated the effectiveness of SIMAC as very good or outstanding, and 70% rated the software as the most effective part of the courses. Exercise results were collected showing that 100% of the students were able to use the software correctly. In exercises involving gross changes to linac operating points (i.e. energy changes) the majority of students were able to correctly perform these beam adjustments. Conclusion: Software simulation(SIMAC), can be used to effectively teach linac physics. In short courses, students were able to correctly make gross parameter adjustments that typically require much longer training times using conventional training methods.« less

  18. GENERAL PURPOSE ADA PACKAGES

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package requires 205K of main memory on a DEC VAX running VMS. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  19. Reducing Risk in DoD Software-Intensive Systems Development

    DTIC Science & Technology

    2016-03-01

    intensive systems development risk. This research addresses the use of the Technical Readiness Assessment (TRA) using the nine-level software Technology...The software TRLs are ineffective in reducing technical risk for the software component development. • Without the software TRLs, there is no...effective method to perform software TRA or reduce the technical development risk. The software component will behave as a new, untried technology in nearly

  20. The tensor network theory library

    NASA Astrophysics Data System (ADS)

    Al-Assam, S.; Clark, S. R.; Jaksch, D.

    2017-09-01

    In this technical paper we introduce the tensor network theory (TNT) library—an open-source software project aimed at providing a platform for rapidly developing robust, easy to use and highly optimised code for TNT calculations. The objectives of this paper are (i) to give an overview of the structure of TNT library, and (ii) to help scientists decide whether to use the TNT library in their research. We show how to employ the TNT routines by giving examples of ground-state and dynamical calculations of one-dimensional bosonic lattice system. We also discuss different options for gaining access to the software available at www.tensornetworktheory.org.

  1. An overview of 3D software visualization.

    PubMed

    Teyseyre, Alfredo R; Campo, Marcelo R

    2009-01-01

    Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.

  2. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...

  3. Student self-assessment in an interactive learning environment: Technological tools for scaffolding and understanding self-assessment practices

    NASA Astrophysics Data System (ADS)

    Eslinger, Eric Martin

    Metacognitive skills are a crucial component of a successful learning career. We define metacognition as the ability to plan, monitor progress toward a goal, reflect on the quality of work and process, and revise the work or plan accordingly. By explicitly addressing certain metacognitive practices in classrooms, researchers have observed improved learning outcomes in both science and mathematical problem solving. Although these efforts were successful, they were also limited in the range of skills that could be addressed at one time and the methods used to address them due to the static nature inherent in traditional pencil-and-paper format. We wished to address these skills in a more dynamic, continuous representation such as that afforded by a computerized learning environment. This paper outlines such an environment and describes pedagogical activities afforded by the system. The ThinkerTools group developed and tested a software scaffold for inquiry projects in a middle-school classroom. By analyzing student use of the software tool, three forms of self-assessment activity were noted: integrated, task and project self-assessment. Each assessment form was related to the degree of interleaving between assessment and work the students engaged in as they developed their inquiry products. I argue that the integrated forms of assessment are more beneficial to student learning, and show that there is a significant relationship between active self-assessment forms and measures of student achievement and product quality. Through the use of case studies including video analysis, I address specific student self-assessment activity that utilized the software as well as self-assessment that took place outside of the software. A model of student self-assessment activity was created, highlighting aspects of activity that afford more productive self-assessment episodes.

  4. Integrating Wireless Networking for Radiation Detection

    NASA Astrophysics Data System (ADS)

    Board, Jeremy; Barzilov, Alexander; Womble, Phillip; Paschal, Jon

    2006-10-01

    As wireless networking becomes more available, new applications are being developed for this technology. Our group has been studying the advantages of wireless networks of radiation detectors. With the prevalence of the IEEE 802.11 standard (``WiFi''), we have developed a wireless detector unit which is comprised of a 5 cm x 5 cm NaI(Tl) detector, amplifier and data acquisition electronics, and a WiFi transceiver. A server may communicate with the detector unit using a TCP/IP network connected to a WiFi access point. Special software on the server will perform radioactive isotope determination and estimate dose-rates. We are developing an enhanced version of the software which utilizes the receiver signal strength index (RSSI) to estimate source strengths and to create maps of radiation intensity.

  5. A method for automatically optimizing medical devices for treating heart failure: designing polymeric injection patterns.

    PubMed

    Wenk, Jonathan F; Wall, Samuel T; Peterson, Robert C; Helgerson, Sam L; Sabbah, Hani N; Burger, Mike; Stander, Nielen; Ratcliffe, Mark B; Guccione, Julius M

    2009-12-01

    Heart failure continues to present a significant medical and economic burden throughout the developed world. Novel treatments involving the injection of polymeric materials into the myocardium of the failing left ventricle (LV) are currently being developed, which may reduce elevated myofiber stresses during the cardiac cycle and act to retard the progression of heart failure. A finite element (FE) simulation-based method was developed in this study that can automatically optimize the injection pattern of the polymeric "inclusions" according to a specific objective function, using commercially available software tools. The FE preprocessor TRUEGRID((R)) was used to create a parametric axisymmetric LV mesh matched to experimentally measured end-diastole and end-systole metrics from dogs with coronary microembolization-induced heart failure. Passive and active myocardial material properties were defined by a pseudo-elastic-strain energy function and a time-varying elastance model of active contraction, respectively, that were implemented in the FE software LS-DYNA. The companion optimization software LS-OPT was used to communicate directly with TRUEGRID((R)) to determine FE model parameters, such as defining the injection pattern and inclusion characteristics. The optimization resulted in an intuitive optimal injection pattern (i.e., the one with the greatest number of inclusions) when the objective function was weighted to minimize mean end-diastolic and end-systolic myofiber stress and ignore LV stroke volume. In contrast, the optimization resulted in a nonintuitive optimal pattern (i.e., 3 inclusions longitudinallyx6 inclusions circumferentially) when both myofiber stress and stroke volume were incorporated into the objective function with different weights.

  6. i-Tree: Tools to assess and manage structure, function, and value of community forests

    NASA Astrophysics Data System (ADS)

    Hirabayashi, S.; Nowak, D.; Endreny, T. A.; Kroll, C.; Maco, S.

    2011-12-01

    Trees in urban communities can mitigate many adverse effects associated with anthropogenic activities and climate change (e.g. urban heat island, greenhouse gas, air pollution, and floods). To protect environmental and human health, managers need to make informed decisions regarding urban forest management practices. Here we present the i-Tree suite of software tools (www.itreetools.org) developed by the USDA Forest Service and their cooperators. This software suite can help urban forest managers assess and manage the structure, function, and value of urban tree populations regardless of community size or technical capacity. i-Tree is a state-of-the-art, peer-reviewed Windows GUI- or Web-based software that is freely available, supported, and continuously refined by the USDA Forest Service and their cooperators. Two major features of i-Tree are 1) to analyze current canopy structures and identify potential planting spots, and 2) to estimate the environmental benefits provided by the trees, such as carbon storage and sequestration, energy conservation, air pollution removal, and storm water reduction. To cover diverse forest topologies, various tools were developed within the i-Tree suite: i-Tree Design for points (individual trees), i-Tree Streets for lines (street trees), and i-Tree Eco, Vue, and Canopy (in the order of complexity) for areas (community trees). Once the forest structure is identified with these tools, ecosystem services provided by trees can be estimated with common models and protocols, and reports in the form of texts, charts, and figures are then created for users. Since i-Tree was developed with a client/server architecture, nationwide data in the US such as location-related parameters, weather, streamflow, and air pollution data are stored in the server and retrieved to a user's computer at run-time. Freely available remote-sensed images (e.g. NLCD and Google maps) are also employed to estimate tree canopy characteristics. As the demand for i-Tree grows internationally, environmental databases from more countries will be coupled with the software suite. Two more i-Tree applications, i-Tree Forecast and i-Tree Landscape are now under development. i-Tree Forecast simulates canopy structures for up to 100 years based on planting and mortality rates and adds capabilities for other i-Tree applications to estimate the benefits of future canopy scenarios. While most i-Tree applications employ a spatially lumped approach, i-Tree landscape employs a spatially distributed approach that allows users to map changes in canopy cover and ecosystem services through time and space. These new i-Tree tools provide an advanced platform for urban managers to assess the impact of current and future urban forests. i-Tree allows managers to promote effective urban forest management and sound arboricultural practices by providing information for advocacy and planning, baseline data for making informed decisions, and standardization for comparisons with other communities.

  7. The Role of Dynamic Geometry Software in High School Geometry Curricula: An Analysis of Proof Tasks

    ERIC Educational Resources Information Center

    Oner, Diler

    2009-01-01

    In this study, I examine the role of dynamic geometry software (DGS) in curricular proof tasks. I investigated seven US high school geometry textbooks that were categorised into three groups: technology-intensive, standards-based, and traditional curricula. I looked at the frequency and purpose of DGS use in these textbooks. In addition, I…

  8. Importance of software version for measurement of arterial stiffness: Arteriograph as an example.

    PubMed

    Ring, Margareta; Eriksson, Maria J; Nyberg, Gunnar; Caidahl, Kenneth

    2018-01-01

    Current guidelines recommend the measurement of arterial stiffness in terms of aortic pulse wave velocity (PWV) as an important cardio-vascular risk marker. Both aortic PWV and the aortic augmentation index (AIxao) can be measured using different techniques, e.g., the Arteriograph and SphygmoCor. A new version of the software for the Arteriograph (v. 3.0.0.1, TensioMed, Budapest, Hungary; Arteriograph II) is now available. We wanted to determine whether this improved software differs from the previous version (Arteriograph v. 1.9.9.12; Arteriograph I). We compared the estimated aortic PWV (ePWVao) and AIxao measured with both versions of Arteriograph software and analysed the agreement of these values with those measured by SphygmoCor (v. 7.01, AtCor Medical, Sydney, Australia). Eighty-seven subjects without known cardio-vascular disease (23 men and 64 women) aged 54.2 ± 8.7 years (mean ± standard deviation; range 33-68 years) were included in the study. Estimated PWVao and AIxao were measured by both Arteriograph and SphygmoCor. We compared Arteriograph I and Arteriograph II with each other and with SphygmoCor. Estimated PWVao measured by Arteriograph II was lower than that measured by Arteriograph I, while the AIxao was higher. Divergence in ePWVao values was especially noted above 9 m/s. Estimated PWVao measured by Arteriograph II (7.2 m/s, 6.6-8.0 [median, 25th-75th percentile]) did not differ from that measured by SphygmoCor (7.1 m/s, 6.7-7.9 [median, 25th-75th percentile]). However, the AIao measured by Arteriograph II was significantly higher (P < 0.001). Regularly upgraded software versions resulting from continuous technical development are needed for quality improvement of methods. However, the changes in software, even if the basic patented operational algorithm has not changed, may influence the measured values as shown in the present study. Therefore, attention should be paid to the software version of the method used when comparing arterial stiffness results in clinical settings or when performing scientific studies.

  9. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    NASA Technical Reports Server (NTRS)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  10. Software evolution. What kind of evolution?

    NASA Astrophysics Data System (ADS)

    Torres-Carbonell, J. J.; Parets-Llorca, J.

    2001-06-01

    Most Software Systems capable of adapting to the environment or of performing some kind of adaptive activity (such as pattern learning, behavior simulations and the like) use concepts and models from Biology. Nevertheless, such approaches are based on the Modern Synthesis, i.e., Darwinism plus Mendelism, and this implies preadaptive mutations in, and subsequent selection of the better adapted individuals. These pre-adaptive changes usually do not produce the desired effect, are virtually useless and require some kind of backtracking for the system to obtain profit from adaptation. It is our contention that an evolutionary approach in Software Systems development cannot be based on pre-adaptive mutations, but rather on post-adaptive ones, that is, anticipatory mutations and modifications (Lamarkism). A novel way of understanding evolution in Software Systems based on applied Lamarkism is presented and a framework that allows the incorporation of modifications according to the necessities of the system and the will of the modeller is proposed.

  11. The Software Engineering Prototype.

    DTIC Science & Technology

    1983-06-01

    34. sThis cnly means that the ’claim’, i.e., "accepted wisdcu" in systems design, was set up as the aiternative to the hypcthesis, in accord with tra dit ion...conflict and its resolution are m~~lyto occur when users can exercise their influence 4n the levelc2- inert prcezss. Ccnflict 4itsslY os snotr lead...the traditional method of software de- velopment often has poor results. Recently, a new approach to software development, the prototype approach

  12. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.

  13. THE VALIDITY OF USING ROC SOFTWARE FOR ANALYSING VISUAL GRADING CHARACTERISTICS DATA: AN INVESTIGATION BASED ON THE NOVEL SOFTWARE VGC ANALYZER.

    PubMed

    Hansson, Jonny; Månsson, Lars Gunnar; Båth, Magnus

    2016-06-01

    The purpose of the present work was to investigate the validity of using single-reader-adapted receiver operating characteristics (ROC) software for analysis of visual grading characteristics (VGC) data. VGC data from four published VGC studies on optimisation of X-ray examinations, previously analysed using ROCFIT, were reanalysed using a recently developed software dedicated to VGC analysis (VGC Analyzer), and the outcomes [the mean and 95 % confidence interval (CI) of the area under the VGC curve (AUCVGC) and the p-value] were compared. The studies included both paired and non-paired data and were reanalysed both for the fixed-reader and the random-reader situations. The results showed good agreement between the softwares for the mean AUCVGC For non-paired data, wider CIs were obtained with VGC Analyzer than previously reported, whereas for paired data, the previously reported CIs were similar or even broader. Similar observations were made for the p-values. The results indicate that the use of single-reader-adapted ROC software such as ROCFIT for analysing non-paired VGC data may lead to an increased risk of committing Type I errors, especially in the random-reader situation. On the other hand, the use of ROC software for analysis of paired VGC data may lead to an increased risk of committing Type II errors, especially in the fixed-reader situation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. The relationships between software publications and software systems

    NASA Astrophysics Data System (ADS)

    Hogg, David W.

    2017-01-01

    When we build software systems or software tools for astronomy, we sometimes do and sometimes don't also write and publish standard scientific papers about those software systems. I will discuss the pros and cons of writing such publications. There are impacts of writing such papers immediately (they can affect the design and structure of the software project itself), in the short term (they can promote adoption and legitimize the software), in the medium term (they can provide a platform for all the literature's mechanisms for citation, criticism, and reuse), and in the long term (they can preserve ideas that are embodied in the software, possibly on timescales much longer than the lifetime of any software context). I will argue that as important as pure software contributions are to astronomy—and I am both a preacher and a practitioner—software contributions are even more valuable when they are associated with traditional scientific publications. There are exceptions and complexities of course, which I will discuss.

  15. The GBT Dynamic Scheduling System: Development and Testing

    NASA Astrophysics Data System (ADS)

    McCarty, M.; Clark, M.; Marganian, P.; O'Neil, K.; Shelton, A.; Sessoms, E.

    2009-09-01

    During the summer trimester of 2008, all observations on the Robert C. Byrd Green Bank Telescope (GBT) were scheduled using the new Dynamic Scheduling System (DSS). Beta testing exercised the policies, algorithms, and software developed for the DSS project. Since observers are located all over the world, the DSS was implemented as a web application. Technologies such as iCalendar, Really Simple Syndication (RSS) feeds, email, and instant messaging are used to transfer as much or as little information to observers as they request. We discuss the software engineering challenges leading to our implementation such as information distribution and building rich user interfaces in the web browser. We also relate our adaptation of agile development practices to design and develop the DSS. Additionally, we describe handling differences in expected versus actual initial conditions in the pool of project proposals for the 08B trimester. We then identify lessons learned from beta testing and present statistics on how the DSS was used during the trimester.

  16. Using articulation and inscription as catalysts for reflection: Design principles for reflective inquiry

    NASA Astrophysics Data System (ADS)

    Loh, Ben Tun-Bin

    2003-07-01

    The demand for students to engage in complex student-driven and information-rich inquiry investigations poses challenges to existing learning environments. Students are not familiar with this style of work, and lack the skills, tools, and expectations it demands, often forging blindly forward in the investigation. If students are to be successful, they need to learn to be reflective inquirers, periodically stepping back from an investigation to evaluate their work. The fundamental goal of my dissertation is to understand how to design learning environments to promote and support reflective inquiry. I have three basic research questions: how to define this mode of work, how to help students learn it, and understanding how it facilitates reflection when enacted in a classroom. I take an exploratory approach in which, through iterative cycles of design, development, and reflection, I develop principles of design for reflective inquiry, instantiate those principles in the design of a software environment, and test that software in the context of classroom work. My work contributes to the understanding of reflective inquiry in three ways: First, I define a task model that describes the kinds of operations (cognitive tasks) that students should engage in as reflective inquirers. These operations are defined in terms of two basic tasks: articulation and inscription, which serve as catalysts for externalizing student thinking as objects of and triggers for reflection. Second, I instantiate the task model in the design of software tools (the Progress Portfolio). And, through proof of concept pilot studies, I examine how the task model and tools helped students with their investigative classroom work. Finally, I take a step back from these implementations and articulate general design principles for reflective inquiry with the goal of informing the design of other reflective inquiry learning environments. There are three design principles: (1) Provide a designated work space for reflection activities to focus student attention on reflection. (2) Help students create and use artifacts that represent their work and their thinking as a means to create referents for reflection. (3) Support and take advantage of social processes that help students reflect on their own work.

  17. Supporting Tablet Configuration, Tracking, and Infection Control Practices in Digital Health Interventions: Study Protocol

    PubMed Central

    Furberg, Robert D; Zulkiewicz, Brittany A; Hudson, Jordan P; Taylor, Olivia M; Lewis, Megan A

    2016-01-01

    Background Tablet-based health care interventions have the potential to encourage patient care in a timelier manner, allow physicians convenient access to patient records, and provide an improved method for patient education. However, along with the continued adoption of tablet technologies, there is a concomitant need to develop protocols focusing on the configuration, management, and maintenance of these devices within the health care setting to support the conduct of clinical research. Objective Develop three protocols to support tablet configuration, tablet management, and tablet maintenance. Methods The Configurator software, Tile technology, and current infection control recommendations were employed to develop three distinct protocols for tablet-based digital health interventions. Configurator is a mobile device management software specifically for iPhone operating system (iOS) devices. The capabilities and current applications of Configurator were reviewed and used to develop the protocol to support device configuration. Tile is a tracking tag associated with a free mobile app available for iOS and Android devices. The features associated with Tile were evaluated and used to develop the Tile protocol to support tablet management. Furthermore, current recommendations on preventing health care–related infections were reviewed to develop the infection control protocol to support tablet maintenance. Results This article provides three protocols: the Configurator protocol, the Tile protocol, and the infection control protocol. Conclusions These protocols can help to ensure consistent implementation of tablet-based interventions, enhance fidelity when employing tablets for research purposes, and serve as a guide for tablet deployments within clinical settings. PMID:27350013

  18. How Do I Start a Property Records System?

    ERIC Educational Resources Information Center

    Whyman, Wynne

    2003-01-01

    A property records system organizes data to be utilized by a camp's facilities department and integrated into other areas. Start by deciding what records to keep and allotting the time. Then develop consistent procedures, including organizing data, creating a catalog, making back-up copies, and integrating procedures. Use software tools. A good…

  19. Minimizing bias in biomass allometry: Model selection and log transformation of data

    Treesearch

    Joseph Mascaro; undefined undefined; Flint Hughes; Amanda Uowolo; Stefan A. Schnitzer

    2011-01-01

    Nonlinear regression is increasingly used to develop allometric equations for forest biomass estimation (i.e., as opposed to the raditional approach of log-transformation followed by linear regression). Most statistical software packages, however, assume additive errors by default, violating a key assumption of allometric theory and possibly producing spurious models....

  20. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  1. Software/hardware distributed processing network supporting the Ada environment

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  2. ACS from development to operations

    NASA Astrophysics Data System (ADS)

    Caproni, Alessandro; Colomer, Pau; Jeram, Bogdan; Sommer, Heiko; Chiozzi, Gianluca; Mañas, Miguel M.

    2016-08-01

    The ALMA Common Software (ACS), provides the infrastructure of the distributed software system of ALMA and other projects. ACS, built on top of CORBA and Data Distribution Service (DDS) middleware, is based on a Component- Container paradigm and hides the complexity of the middleware allowing the developer to focus on domain specific issues. The transition of the ALMA observatory from construction to operations brings with it that ACS effort focuses primarily on scalability, stability and robustness rather than on new features. The transition came together with a shorter release cycle and a more extensive testing. For scalability, the most problematic area has been the CORBA notification service, used to implement the publisher subscriber pattern because of the asynchronous nature of the paradigm: a lot of effort has been spent to improve its stability and recovery from run time errors. The original bulk data mechanism, implemented using the CORBA Audio/Video Streaming Service, showed its limitations and has been replaced with a more performant and scalable DDS implementation. Operational needs showed soon the difference between releases cycles for Online software (i.e. used during observations) and Offline software, which requires much more frequent releases. This paper attempts to describe the impact the transition from construction to operations had on ACS, the solution adopted so far and a look into future evolution.

  3. 48 CFR 227.7103-6 - Contract clauses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... expense). Do not use the clause when the only deliverable items are computer software or computer software... architect-engineer and construction contracts. (b)(1) Use the clause at 252.227-7013 with its Alternate I in... Software Previously Delivered to the Government, in solicitations when the resulting contract will require...

  4. 48 CFR 227.7103-6 - Contract clauses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... expense). Do not use the clause when the only deliverable items are computer software or computer software... architect-engineer and construction contracts. (b)(1) Use the clause at 252.227-7013 with its Alternate I in... Software Previously Delivered to the Government, in solicitations when the resulting contract will require...

  5. Cargo Movement Operations System (CMOS) Draft Software Product Specification, Increment I

    DTIC Science & Technology

    1990-12-13

    NO [ ] COMMENT DISPOSITION: ACCEPT [ ] REJECT [ ] COMMENT STATUS: OPEN [ ] CLOSED [ ] Cmnt Page Paragraph No. No. Number Comment 1. 2 1.3 Delete "Software Product Specification" from lines 4 & 5 of the first paragraph. 2. 11 3.4 Change "Table 3.4" to "Appendix E". 3. App B (all) Change the term "Deskview" used to describe the development language to "DESQview". ORIGINATOR CONTROL NUMBER: SPS1-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: A014-02 DATE: 12/13/90 ORIGINATOR NAME:

  6. GeolOkit 1.0: a new Open Source, Cross-Platform software for geological data visualization in Google Earth environment

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud

    2016-04-01

    GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.

  7. Towards a mature measurement environment: Creating a software engineering research environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1990-01-01

    Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.

  8. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  9. IHE cross-enterprise document sharing for imaging: interoperability testing software

    PubMed Central

    2010-01-01

    Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241

  10. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    PubMed

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  11. Software for Dosage Individualization of Voriconazole for Immunocompromised Patients

    PubMed Central

    VanGuilder, Michael; Donnelly, J. Peter; Blijlevens, Nicole M. A.; Brüggemann, Roger J. M.; Jelliffe, Roger W.; Neely, Michael N.

    2013-01-01

    The efficacy of voriconazole is potentially compromised by considerable pharmacokinetic variability. There are increasing insights into voriconazole concentrations that are safe and effective for treatment of invasive fungal infections. Therapeutic drug monitoring is increasingly advocated. Software to aid in the individualization of dosing would be an extremely useful clinical tool. We developed software to enable the individualization of voriconazole dosing to attain predefined serum concentration targets. The process of individualized voriconazole therapy was based on concepts of Bayesian stochastic adaptive control. Multiple-model dosage design with feedback control was used to calculate dosages that achieved desired concentration targets with maximum precision. The performance of the software program was assessed using the data from 10 recipients of an allogeneic hematopoietic stem cell transplant (HSCT) receiving intravenous (i.v.) voriconazole. The program was able to model the plasma concentrations with a high level of precision, despite the wide range of concentration trajectories and interindividual pharmacokinetic variability. The voriconazole concentrations predicted after the last dosages were largely concordant with those actually measured. Simulations provided an illustration of the way in which the software can be used to adjust dosages of patients falling outside desired concentration targets. This software appears to be an extremely useful tool to further optimize voriconazole therapy and aid in therapeutic drug monitoring. Further prospective studies are now required to define the utility of the controller in daily clinical practice. PMID:23380734

  12. Predictive Modeling of Terrestrial Radiation Exposure from Geologic Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malchow, Russell L.; Haber, Daniel University of Nevada, Las Vegas; Burnley, Pamela

    2015-01-01

    Aerial gamma ray surveys are important for those working in nuclear security and industry for determining locations of both anthropogenic radiological sources and natural occurrences of radionuclides. During an aerial gamma ray survey, a low flying aircraft, such as a helicopter, flies in a linear pattern across the survey area while measuring the gamma emissions with a sodium iodide (NaI) detector. Currently, if a gamma ray survey is being flown in an area, the only way to correct for geologic sources of gamma rays is to have flown the area previously. This is prohibitively expensive and would require complete nationalmore » coverage. This project’s goal is to model the geologic contribution to radiological backgrounds using published geochemical data, GIS software, remote sensing, calculations, and modeling software. K, U and Th are the three major gamma emitters in geologic material. U and Th are assumed to be in secular equilibrium with their daughter isotopes. If K, U, and Th abundance values are known for a given geologic unit the expected gamma ray exposure rate can be calculated using the Grasty equation or by modeling software. Monte Carlo N-Particle Transport software (MCNP), developed by Los Alamos National Laboratory, is modeling software designed to simulate particles and their interactions with matter. Using this software, models have been created that represent various lithologies. These simulations randomly generate gamma ray photons at energy levels expected from natural radiologic sources. The photons take a random path through the simulated geologic media and deposit their energy at the end of their track. A series of nested spheres have been created and filled with simulated atmosphere to record energy deposition. Energies deposited are binned in the same manner as the NaI detectors used during an aerial survey. These models are used in place of the simplistic Grasty equation as they take into account absorption properties of the lithology which the simplistic equation ignores.« less

  13. Cross-Platform Mobile Application Development: A Pattern-Based Approach

    DTIC Science & Technology

    2012-03-01

    Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras. Finally, each...applications are written. Additionally, developers should be aware of different hardware capabilities such as external SD cards and forward facing cameras... iTunes library, allowing the user to update software and manage content on each device. However, in iOS5, the PC Free feature removes this constraint

  14. Replacing the IRAF/PyRAF Code-base at STScI: The Advanced Camera for Surveys (ACS)

    NASA Astrophysics Data System (ADS)

    Lucas, Ray A.; Desjardins, Tyler D.; STScI ACS (Advanced Camera for Surveys) Team

    2018-06-01

    IRAF/PyRAF are no longer viable on the latest hardware often used by HST observers, therefore STScI no longer actively supports IRAF or PyRAF for most purposes. STScI instrument teams are in the process of converting all of our data processing and analysis code from IRAF/PyRAF to Python, including our calibration reference file pipelines and data reduction software. This is exemplified by our latest ACS Data Handbook, version 9.0, which was recently published in February 2018. Examples of IRAF and PyRAF commands have now been replaced by code blocks in Python, with references linked to documentation on how to download and install the latest Python software via Conda and AstroConda. With the temporary exception of the ACS slitless spectroscopy tool aXe, all ACS-related software is now independent of IRAF/PyRAF. A concerted effort has been made across STScI divisions to help the astronomical community transition from IRAF/PyRAF to Python, with tools such as Python Jupyter notebooks being made to give users workable examples. In addition to our code changes, the new ACS data handbook discusses the latest developments in charge transfer efficiency (CTE) correction, bias de-striping, and updates to the creation and format of calibration reference files among other topics.

  15. Evaluation of a Mobile Platform for Proof-of-Concept Autonomous Site Selection and Preparation

    NASA Astrophysics Data System (ADS)

    Gammell, Jonathan

    A mobile robotic platform for Autonomous Site Selection and Preparation (ASSP) was developed for an analogue deployment to Mauna Kea, Hawai`i. A team of rovers performed an autonomous Ground Penetrating Radar (GPR) survey and constructed a level landing pad. They used interchangeable payloads that allowed the GPR and blade to be easily exchanged. Autonomy was accomplished by integrating the individual hardware devices with software based on the ArgoSoft framework previously developed at UTIAS. The rovers were controlled by an on-board netbook. The successes and failures of the devices and software modules are evaluated within. Recommendations are presented to address problems discovered during the deployment and to guide future research on the platform.

  16. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...

    2012-01-01

    A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less

  17. Analysis and fit of stellar spectra using a mega-database of CMFGEN models

    NASA Astrophysics Data System (ADS)

    Fierro-Santillán, Celia; Zsargó, Janos; Klapp, Jaime; Díaz-Azuara, Santiago Alfredo; Arrieta, Anabel; Arias, Lorena

    2017-11-01

    We present a tool for analysis and fit of stellar spectra using a mega database of 15,000 atmosphere models for OB stars. We have developed software tools, which allow us to find the model that best fits to an observed spectrum, comparing equivalent widths and line ratios in the observed spectrum with all models of the database. We use the Hα, Hβ, Hγ, and Hδ lines as criterion of stellar gravity and ratios of He II λ4541/He I λ4471, He II λ4200/(He I+He II λ4026), He II λ4541/He I λ4387, and He II λ4200/He I λ4144 as criterion of T eff.

  18. Develop a Model Component

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.

  19. Software architecture of biomimetic underwater vehicle

    NASA Astrophysics Data System (ADS)

    Praczyk, Tomasz; Szymak, Piotr

    2016-05-01

    Autonomous underwater vehicles are vehicles that are entirely or partly independent of human decisions. In order to obtain operational independence, the vehicles have to be equipped with a specialized software. The main task of the software is to move the vehicle along a trajectory with collision avoidance. Moreover, the software has also to manage different devices installed on the vehicle board, e.g. to start and stop cameras, sonars etc. In addition to the software embedded on the vehicle board, the software responsible for managing the vehicle by the operator is also necessary. Its task is to define mission of the vehicle, to start, to stop the mission, to send emergency commands, to monitor vehicle parameters, and to control the vehicle in remotely operated mode. An important objective of the software is also to support development and tests of other software components. To this end, a simulation environment is necessary, i.e. simulation model of the vehicle and all its key devices, the model of the sea environment, and the software to visualize behavior of the vehicle. The paper presents architecture of the software designed for biomimetic autonomous underwater vehicle (BAUV) that is being constructed within the framework of the scientific project financed by Polish National Center of Research and Development.

  20. Geographic Information Systems and Web Page Development

    NASA Technical Reports Server (NTRS)

    Reynolds, Justin

    2004-01-01

    The Facilities Engineering and Architectural Branch is responsible for the design and maintenance of buildings, laboratories, and civil structures. In order to improve efficiency and quality, the FEAB has dedicated itself to establishing a data infrastructure based on Geographic Information Systems, GIs. The value of GIS was explained in an article dating back to 1980 entitled "Need for a Multipurpose Cadastre which stated, "There is a critical need for a better land-information system in the United States to improve land-conveyance procedures, furnish a basis for equitable taxation, and provide much-needed information for resource management and environmental planning." Scientists and engineers both point to GIS as the solution. What is GIS? According to most text books, Geographic Information Systems is a class of software that stores, manages, and analyzes mapable features on, above, or below the surface of the earth. GIS software is basically database management software to the management of spatial data and information. Simply put, Geographic Information Systems manage, analyze, chart, graph, and map spatial information. At the outset, I was given goals and expectations from my branch and from my mentor with regards to the further implementation of GIs. Those goals are as follows: (1) Continue the development of GIS for the underground structures. (2) Extract and export annotated data from AutoCAD drawing files and construct a database (to serve as a prototype for future work). (3) Examine existing underground record drawings to determine existing and non-existing underground tanks. Once this data was collected and analyzed, I set out on the task of creating a user-friendly database that could be assessed by all members of the branch. It was important that the database be built using programs that most employees already possess, ruling out most AutoCAD-based viewers. Therefore, I set out to create an Access database that translated onto the web using Internet Explorer as the foundation. After some programming, it was possible to view AutoCAD files and other GIS-related applications on Internet Explorer, while providing the user with a variety of editing commands and setting options. I was also given the task of launching a divisional website using Macromedia Flash and other web- development programs.

  1. SAO mission support software and data standards, version 1.0

    NASA Technical Reports Server (NTRS)

    Hsieh, P.

    1993-01-01

    This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.

  2. An Investigation of Expert Systems Usage for Software Requirements Development in the Strategic Defense Initiative Environment.

    DTIC Science & Technology

    1986-06-10

    the solution of the base could be the solution of the target. If expert systems are to mimic humans , then they should inherently utilize analogy. In the...expert systems environment, the theory of frames for representing knowledge developed partly because humans usually solve problems by first seeing if...Goals," Computer, May 1975, p. 17. 8. A.I. Wasserman, "Some Principles of User Software Engineering for Information Systems ," Digest of Papers, COMPCON

  3. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  4. Research in Parallel Algorithms and Software for Computational Aerosciences

    DOT National Transportation Integrated Search

    1996-04-01

    Phase I is complete for the development of a Computational Fluid Dynamics : with automatic grid generation and adaptation for the Euler : analysis of flow over complex geometries. SPLITFLOW, an unstructured Cartesian : grid code developed at Lockheed...

  5. Carbon footprint estimator, phase II : volume I - GASCAP model.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  6. MMX-I: data-processing software for multimodal X-ray imaging and tomography

    PubMed Central

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-01-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors’ knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159

  7. The ATLAS Tier-3 in Geneva and the Trigger Development Facility

    NASA Astrophysics Data System (ADS)

    Gadomski, S.; Meunier, Y.; Pasche, P.; Baud, J.-P.; ATLAS Collaboration

    2011-12-01

    The ATLAS Tier-3 farm at the University of Geneva provides storage and processing power for analysis of ATLAS data. In addition the facility is used for development, validation and commissioning of the High Level Trigger of ATLAS [1]. The latter purpose leads to additional requirements on the availability of latest software and data, which will be presented. The farm is also a part of the WLCG [2], and is available to all members of the ATLAS Virtual Organization. The farm currently provides 268 CPU cores and 177 TB of storage space. A grid Storage Element, implemented with the Disk Pool Manager software [3], is available and integrated with the ATLAS Distributed Data Management system [4]. The batch system can be used directly by local users, or with a grid interface provided by NorduGrid ARC middleware [5]. In this article we will present the use cases that we support, as well as the experience with the software and the hardware we are using. Results of I/O benchmarking tests, which were done for our DPM Storage Element and for the NFS servers we are using, will also be presented.

  8. Converging Work-Talk Patterns in Online Task-Oriented Communities.

    PubMed

    Xuan, Qi; Devanbu, Premkumar; Filkov, Vladimir

    2016-01-01

    Much of what we do is accomplished by working collaboratively with others, and a large portion of our lives are spent working and talking; the patterns embodied in the alternation of working and talking can provide much useful insight into task-oriented social behaviors. The available electronic traces of the different kinds of human activities in online communities are an empirical goldmine that can enable the holistic study and understanding of these social systems. Open Source Software (OSS) projects are prototypical examples of collaborative, task-oriented communities, depending on volunteers for high-quality work. Here, we use sequence analysis methods to identify the work-talk patterns of software developers in online communities of Open Source Software projects. We find that software developers prefer to persist in same kinds of activities, i.e., a string of work activities followed by a string of talk activities and so forth, rather than switch them frequently; this tendency strengthens with time, suggesting that developers become more efficient, and can work longer with fewer interruptions. This process is accompanied by the formation of community culture: developers' patterns in the same communities get closer with time while different communities get relatively more different. The emergence of community culture is apparently driven by both "talk" and "work". Finally, we also find that workers with good balance between "work" and "talk" tend to produce just as much work as those that focus strongly on "work"; however, the former appear to be more likely to continue to be active contributors in the communities.

  9. SLR precision analysis for LAGEOS I and II

    NASA Astrophysics Data System (ADS)

    Kizilsu, Gaye; Sahin, Muhammed

    2000-10-01

    This paper deals with the problem of properly weighting satellite observations which are non-uniform in quality. The technique, the variance component estimation method developed by Helmert, was first applied to the 1987 LAGEOS I SLR data by Sahin et al. (1992). This paper investigates the performance of the globally distributed SLR stations using the Helmert type variance component estimation. As well as LAGEOS I data, LAGEOS II data were analysed, in order to compare with the previously analysed 1987 LAGEOS I data. The LAGEOS I and II data used in this research were obtained from the NASA Crustal Dynamics Data Information System (CDDIS), which archives data acquired from stations operated by NASA and by other U.S. and international organizations. The data covers the years 1994, 1995 and 1996. The analysis is based on "full-rate" laser observations, which consist of hundreds to thousands of ranges per satellite pass. The software used is based on the SATAN package (SATellite ANalysis) developed at the Royal Greenwich Observatory in the UK.

  10. Spaceport Processing System Development Lab

    NASA Technical Reports Server (NTRS)

    Dorsey, Michael

    2013-01-01

    The Spaceport Processing System Development Lab (SPSDL), developed and maintained by the Systems Hardware and Engineering Branch (NE-C4), is a development lab with its own private/restricted networks. A private/restricted network is a network with restricted or no communication with other networks. This allows users from different groups to work on their own projects in their own configured environment without interfering with others utilizing their resources in the lab. The different networks being used in the lab have no way to talk with each other due to the way they are configured, so how a user configures his software, operating system, or the equipment doesn't interfere or carry over on any of the other networks in the lab. The SPSDL is available for any project in KSC that is in need of a lab environment. My job in the SPSDL was to assist in maintaining the lab to make sure it's accessible for users. This includes, but is not limited to, making sure the computers in the lab are properly running and patched with updated hardware/software. In addition to this, I also was to assist users who had issues in utilizing the resources in the lab, which may include helping to configure a restricted network for their own environment. All of this was to ensure workers were able to use the SPSDL to work on their projects without difficulty which would in turn, benefit the work done throughout KSC. When I wasn't working in the SPSDL, I would instead help other coworkers with smaller tasks which included, but wasn't limited to, the proper disposal, moving of, or search for essential equipment. I also, during the free time I had, used NASA's resources to increase my knowledge and skills in a variety of subjects related to my major as a computer engineer, particularly in UNIX, Networking, and Embedded Systems.

  11. Development of high performance scientific components for interoperability of computing packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less

  12. 48 CFR 234.7101 - Solicitation provision and contract clause.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...

  13. 48 CFR 234.7101 - Solicitation provision and contract clause.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...

  14. 48 CFR 234.7101 - Solicitation provision and contract clause.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...

  15. 48 CFR 234.7101 - Solicitation provision and contract clause.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...

  16. Enhancing Instruction through Software Infusion.

    ERIC Educational Resources Information Center

    Sia, Archie P.

    The presence of the computer in the classroom is no longer considered an oddity; it has become an ordinary resource for teachers to use for the enhancement of instruction. This paper presents an examination of software infusion, i.e., the use of computer software to enrich instruction in an academic curriculum. The process occurs when a chosen…

  17. Journals May Soon Use Anti-Plagiarism Software on Their Authors

    ERIC Educational Resources Information Center

    Rampell, Catherine

    2008-01-01

    This spring, academic journals may turn the anti-plagiarism software that professors have been using against their students on the professors themselves. CrossRef, a publishing industry association, and the software company iParadigms announced a deal last week to create CrossCheck, an anti-plagiarism program for academic journals. The software…

  18. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    PubMed

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.

  19. A Low-Cost Part-Task Flight Training System: An Application of a Head Mounted Display

    DTIC Science & Technology

    1990-12-01

    architecture. The task at hand was to develop a software emulation libary that would emulate the function calls used within the Flight and Dog programs. This...represented in two hexadecimal digits for each color. The format of the packed long integer looks like aaggbbrr with each color value representing a...Western Digital ethernet card as the cheapest compatible card available. Good fortune arrived, as I was calling to order the card, I saw an unused card

  20. Development of an impairment-based individualized treatment workflow using an iPad-based software platform.

    PubMed

    Kiran, Swathi; Des Roches, Carrie; Balachandran, Isabel; Ascenso, Elsa

    2014-02-01

    Individuals with language and cognitive deficits following brain damage likely require long-term rehabilitation. Consequently, it is a huge practical problem to provide the continued communication therapy that these individuals require. The present project describes the development of an impairment-based individualized treatment workflow using a software platform called Constant Therapy. This article is organized into two sections. We will first describe the general methods of the treatment workflow for patients involved in this study. There are four steps in this process: (1) the patient's impairment is assessed using standardized tests, (2) the patient is assigned a specific and individualized treatment plan, (3) the patient practices the therapy at home and at the clinic, and (4) the clinician and the patient can analyze the results of the patient's performance remotely and monitor and alter the treatment plan accordingly. The second section provides four case studies that provide a representative sample of participants progressing through their individualized treatment plan. The preliminary results of the patient treatment provide encouraging evidence for the feasibility of a rehabilitation program for individuals with brain damage based on the iPad (Apple Inc., Cupertino, CA). Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  1. Polyglot Programming in Applications Used for Genetic Data Analysis

    PubMed Central

    Nowak, Robert M.

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development. PMID:25197633

  2. Polyglot programming in applications used for genetic data analysis.

    PubMed

    Nowak, Robert M

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.

  3. Multidisciplinary Optimization Branch Experience Using iSIGHT Software

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Korte, J. J.; Dunn, H. J.; Salas, A. O.

    1999-01-01

    The Multidisciplinary Optimization (MDO) Branch at NASA Langley is investigating frameworks for supporting multidisciplinary analysis and optimization research. A framework provides software and system services to integrate computational tasks and allows the researcher to concentrate more on the application and less on the programming details. A framework also provides a common working environment and a full range of optimization tools, and so increases the productivity of multidisciplinary research teams. Finally, a framework enables staff members to develop applications for use by disciplinary experts in other organizations. This year, the MDO Branch has gained experience with the iSIGHT framework. This paper describes experiences with four aerospace applications, including: (1) reusable launch vehicle sizing, (2) aerospike nozzle design, (3) low-noise rotorcraft trajectories, and (4) acoustic liner design. Brief overviews of each problem are provided, including the number and type of disciplinary codes and computation time estimates. In addition, the optimization methods, objective functions, design variables, and constraints are described for each problem. For each case, discussions on the advantages and disadvantages of using the iSIGHT framework are provided as well as notes on the ease of use of various advanced features and suggestions for areas of improvement.

  4. Using ProHits to store, annotate and analyze affinity purification - mass spectrometry (AP-MS) data

    PubMed Central

    Liu, Guomin; Zhang, Jianping; Choi, Hyungwon; Lambert, Jean-Philippe; Srikumar, Tharan; Larsen, Brett; Nesvizhskii, Alexey I.; Raught, Brian; Tyers, Mike; Gingras, Anne-Claude

    2012-01-01

    Affinity purification coupled with mass spectrometry (AP-MS) is a robust technique used to identify protein-protein interactions. With recent improvements in sample preparation, and dramatic advances in MS instrumentation speed and sensitivity, this technique is becoming more widely used throughout the scientific community. To meet the needs of research groups both large and small, we have developed software solutions for tracking, scoring and analyzing AP-MS data. Here, we provide details for the installation and utilization of ProHits, a Laboratory Information Management System designed specifically for AP-MS interaction proteomics. This protocol explains: (i) how to install the complete ProHits system, including modules for the management of mass spectrometry files and the analysis of interaction data, and (ii) alternative options for the use of pre-existing search results in simpler versions of ProHits, including a virtual machine implementation of our ProHits Lite software. We also describe how to use the main features of the software to analyze AP-MS data. PMID:22948730

  5. Human factors for capacity building: lessons learned from the OpenMRS implementers network.

    PubMed

    Seebregts, C J; Mamlin, B W; Biondich, P G; Fraser, H S F; Wolfe, B A; Jazayeri, D; Miranda, J; Blaya, J; Sinha, C; Bailey, C T; Kanter, A S

    2010-01-01

    The overall objective of this project was to investigate ways to strengthen the OpenMRS community by (i) developing capacity and implementing a network focusing specifically on the needs of OpenMRS implementers, (ii) strengthening community-driven aspects of OpenMRS and providing a dedicated forum for implementation-specific issues, and; (iii) providing regional support for OpenMRS implementations as well as mentorship and training. The methods used included (i) face-to-face networking using meetings and workshops; (ii) online collaboration tools, peer support and mentorship programmes; (iii) capacity and community development programmes, and; (iv) community outreach programmes. The community-driven approach, combined with a few simple interventions, has been a key factor in the growth and success of the OpenMRS Implementers Network. It has contributed to implementations in at least twenty-three different countries using basic online tools; and provided mentorship and peer support through an annual meeting, workshops and an internship program. The OpenMRS Implementers Network has formed collaborations with several other open source networks and is evolving regional OpenMRS Centres of Excellence to provide localized support for OpenMRS development and implementation. These initiatives are increasing the range of functionality and sustainability of open source software in the health domain, resulting in improved adoption and enterprise-readiness. Social organization and capacity development activities are important in growing a successful community-driven open source software model.

  6. NASA's Advanced Multimission Operations System: A Case Study in Formalizing Software Architecture Evolution

    NASA Technical Reports Server (NTRS)

    Barnes, Jeffrey M.

    2011-01-01

    All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.

  7. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  8. pyam: Python Implementation of YaM

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.

  9. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  10. The Anatomy of A.L.I.C.E.

    NASA Astrophysics Data System (ADS)

    Wallace, Richard S.

    This paper is a technical presentation of Artificial Linguistic Internet Computer Entity (A.L.I.C.E.) and Artificial Intelligence Markup Language (AIML), set in context by historical and philosophical ruminations on human consciousness. A.L.I.C.E., the first AIML-based personality program, won the Loebner Prize as "the most human computer" at the annual Turing Test contests in 2000, 2001, and 2004. The program, and the organization that develops it, is a product of the world of free software. More than 500 volunteers from around the world have contributed to her development. This paper describes the history of A.L.I.C.E. and AIML-free software since 1995, noting that the theme and strategy of deception and pretense upon which AIML is based can be traced through the history of Artificial Intelligence research. This paper goes on to show how to use AIML to create robot personalities like A.L.I.C.E. that pretend to be intelligent and selfaware. The paper winds up with a survey of some of the philosophical literature on the question of consciousness. We consider Searle's Chinese Room, and the view that natural language understanding by a computer is impossible. We note that the proposition "consciousness is an illusion" may be undermined by the paradoxes it apparently implies. We conclude that A.L.I.C.E. does pass the Turing Test, at least, to paraphrase Abraham Lincoln, for some of the people some of the time.

  11. Unraveling the development of scientific literacy: Domain-specific inquiry support in a system of cognitive and social interactions

    NASA Astrophysics Data System (ADS)

    Tabak, Iris Ellen

    The goal of this dissertation was to study how to harness technological tools in service of establishing a climate of inquiry in science classrooms. The research is a design experiment drawing on sociocultural and cognitive theory. As part of the BGuILE project, I developed software to support observational research of natural selection, and a complementary high school unit on evolution. Focusing on urban schools, I employed interpretive methods to examine learning as it unfolds in the classroom. I present design principles for realizing a climate of inquiry in technology-infused classrooms. This research contributes to technology design, teaching practice and educational and cognitive research. My pedagogical approach, Domain-Specific Strategic Support (DSSS), helps students analyze and synthesize primary data by making experts' considerations of content knowledge explicit. Students query data by constructing questions from a selection of comparison and variable types that are privileged in the domain. Students organize their data according to evidence categories that comprise a natural selection argument. I compared the inquiry process of contrastive cases: an honor group, a regular group and a lower track group. DSSS enabled students at different achievement levels to set up systematic comparisons, and construct empirically-based explanations. Prior knowledge and inquiry experience influenced spontaneous strategy use. Teacher guidance compensated for lack of experience, and enabled regular level students to employ strategies as frequently as honor students. I extend earlier research by proposing a taxonomy of both general and domain-specific reflective inquiry strategies. I argue that software, teacher and curriculum work in concert to sustain a climate of inquiry. Teachers help realize the potential that technological tools invite. Teachers reinforce software supports by encouraging students utilize technological tools, and by modeling their use. They also establish classroom norms that reflect scientific values. Discussions at the computer allow teachers to provide just-in-time guidance on inquiry actions. Whole class discussions afford sharing insights across groups, and relating finding to normative knowledge. Pretest to posttest improvements in both conceptual and strategic knowledge suggest that DSSS helps reconcile the tension that can exist between content and process goals in inquiry settings.

  12. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  13. Dragging Maintaining Symmetry: Can It Generate the Concept of Inclusivity as well as a Family of Shapes?

    ERIC Educational Resources Information Center

    Forsythe, Susan K.

    2015-01-01

    This article describes a project using Design Based Research methodology to ascertain whether a pedagogical task based on a dynamic figure designed in a Dynamic Geometry Software (DGS) program could be instrumental in developing students' geometrical reasoning. A dragging strategy which I have named "Dragging Maintaining Symmetry" (DMS)…

  14. Tips, Tricks & Techniques: Creating & Teaching with Simple Animation: Making Biology Instruction Come Alive

    ERIC Educational Resources Information Center

    Zanin, Mary K. B.

    2015-01-01

    Over the years, many of my students have reported that they enjoy lectures that include short, simple animations. To keep students engaged, I have developed a small set of teaching animations using PowerPoint and Camtasia Studio software packages. A survey of students who learned four difficult topics with traditional written lessons and with…

  15. Specializing architectures for the type 2 diabetes mellitus care use cases with a focus on process management.

    PubMed

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Ruiz, Alonso A

    2015-01-01

    The development of software supporting inter-disciplinary systems like the type 2 diabetes mellitus care requires the deployment of methodologies designed for this type of interoperability. The GCM framework allows the architectural description of such systems and the development of software solutions based on it. A first step of the GCM methodology is the definition of a generic architecture, followed by its specialization for specific use cases. This paper describes the specialization of the generic architecture of a system, supporting Type 2 diabetes mellitus glycemic control, for a pharmacotherapy use case. It focuses on the behavioral aspect of the system, i.e. the policy domain and the definition of the rules governing the system. The design of this architecture reflects the inter-disciplinary feature of the methodology. Finally, the resulting architecture allows building adaptive, intelligent and complete systems.

  16. General consumer communication tools for improved image management and communication in medicine.

    PubMed

    Rosset, Chantal; Rosset, Antoine; Ratib, Osman

    2005-12-01

    We elected to explore new technologies emerging on the general consumer market that can improve and facilitate image and data communication in medical and clinical environment. These new technologies developed for communication and storage of data can improve the user convenience and facilitate the communication and transport of images and related data beyond the usual limits and restrictions of a traditional picture archiving and communication systems (PACS) network. We specifically tested and implemented three new technologies provided on Apple computer platforms. (1) We adopted the iPod, a MP3 portable player with a hard disk storage, to easily and quickly move large number of DICOM images. (2) We adopted iChat, a videoconference and instant-messaging software, to transmit DICOM images in real time to a distant computer for conferencing teleradiology. (3) Finally, we developed a direct secure interface to use the iDisk service, a file-sharing service based on the WebDAV technology, to send and share DICOM files between distant computers. These three technologies were integrated in a new open-source image navigation and display software called OsiriX allowing for manipulation and communication of multimodality and multidimensional DICOM image data sets. This software is freely available as an open-source project at http://homepage.mac.com/rossetantoine/OsiriX. Our experience showed that the implementation of these technologies allowed us to significantly enhance the existing PACS with valuable new features without any additional investment or the need for complex extensions of our infrastructure. The added features such as teleradiology, secure and convenient image and data communication, and the use of external data storage services open the gate to a much broader extension of our imaging infrastructure to the outside world.

  17. Warning: Projects May Be Closer than They Appear

    NASA Technical Reports Server (NTRS)

    Africa, Colby

    2004-01-01

    I had been working for two years as the technical product manager for a large software company, when their partner company gave me a call. They needed good software engineers to customize a new version of software, and they thought I was their guy. They told me what they wanted to do to the software, and they even showed me some prototypes. Their idea was to take the basic software tool that the large company was producing and make it more accessible to the customer. They would do this by building in flexibility based on user skill level and organizational maturity. I thought that was a fascinating approach, and I bought into it in a big way. I decided to leave my job and join up with the smaller company as their director of software engineering.

  18. The Neurological Basis and Potential Modification of Emotional Intelligence through Affective/Behavioral Training

    DTIC Science & Technology

    2010-10-01

    facial trustworthiness; facial displays of anger) presented subliminally . Furthermore, the responsiveness of these regions to subliminal stimulation ...develop, or program the computerized stimulation paradigms for use during functional neuroimaging (i.e., MJT; BMAT; EFAT). These paradigms will be...programming began on the computerized functional MRI stimulation paradigms using e-prime software. • Quarter #2: Programming of all computerized functional

  19. CADBIT II - Computer-Aided Design for Built-In Test. Volume 1

    DTIC Science & Technology

    1993-06-01

    data provided in the CADBIT I Final Report, as indicated in Figure 1.2. "• CADBIT II IMPLEMENTS SYSTEM CONCEPT, REQUIREMENTS, AND DATA DEVELOPED DURING...CADBIT II software was developed using de facto computer standards including Unix, C, and the X Windows-based OSF/Motif graphical user interface... export connectivity infermation. Design Architect is a package for designers that includes schematic capture, VHDL editor, and libraries of digital

  20. Element Load Data Processor (ELDAP) Users Manual

    NASA Technical Reports Server (NTRS)

    Ramsey, John K., Jr.; Ramsey, John K., Sr.

    2015-01-01

    Often, the shear and tensile forces and moments are extracted from finite element analyses to be used in off-line calculations for evaluating the integrity of structural connections involving bolts, rivets, and welds. Usually the maximum forces and moments are desired for use in the calculations. In situations where there are numerous structural connections of interest for numerous load cases, the effort in finding the true maximum force and/or moment combinations among all fasteners and welds and load cases becomes difficult. The Element Load Data Processor (ELDAP) software described herein makes this effort manageable. This software eliminates the possibility of overlooking the worst-case forces and moments that could result in erroneous positive margins of safety and/or selecting inconsistent combinations of forces and moments resulting in false negative margins of safety. In addition to forces and moments, any scalar quantity output in a PATRAN report file may be evaluated with this software. This software was originally written to fill an urgent need during the structural analysis of the Ares I-X Interstage segment. As such, this software was coded in a straightforward manner with no effort made to optimize or minimize code or to develop a graphical user interface.

  1. Data Collection with Linux in the Undergraduate Physics Lab

    NASA Astrophysics Data System (ADS)

    Ramey, R. Dwayne

    2004-11-01

    Electronic data devices such as photogates can greatly facilitate data collection in the undergraduate physics laboratory. Unfortunately, these devices have several practical drawbacks. While the photogates themselves are not particularly expensive, manufacturers of these devices have created intermediary hardware devices for data buffering and manipulation. These devices, while useful in some contexts, greatly increase the overall price of data collection and, through the use of proprietary software, limit the ability of the enduser to customize the software. As an alternative, I outline the procedure for establishing a computer-based data collection system that consists of opensource software and user constructed connections. The data collection system consists of the wiring needed to connect a data device to a computer and the software needed to collect and manipulate data. Data devices can be connected to a computer through either through the USB port or the gameport of a sound card. Software capable of collecting and manipulating the data from a photogate type device on a Linux system has been developed and will be discrussed. Results for typical undergraduate photogate based experiments will be shown, error limits and data collect rates will be discussed for both the gameport and USB connections.

  2. Building a Snow Data Management System using Open Source Software (and IDL)

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01

  3. The Evolution of On-Board Emergency Training for the International Space Station Crew

    NASA Technical Reports Server (NTRS)

    LaBuff, Skyler

    2015-01-01

    The crew of the International Space Station (ISS) receives extensive ground-training in order to safely and effectively respond to any potential emergency event while on-orbit, but few people realize that their training is not concluded when they launch into space. The evolution of the emergency On- Board Training events (OBTs) has recently moved from paper "scripts" to an intranet-based software simulation that allows for the crew, as well as the flight control teams in Mission Control Centers across the world, to share in an improved and more realistic training event. This emergency OBT simulator ensures that the participants experience the training event as it unfolds, completely unaware of the type, location, or severity of the simulated emergency until the scenario begins. The crew interfaces with the simulation software via iPads that they keep with them as they translate through the ISS modules, receiving prompts and information as they proceed through the response. Personnel in the control centers bring up the simulation via an intranet browser at their console workstations, and can view additional telemetry signatures in simulated ground displays in order to assist the crew and communicate vital information to them as applicable. The Chief Training Officers and emergency instructors set the simulation in motion, choosing the type of emergency (rapid depressurization, fire, or toxic atmosphere) and specific initial conditions to emphasize the desired training objectives. Project development, testing, and implementation was a collaborative effort between ISS emergency instructors, Chief Training Officers, Flight Directors, and the Crew Office using commercial off the shelf (COTS) hardware along with simulation software created in-house. Due to the success of the Emergency OBT simulator, the already-developed software has been leveraged and repurposed to develop a new emulator used during fire response ground-training to deliver data that the crew receives from the handheld Compound Specific Analyzer for Combustion Products (CSA-CP). This CSA-CP emulator makes use of a portion of codebase from the Emergency OBT simulator dealing with atmospheric contamination during fire scenarios, and feeds various data signatures to crew via an iPod Touch with a flight-like CSA-CP display. These innovative simulations, which make use of COTS hardware with custom in-house software, have yielded drastic improvements to emergency training effectiveness and risk reduction for ISS crew and flight control teams during on-orbit and ground training events.

  4. What is Microsoft EMET and Why Should I Care?

    DTIC Science & Technology

    2014-10-22

    Headquarters Services , Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should...William 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Software Engineering Institute...with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center sponsored by

  5. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George I.; Stetson, Howard K.

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders

  6. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.

  7. Structural Area Inspection Frequency Evaluation (SAIFE). Volume 4. Software Documentation and User’s Manual. Book 1 - Initial Program

    DTIC Science & Technology

    1978-04-01

    structural modifications developed is prin ted . This numher includesa mod If.Icat lons because of fati . ue totet f.ailures or aircraft sorvice...1I$li. ACRVT’ -IThis~ arra’ly Conlta ins 00i Ident i liccat Wun nuanhers ofI t h&’T’~T’ l hWt I mvie t i rc ra ft. A\\PI 1) - This array contains the

  8. Software Development as Music Education Research

    ERIC Educational Resources Information Center

    Brown, Andrew R.

    2007-01-01

    This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…

  9. Architectural Heritage Documentation by Using Low Cost Uav with Fisheye Lens: Otag-I Humayun in Istanbul as a Case Study

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Özerdem, Ö. Z.

    2017-11-01

    The digital documentation of architectural heritage is important for monitoring, preserving, managing as well as 3B BIM modelling, time-space VR (virtual reality) applications. The unmanned aerial vehicles (UAVs) have been widely used in these application thanks to rapid developments in technology which enable the high resolution images with resolutions in millimeters. Moreover, it has become possible to produce highly accurate 3D point clouds with structure from motion (SfM) and multi-view stereo (MVS), to obtain a surface reconstruction of a realistic 3D architectural heritage model by using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan. In this study, digital documentation of Otag-i Humayun (The Ottoman Empire Sultan's Summer Palace) located in Davutpaşa, Istanbul/Turkey is aimed using low cost UAV. The data collections have been made with low cost UAS 3DR Solo UAV with GoPro Hero 4 with fisheye lens. The data processing was accomplished by using commercial Pix4D software. The dense point clouds, a true orthophoto and 3D solid model of the Otag-i Humayun were produced results. The quality check of the produced point clouds has been performed. The obtained result from Otag-i Humayun in Istanbul proved that, the low cost UAV with fisheye lens can be successfully used for architectural heritage documentation.

  10. A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.

    ERIC Educational Resources Information Center

    Suen, Che-yin; Pok, Yang-ming

    Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…

  11. Arterial pressure-based cardiac output monitoring: a multicenter validation of the third-generation software in septic patients.

    PubMed

    De Backer, Daniel; Marx, Gernot; Tan, Andrew; Junker, Christopher; Van Nuffelen, Marc; Hüter, Lars; Ching, Willy; Michard, Frédéric; Vincent, Jean-Louis

    2011-02-01

    Second-generation FloTrac software has been shown to reliably measure cardiac output (CO) in cardiac surgical patients. However, concerns have been raised regarding its accuracy in vasoplegic states. The aim of the present multicenter study was to investigate the accuracy of the third-generation software in patients with sepsis, particularly when total systemic vascular resistance (TSVR) is low. Fifty-eight septic patients were included in this prospective observational study in four university-affiliated ICUs. Reference CO was measured by bolus pulmonary thermodilution (iCO) using 3-5 cold saline boluses. Simultaneously, CO was computed from the arterial pressure curve recorded on a computer using the second-generation (CO(G2)) and third-generation (CO(G3)) FloTrac software. CO was also measured by semi-continuous pulmonary thermodilution (CCO). A total of 401 simultaneous measurements of iCO, CO(G2), CO(G3), and CCO were recorded. The mean (95%CI) biases between CO(G2) and iCO, CO(G3) and iCO, and CCO and iCO were -10 (-15 to -5)% [-0.8 (-1.1 to -0.4) L/min], 0 (-4 to 4)% [0 (-0.3 to 0.3) L/min], and 9 (6-13)% [0.7 (0.5-1.0) L/min], respectively. The percentage errors were 29 (20-37)% for CO(G2), 30 (24-37)% for CO(G3), and 28 (22-34)% for CCO. The difference between iCO and CO(G2) was significantly correlated with TSVR (r(2) = 0.37, p < 0.0001). A very weak (r(2) = 0.05) relationship was also observed for the difference between iCO and CO(G3). In patients with sepsis, the third-generation FloTrac software is more accurate, as precise, and less influenced by TSVR than the second-generation software.

  12. 31 CFR 560.511 - Exportation or supply of insubstantial United States content for use in foreign-made products or...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-made end product: (i) U.S.-origin goods (excluding software) comprise less than 10 percent of the foreign-made good (excluding software); (ii) U.S.-origin software comprises less than 10 percent of the foreign-made software; (iii) U.S.-origin technology comprises less than 10 percent of the foreign-made...

  13. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  14. Providing an empirical basis for optimizing the verification and testing phases of software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  15. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-05-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonalmore » view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface.« less

  16. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  17. Modeling and analysis of selected space station communications and tracking subsystems

    NASA Technical Reports Server (NTRS)

    Richmond, Elmer Raydean

    1993-01-01

    The Communications and Tracking System on board Space Station Freedom (SSF) provides space-to-ground, space-to-space, audio, and video communications, as well as tracking data reception and processing services. Each major category of service is provided by a communications subsystem which is controlled and monitored by software. Among these subsystems, the Assembly/Contingency Subsystem (ACS) and the Space-to-Ground Subsystem (SGS) provide communications with the ground via the Tracking and Data Relay Satellite (TDRS) System. The ACS is effectively SSF's command link, while the SGS is primarily intended as the data link for SSF payloads. The research activities of this project focused on the ACS and SGS antenna management algorithms identified in the Flight System Software Requirements (FSSR) documentation, including: (1) software modeling and evaluation of antenna management (positioning) algorithms; and (2) analysis and investigation of selected variables and parameters of these antenna management algorithms i.e., descriptions and definitions of ranges, scopes, and dimensions. In a related activity, to assist those responsible for monitoring the development of this flight system software, a brief summary of software metrics concepts, terms, measures, and uses was prepared.

  18. 78 FR 47012 - Developing Software Life Cycle Processes Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes Used in... revised regulatory guide (RG), revision 1 of RG 1.173, ``Developing Software Life Cycle Processes for... Developing a Software Project Life Cycle Process,'' issued 2006, with the clarifications and exceptions as...

  19. The winding road to being a code monkey

    NASA Astrophysics Data System (ADS)

    Sarahan, Michael

    2017-09-01

    I am now a software engineer at a company that provides data analytics services, and helps support the open source data science community. I have been a computer nerd for a very long time, but it was my CEU experience at Texas A&M with Sherry Yennello (2003-2005) that helped me put my nerd skills to productive use. My project then was simulation of pulse shape discrimination electronics, and it was an excellent introduction to core computational concerns, such as digitization: when you see a line on the screen, that's not really how the computer sees it. I wandered in graduate school through a chemistry program into using electron microscopes. My programming interest got me into image and signal processing, which led naturally to jobs in analyzing data, and also in acquiring data. Throughout, it was always difficult just to make software work. I got pretty good at making it work. That's what I do for a living now - package software so that it is easy for other people to do great science with.

  20. Virtual Instrument Simulator for CERES

    NASA Technical Reports Server (NTRS)

    Chapman, John J.

    1997-01-01

    A benchtop virtual instrument simulator for CERES (Clouds and the Earth's Radiant Energy System) has been built at NASA, Langley Research Center in Hampton, VA. The CERES instruments will fly on several earth orbiting platforms notably NASDA's Tropical Rainfall Measurement Mission (TRMM) and NASA's Earth Observing System (EOS) satellites. CERES measures top of the atmosphere radiative fluxes using microprocessor controlled scanning radiometers. The CERES Virtual Instrument Simulator consists of electronic circuitry identical to the flight unit's twin microprocessors and telemetry interface to the supporting spacecraft electronics and two personal computers (PC) connected to the I/O ports that control azimuth and elevation gimbals. Software consists of the unmodified TRW developed Flight Code and Ground Support Software which serves as the instrument monitor and NASA/TRW developed engineering models of the scanners. The CERES Instrument Simulator will serve as a testbed for testing of custom instrument commands intended to solve in-flight anomalies of the instruments which could arise during the CERES mission. One of the supporting computers supports the telemetry display which monitors the simulator microprocessors during the development and testing of custom instrument commands. The CERES engineering development software models have been modified to provide a virtual instrument running on a second supporting computer linked in real time to the instrument flight microprocessor control ports. The CERES Instrument Simulator will be used to verify memory uploads by the CERES Flight Operations TEAM at NASA. Plots of the virtual scanner models match the actual instrument scan plots. A high speed logic analyzer has been used to track the performance of the flight microprocessor. The concept of using an identical but non-flight qualified microprocessor and electronics ensemble linked to a virtual instrument with identical system software affords a relatively inexpensive simulation system capable of high fidelity.

  1. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  2. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  3. ELER software - a new tool for urban earthquake loss assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  4. Reconfigurable, Cognitive Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  5. Serving the enterprise and beyond with informatics for integrating biology and the bedside (i2b2)

    PubMed Central

    Weber, Griffin; Mendis, Michael; Gainer, Vivian; Chueh, Henry C; Churchill, Susanne; Kohane, Isaac

    2010-01-01

    Informatics for Integrating Biology and the Bedside (i2b2) is one of seven projects sponsored by the NIH Roadmap National Centers for Biomedical Computing (http://www.ncbcs.org). Its mission is to provide clinical investigators with the tools necessary to integrate medical record and clinical research data in the genomics age, a software suite to construct and integrate the modern clinical research chart. i2b2 software may be used by an enterprise's research community to find sets of interesting patients from electronic patient medical record data, while preserving patient privacy through a query tool interface. Project-specific mini-databases (“data marts”) can be created from these sets to make highly detailed data available on these specific patients to the investigators on the i2b2 platform, as reviewed and restricted by the Institutional Review Board. The current version of this software has been released into the public domain and is available at the URL: http://www.i2b2.org/software. PMID:20190053

  6. Building a Trusted Path for Applications Using COTS Components

    DTIC Science & Technology

    2004-11-01

    against attacks by malicious software. Trojan horse programs, i.e., programs with additional hidden, often malicious, functions, are more and more...cannot be imitated by untrusted software." Wiseman et al. (1988) propose a user interface for the SMITE system to prevent Trojan horses from...input, two of which can also be used for the hologram service. 7.0 CONCLUSION Trojan horse programs, i.e., programs with additional hidden, often

  7. Inexpensive Instruments for a Sound Unit

    NASA Astrophysics Data System (ADS)

    Brazzle, Bob

    2011-04-01

    My unit on sound and waves is embedded within a long-term project in which my high school students construct a musical instrument out of common materials. The unit culminates with a performance assessment: students play the first four measures of "Somewhere Over the Rainbow"—chosen because of the octave interval of the first two notes—in the key of C, and write a short paper describing the theory underlying their instrument. My students have done this project for the past three years, and it continues to evolve. This year I added new instructional materials that I developed using a freeware program called Audacity. This software is very intuitive, and my students used it to develop their musical instruments. In this paper I will describe some of the inexpensive instructional materials in my sound unit, and how they fit with my learning goals.

  8. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudkevich, Aleksandr; Goldis, Evgeniy

    This research conducted by the Newton Energy Group, LLC (NEG) is dedicated to the development of pCloud: a Cloud-based Power Market Simulation Environment. pCloud is offering power industry stakeholders the capability to model electricity markets and is organized around the Software as a Service (SaaS) concept -- a software application delivery model in which software is centrally hosted and provided to many users via the internet. During the Phase I of this project NEG developed a prototype design for pCloud as a SaaS-based commercial service offering, system architecture supporting that design, ensured feasibility of key architecture's elements, formed technological partnershipsmore » and negotiated commercial agreements with partners, conducted market research and other related activities and secured funding for continue development of pCloud between the end of Phase I and beginning of Phase II, if awarded. Based on the results of Phase I activities, NEG has established that the development of a cloud-based power market simulation environment within the Windows Azure platform is technologically feasible, can be accomplished within the budget and timeframe available through the Phase II SBIR award with additional external funding. NEG believes that pCloud has the potential to become a game-changing technology for the modeling and analysis of electricity markets. This potential is due to the following critical advantages of pCloud over its competition: - Standardized access to advanced and proven power market simulators offered by third parties. - Automated parallelization of simulations and dynamic provisioning of computing resources on the cloud. This combination of automation and scalability dramatically reduces turn-around time while offering the capability to increase the number of analyzed scenarios by a factor of 10, 100 or even 1000. - Access to ready-to-use data and to cloud-based resources leading to a reduction in software, hardware, and IT costs. - Competitive pricing structure, which will make high-volume usage of simulation services affordable. - Availability and affordability of high quality power simulators, which presently only large corporate clients can afford, will level the playing field in developing regional energy policies, determining prudent cost recovery mechanisms and assuring just and reasonable rates to consumers. - Users that presently do not have the resources to internally maintain modeling capabilities will now be able to run simulations. This will invite more players into the industry, ultimately leading to more transparent and liquid power markets.« less

  10. Cognitive software defined radar: waveform design for clutter and interference suppression

    NASA Astrophysics Data System (ADS)

    Kirk, Benjamin H.; Owen, Jonathan W.; Narayanan, Ram M.; Blunt, Shannon D.; Martone, Anthony F.; Sherbondy, Kelly D.

    2017-05-01

    Clutter and radio frequency interference (RFI) are prevalent issues in the field of radar and are specifically of interest to of cognitive radar. Here, methods for applying and testing the utility of cognitive radar for clutter and RFI mitigation are explored. Using the adaptable transmit capability, environmental database, and general "awareness" of a cognitive radar system (i.e. spectrum sensing, geographical location, etc.), a matched waveform is synthesized that improves the signal-to-clutter ratio (SCR), assuming at least an estimate of the target response and the environmental clutter response are known a prior i. RFI may also be mitigated by sensing the RF spectrum and adapting the transmit center frequency and bandwidth using methods that optimize bandwidth and signal-to-interference plus noise ratio (SINR) (i.e. the spectrum sensing, multi-objective (SS-MO) algorithm). The improvement is shown by a decrease in the noise floor. The above methods' effectiveness are examined via a test-bed developed around a software defined radio (SDR). Testing and the general use of commercial off the shelf (COTS) devices are desirable for their cost effectiveness, general ease of use, as well as technical and community support, but these devices provide design challenges in order to be effective. The universal software radio peripheral (USRP) X310 SDR is a relatively cheap and portable device that has all the system components of a basic cognitive radar. Design challenges of the SDR include phase coherency between channels, bandwidth limitations, dynamic range, and speed of computation and data communication / recording.

  11. Modeling Latent Growth Curves With Incomplete Data Using Different Types of Structural Equation Modeling and Multilevel Software

    ERIC Educational Resources Information Center

    Ferrer, Emilio; Hamagami, Fumiaki; McArdle, John J.

    2004-01-01

    This article offers different examples of how to fit latent growth curve (LGC) models to longitudinal data using a variety of different software programs (i.e., LISREL, Mx, Mplus, AMOS, SAS). The article shows how the same model can be fitted using both structural equation modeling and multilevel software, with nearly identical results, even in…

  12. Effectiveness of computer-aided diagnosis of colorectal lesions using novel software for magnifying narrow-band imaging: a pilot study.

    PubMed

    Tamai, Naoto; Saito, Yutaka; Sakamoto, Taku; Nakajima, Takeshi; Matsuda, Takahisa; Sumiyama, Kazuki; Tajiri, Hisao; Koyama, Ryosuke; Kido, Shoji

    2017-08-01

     Magnifying narrow-band imaging (M-NBI) enables detailed observation of microvascular architecture and can be used in endoscopic diagnosis of colorectal lesion. However, in clinical practice, differential diagnosis and estimation of invasion depth of colorectal lesions based on M-NBI findings require experience. Therefore, developing computer-aided diagnosis (CAD) for M-NBI would be beneficial for clinical practice. The aim of this study was to evaluate the effectiveness of software for CAD of colorectal lesions. In collaboration with Yamaguchi University, we developed novel software that enables CAD of colorectal lesions using M-NBI images. This software for CAD further specifically divides original Sano's colorectal M-NBI classification into 3 groups (group A, capillary pattern [CP] type I; group B, CP type II + CP type IIIA; group C, CP type IIIB), which describe hyperplastic polyps (HPs), adenoma/adenocarcinoma (intramucosal [IM] to submucosal [SM]-superficial) lesions, and SM-deep lesions, respectively. We retrospectively reviewed 121 lesions evaluated using M-NBI. The 121 reviewed lesions included 21 HP, 80 adenoma/adenocarcinoma (IM to SM-superficial), and 20 SM-deep lesions. The concordance rate between the CAD and the diagnosis of the experienced endoscopists was 90.9 %. The sensitivity, specificity, positive and negative predictive values, and accuracy of the CAD for neoplastic lesions were 83.9 %, 82.6 %, 53.1 %, 95.6 %, and 82.8 %, respectively. The values for SM-deep lesions were 83.9 %, 82.6 %, 53.1 %, 95.6 %, and 82.8 %, respectively.  Relatively high diagnostic values were obtained using CAD. This software for CAD could possibly lead to a wider use of M-NBI in the endoscopic diagnosis of colorectal lesions.

  13. Using failure mode and effects analysis to plan implementation of smart i.v. pump technology.

    PubMed

    Wetterneck, Tosha B; Skibinski, Kathleen A; Roberts, Tanita L; Kleppin, Susan M; Schroeder, Mark E; Enloe, Myra; Rough, Steven S; Hundt, Ann Schoofs; Carayon, Pascale

    2006-08-15

    Failure mode and effects analysis (FMEA) was used to evaluate a smart i.v. pump as it was implemented into a redesigned medication-use process. A multidisciplinary team conducted a FMEA to guide the implementation of a smart i.v. pump that was designed to prevent pump programming errors. The smart i.v. pump was equipped with a dose-error reduction system that included a pre-defined drug library in which dosage limits were set for each medication. Monitoring for potential failures and errors occurred for three months postimplementation of FMEA. Specific measures were used to determine the success of the actions that were implemented as a result of the FMEA. The FMEA process at the hospital identified key failure modes in the medication process with the use of the old and new pumps, and actions were taken to avoid errors and adverse events. I.V. pump software and hardware design changes were also recommended. Thirteen of the 18 failure modes reported in practice after pump implementation had been identified by the team. A beneficial outcome of FMEA was the development of a multidisciplinary team that provided the infrastructure for safe technology implementation and effective event investigation after implementation. With the continual updating of i.v. pump software and hardware after implementation, FMEA can be an important starting place for safe technology choice and implementation and can produce site experts to follow technology and process changes over time. FMEA was useful in identifying potential problems in the medication-use process with the implementation of new smart i.v. pumps. Monitoring for system failures and errors after implementation remains necessary.

  14. Constellation Program Electrical Ground Support Equipment Research and Development

    NASA Technical Reports Server (NTRS)

    McCoy, Keegan S.

    2010-01-01

    At the Kennedy Space Center, I engaged in the research and development of electrical ground support equipment for NASA's Constellation Program. Timing characteristics playa crucial role in ground support communications. Latency and jitter are two problems that must be understood so that communications are timely and consistent within the Kennedy Ground Control System (KGCS). I conducted latency and jitter tests using Alien-Bradley programmable logic controllers (PLCs) so that these two intrinsic network properties can be reduced. Time stamping and clock synchronization also play significant roles in launch processing and operations. Using RSLogix 5000 project files and Wireshark network protocol analyzing software, I verified master/slave PLC Ethernet module clock synchronization, master/slave IEEE 1588 communications, and time stamping capabilities. All of the timing and synchronization test results are useful in assessing the current KGCS operational level and determining improvements for the future.

  15. Validation of software for calculating the likelihood ratio for parentage and kinship.

    PubMed

    Drábek, J

    2009-03-01

    Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.

  16. FOCU:S--future operator control unit: soldier

    NASA Astrophysics Data System (ADS)

    O'Brien, Barry J.; Karan, Cem; Young, Stuart H.

    2009-05-01

    The U.S. Army Research Laboratory's (ARL) Computational and Information Sciences Directorate (CISD) has long been involved in autonomous asset control, specifically as it relates to small robots. Over the past year, CISD has been making strides in the implementation of three areas of small robot autonomy, namely platform autonomy, Soldier-robot interface, and tactical behaviors. It is CISD's belief that these three areas must be considered as a whole in order to provide Soldiers with useful capabilities. In addressing the Soldier-robot interface aspect, CISD has begun development on a unique dismounted controller called the Future Operator Control Unit: Soldier (FOCU:S) that is based on an Apple iPod Touch. The iPod Touch's small form factor, unique touch-screen input device, and the presence of general purpose computing applications such as a web browser combine to give this device the potential to be a disruptive technology. Setting CISD's implementation apart from other similar iPod or iPhone-based devices is the ARL software that allows multiple robotic platforms to be controlled from a single OCU. The FOCU:S uses the same Agile Computing Infrastructure (ACI) that all other assets in the ARL robotic control system use, enabling automated asset discovery on any type of network. Further, a custom ad hoc routing implementation allows the FOCU:S to communicate with the ARL ad hoc communications system and enables it to extend the range of the network. This paper will briefly describe the current robotic control architecture employed by ARL and provide short descriptions of existing capabilities. Further, the paper will discuss FOCU:S specific software developed for the iPod Touch, including unique capabilities enabled by the device's unique hardware.

  17. Usability evaluation of mobile applications using ISO 9241 and ISO 25062 standards.

    PubMed

    Moumane, Karima; Idri, Ali; Abran, Alain

    2016-01-01

    This paper presents an empirical study based on a set of measures to evaluate the usability of mobile applications running on different mobile operating systems, including Android, iOS and Symbian. The aim is to evaluate empirically a framework that we have developed on the use of the Software Quality Standard ISO 9126 in mobile environments, especially the usability characteristic. To do that, 32 users had participated in the experiment and we have used ISO 25062 and ISO 9241 standards for objective measures by working with two widely used mobile applications: Google Apps and Google Maps. The QUIS 7.0 questionnaire have been used to collect measures assessing the users' level of satisfaction when using these two mobile applications. By analyzing the results we highlighted a set of mobile usability issues that are related to the hardware as well as to the software and that need to be taken into account by designers and developers in order to improve the usability of mobile applications.

  18. Software for Automated Reading of STEP Files by I-DEAS(trademark)

    NASA Technical Reports Server (NTRS)

    Pinedo, John

    2003-01-01

    A program called "readstep" enables the I-DEAS(tm) computer-aided-design (CAD) software to automatically read Standard for the Exchange of Product Model Data (STEP) files. (The STEP format is one of several used to transfer data between dissimilar CAD programs.) Prior to the development of "readstep," it was necessary to read STEP files into I-DEAS(tm) one at a time in a slow process that required repeated intervention by the user. In operation, "readstep" prompts the user for the location of the desired STEP files and the names of the I-DEAS(tm) project and model file, then generates an I-DEAS(tm) program file called "readstep.prg" and two Unix shell programs called "runner" and "controller." The program "runner" runs I-DEAS(tm) sessions that execute readstep.prg, while "controller" controls the execution of "runner" and edits readstep.prg if necessary. The user sets "runner" and "controller" into execution simultaneously, and then no further intervention by the user is required. When "runner" has finished, the user should see only parts from successfully read STEP files present in the model file. STEP files that could not be read successfully (e.g., because of format errors) should be regenerated before attempting to read them again.

  19. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    NASA Astrophysics Data System (ADS)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  20. Software Tools for Software Maintenance

    DTIC Science & Technology

    1988-10-01

    Cobol ! ASTEC I --------------- ----. -... . . . ---------. P . .---------- I-.... IBM Main DOS None Spec I CA-Converter I A -------- -------------I...none ASTEC Source: VSM Company: MAINTEC, Inc. Change-Han Phone: (612) 831-2122 Company: SERENA Consulting Function: RF Phone: (800) 621-0851

  1. ELF/VLF/LF Radio Propagation and Systems Aspects (La Propagation des Ondes Radio ELF/VLF/LF et les Aspects Systemes)

    DTIC Science & Technology

    1993-05-01

    limitation of the software package would not allow DATE/I’ME FREQUENCY (kHz) the program to run over 2359 to 0001 UT. This was 18.1 19.0 21.4 24.0...Capability (LWPC), software package devel- oped at NOSC (FERGUSON et al 1989) and adapted by us to the Macintosh personal computer. We find that this... software works very well. Our investigations are to I evaluate and devise geophysical models to be used with . LWPC in assessing VLF communications and

  2. A Standing Location Detector Enabling People with Developmental Disabilities to Control Environmental Stimulation through Simple Physical Activities with Nintendo Wii Balance Boards

    ERIC Educational Resources Information Center

    Shih, Ching-Hsiang

    2011-01-01

    This study evaluated whether two people with developmental disabilities would be able to actively perform simple physical activities by controlling their favorite environmental stimulation using Nintendo Wii Balance Boards with a newly developed standing location detection program (SLDP, i.e., a new software program turning a Nintendo Wii Balance…

  3. Using Software Testing Techniques for Efficient Handling of Programming Exercises in an e-Learning Platform

    ERIC Educational Resources Information Center

    Schwieren, Joachim; Vossen, Gottfried; Westerkamp, Peter

    2006-01-01

    e-Learning has become a major field of interest in recent years, and multiple approaches and solutions have been developed. A typical form of e-learning application comprises exercise submission and assessment systems that allow students to work on assignments whenever and where they want (i.e., dislocated, asynchronous work). In basic computer…

  4. Input and Output Mechanisms and Devices. Phase I: Adding Voice Output to a Speaker-Independent Recognition System.

    ERIC Educational Resources Information Center

    Scott Instruments Corp., Denton, TX.

    This project was designed to develop techniques for adding low-cost speech synthesis to educational software. Four tasks were identified for the study: (1) select a microcomputer with a built-in analog-to-digital converter that is currently being used in educational environments; (2) determine the feasibility of implementing expansion and playback…

  5. An Object Location Detector Enabling People with Developmental Disabilities to Control Environmental Stimulation through Simple Occupational Activities with Battery-Free Wireless Mice

    ERIC Educational Resources Information Center

    Shih, Ching-Hsiang

    2011-01-01

    This study assessed whether two persons with developmental disabilities would be able to actively perform simple occupational activities by controlling their favorite environmental stimulation using battery-free wireless mice with a newly developed object location detection program (OLDP, i.e., a new software program turning a battery-free…

  6. Developing infrared array controller with software real time operating system

    NASA Astrophysics Data System (ADS)

    Sako, Shigeyuki; Miyata, Takashi; Nakamura, Tomohiko; Motohara, Kentaro; Uchimoto, Yuka Katsuno; Onaka, Takashi; Kataza, Hirokazu

    2008-07-01

    Real-time capabilities are required for a controller of a large format array to reduce a dead-time attributed by readout and data transfer. The real-time processing has been achieved by dedicated processors including DSP, CPLD, and FPGA devices. However, the dedicated processors have problems with memory resources, inflexibility, and high cost. Meanwhile, a recent PC has sufficient resources of CPUs and memories to control the infrared array and to process a large amount of frame data in real-time. In this study, we have developed an infrared array controller with a software real-time operating system (RTOS) instead of the dedicated processors. A Linux PC equipped with a RTAI extension and a dual-core CPU is used as a main computer, and one of the CPU cores is allocated to the real-time processing. A digital I/O board with DMA functions is used for an I/O interface. The signal-processing cores are integrated in the OS kernel as a real-time driver module, which is composed of two virtual devices of the clock processor and the frame processor tasks. The array controller with the RTOS realizes complicated operations easily, flexibly, and at a low cost.

  7. Enabling Co-Design of Multi-Layer Exascale Storage Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carothers, Christopher

    Growing demands for computing power in applications such as energy production, climate analysis, computational chemistry, and bioinformatics have propelled computing systems toward the exascale: systems with 10 18 floating-point operations per second. These systems, to be designed and constructed over the next decade, will create unprecedented challenges in component counts, power consumption, resource limitations, and system complexity. Data storage and access are an increasingly important and complex component in extreme-scale computing systems, and significant design work is needed to develop successful storage hardware and software architectures at exascale. Co-design of these systems will be necessary to find the best possiblemore » design points for exascale systems. The goal of this work has been to enable the exploration and co-design of exascale storage systems by providing a detailed, accurate, and highly parallel simulation of exascale storage and the surrounding environment. Specifically, this simulation has (1) portrayed realistic application checkpointing and analysis workloads, (2) captured the complexity, scale, and multilayer nature of exascale storage hardware and software, and (3) executed in a timeframe that enables “what if'” exploration of design concepts. We developed models of the major hardware and software components in an exascale storage system, as well as the application I/O workloads that drive them. We used our simulation system to investigate critical questions in reliability and concurrency at exascale, helping guide the design of future exascale hardware and software architectures. Additionally, we provided this system to interested vendors and researchers so that others can explore the design space. We validated the capabilities of our simulation environment by configuring the simulation to represent the Argonne Leadership Computing Facility Blue Gene/Q system and comparing simulation results for application I/O patterns to the results of executions of these I/O kernels on the actual system.« less

  8. Inertial Navigation System Standardized Software Development. Volume II. INS Survey and Analytical Development

    DTIC Science & Technology

    1978-06-01

    unit magnitude, mutually orthogonal, right handed) (in the particular case i, j, k are along the x, y, z axes of the body frame). -2 A good concise...shown. A-_22 __ _ __ I - -~- - ’ The signal flow form is shown in Figure A-l6: C5 L Li ’Liter = (~1 Z~LI) L Z ~LO) For th sm reaon as beor, 2’ is

  9. Introduction to benchmark dose methods and U.S. EPA's benchmark dose software (BMDS) version 2.1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, J. Allen, E-mail: davis.allen@epa.gov; Gift, Jeffrey S.; Zhao, Q. Jay

    2011-07-15

    Traditionally, the No-Observed-Adverse-Effect-Level (NOAEL) approach has been used to determine the point of departure (POD) from animal toxicology data for use in human health risk assessments. However, this approach is subject to substantial limitations that have been well defined, such as strict dependence on the dose selection, dose spacing, and sample size of the study from which the critical effect has been identified. Also, the NOAEL approach fails to take into consideration the shape of the dose-response curve and other related information. The benchmark dose (BMD) method, originally proposed as an alternative to the NOAEL methodology in the 1980s, addressesmore » many of the limitations of the NOAEL method. It is less dependent on dose selection and spacing, and it takes into account the shape of the dose-response curve. In addition, the estimation of a BMD 95% lower bound confidence limit (BMDL) results in a POD that appropriately accounts for study quality (i.e., sample size). With the recent advent of user-friendly BMD software programs, including the U.S. Environmental Protection Agency's (U.S. EPA) Benchmark Dose Software (BMDS), BMD has become the method of choice for many health organizations world-wide. This paper discusses the BMD methods and corresponding software (i.e., BMDS version 2.1.1) that have been developed by the U.S. EPA, and includes a comparison with recently released European Food Safety Authority (EFSA) BMD guidance.« less

  10. Information Technology Progress in North Korea and Its Prospects

    DTIC Science & Technology

    2005-08-01

    North Korean company agreed to develop a Korean- Chinese language-interpretation software together with a South Korean company, L&I Soft. The North...Chosun cuisine , and Chosun stamps. Furthermore, governmental agencies and university research institutes developed roughly 10 programs for the Science

  11. Study and Analysis of The Robot-Operated Material Processing Systems (ROMPS)

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.

    1996-01-01

    This is a report presenting the progress of a research grant funded by NASA for work performed during 1 Oct. 1994 - 31 Sep. 1995. The report deals with the development and investigation of potential use of software for data processing for the Robot Operated Material Processing System (ROMPS). It reports on the progress of data processing of calibration samples processed by ROMPS in space and on earth. First data were retrieved using the I/O software and manually processed using MicroSoft Excel. Then the data retrieval and processing process was automated using a program written in C which is able to read the telemetry data and produce plots of time responses of sample temperatures and other desired variables. LabView was also employed to automatically retrieve and process the telemetry data.

  12. Basic Techniques in Environmental Simulation.

    DTIC Science & Technology

    1982-07-01

    the devel- ’I or oper is liable for all necessary changes in the model or its supporting computer software . After the 90-day warranty expires, the user...processing unit, that part of a computer which accom- plishes arithmetic and logical operations DCFLOS Dynamic cloud -free line-of-sight, a simulation... Software Development ......... 12 1.7.7 Operational Environment, Interfaces, and Constraints. . 12 1.7.8 Effectiveness Evaluation, Value Analysis, and

  13. C3I Analysis Tools for Development Planning. Volume 1

    DTIC Science & Technology

    1985-09-27

    otherwise as in any manner licensing the holder or any other person or conveying any rights or permission to manufacture, use, or sell any patented... PERSONAL AUTHOR(S) ».A. Vail, G.H. Weissman, J.G. Wohl TYPE OF REPORT ?inal 13b. TIME COVERED FROM SUPPLEMENTARY NOTATION Refer to ESD-TR-86...support for decisions about the relative value of acquisition programs. 5. The software tool developed on a personal computer demonstrates that

  14. Degraded Operational Environment: Integration of Social Network Infrastructure Concept in a Traditional Military C2 System

    DTIC Science & Technology

    2013-06-01

    Communication Applet) UNIGE – D.I.M.E. Using a free application as “MIT APP Inventor” Android Software Development Kit DEGRADED C2 ICCRTS 2013...operate on an Android operating system up-gradable on which will be developed a simplified ACA ( Android Communication Applet) that will call C24U...Server) IP number . . . Portable COTS Devices ACA - C24U ( Android Communication Applet) Sending/receiving SEFL (Simple Exchange

  15. Development of Measures to Assess Product Modularity and Reconfigurability

    DTIC Science & Technology

    2010-03-01

    mission needs. For example, a thermal blanket is the only “module” currently being used to control spacecraft temperature (i.e. no active cooling). If...infrastructure, and thermal control. The spacecraft components include the autonomous flight software; the quantity of high- performance computing; power... thermal requirements are satisfied using this thermal blanket , then there may not be a need for active cooling to improve the thermal range of the

  16. SDO FlatSat Facility

    NASA Technical Reports Server (NTRS)

    Amason, David L.

    2008-01-01

    The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.

  17. Molecular Genetics Information System (MOLGENIS): alternatives in developing local experimental genomics databases.

    PubMed

    Swertz, Morris A; De Brock, E O; Van Hijum, Sacha A F T; De Jong, Anne; Buist, Girbe; Baerends, Richard J S; Kok, Jan; Kuipers, Oscar P; Jansen, Ritsert C

    2004-09-01

    Genomic research laboratories need adequate infrastructure to support management of their data production and research workflow. But what makes infrastructure adequate? A lack of appropriate criteria makes any decision on buying or developing a system difficult. Here, we report on the decision process for the case of a molecular genetics group establishing a microarray laboratory. Five typical requirements for experimental genomics database systems were identified: (i) evolution ability to keep up with the fast developing genomics field; (ii) a suitable data model to deal with local diversity; (iii) suitable storage of data files in the system; (iv) easy exchange with other software; and (v) low maintenance costs. The computer scientists and the researchers of the local microarray laboratory considered alternative solutions for these five requirements and chose the following options: (i) use of automatic code generation; (ii) a customized data model based on standards; (iii) storage of datasets as black boxes instead of decomposing them in database tables; (iv) loosely linking to other programs for improved flexibility; and (v) a low-maintenance web-based user interface. Our team evaluated existing microarray databases and then decided to build a new system, Molecular Genetics Information System (MOLGENIS), implemented using code generation in a period of three months. This case can provide valuable insights and lessons to both software developers and a user community embarking on large-scale genomic projects. http://www.molgenis.nl

  18. Software is a Product...Not

    DTIC Science & Technology

    1992-09-01

    understand the process if we consider software as a service , not a prod- uct. Let me expand on this statement. I do not believe we must do any of the... software -building activities differently. Instead, from the perspective of schedul- ing, budgeting, and delivering software , we should use the service ...While we’re not perfect, we do a fairly its upgrades. The pricing scheme be- good job of managing hardware engi- Software as a service . What is a

  19. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    NASA Astrophysics Data System (ADS)

    Xuan, Chuang; Oda, Hirokuni

    2015-11-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  20. Development of a Mammographic Image Processing Environment Using MATLAB.

    DTIC Science & Technology

    1994-12-01

    S. .° : i. .... ...... Correctness Reliability Efficiency Integrity Usability Figure 1.2 - McCall’s software quality factors [ Pressman , 1987] 1.4...quality factors [ Pressman , 1987] 3-13 Each quality factor itself is related to independent attributes called criteria [Cooper and Fisher, 1979], or...metrics by [ Pressman , 1987], that can be used to judge, define, and measure quality [Cooper and Fisher, 1979]. Figure 3.9 shows the criteria that are used

  1. Current Practice in Software Development for Computational Neuroscience and How to Improve It

    PubMed Central

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research. PMID:24465191

  2. Current practice in software development for computational neuroscience and how to improve it.

    PubMed

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  3. Software solution for autonomous observations with H2RG detectors and SIDECAR ASICs for the RATIR camera

    NASA Astrophysics Data System (ADS)

    Klein, Christopher R.; Kubánek, Petr; Butler, Nathaniel R.; Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Bloom, Joshua S.; Farah, Alejandro; Gehrels, Neil; Georgiev, Leonid; González, J. Jesús; Lee, William H.; Lotkin, Gennadiy N.; Moseley, Samuel H.; Prochaska, J. Xavier; Ramirez-Ruiz, Enrico; Richer, Michael G.; Robinson, Frederick D.; Román-Zúñiga, Carlos; Samuel, Mathew V.; Sparr, Leroy M.; Tucker, Corey; Watson, Alan M.

    2012-07-01

    The Reionization And Transients InfraRed (RATIR) camera has been built for rapid Gamma-Ray Burst (GRB) followup and will provide quasi-simultaneous imaging in ugriZY JH. The optical component uses two 2048 × 2048 pixel Finger Lakes Imaging ProLine detectors, one optimized for the SDSS u, g, and r bands and one optimized for the SDSS i band. The infrared portion incorporates two 2048 × 2048 pixel Teledyne HgCdTe HAWAII-2RG detectors, one with a 1.7-micron cutoff and one with a 2.5-micron cutoff. The infrared detectors are controlled by Teledyne's SIDECAR (System for Image Digitization Enhancement Control And Retrieval) ASICs (Application Specific Integrated Circuits). While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 (JWST ASIC Drive Electronics) interface card and IDE (Integrated Development Environment). Here we present a summary of the software developed to interface the RATIR detectors with Remote Telescope System, 2nd Version (RTS2) software. RTS2 is an integrated open source package for remote observatory control under the Linux operating system and will autonomously coordinate observatory dome, telescope pointing, detector, filter wheel, focus stage, and dewar vacuum compressor operations. Where necessary we have developed custom interfaces between RTS2 and RATIR hardware, most notably for cryogenic focus stage motor drivers and temperature controllers. All detector and hardware interface software developed for RATIR is freely available and open source as part of the RTS2 distribution.

  4. Reusable Software and Open Data Incorporate Ecological Understanding To Optimize Agriculture and Improveme Crops.

    NASA Astrophysics Data System (ADS)

    LeBauer, D.

    2015-12-01

    Humans need a secure and sustainable food supply, and science can help. We have an opportunity to transform agriculture by combining knowledge of organisms and ecosystems to engineer ecosystems that sustainably produce food, fuel, and other services. The challenge is that the information we have. Measurements, theories, and laws found in publications, notebooks, measurements, software, and human brains are difficult to combine. We homogenize, encode, and automate the synthesis of data and mechanistic understanding in a way that links understanding at different scales and across domains. This allows extrapolation, prediction, and assessment. Reusable components allow automated construction of new knowledge that can be used to assess, predict, and optimize agro-ecosystems. Developing reusable software and open-access databases is hard, and examples will illustrate how we use the Predictive Ecosystem Analyzer (PEcAn, pecanproject.org), the Biofuel Ecophysiological Traits and Yields database (BETYdb, betydb.org), and ecophysiological crop models to predict crop yield, decide which crops to plant, and which traits can be selected for the next generation of data driven crop improvement. A next step is to automate the use of sensors mounted on robots, drones, and tractors to assess plants in the field. The TERRA Reference Phenotyping Platform (TERRA-Ref, terraref.github.io) will provide an open access database and computing platform on which researchers can use and develop tools that use sensor data to assess and manage agricultural and other terrestrial ecosystems. TERRA-Ref will adopt existing standards and develop modular software components and common interfaces, in collaboration with researchers from iPlant, NEON, AgMIP, USDA, rOpenSci, ARPA-E, many scientists and industry partners. Our goal is to advance science by enabling efficient use, reuse, exchange, and creation of knowledge.

  5. [Application of Stata software to test heterogeneity in meta-analysis method].

    PubMed

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  6. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  7. Big Hand Produces CD-I World Disc--First Title Authored Entirely with MediaMogul.

    ERIC Educational Resources Information Center

    Buckman, Brad; Grant, Valerie

    1993-01-01

    Discusses Big Hand Productions' development of CD-I-WORLD, an interactive multimedia magazine on compact disk, designed for mass appeal and complete with advertising. Features of MediaMogul, the prepackaged authoring software that made production of this innovation possible, are described. Sample screen displays are included. (EA)

  8. DHS-STEM Internship at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldman, B

    2008-08-18

    This summer I had the fortunate opportunity through the DHS-STEM program to attend Lawrence Livermore National Laboratories (LLNL) to work with Tom Slezak on the bioinformatics team. The bioinformatics team, among other things, helps to develop TaqMan and microarray probes for the identification of pathogens. My main project at the laboratory was to test such probe identification capabilities against metagenomic (unsequenced) data from around the world. Using various sequence analysis tools (Vmatch and Blastall) and several we developed ourselves, about 120 metagenomic sequencing projects were compared against a collection of all completely sequenced genomes and Lawrence Livermore National Laboratory's (LLNL)more » current probe database. For the probes, the Blastall algorithms compared each individual metagenomic project using various parameters allowing for the natural ambiguities of in vitro hybridization (mismatches, deletions, insertions, hairpinning, etc.). A low level cutoff was used to eliminate poor sequence matches, and to leave a large variety of higher quality matches for future research into the hybridization of sequences with mutations and variations. Any hits with at least 80% base pair conservation over 80% of the length of the match. Because of the size of our whole genome database, we utilized the exact match algorithm of Vmatch to quickly search and compare genomes for exact matches with varying lower level limits on sequence length. I also provided preliminary feasibility analyses to support a potential industry-funded project to develop a multiplex assay on several genera and species. Each genus and species was evaluated based on the amount of sequenced genomes, amount of near neighbor sequenced genomes, presence of identifying genes--metabolistic or antibiotic resistant genes--and the availability of research on the identification of the specific genera or species. Utilizing the bioinformatic team's software, I was able to develop and/or update several TaqMan probes for these and develop a plan of identification for the more difficult ones. One suggestion for a genus with low conservation was to separate species into several groups and look for probes within these and then use a combination of probes to identify a genus. This has the added benefit of also providing subgenus identification in larger genera. During both projects I had developed a set of computer programs to simplify or consolidate several processes. These programs were constructed with the intent of being reused to either repeat these results, further this research, or to start a similar project. A big problem in the bioinformatic/sequencing field is the variability of data storage formats which make using data from various sources extremely difficult. Excluding for the moment the many errors present in online database genome sequences, there are still many difficulties in converting one data type into another successfully every time. Dealing with hundreds of files, each hundreds of megabytes, requires automation which in turn requires good data mining software. The programs I developed will help ease this issue and make more genomic sources available for use. With these programs it is extremely easy to gather the data, cleanse it, convert it and run it through some analysis software and even analyze the output of this software. When dealing with vast amounts of data it is vital for the researcher to optimize the process--which became clear to me with only ten weeks to work with. Due to the time constraint of the internship, I was unable to finish my metagenomic project; I did finish with success, my second project, discovering TaqMan identification for genera and species. Although I did not complete my first project I made significant findings along the way that suggest the need for further research on the subject. I found several instances of false positives in the metagenomic data from our microarrays which indicates the need to sequence more metagenomic samples. My initial research shows the importance of expanding our known metagenomic world; at this point there is always the likelihood of developing probes with unknown interactions because there is not enough sequencing. On the other hand my research did point out the sensitivity and quality of LLNL's microarrays when it identified a parvoviridae infection in a mosquito metagenomic sample from southern California. It also uniquely identified the presence of several species of the adenovirus which could mean that there was some archaic strain of the adenovirus present in the metagenomic sample or there was a contamination in the sample, requiring a further investigation to clarify.« less

  9. Automated smear counting and data processing using a notebook computer in a biomedical research facility.

    PubMed

    Ogata, Y; Nishizawa, K

    1995-10-01

    An automated smear counting and data processing system for a life science laboratory was developed to facilitate routine surveys and eliminate human errors by using a notebook computer. This system was composed of a personal computer, a liquid scintillation counter and a well-type NaI(Tl) scintillation counter. The radioactivity of smear samples was automatically measured by these counters. The personal computer received raw signals from the counters through an interface of RS-232C. The software for the computer evaluated the surface density of each radioisotope and printed out that value along with other items as a report. The software was programmed in Pascal language. This system was successfully applied to routine surveys for contamination in our facility.

  10. The social disutility of software ownership.

    PubMed

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  11. Understanding Acceptance of Software Metrics--A Developer Perspective

    ERIC Educational Resources Information Center

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  12. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  13. Modular Infrastructure for Rapid Flight Software Development

    NASA Technical Reports Server (NTRS)

    Pires, Craig

    2010-01-01

    This slide presentation reviews the use of modular infrastructure to assist in the development of flight software. A feature of this program is the use of model based approach for application unique software. A review of two programs that this approach was use on are: the development of software for Hover Test Vehicle (HTV), and Lunar Atmosphere and Dust Environment Experiment (LADEE).

  14. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    PubMed

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  15. An information model for use in software management estimation and prediction

    NASA Technical Reports Server (NTRS)

    Li, Ningda R.; Zelkowitz, Marvin V.

    1993-01-01

    This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.

  16. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  17. 78 FR 32169 - Facilitating the Deployment of Text-to-911 and Other Next Generation 911 Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... messaging services (i.e., all providers of software applications that enable a consumer to send text... messaging services (i.e., all providers of software applications that enable a consumer to send text... providers of interconnected text messaging services (i.e., all providers of software applications that...

  18. Virtual pools for interactive analysis and software development through an integrated Cloud environment

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Italiano, A.; Salomoni, D.; Calabrese Melcarne, A. K.

    2011-12-01

    WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.

  19. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  20. Simulation training for medical emergencies in the dental setting using an inexpensive software application.

    PubMed

    Kishimoto, N; Mukai, N; Honda, Y; Hirata, Y; Tanaka, M; Momota, Y

    2017-11-09

    Every dental provider needs to be educated about medical emergencies to provide safe dental care. Simulation training is available with simulators such as advanced life support manikins and robot patients. However, the purchase and development costs of these simulators are high. We have developed a simulation training course on medical emergencies using an inexpensive software application. The purpose of this study was to evaluate the educational effectiveness of this course. Fifty-one dental providers participated in this study from December 2014 to March 2015. Medical simulation software was used to simulate a patient's vital signs. We evaluated participants' ability to diagnose and treat vasovagal syncope or anaphylaxis with an evaluation sheet and conducted a questionnaire before and after the scenario-based simulation training. The median evaluation sheet score for vasovagal syncope increased significantly from 7/9 before to 9/9 after simulation training. The median score for anaphylaxis also increased significantly from 8/12 to 12/12 (P < .01). For the item "I can treat vasovagal syncope/anaphylaxis adequately," the percentage responding "Strongly agree" or "Agree" increased from 14% to 56% for vasovagal syncope and from 6% to 42% for anaphylaxis with simulation training. This simulation course improved participants' ability to diagnose and treat medical emergencies and improved their confidence. This course can be offered inexpensively using a software application. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. EpHLA software: a timesaving and accurate tool for improving identification of acceptable mismatches for clinical purposes.

    PubMed

    Filho, Herton Luiz Alves Sales; da Mata Sousa, Luiz Claudio Demes; von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; dos Santos Neto, Pedro de Alcântara; do Nascimento, Ferraz; de Castro, Adail Fonseca; do Nascimento, Liliane Machado; Kneib, Carolina; Bianchi Cazarote, Helena; Mayumi Kitamura, Daniele; Torres, Juliane Roberta Dias; da Cruz Lopes, Laiane; Barros, Aryela Loureiro; da Silva Edlin, Evelin Nildiane; de Moura, Fernanda Sá Leal; Watanabe, Janine Midori Figueiredo; do Monte, Semiramis Jamil Hadad

    2012-06-01

    The HLAMatchmaker algorithm, which allows the identification of “safe” acceptable mismatches (AMMs) for recipients of solid organ and cell allografts, is rarely used in part due to the difficulty in using it in the current Excel format. The automation of this algorithm may universalize its use to benefit the allocation of allografts. Recently, we have developed a new software called EpHLA, which is the first computer program automating the use of the HLAMatchmaker algorithm. Herein, we present the experimental validation of the EpHLA program by showing the time efficiency and the quality of operation. The same results, obtained by a single antigen bead assay with sera from 10 sensitized patients waiting for kidney transplants, were analyzed either by conventional HLAMatchmaker or by automated EpHLA method. Users testing these two methods were asked to record: (i) time required for completion of the analysis (in minutes); (ii) number of eplets obtained for class I and class II HLA molecules; (iii) categorization of eplets as reactive or non-reactive based on the MFI cutoff value; and (iv) determination of AMMs based on eplets' reactivities. We showed that although both methods had similar accuracy, the automated EpHLA method was over 8 times faster in comparison to the conventional HLAMatchmaker method. In particular the EpHLA software was faster and more reliable but equally accurate as the conventional method to define AMMs for allografts. The EpHLA software is an accurate and quick method for the identification of AMMs and thus it may be a very useful tool in the decision-making process of organ allocation for highly sensitized patients as well as in many other applications.

  2. Software Process Assurance for Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.

  3. Flight dynamics system software development environment (FDS/SDE) tutorial

    NASA Technical Reports Server (NTRS)

    Buell, John; Myers, Philip

    1986-01-01

    A sample development scenario using the Flight Dynamics System Software Development Environment (FDS/SDE) is presented. The SDE uses a menu-driven, fill-in-the-blanks format that provides online help at all steps, thus eliminating lengthy training and allowing immediate use of this new software development tool.

  4. Precise Documentation: The Key to Better Software

    NASA Astrophysics Data System (ADS)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  5. NA-42 TI Shared Software Component Library FY2011 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.

    The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less

  6. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  7. Software Development for the Hobby-Eberly Telescope's Segment Alignment Maintenance System using LABView

    NASA Technical Reports Server (NTRS)

    Hall, Drew P.; Ly, William; Howard, Richard T.; Weir, John; Rakoczy, John; Roe, Fred (Technical Monitor)

    2002-01-01

    The software development for an upgrade to the Hobby-Eberly Telescope (HET) was done in LABView. In order to improve the performance of the HET at the McDonald Observatory, a closed-loop system had to be implemented to keep the mirror segments aligned during periods of observation. The control system, called the Segment Alignment Maintenance System (SAMs), utilized inductive sensors to measure the relative motions of the mirror segments. Software was developed in LABView to tie the sensors, operator interface, and mirror-control motors together. Developing the software in LABView allowed the system to be flexible, understandable, and able to be modified by the end users. Since LABView is built using block diagrams, the software naturally followed the designed control system's block and flow diagrams, and individual software blocks could be easily verified. LABView's many built-in display routines allowed easy visualization of diagnostic and health-monitoring data during testing. Also, since LABView is a multi-platform software package, different programmers could develop the code remotely on various types of machines. LABView s ease of use facilitated rapid prototyping and field testing. There were some unanticipated difficulties in the software development, but the use of LABView as the software "language" for the development of SAMs contributed to the overall success of the project.

  8. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Geary, Joseph; Hawkins, Lamar; Ahmad, Anees; Gong, Qian

    1997-01-01

    This report describes work conducted on Delivery Order 181 between October 1996 through June 1997. During this period software was written to: compute axial PSD's from RDOS AXAF-I mirror surface maps; plot axial surface errors and compute PSD's from HDOS "Big 8" axial scans; plot PSD's from FITS format PSD files; plot band-limited RMS vs axial and azimuthal position for multiple PSD files; combine and organize PSD's from multiple mirror surface measurements formatted as input to GRAZTRACE; modify GRAZTRACE to read FITS formatted PSD files; evaluate AXAF-I test results; improve and expand the capabilities of the GT x-ray mirror analysis package. During this period work began on a more user-friendly manual for the GT program, and improvements were made to the on-line help manual.

  9. Playing with Plug-ins

    ERIC Educational Resources Information Center

    Thompson, Douglas E.

    2013-01-01

    In today's complex music software packages, many features can remain unexplored and unused. Software plug-ins--available in most every music software package, yet easily overlooked in the software's basic operations--are one such feature. In this article, I introduce readers to plug-ins and offer tips for purchasing plug-ins I have…

  10. Journal of Open Source Software (JOSS): design and first-year review

    NASA Astrophysics Data System (ADS)

    Smith, Arfon M.

    2018-01-01

    JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.

  11. Results of a Survey Software Development Project Management in the U.S. Aerospace Industry. Volume II. Project Management Techniques, Procedures and Tools.

    DTIC Science & Technology

    1979-12-18

    I Z I UO W-1 eu 0 - 0 tD CL 0, 0 -I7 NW 2 - x M.j a CL W2 X 41 a ~ 0 0,a 4~~ 0 Z .D .J 0 2. N 0 ~N IU 0 4 - 2 0 ~. 0 Q. ’o ~ 0, 𔃺e U - U ~- 0, 0 -a...0 .44 A A A Ao 0 - 2. -U 2- 4’ A 4’ A o .~ 0 .~ 2. flJ 2. A Ao ~. a 2. 𔃾 2- @2 @2 @2 A 0 .~. a I- 2. Xii - 0 2 @2 ON AXe Re 2- Oi. K A.. A A AU .40...Project manager or person appointed by him SE/ TD project manager b. Senior ADP Manager Director Director computer programming Software program design

  12. 1025: MAGIC 2010 Multi Autonomous Ground International Challenge. Volume I

    DTIC Science & Technology

    2010-10-22

    the creation of software required to interact with the sensors for each subsystem. Most of the systems have been extensively developed and tested with...varying levels of success. All of the systems have been developed from the ground up and have been discussed in the report. 15. SUBJECT TERMS...the system . The system was broken down into several components. These were: (i) The ability to perform accurate localisation both indoors and outside

  13. Managing Critical Infrastructures C.I.M. Suite

    ScienceCinema

    Dudenhoeffer, Donald

    2018-05-23

    See how a new software package developed by INL researchers could help protect infrastructure during natural disasters, terrorist attacks and electrical outages. For more information about INL research, visit http://www.facebook.com/idahonationallaboratory.

  14. The Particle-in-Cell and Kinetic Simulation Software Center

    NASA Astrophysics Data System (ADS)

    Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; An, W.; Dalichaouch, T. N.; Davidson, A.; Hildebrand, L.; Joglekar, A.; May, J.; Miller, K.; Touati, M.; Xu, X. L.

    2017-10-01

    The UCLA Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) aims to support an international community of PIC and plasma kinetic software developers, users, and educators; to increase the use of this software for accelerating the rate of scientific discovery; and to be a repository of knowledge and history for PIC. We discuss progress towards making available and documenting illustrative open-source software programs and distinct production programs; developing and comparing different PIC algorithms; coordinating the development of resources for the educational use of kinetic software; and the outcomes of our first sponsored OSIRIS users workshop. We also welcome input and discussion from anyone interested in using or developing kinetic software, in obtaining access to our codes, in collaborating, in sharing their own software, or in commenting on how PICKSC can better serve the DPP community. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  15. Comparison of Virtual Nutri Plus® and Dietpro 5i® software systems for the assessment of nutrient intake before and after Roux-en-Y gastric bypass

    PubMed Central

    Marques da Silva, Mariane; Sala, Priscila Campos; Cardinelli, Camila Siqueira; Torrinhas, Raquel Suzana; Waitzberg, Dan Linetzky

    2014-01-01

    OBJECTIVES: The assessment of nutritional intake before and after bariatric surgery assists in identifying eating disorders, nutritional deficiencies and weight loss/maintenance. The 7-day record is the gold standard for such an assessment and is interpreted using specialized software. This study sought to compare the Virtual Nutri Plus® and Dietpro 5i® software systems in assessing nutrient intake in obese patients with type 2 diabetes mellitus who underwent a Roux-en-Y gastric bypass. METHODS: Nutritional intake was assessed in 10 obese women with type 2 diabetes mellitus before and 3 months after Roux-en-Y gastric bypass. The 7-day record was used to assess food intake and then, the Virtual Nutri Plus® and Dietpro 5i® software systems were used to calculate calorie, macronutrient and micronutrient intake based on validated food chemical composition databases. Clinicaltrials.gov: NCT01251016. RESULTS: During the preoperative period, deficits in the ingestion of total fiber and 15 out of 22 estimated micronutrients were observed when using the Virtual Nutri Plus®, compared to deficiencies in total fiber and 4 micronutrients when using the Dietpro 5i®. During the postoperative period, both the Virtual Nutri Plus® and Dietpro 5i® systems detected deficits in the ingestion of total fiber, carbohydrates and 19 micronutrients, but only the Virtual Nutri Plus® detected deficits in complex B vitamins (except B12) and minerals. CONCLUSION: Virtual Nutri Plus® was more sensitive than Dietpro 5i® for the identification of deficits in nutrient intake in obese, type 2 diabetes mellitus patients undergoing Roux-en-Y gastric bypass. PMID:25518027

  16. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  17. AcquiControl: Seismic Data Logger Control via iPhone

    NASA Astrophysics Data System (ADS)

    Golden, S.; Horkley, B.

    2010-12-01

    Seismic stations are often placed in remote areas, accessible only a few times per year. A typical stand-alone seismic station consists of the seismometer and a data logger, which records the data to attached disk drives or flash memory for later collection by a field crew. Even if the station uses telemetry, maintenance visits may be minimized but rarely entirely avoided. During station visits, field personnel use laptops or handheld devices to control the data logger and seismometer, check their status, and adjust their configuration if necessary. The efficiency and reliability of these on-site quality control tasks has a significant impact on the overall performance of seismic field operations. One widespread seismic data logger is the RT130 by REFTEK Inc., which is traditionally controlled through REFTEK proprietary software designed to run on Palm compatible devices. While this software functions well, compatible Palm handhelds went out of production and are getting hard to find. Also we felt, that its user interface still offered room for improvement. Therefore, we developed a new RT130 control application, named AcquiControl, which runs on Apple’s iPhone or iPod Touch, and features a redesigned, user-friendly interface. The Palm handheld communicates with the data logger through a serial cable. While this is technically also possible with the iPhone or iPod Touch, the production and licensing costs for the required custom cable so far kept us from further pursuing this path. Instead, AcquiControl makes use of the wireless networking capabilities inherent to any iPhone or iPod Touch. To wirelessly connect to an RT130 data logger, we use a wireless-to-serial “dongle” manufactured by Serialio.com, which attaches directly to the data logger’s serial port. First experiments with this setup have shown, that it is actually more convenient to use than a directly attached serial cable, especially during less than ideal environmental conditions such as present while working in the rain. AcquiControl offers a third-party alternative to interface with RT130 data loggers in the field. Currently it covers most frequently used capabilities of the older Palm software through equivalent features, with plans to add more as needed. Beyond that, the application could be easily adapted to support other data loggers than the RT130, possibly even by other manufacturers. This would give users a more uniform interface regardless of data logger model, which could become an advantage during mixed-equipment campaigns. Also, the same software could be expanded to allow the direct input of various field notes, which could be downloaded together with automatically logged configuration data to ease the preparation of seismic metadata or the building of a project-central seismic database.

  18. Perceptions and Effects of Classroom Capture Software on Course Performance among Selected Online Community College Mathematics Students

    ERIC Educational Resources Information Center

    Smith, Rachel Naomi

    2017-01-01

    The purpose of this mixed methods research study was two-fold. First, I compared the findings of the success rates of online mathematics students with the perceived effects of classroom capture software in hopes to find convergence. Second, I used multiple methods in different phases of the study to expand the breadth and range of the effects of…

  19. Adopting Industry Standards for Control Systems Within Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Young, James Scott; Boulanger, Richard

    2002-01-01

    This paper gives a description of OPC (Object Linking and Embedding for Process Control) standards for process control and outlines the experiences at JSC with using these standards to interface with I/O hardware from three independent vendors. The I/O hardware was integrated with a commercially available SCADA/HMI software package to make up the control and monitoring system for the Environmental Systems Test Stand (ESTS). OPC standards were utilized for communicating with I/O hardware and the software was used for implementing monitoring, PC-based distributed control, and redundant data storage over an Ethernet physical layer using an embedded din-rail mounted PC.

  20. The Implementation of a Multi-Backend Database System (MDBS). Part I. Software Engineering Strategies and Efforts Towards a Prototype MDBS.

    DTIC Science & Technology

    1983-06-01

    for DEC PDPll systems. MAINSAIL was developed and is marketed with a set of integrated tools for program development. The syntax of the language is...stack, and to test for stack-full and stack-empty conditions. This technique is useful in enforcing data integrity and in con- trolling concurrent...and market MAINSAIL. The language is distinguished by its portability. The same compiler and runtime system, both written in MAINSAIL, are the basis

Top