Sample records for software previously delivered

  1. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  2. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  3. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  4. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  5. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  6. 48 CFR 227.7103-6 - Contract clauses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... expense). Do not use the clause when the only deliverable items are computer software or computer software... architect-engineer and construction contracts. (b)(1) Use the clause at 252.227-7013 with its Alternate I in... Software Previously Delivered to the Government, in solicitations when the resulting contract will require...

  7. 48 CFR 227.7103-6 - Contract clauses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... expense). Do not use the clause when the only deliverable items are computer software or computer software... architect-engineer and construction contracts. (b)(1) Use the clause at 252.227-7013 with its Alternate I in... Software Previously Delivered to the Government, in solicitations when the resulting contract will require...

  8. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  9. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  10. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  11. Delivering Software Process-Specific Project Courses in Tertiary Education Environment: Challenges and Solution

    ERIC Educational Resources Information Center

    Rong, Guoping; Shao, Dong

    2012-01-01

    The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…

  12. A software bus for thread objects

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Li, Dehuai

    1995-01-01

    The authors have implemented a software bus for lightweight threads in an object-oriented programming environment that allows for rapid reconfiguration and reuse of thread objects in discrete-event simulation experiments. While previous research in object-oriented, parallel programming environments has focused on direct communication between threads, our lightweight software bus, called the MiniBus, provides a means to isolate threads from their contexts of execution by restricting communications between threads to message-passing via their local ports only. The software bus maintains a topology of connections between these ports. It routes, queues, and delivers messages according to this topology. This approach allows for rapid reconfiguration and reuse of thread objects in other systems without making changes to the specifications or source code. A layered approach that provides the needed transparency to developers is presented. Examples of using the MiniBus are given, and the value of bus architectures in building and conducting simulations of discrete-event systems is discussed.

  13. Media processors using a new microsystem architecture designed for the Internet era

    NASA Astrophysics Data System (ADS)

    Wyland, David C.

    1999-12-01

    The demands of digital image processing, communications and multimedia applications are growing more rapidly than traditional design methods can fulfill them. Previously, only custom hardware designs could provide the performance required to meet the demands of these applications. However, hardware design has reached a crisis point. Hardware design can no longer deliver a product with the required performance and cost in a reasonable time for a reasonable risk. Software based designs running on conventional processors can deliver working designs in a reasonable time and with low risk but cannot meet the performance requirements. What is needed is a media processing approach that combines very high performance, a simple programming model, complete programmability, short time to market and scalability. The Universal Micro System (UMS) is a solution to these problems. The UMS is a completely programmable (including I/O) system on a chip that combines hardware performance with the fast time to market, low cost and low risk of software designs.

  14. Volpe SuperFar V6.0 Software and Support Documentation; Letter Report V324-FB48B3-LR3

    DOT National Transportation Integrated Search

    2017-09-29

    This Letter Report serves to deliver the third external release version of the USDOT Volpe Centers SuperFAR Spectral Aircraft Noise Processing Software (Version 6.0). Earlier versions of the software were delivered to FAA in February 2015 and Marc...

  15. The simulation library of the Belle II software system

    NASA Astrophysics Data System (ADS)

    Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.

    2017-10-01

    SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.

  16. Multi-stage learning aids applied to hands-on software training.

    PubMed

    Rother, Kristian; Rother, Magdalena; Pleus, Alexandra; Upmeier zu Belzen, Annette

    2010-11-01

    Delivering hands-on tutorials on bioinformatics software and web applications is a challenging didactic scenario. The main reason is that trainees have heterogeneous backgrounds, different previous knowledge and vary in learning speed. In this article, we demonstrate how multi-stage learning aids can be used to allow all trainees to progress at a similar speed. In this technique, the trainees can utilize cards with hints and answers to guide themselves self-dependently through a complex task. We have successfully conducted a tutorial for the molecular viewer PyMOL using two sets of learning aid cards. The trainees responded positively, were able to complete the task, and the trainer had spare time to respond to individual questions. This encourages us to conclude that multi-stage learning aids overcome many disadvantages of established forms of hands-on software training.

  17. ’Pushing a Big Rock Up a Steep Hill’: Acquisition Lessons Learned from DoD Applications Storefront

    DTIC Science & Technology

    2014-04-30

    software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver software from a central...automated delivery of software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver... mobile technologies, hoping to enhance warfighter situational awareness and access to information. Unfortunately, the Defense Acquisition System has not

  18. IEEE Computer Society/Software Engineering Institute Watts S. Humphrey Software Process Achievement (SPA) Award 2016: Nationwide

    DTIC Science & Technology

    2017-04-05

    Information Technology at Nationwide v Abstract vi 1 Business Imperatives 1 1.1 Deliver the Right Work 1 1.2 Deliver the Right Way 1 1.3 Deliver with...an Engaged Workforce 1 2 Challenges and Opportunities 2 2.1 Responding to Demand 2 2.2 Standards and Capabilities 2 2.3 Information Technology ...release and unlimited distribution. Information Technology at Nationwide Nationwide Information Technology (IT) is comprised of seven offices

  19. Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio User's Guide -- Advanced Exploration Systems (AES)

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto; Shalkhauser, Mary Jo Windmille

    2017-01-01

    The Integrated Power, Avionics and Software (IPAS) software defined radio (SDR) was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RAICS) platform, for radio development at NASA Johnson Space Center. Software and hardware description language (HDL) code were delivered by NASA Glenn Research Center for use in the IPAS test bed and for development of their own Space Telecommunications Radio System (STRS) waveforms on the RAICS platform. The purpose of this document is to describe how to setup and operate the IPAS STRS Radio platform with its delivered test waveform.

  20. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  1. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  2. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  3. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  4. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  5. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  6. High-efficiency space-based software radio architectures & algorithms (a minimum size, weight, and power TeraOps processor)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunham, Mark Edward; Baker, Zachary K; Stettler, Matthew W

    2009-01-01

    Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS) parts available at the time of design. In this case we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream, while using only 53 Watts mains power, 5.5 kg, and 3 liters. This processing capability enables very advanced algorithms such as our wideband RF compression scheme tomore » operate remotely, allowing network bandwidth constrained applications to deliver previously unattainable performance.« less

  7. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    ERIC Educational Resources Information Center

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  8. Shifter: Containers for HPC

    NASA Astrophysics Data System (ADS)

    Gerhardt, Lisa; Bhimji, Wahid; Canon, Shane; Fasel, Markus; Jacobsen, Doug; Mustafa, Mustafa; Porter, Jeff; Tsulaia, Vakho

    2017-10-01

    Bringing HEP computing to HPC can be difficult. Software stacks are often very complicated with numerous dependencies that are difficult to get installed on an HPC system. To address this issue, NERSC has created Shifter, a framework that delivers Docker-like functionality to HPC. It works by extracting images from native formats and converting them to a common format that is optimally tuned for the HPC environment. We have used Shifter to deliver the CVMFS software stack for ALICE, ATLAS, and STAR on the supercomputers at NERSC. As well as enabling the distribution multi-TB sized CVMFS stacks to HPC, this approach also offers performance advantages. Software startup times are significantly reduced and load times scale with minimal variation to 1000s of nodes. We profile several successful examples of scientists using Shifter to make scientific analysis easily customizable and scalable. We will describe the Shifter framework and several efforts in HEP and NP to use Shifter to deliver their software on the Cori HPC system.

  9. Perceived Effectiveness of Web Conferencing Software in the Digital Environment to Deliver a Graduate Course in Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Hudson, Tina M.; Knight, Victoria; Collins, Belva C.

    2012-01-01

    This article provides an overview of the planning and instructional delivery of a course in Applied Behavior Analysis using Adobe Connect Pro™. A description of software features used by course instructors is provided along with how each feature compares to resources found to deliver instruction in a traditional classroom setting. In addition, the…

  10. Animated software training via the internet: lessons learned

    NASA Technical Reports Server (NTRS)

    Scott, C. J.

    2000-01-01

    The Mission Execution and Automation Section, Information Technologies and Software Systems Division at the Jet Propulsion Laboratory, recently delivered an animated software training module for the TMOD UPLINK Consolidation Task for operator training at the Deep Space Network.

  11. 48 CFR 27.404-1 - Unlimited rights data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software). (b) Form, fit, and function data delivered under contract. (c) Data (except as may be included with restricted computer software) that constitute manuals or instructional and training material for... rights data or restricted computer software (see 27.404-2). ...

  12. 48 CFR 27.404-1 - Unlimited rights data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software). (b) Form, fit, and function data delivered under contract. (c) Data (except as may be included with restricted computer software) that constitute manuals or instructional and training material for... rights data or restricted computer software (see 27.404-2). ...

  13. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  14. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  15. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  16. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  17. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  18. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  19. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  20. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  1. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... recipient shall not, for example, enhance, decompile, disassemble, or reverse engineer the software; time... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  2. Analyzing the costs to deliver medication therapy management services.

    PubMed

    Rupp, Michael T

    2011-01-01

    To provide pharmacy managers and consultant pharmacists with a step-by-step approach for analyzing of the costs of delivering medication therapy management (MTM) services and to describe use of a free online software application for determining costs of delivering MTM. The process described is applicable to community pharmacies and consultant pharmacists who provide MTM services from nonpharmacy settings. The PharmAccount Service Cost Calculator is an Internet- based software application that uses a guided online interview to collect information needed to conduct a comprehensive cost analysis of any specialized pharmacy service. In addition to direct variable and fixed costs, the software automatically allocates indirect and overhead costs to the service and generates an itemized report that details the components of service delivery costs. The service cost calculator is sufficiently flexible to support the analysis of virtually any specialized pharmacy service, irrespective of whether the service is being delivered from a physical pharmacy. The software application allows users to perform sensitivity analysis to quickly determine the potential impact that alternate scenarios would have on service delivery cost. It is therefore particularly well suited to assist in the design and planning of a new pharmacy service. Good management requires that the cost implications of service delivery decisions are known and considered. Analyzing the cost of an MTM service is an important step in developing a sustainable business model.

  3. Situational Lightning Climatologies for Central Florida: Phase III

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    This report describes work done by the Applied Meteorology Unit (AMU) to add composite soundings to the Advanced Weather Interactive Processing System (AWIPS). This allows National Weather Service (NWS) forecasters to compare the current atmospheric state with climatology. In a previous phase, the AMU created composite soundings for four rawinsonde observation stations in Florida, for each of eight flow regimes. The composite soundings were delivered to the NWS Melbourne (MLB) office for display using the NSHARP software program. NWS MLB requested that the AMU make the composite soundings available for display in AWIPS. The AMU first created a procedure to customize AWIPS so composite soundings could be displayed. A unique four-character identifier was created for each of the 32 composite soundings. The AMU wrote a Tool Command Language/Tool Kit (TcVTk) software program to convert the composite soundings from NSHARP to Network Common Data Form (NetCDF) format. The NetCDF files were then displayable by AWIPS.

  4. 48 CFR 1852.227-19 - Commercial computer software-Restricted rights.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software-Restricted rights. 1852.227-19 Section 1852.227-19 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-19 Commercial computer software—Restricted rights. (a) As prescribed in 1827... regarding any computer software delivered under this contract/purchase order, the NASA Contracting Officer...

  5. 48 CFR 1852.227-19 - Commercial computer software-Restricted rights.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software-Restricted rights. 1852.227-19 Section 1852.227-19 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-19 Commercial computer software—Restricted rights. (a) As prescribed in 1827... regarding any computer software delivered under this contract/purchase order, the NASA Contracting Officer...

  6. 48 CFR 1852.227-19 - Commercial computer software-Restricted rights.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software-Restricted rights. 1852.227-19 Section 1852.227-19 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-19 Commercial computer software—Restricted rights. (a) As prescribed in 1827... regarding any computer software delivered under this contract/purchase order, the NASA Contracting Officer...

  7. 48 CFR 1852.227-19 - Commercial computer software-Restricted rights.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software-Restricted rights. 1852.227-19 Section 1852.227-19 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-19 Commercial computer software—Restricted rights. (a) As prescribed in 1827... regarding any computer software delivered under this contract/purchase order, the NASA Contracting Officer...

  8. 48 CFR 1852.227-19 - Commercial computer software-Restricted rights.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software-Restricted rights. 1852.227-19 Section 1852.227-19 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-19 Commercial computer software—Restricted rights. (a) As prescribed in 1827... regarding any computer software delivered under this contract/purchase order, the NASA Contracting Officer...

  9. 48 CFR 227.7103-7 - Use and non-disclosure agreement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... release, disclosure, or authorized use of technical data or computer software subject to special license... (b) of this subsection, technical data or computer software delivered to the Government with..., display, or disclose technical data subject to limited rights or computer software subject to restricted...

  10. MaROS: Information Management Service

    NASA Technical Reports Server (NTRS)

    Allard, Daniel A.; Gladden, Roy E.; Wright, Jesse J.; Hy, Franklin H.; Rabideau, Gregg R.; Wallick, Michael N.

    2011-01-01

    This software is provided by the Mars Relay Operations Service (MaROS) task to a variety of Mars projects for the purpose of coordinating communications sessions between landed spacecraft assets and orbiting spacecraft assets at Mars. The Information Management Service centralizes a set of functions previously distributed across multiple spacecraft operations teams, and as such, greatly improves visibility into the end-to-end strategic coordination process. Most of the process revolves around the scheduling of communications sessions between the spacecraft during periods of time when a landed asset on Mars is geometrically visible by an orbiting spacecraft. These relay sessions are used to transfer data both to and from the landed asset via the orbiting asset on behalf of Earth-based spacecraft operators. This software component is an application process running as a Java virtual machine. The component provides all service interfaces via a Representational State Transfer (REST) protocol over https to external clients. There are two general interaction modes with the service: upload and download of data. For data upload, the service must execute logic specific to the upload data type and trigger any applicable calculations including pass delivery latencies and overflight conflicts. For data download, the software must retrieve and correlate requested information and deliver to the requesting client. The provision of this service enables several key advancements over legacy processes and systems. For one, this service represents the first time that end-to-end relay information is correlated into a single shared repository. The software also provides the first multimission latency calculator; previous latency calculations had been performed on a mission-by-mission basis.

  11. Design Your Own Instructional Software: It's Easy.

    ERIC Educational Resources Information Center

    Pauline, Ronald F.

    Computer Assisted Instruction (CAI) is, quite simply, an instance in which instructional content activities are delivered via a computer. Many commercially-available software programs, although excellent programs, may not be acceptable for each individual teacher's classroom. One way to insure that software is not only acceptable but also targets…

  12. Software-as-a-Service Vendors: Are They Ready to Successfully Deliver?

    NASA Astrophysics Data System (ADS)

    Heart, Tsipi; Tsur, Noa Shamir; Pliskin, Nava

    Software as a service (SaaS) is a software sourcing option that allows organizations to remotely access enterprise applications, without having to install the application in-house. In this work we study vendors' readiness to deliver SaaS, a topic scarcely studied before. The innovation classification (evolutionary vs. revolutionary) and a new, Seven Fundamental Organizational Capabilities (FOCs) Model, are used as the theoretical frameworks. The Seven FOCs model suggests generic yet comprehensive set of capabilities that are required for organizational success: 1) sensing the stakeholders, 2) sensing the business environment, 3) sensing the knowledge environment, 4) process control, 5) process improvement, 6) new process development, and 7) appropriate resolution.

  13. Proposal for a New ’Rights in Software’ Clause for Software Acquisitions by the Department of Defense.

    DTIC Science & Technology

    1986-09-01

    point here Is that the capital cost of design and development (including the cost of software tools and/or CAD/CAM programs which aided in the development...and capitalization , software Is in many ways more Ike a hardware component than it is Ike the tech- nical documentation which supports the hardware...Invoked, the owner of intelectual property rights in software may attach appropriate copyright notices to software delivered under this contract. 2.2.2

  14. From Product- to Service-Oriented Strategies in the Enterprise Software Market

    ERIC Educational Resources Information Center

    Xin, Mingdi

    2009-01-01

    The enterprise software market is seeing the rise of a new business model--selling Software-as-a-Service (SaaS), in which a standard piece of software is owned and managed remotely by the vendor and delivered as a service over the Internet. Despite the hype, questions remain regarding the rise of this new service model and how it would impact the…

  15. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  16. Developing of an automation for therapy dosimetry systems by using labview software

    NASA Astrophysics Data System (ADS)

    Aydin, Selim; Kam, Erol

    2018-06-01

    Traceability, accuracy and consistency of radiation measurements are essential in radiation dosimetry, particularly in radiotherapy, where the outcome of treatments is highly dependent on the radiation dose delivered to patients. Therefore it is very important to provide reliable, accurate and fast calibration services for therapy dosimeters since the radiation dose delivered to a radiotherapy patient is directly related to accuracy and reliability of these devices. In this study, we report the performance of in-house developed computer controlled data acquisition and monitoring software for the commercially available radiation therapy electrometers. LabVIEW® software suite is used to provide reliable, fast and accurate calibration services. The software also collects environmental data such as temperature, pressure and humidity in order to use to use these them in correction factor calculations. By using this software tool, a better control over the calibration process is achieved and the need for human intervention is reduced. This is the first software that can control frequently used dosimeter systems, in radiation thereapy field at hospitals, such as Unidos Webline, Unidos E, Dose-1 and PC Electrometers.

  17. Software is a Product...Not

    DTIC Science & Technology

    1992-09-01

    understand the process if we consider software as a service , not a prod- uct. Let me expand on this statement. I do not believe we must do any of the... software -building activities differently. Instead, from the perspective of schedul- ing, budgeting, and delivering software , we should use the service ...While we’re not perfect, we do a fairly its upgrades. The pricing scheme be- good job of managing hardware engi- Software as a service . What is a

  18. A Quantitative Inquiry into Software Developers' Intentions to Use Agile Scrum Method

    ERIC Educational Resources Information Center

    Huq, M. Shamsul

    2017-01-01

    In recent years, organizations have shown increasing willingness to adopt agile scrum method (ASM) to meet the demand of modern-day software development; that is to deliver faster and better software, with a built-in flexibility to absorb last minute changes in requirements. This research study was undertaken to uncover the underlying factors that…

  19. Non-Classroom Use of "Presentation Software" in Accelerated Classes: Student Use and Perceptions of Value

    ERIC Educational Resources Information Center

    Davies, Thomas; Korte, Leon; Cornelsen, Erin

    2016-01-01

    Numerous articles found in education literature discuss the advantages and disadvantages of using "presentation" software to deliver critical course content to students. Frequently the perceived value of the use of software such as PowerPoint is dependent upon how it is used, for instance, the extent to which bells and whistles are…

  20. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    NASA Astrophysics Data System (ADS)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  1. The Role of Organizational Sub-Cultures in Higher Education Adoption of Open Source Software (OSS) for Teaching/Learning

    ERIC Educational Resources Information Center

    Williams van Rooij, Shahron

    2010-01-01

    This paper contrasts the arguments offered in the literature advocating the adoption of open source software (OSS)--software delivered with its source code--for teaching and learning applications, with the reality of limited enterprise-wide deployment of those applications in U.S. higher education. Drawing on the fields of organizational…

  2. Advancements in Curriculum and Assessment by the Use of IMMEX Technology in the Organic Laboratory

    ERIC Educational Resources Information Center

    Cox, Charles T., Jr.; Cooper, Melanie M.; Pease, Rebecca; Buchanan, Krystal; Hernandez-Cruz, Laura; Stevens, Ron; Picione, John; Holme, Thomas

    2008-01-01

    The use of web-based software and course management systems for the delivery of online assessments in the chemistry classroom is becoming more common. IMMEX software, like other web-based software, can be used for delivering assessments and providing feedback, but differs in that it offers additional features designed to give insights and promote…

  3. Integrating External Software into SMART Board™ Calculus Lessons

    ERIC Educational Resources Information Center

    Wolmer, Allen; Khazanov, Leonid

    2011-01-01

    Interactive Whiteboards (IWBs) are becoming commonplace throughout primary, secondary, and postsecondary classrooms. However, the focus of the associated lesson creation & management software tools delivered with IWBs has been the primary grades, while secondary and postsecondary mathematics lessons have requirements beyond what is delivered…

  4. Lecturing with a Virtual Whiteboard

    NASA Astrophysics Data System (ADS)

    Milanovic, Zoran

    2006-09-01

    Recent advances in computer technology, word processing software, and projection systems have made traditional whiteboard lecturing obsolete. Tablet personal computers connected to display projectors and running handwriting software have replaced the marker-on-whiteboard method of delivering a lecture. Since the notes can be saved into an electronic file, they can be uploaded to a class website to be perused by the students later. This paper will describe the author's experiences in using this new technology to deliver physics lectures at an engineering school. The benefits and problems discovered will be reviewed and results from a survey of student opinions will be discussed.

  5. Delivering Alert Messages to Members of a Work Force

    NASA Technical Reports Server (NTRS)

    Loftis, Julia; Nickens, Stephanie; Pell, Melissa; Pell, Vince

    2008-01-01

    Global Alert Resolution Network (GARNET) is a software system for delivering emergency alerts as well as less-urgent messages to members of the Goddard Space Flight Center work force via an intranet or the Internet, and can be adapted to similar use in other large organizations.

  6. Elearn: A Collaborative Educational Virtual Environment.

    ERIC Educational Resources Information Center

    Michailidou, Anna; Economides, Anastasios A.

    Virtual Learning Environments (VLEs) that support collaboration are one of the new technologies that have attracted great interest. VLEs are learning management software systems composed of computer-mediated communication software and online methods of delivering course material. This paper presents ELearn, a collaborative VLE for teaching…

  7. HR, Streamlined

    ERIC Educational Resources Information Center

    Ramaswami, Rama

    2008-01-01

    Human Resources (HR) administrators are finding that as software modules are installed to automate various processes, they have more time to focus on strategic objectives. And as compliance with affirmative action and other employment regulations comes under increasing scrutiny, HR staffers are finding that software can deliver and track data with…

  8. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software-Licensing. 1852.227-86 Section 1852.227-86 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-86 Commercial computer software—Licensing. As prescribed in 1827.409-70, insert the following clause: Commercial Computer Software—Licensing (DEC 1987) (a) Any delivered...

  9. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software-Licensing. 1852.227-86 Section 1852.227-86 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-86 Commercial computer software—Licensing. As prescribed in 1827.409-70, insert the following clause: Commercial Computer Software—Licensing (DEC 1987) (a) Any delivered...

  10. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software-Licensing. 1852.227-86 Section 1852.227-86 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-86 Commercial computer software—Licensing. As prescribed in 1827.409-70, insert the following clause: Commercial Computer Software—Licensing (DEC 1987) (a) Any delivered...

  11. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software-Licensing. 1852.227-86 Section 1852.227-86 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-86 Commercial computer software—Licensing. As prescribed in 1827.409-70, insert the following clause: Commercial Computer Software—Licensing (DEC 1987) (a) Any delivered...

  12. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software-Licensing. 1852.227-86 Section 1852.227-86 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.227-86 Commercial computer software—Licensing. As prescribed in 1827.409-70, insert the following clause: Commercial Computer Software—Licensing (DEC 1987) (a) Any delivered...

  13. User-centered design of multi-gene sequencing panel reports for clinicians.

    PubMed

    Cutting, Elizabeth; Banchero, Meghan; Beitelshees, Amber L; Cimino, James J; Fiol, Guilherme Del; Gurses, Ayse P; Hoffman, Mark A; Jeng, Linda Jo Bone; Kawamoto, Kensaku; Kelemen, Mark; Pincus, Harold Alan; Shuldiner, Alan R; Williams, Marc S; Pollin, Toni I; Overby, Casey Lynnette

    2016-10-01

    The objective of this study was to develop a high-fidelity prototype for delivering multi-gene sequencing panel (GS) reports to clinicians that simulates the user experience of a final application. The delivery and use of GS reports can occur within complex and high-paced healthcare environments. We employ a user-centered software design approach in a focus group setting in order to facilitate gathering rich contextual information from a diverse group of stakeholders potentially impacted by the delivery of GS reports relevant to two precision medicine programs at the University of Maryland Medical Center. Responses from focus group sessions were transcribed, coded and analyzed by two team members. Notification mechanisms and information resources preferred by participants from our first phase of focus groups were incorporated into scenarios and the design of a software prototype for delivering GS reports. The goal of our second phase of focus group, to gain input on the prototype software design, was accomplished through conducting task walkthroughs with GS reporting scenarios. Preferences for notification, content and consultation from genetics specialists appeared to depend upon familiarity with scenarios for ordering and delivering GS reports. Despite familiarity with some aspects of the scenarios we proposed, many of our participants agreed that they would likely seek consultation from a genetics specialist after viewing the test reports. In addition, participants offered design and content recommendations. Findings illustrated a need to support customized notification approaches, user-specific information, and access to genetics specialists with GS reports. These design principles can be incorporated into software applications that deliver GS reports. Our user-centered approach to conduct this assessment and the specific input we received from clinicians may also be relevant to others working on similar projects. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Group Projects and the Computer Science Curriculum

    ERIC Educational Resources Information Center

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  15. A case study : Chart II software upgrade, using a design competition to procure ITS software

    DOT National Transportation Integrated Search

    2000-03-01

    This is one of a series of case studies that examine procurement approaches used to deliver Intelligent Transportation System (ITS) projects. The purpose of these reports is to provide examples of successful strategies that have been used to overcome...

  16. Baby Boy Jones Interactive Case-Based Learning Activity: A Web-Delivered Teaching Strategy.

    PubMed

    Cleveland, Lisa M; Carmona, Elenice Valentim; Paper, Bruce; Solis, Linda; Taylor, Bonnie

    2015-01-01

    Faced with limited resources, nurse educators are challenged with transforming nursing education while preparing enough qualified nurses to meet future demand; therefore, innovative approaches to teaching are needed. In this article, we describe the development of an innovative teaching activity. Baby Boy Jones is a Web-delivered, case-based learning activity focused on neonatal infection. It was created using e-learning authoring software and delivered through a learning management system.

  17. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  18. Courseware Authoring and Delivering System for Chinese Language Instruction. Final Report.

    ERIC Educational Resources Information Center

    Mao, Tang

    A study investigated technical methods for simplifying and improving the creation of software for teaching uncommonly taught languages such as Chinese. Research consisted of assessment of existing authoring systems, domestic and overseas, available hardware, peripherals, and software packages that could be integrated into this project. Then some…

  19. 48 CFR 252.227-7017 - Identification and assertion of use, release, or disclosure restrictions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and Computer Software—Small Business Innovation Research (SBIR) Program clause. (2) If a successful offeror will not be required to deliver technical data, the Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation clause, or, if this solicitation contemplates a contract under the...

  20. 48 CFR 252.227-7017 - Identification and assertion of use, release, or disclosure restrictions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and Computer Software—Small Business Innovation Research (SBIR) Program clause. (2) If a successful offeror will not be required to deliver technical data, the Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation clause, or, if this solicitation contemplates a contract under the...

  1. 48 CFR 252.227-7017 - Identification and assertion of use, release, or disclosure restrictions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and Computer Software—Small Business Innovative Research (SBIR) Program clause. (2) If a successful offeror will not be required to deliver technical data, the Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation clause, or, if this solicitation contemplates a contract under the...

  2. 48 CFR 252.227-7017 - Identification and assertion of use, release, or disclosure restrictions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and Computer Software—Small Business Innovation Research (SBIR) Program clause. (2) If a successful offeror will not be required to deliver technical data, the Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation clause, or, if this solicitation contemplates a contract under the...

  3. 48 CFR 252.227-7017 - Identification and assertion of use, release, or disclosure restrictions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and Computer Software—Small Business Innovation Research (SBIR) Program clause. (2) If a successful offeror will not be required to deliver technical data, the Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation clause, or, if this solicitation contemplates a contract under the...

  4. PLATO[R] Achieve Now. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    "PLATO[R] Achieve Now" is a software-based curriculum for the elementary and middle school grades. Instructional content is delivered via the PlayStation Portable (PSP[R]) system, allowing students to access learning materials in various settings. Software-based assessments are used to customize individual instruction, allowing students…

  5. Discovering and Mitigating Software Vulnerabilities through Large-Scale Collaboration

    ERIC Educational Resources Information Center

    Zhao, Mingyi

    2016-01-01

    In today's rapidly digitizing society, people place their trust in a wide range of digital services and systems that deliver latest news, process financial transactions, store sensitive information, etc. However, this trust does not have a solid foundation, because software code that supports this digital world has security vulnerabilities. These…

  6. Designing Online Tutor Training for Language Courses: A Case Study

    ERIC Educational Resources Information Center

    Stickler, Ursula; Hampel, Regine

    2007-01-01

    In 2003-04 the Open University offered its first German beginners' course with a dual tuition strand: tutorials were delivered either face-to-face or online using synchronous, audio-graphic, Internet-based conferencing software. For the new online tutors, a special training programme was designed and delivered. We evaluated the benefits of our…

  7. The Object Formerly Known as the Textbook

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2013-01-01

    Textbook publishers argue that their newest digital products should not even be called "textbooks." They are really software programs built to deliver a mix of text, videos, and homework assignments. But delivering them is just the beginning. No old-school textbook was able to be customized for each student in the classroom. The books never graded…

  8. Design and implementation of an online systemic human anatomy course with laboratory.

    PubMed

    Attardi, Stefanie M; Rogers, Kem A

    2015-01-01

    Systemic Human Anatomy is a full credit, upper year undergraduate course with a (prosection) laboratory component at Western University Canada. To meet enrollment demands beyond the physical space of the laboratory facility, a fully online section was developed to run concurrently with the traditional face to face (F2F) course. Lectures given to F2F students are simultaneously broadcasted to online students using collaborative software (Blackboard Collaborate). The same collaborative software is used by a teaching assistant to deliver laboratory demonstrations in which three-dimensional (3D) virtual anatomical models are manipulated. Ten commercial software programs were reviewed to determine their suitability for demonstrating the virtual models, resulting in the selection of Netter's 3D Interactive Anatomy. Supplementary online materials for the central nervous system were developed by creating 360° images of plastinated prosected brain specimens and a website through which they could be accessed. This is the first description of a fully online undergraduate anatomy course with a live, interactive laboratory component. Preliminary data comparing the online and F2F student grades suggest that previous student academic performance, and not course delivery format, predicts performance in anatomy. Future qualitative studies will reveal student perceptions about their learning experiences in both of the course delivery formats. © 2014 American Association of Anatomists.

  9. Poor drug distribution as a possible explanation for the results of the PRECISE trial.

    PubMed

    Sampson, John H; Archer, Gary; Pedain, Christoph; Wembacher-Schröder, Eva; Westphal, Manfred; Kunwar, Sandeep; Vogelbaum, Michael A; Coan, April; Herndon, James E; Raghavan, Raghu; Brady, Martin L; Reardon, David A; Friedman, Allan H; Friedman, Henry S; Rodríguez-Ponce, M Inmaculada; Chang, Susan M; Mittermeyer, Stephan; Croteau, David; Puri, Raj K

    2010-08-01

    Convection-enhanced delivery (CED) is a novel intracerebral drug delivery technique with considerable promise for delivering therapeutic agents throughout the CNS. Despite this promise, Phase III clinical trials employing CED have failed to meet clinical end points. Although this may be due to inactive agents or a failure to rigorously validate drug targets, the authors have previously demonstrated that catheter positioning plays a major role in drug distribution using this technique. The purpose of the present work was to retrospectively analyze the expected drug distribution based on catheter positioning data available from the CED arm of the PRECISE trial. Data on catheter positioning from all patients randomized to the CED arm of the PRECISE trial were available for analyses. BrainLAB iPlan Flow software was used to estimate the expected drug distribution. Only 49.8% of catheters met all positioning criteria. Still, catheter positioning score (hazard ratio 0.93, p = 0.043) and the number of optimally positioned catheters (hazard ratio 0.72, p = 0.038) had a significant effect on progression-free survival. Estimated coverage of relevant target volumes was low, however, with only 20.1% of the 2-cm penumbra surrounding the resection cavity covered on average. Although tumor location and resection cavity volume had no effect on coverage volume, estimations of drug delivery to relevant target volumes did correlate well with catheter score (p < 0.003), and optimally positioned catheters had larger coverage volumes (p < 0.002). Only overall survival (p = 0.006) was higher for investigators considered experienced after adjusting for patient age and Karnofsky Performance Scale score. The potential efficacy of drugs delivered by CED may be severely constrained by ineffective delivery in many patients. Routine use of software algorithms and alternative catheter designs and infusion parameters may improve the efficacy of drugs delivered by CED.

  10. Computer Simulation Modeling: A Method for Predicting the Utilities of Alternative Computer-Aided Treat Evaluation Algorithms

    DTIC Science & Technology

    1990-09-01

    1988). Current versions of the ADATS have CATE systems insLzlled, but the software is still under development by the radar manufacturer, Contraves ...Italiana, a subcontractor to Martin Marietta (USA). Contraves Italiana will deliver the final version of the software to Martin Marietta in 1991. Until then

  11. Delivering Savings with Open Architecture and Product Lines

    DTIC Science & Technology

    2011-04-30

    p.m. Chair: Christopher Deegan , Executive Director, Program Executive Office for Integrated Warfare Systems Delivering Savings with Open...Architectures Walt Scacchi and Thomas Alspaugh, Institute for Software Research Christopher Deegan —Executive Director, Program Executive Officer...Integrated Warfare Systems (PEO IWS). Mr. Deegan directs the development, acquisition, and fleet support of 150 combat weapon system programs managed by 350

  12. Computer-Assisted, Counselor-Delivered Smoking Cessation Counseling for Community College Students: Intervention Approach and Sample Characteristics

    ERIC Educational Resources Information Center

    Prokhorov, Alexander V.; Fouladi, Rachel T.; de Moor, Carl; Warneke, Carla L.; Luca, Mario; Jones, Mary Mullin; Rosenblum, Carol; Emmons, Karen M.; Hudmon, Karen Suchanek; Yost, Tracey E.; Gritz, Ellen R.

    2007-01-01

    This report presents the experimental approach and baseline findings from "Look at Your Health," an ongoing study to develop and evaluate a computer-assisted, counselor-delivered smoking cessation program for community college students. It describes the expert system software program used for data collection and for provision of tailored feedback,…

  13. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  14. Configuration management and software measurement in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described.

  15. Clinical experience with Mobius FX software for 3D dose verification for prostate VMAT plans and comparison with physical measurements

    NASA Astrophysics Data System (ADS)

    Vazquez-Quino, L. A.; Huerta-Hernandez, C. I.; Rangaraj, D.

    2017-05-01

    MobiusFX, an add-on software module from Mobius Medical Systems for IMRT and VMAT QA, uses measurements in linac treatment logs to calculate and verify the 3D dose delivered to patients. In this study, 10 volumetric-modulated arc therapy (VMAT) prostate plans were planned and delivered in a Varian TrueBeam linac. The plans consisted of beams with 6 and 10 MV energy and 2 to 3 arcs per plan. The average gamma value with criterion of 3% and 3mm MobiusFX and TPS: 99.96%, 2% and 2mm MobiusFX and TPS: 98.70 %. Further comparison with ArcCheck measurements was conducted.

  16. The Importance of Architecture in DoD Software

    DTIC Science & Technology

    1991-07-01

    01282 92 1 14 060 M91-35 The Importance of Architecture in DOD Software S ACCesion For- * DTIC "r,’L- .S Dr. Barry M. Horowitz July 1991 D;.t ibto...resource utilization: architecture determines how the system sustains , 06 operations when parts of the system fail. The architecture also determines...software maintainers to ensure that we deliver to them whatever is necessary for them Medium to sustain and use the architecture . Fault Rate 37% Getting

  17. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  18. Designing, Developing and Implementing a Software Tool for Scenario Based Learning

    ERIC Educational Resources Information Center

    Norton, Geoff; Taylor, Mathew; Stewart, Terry; Blackburn, Greg; Jinks, Audrey; Razdar, Bahareh; Holmes, Paul; Marastoni, Enrique

    2012-01-01

    The pedagogical value of problem-based and inquiry-based learning activities has led to increased use of this approach in many courses. While scenarios or case studies were initially presented to learners as text-based material, the development of modern software technology provides the opportunity to deliver scenarios as e-learning modules,…

  19. Code Pulse: Software Assurance (SWA) Visual Analytics for Dynamic Analysis of Code

    DTIC Science & Technology

    2014-09-01

    31 4.5.1 Market Analysis...competitive market analysis to assess the tool potential. The final transition targets were selected and expressed along with our research on the topic...public release milestones. Details of our testing methodology is in our Software Test Plan deliv- erable, CP- STP -0001. A summary of this approach is

  20. Using Adobe Connect to Deliver Online Library Instruction to the RN to BSN Program

    ERIC Educational Resources Information Center

    Carlson, Kathleen

    2011-01-01

    This paper takes a look at how one academic health sciences librarian brought mediated literature searching to the distance RN to BSN nursing students. It takes a look at why Adobe Connect was the webinar software that was selected to deliver online instruction to the students. The article explains how students participated in a pre-class survey…

  1. Cyber security best practices for the nuclear industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badr, I.

    2012-07-01

    When deploying software based systems, such as, digital instrumentation and controls for the nuclear industry, it is vital to include cyber security assessment as part of architecture and development process. When integrating and delivering software-intensive systems for the nuclear industry, engineering teams should make use of a secure, requirements driven, software development life cycle, ensuring security compliance and optimum return on investment. Reliability protections, data loss prevention, and privacy enforcement provide a strong case for installing strict cyber security policies. (authors)

  2. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  3. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  4. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  5. The six critical attributes of the next generation of quality management software systems.

    PubMed

    Clark, Kathleen

    2011-07-01

    Driven by both the need to meet regulatory requirements and a genuine desire to drive improved quality, quality management systems encompassing standard operating procedure, corrective and preventative actions and related processes have existed for many years, both in paper and electronic form. The impact of quality management systems on 'actual' quality, however, is often reported as far less than desired. A quality management software system that moves beyond formal forms-driven processes to include a true closed loop design, manage disparate processes across the enterprise, provide support for collaborative processes and deliver insight into the overall state of control has the potential to close the gap between simply accomplishing regulatory compliance and delivering measurable improvements in quality and efficiency.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamlet, Benjamin R.; Harris, James M.; Burns, John F.

    This document contains 4 use case realizations generated from the model contained in Rational Software Architect. These use case realizations are the current versions of the realizations originally delivered in Elaboration Iteration 3.

  7. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  8. A decentralized software bus based on IP multicas ting

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd

    1995-01-01

    We describe decentralized reconfigurable implementation of a conference management system based on the low-level Internet Protocol (IP) multicasting protocol. IP multicasting allows low-cost, world-wide, two-way transmission of data between large numbers of conferencing participants through the Multicasting Backbone (MBone). Each conference is structured as a software bus -- a messaging system that provides a run-time interconnection model that acts as a separate agent (i.e., the bus) for routing, queuing, and delivering messages between distributed programs. Unlike the client-server interconnection model, the software bus model provides a level of indirection that enhances the flexibility and reconfigurability of a distributed system. Current software bus implementations like POLYLITH, however, rely on a centralized bus process and point-to-point protocols (i.e., TCP/IP) to route, queue, and deliver messages. We implement a software bus called the MULTIBUS that relies on a separate process only for routing and uses a reliable IP multicasting protocol for delivery of messages. The use of multicasting means that interconnections are independent of IP machine addresses. This approach allows reconfiguration of bus participants during system execution without notifying other participants of new IP addresses. The use of IP multicasting also permits an economy of scale in the number of participants. We describe the MULITIBUS protocol elements and show how our implementation performs better than centralized bus implementations.

  9. CBT Pilot Program Instructional Guide. Basic Drafting Skills Curriculum Delivered through CAD Workstations and Artificial Intelligence Software.

    ERIC Educational Resources Information Center

    Smith, Richard J.; Sauer, Mardelle A.

    This guide is intended to assist teachers in using computer-aided design (CAD) workstations and artificial intelligence software to teach basic drafting skills. The guide outlines a 7-unit shell program that may also be used as a generic authoring system capable of supporting computer-based training (CBT) in other subject areas. The first section…

  10. Earth Science Goes E-Commerce

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Software packages commercially marketed by Agri ImaGIS allow customers to analyze farm fields. Agri ImaGIS provides satellite images of farmland and agricultural views to US clients. The company approached NASA-MSU TechLink for access to technology that would improve the company's capabilities to deliver satellite images over the Internet. TechLink found that software with the desired functions had already been developed through NASA's Remote Sensing Database Program. Agri ImaGIS formed a partnership with the University of Minnesota group that allows the company to further develop the software to meet its Internet commerce needs.

  11. Flight Software for the LADEE Mission

    NASA Technical Reports Server (NTRS)

    Cannon, Howard N.

    2015-01-01

    The Lunar Atmosphere and Dust Environment Explorer (LADEE) spacecraft was launched on September 6, 2013, and completed its mission on April 17, 2014 with a directed impact to the Lunar Surface. Its primary goals were to examine the lunar atmosphere, measure lunar dust, and to demonstrate high rate laser communications. The LADEE mission was a resounding success, achieving all mission objectives, much of which can be attributed to careful planning and preparation. This paper discusses some of the highlights from the mission, and then discusses the techniques used for developing the onboard Flight Software. A large emphasis for the Flight Software was to develop it within tight schedule and cost constraints. To accomplish this, the Flight Software team leveraged heritage software, used model based development techniques, and utilized an automated test infrastructure. This resulted in the software being delivered on time and within budget. The resulting software was able to meet all system requirements, and had very problems in flight.

  12. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study

    PubMed Central

    Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-01-01

    Background Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. Objective The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users’ verbal responses, more closely mirroring a human-delivered motivational intervention. Methods We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Results Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Conclusions Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. PMID:28659259

  13. Cost-aware request routing in multi-geography cloud data centres using software-defined networking

    NASA Astrophysics Data System (ADS)

    Yuan, Haitao; Bi, Jing; Li, Bo Hu; Tan, Wei

    2017-03-01

    Current geographically distributed cloud data centres (CDCs) require gigantic energy and bandwidth costs to provide multiple cloud applications to users around the world. Previous studies only focus on energy cost minimisation in distributed CDCs. However, a CDC provider needs to deliver gigantic data between users and distributed CDCs through internet service providers (ISPs). Geographical diversity of bandwidth and energy costs brings a highly challenging problem of how to minimise the total cost of a CDC provider. With the recently emerging software-defined networking, we study the total cost minimisation problem for a CDC provider by exploiting geographical diversity of energy and bandwidth costs. We formulate the total cost minimisation problem as a mixed integer non-linear programming (MINLP). Then, we develop heuristic algorithms to solve the problem and to provide a cost-aware request routing for joint optimisation of the selection of ISPs and the number of servers in distributed CDCs. Besides, to tackle the dynamic workload in distributed CDCs, this article proposes a regression-based workload prediction method to obtain future incoming workload. Finally, this work evaluates the cost-aware request routing by trace-driven simulation and compares it with the existing approaches to demonstrate its effectiveness.

  14. Comparison of flat cleaved and cylindrical diffusing fibers as treatment sources for interstitial photodynamic therapy.

    PubMed

    Baran, Timothy M; Foster, Thomas H

    2014-02-01

    For interstitial photodynamic therapy (iPDT) of bulky tumors, careful treatment planning is required in order to ensure that a therapeutic dose is delivered to the tumor, while minimizing damage to surrounding normal tissue. In clinical contexts, iPDT has typically been performed with either flat cleaved or cylindrical diffusing optical fibers as light sources. Here, the authors directly compare these two source geometries in terms of the number of fibers and duration of treatment required to deliver a prescribed light dose to a tumor volume. Treatment planning software for iPDT was developed based on graphics processing unit enhanced Monte Carlo simulations. This software was used to optimize the number of fibers, total energy delivered by each fiber, and the position of individual fibers in order to deliver a target light dose (D90) to 90% of the tumor volume. Treatment plans were developed using both flat cleaved and cylindrical diffusing fibers, based on tissue volumes derived from CT data from a head and neck cancer patient. Plans were created for four cases: fixed energy per fiber, fixed number of fibers, and in cases where both or neither of these factors were fixed. When the number of source fibers was fixed at eight, treatment plans based on flat cleaved fibers required each to deliver 7180-8080 J in order to deposit 90 J/cm(2) in 90% of the tumor volume. For diffusers, each fiber was required to deliver 2270-2350 J (333-1178 J/cm) in order to achieve this same result. For the case of fibers delivering a fixed 900 J, 13 diffusers or 19 flat cleaved fibers at a spacing of 1 cm were required to deliver the desired dose. With energy per fiber fixed at 2400 J and the number of fibers fixed at eight, diffuser fibers delivered the desired dose to 93% of the tumor volume, while flat cleaved fibers delivered this dose to 79%. With both energy and number of fibers allowed to vary, six diffusers delivering 3485-3600 J were required, compared to ten flat cleaved fibers delivering 2780-3600 J. For the same number of fibers, cylindrical diffusers allow for a shorter treatment duration compared to flat cleaved fibers. For the same energy delivered per fiber, diffusers allow for the insertion of fewer fibers in order to deliver the same light dose to a target volume.

  15. Comparison of flat cleaved and cylindrical diffusing fibers as treatment sources for interstitial photodynamic therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baran, Timothy M., E-mail: timothy.baran@rochester.edu; Foster, Thomas H.

    Purpose: For interstitial photodynamic therapy (iPDT) of bulky tumors, careful treatment planning is required in order to ensure that a therapeutic dose is delivered to the tumor, while minimizing damage to surrounding normal tissue. In clinical contexts, iPDT has typically been performed with either flat cleaved or cylindrical diffusing optical fibers as light sources. Here, the authors directly compare these two source geometries in terms of the number of fibers and duration of treatment required to deliver a prescribed light dose to a tumor volume. Methods: Treatment planning software for iPDT was developed based on graphics processing unit enhanced Montemore » Carlo simulations. This software was used to optimize the number of fibers, total energy delivered by each fiber, and the position of individual fibers in order to deliver a target light dose (D{sub 90}) to 90% of the tumor volume. Treatment plans were developed using both flat cleaved and cylindrical diffusing fibers, based on tissue volumes derived from CT data from a head and neck cancer patient. Plans were created for four cases: fixed energy per fiber, fixed number of fibers, and in cases where both or neither of these factors were fixed. Results: When the number of source fibers was fixed at eight, treatment plans based on flat cleaved fibers required each to deliver 7180–8080 J in order to deposit 90 J/cm{sup 2} in 90% of the tumor volume. For diffusers, each fiber was required to deliver 2270–2350 J (333–1178 J/cm) in order to achieve this same result. For the case of fibers delivering a fixed 900 J, 13 diffusers or 19 flat cleaved fibers at a spacing of 1 cm were required to deliver the desired dose. With energy per fiber fixed at 2400 J and the number of fibers fixed at eight, diffuser fibers delivered the desired dose to 93% of the tumor volume, while flat cleaved fibers delivered this dose to 79%. With both energy and number of fibers allowed to vary, six diffusers delivering 3485–3600 J were required, compared to ten flat cleaved fibers delivering 2780–3600 J. Conclusions: For the same number of fibers, cylindrical diffusers allow for a shorter treatment duration compared to flat cleaved fibers. For the same energy delivered per fiber, diffusers allow for the insertion of fewer fibers in order to deliver the same light dose to a target volume.« less

  16. Security in Full-Force

    NASA Technical Reports Server (NTRS)

    2002-01-01

    When fully developed for NASA, Vanguard Enforcer(TM) software-which emulates the activities of highly technical security system programmers, auditors, and administrators-was among the first intrusion detection programs to restrict human errors from affecting security, and to ensure the integrity of a computer's operating systems, as well as the protection of mission critical resources. Vanguard Enforcer was delivered in 1991 to Johnson Space Center and has been protecting systems and critical data there ever since. In August of 1999, NASA granted Vanguard exclusive rights to commercialize the Enforcer system for the private sector. In return, Vanguard continues to supply NASA with ongoing research, development, and support of Enforcer. The Vanguard Enforcer 4.2 is one of several surveillance technologies that make up the Vanguard Security Solutions line of products. Using a mainframe environment, Enforcer 4.2 achieves previously unattainable levels of automated security management.

  17. Shuttle avionics software trials, tribulations and success

    NASA Technical Reports Server (NTRS)

    Henderson, O. L.

    1985-01-01

    The early problems and the solutions developed to provide the required quality software needed to support the space shuttle engine development program are described. The decision to use a programmable digital control system on the space shuttle engine was primarily based upon the need for a flexible control system capable of supporting the total engine mission on a large complex pump fed engine. The mission definition included all control phases from ground checkout through post shutdown propellant dumping. The flexibility of the controller through reprogrammable software allowed the system to respond to the technical challenges and innovation required to develop both the engine and controller hardware. This same flexibility, however, placed a severe strain on the capability of the software development and verification organization. The overall development program required that the software facility accommodate significant growth in both the software requirements and the number of software packages delivered. This challenge was met by reorganization and evolution in the process of developing and verifying software.

  18. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  19. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  20. Exploring the Use of a Test Automation Framework

    NASA Technical Reports Server (NTRS)

    Cervantes, Alex

    2009-01-01

    It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.

  1. A Software Rejuvenation Framework for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  2. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study.

    PubMed

    Kahler, Christopher W; Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-06-28

    Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users' verbal responses, more closely mirroring a human-delivered motivational intervention. We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. ©Christopher W Kahler, William J Lechner, James MacGlashan, Tyler B Wray, Michael L Littman. Originally published in JMIR Mental Health (http://mental.jmir.org), 28.06.2017.

  3. Subaperture metrology technologies extend capabilities in optics manufacturing

    NASA Astrophysics Data System (ADS)

    Tricard, Marc; Forbes, Greg; Murphy, Paul

    2005-10-01

    Subaperture polishing technologies have radically changed the landscape of precision optics manufacturing and enabled the production of higher precision optics with increasingly difficult figure requirements. However, metrology is a critical piece of the optics fabrication process, and the dependence on interferometry is especially acute for computer-controlled, deterministic finishing. Without accurate full-aperture metrology, figure correction using subaperture polishing technologies would not be possible. QED Technologies has developed the Subaperture Stitching Interferometer (SSI) that extends the effective aperture and dynamic range of a phase measuring interferometer. The SSI's novel developments in software and hardware improve the capacity and accuracy of traditional interferometers, overcoming many of the limitations previously faced. The SSI performs high-accuracy automated measurements of spheres, flats, and mild aspheres up to 200 mm in diameter by stitching subaperture data. The system combines a six-axis precision workstation, a commercial Fizeau interferometer of 4" or 6" aperture, and dedicated software. QED's software automates the measurement design, data acquisition, and mathematical reconstruction of the full-aperture phase map. The stitching algorithm incorporates a general framework for compensating several types of errors introduced by the interferometer and stage mechanics. These include positioning errors, viewing system distortion, the system reference wave error, etc. The SSI has been proven to deliver the accurate and flexible metrology that is vital to precision optics fabrication. This paper will briefly review the capabilities of the SSI as a production-ready, metrology system that enables costeffective manufacturing of precision optical surfaces.

  4. Use of Fisheye Parrot Bebop 2 Images for 3d Modelling Using Commercial Photogrammetric Software

    NASA Astrophysics Data System (ADS)

    Pagliari, D.; Pinto, L.

    2018-05-01

    Fisheye camera installed on-board mass market UAS are becoming very popular and it is more and more frequent the use of such platforms for photogrammetric purposes. The interest of wide-angles images for 3D modelling is confirmed by the introduction of fisheye models in several commercial software packages. The paper exploits the different mathematical models implemented in the most famous commercial photogrammetric software packages, highlighting the different processing pipelines and analysing the achievable results in terms of checkpoint residuals, as well as the quality of the delivered 3D point clouds. A two-step approach based on the creation of undistorted images has been tested too. An experimental test has been carried out using a Parrot Bebop 2 UAS by performing a flight over an historical complex located near Piacenza (Northern Italy), which is characterized by the simultaneous presence of horizontal, vertical and oblique surfaces. Different flight configurations have been tested to evaluate the potentiality and possible drawbacks of the previously mentioned UAS platform. Results confirmed that the fisheye images acquired with the Parrot Bebop 2 are suitable for 3D modelling, ensuring accuracies of the photogrammetric blocks of the order of the GSD (about 0.05 m normal to the optic axis in case of a flight height equal to 35 m). The generated point clouds have been compared to a reference scan, acquired by means of a MS60 MultiStation, resulting in differences below 0.05 in all directions.

  5. Progressive retry for software error recovery in distributed systems

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.

    1993-01-01

    In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.

  6. Transportable Payload Operations Control Center reusable software: Building blocks for quality ground data systems

    NASA Technical Reports Server (NTRS)

    Mahmot, Ron; Koslosky, John T.; Beach, Edward; Schwarz, Barbara

    1994-01-01

    The Mission Operations Division (MOD) at Goddard Space Flight Center builds Mission Operations Centers which are used by Flight Operations Teams to monitor and control satellites. Reducing system life cycle costs through software reuse has always been a priority of the MOD. The MOD's Transportable Payload Operations Control Center development team established an extensive library of 14 subsystems with over 100,000 delivered source instructions of reusable, generic software components. Nine TPOCC-based control centers to date support 11 satellites and achieved an average software reuse level of more than 75 percent. This paper shares experiences of how the TPOCC building blocks were developed and how building block developer's, mission development teams, and users are all part of the process.

  7. Electronic Books.

    ERIC Educational Resources Information Center

    Barker, Philip; Giller, Susan

    1992-01-01

    Classifies types of electronic books: archival, informational, instructional, and interrogational; evaluates five commercially, available examples and two in-house examples; and describes software tools for creating and delivering electronic books. Identifies crucial design considerations: interactive end-user interfaces; use of hypermedia;…

  8. Knowledge Retrieval Solutions.

    ERIC Educational Resources Information Center

    Khan, Kamran

    1998-01-01

    Excalibur RetrievalWare offers true knowledge retrieval solutions. Its fundamental technologies, Adaptive Pattern Recognition Processing and Semantic Networks, have capabilities for knowledge discovery and knowledge management of full-text, structured and visual information. The software delivers a combination of accuracy, extensibility,…

  9. NASA Ares I Crew Launch Vehicle Upper Stage Avionics and Software Overview

    NASA Technical Reports Server (NTRS)

    Nola, Charles L.; Blue, Lisa

    2008-01-01

    Building on the heritage of the Saturn and Space Shuttle Programs for the Design, Development, Test, and Evaluation (DDT and E) of avionics and software for NASA's Ares I Crew Launch Vehicle (CLV), the Ares I Upper Stage Element is a vital part of the Constellation Program's transportation system. The Upper Stage Element's Avionics Subsystem is actively proceeding toward its objective of delivering a flight-certified Upper Stage Avionics System for the Ares I CLV.

  10. The Role of Networks in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Geng; Devine, Mac

    The confluence of technology advancements and business developments in Broadband Internet, Web services, computing systems, and application software over the past decade has created a perfect storm for cloud computing. The "cloud model" of delivering and consuming IT functions as services is poised to fundamentally transform the IT industry and rebalance the inter-relationships among end users, enterprise IT, software companies, and the service providers in the IT ecosystem (Armbrust et al., 2009; Lin, Fu, Zhu, & Dasmalchi, 2009).

  11. Cognitive issues in autonomous spacecraft-control operations: An investigation of software-mediated decision making in a scaled environment

    NASA Astrophysics Data System (ADS)

    Murphy, Elizabeth Drummond

    As advances in technology are applied in complex, semi-automated domains, human controllers are distanced from the controlled process. This physical and psychological distance may both facilitate and degrade human performance. To investigate cognitive issues in spacecraft ground-control operations, the present experimental research was undertaken. The primary issue concerned the ability of operations analysts who do not monitor operations to make timely, accurate decisions when autonomous software calls for human help. Another key issue involved the potential effects of spatial-visualization ability (SVA) in environments that present data in graphical formats. Hypotheses were derived largely from previous findings and predictions in the literature. Undergraduate psychology students were assigned at random to a monitoring condition or an on-call condition in a scaled environment. The experimental task required subjects to decide on the veracity of a problem diagnosis delivered by a software process on-board a simulated spacecraft. To support decision-making, tabular and graphical data displays presented information on system status. A level of software confidence in the problem diagnosis was displayed, and subjects reported their own level of confidence in their decisions. Contrary to expectations, the performance of on-call subjects did not differ significantly from that of continuous monitors. Analysis yielded a significant interaction of sex and condition: Females in the on-call condition had the lowest mean accuracy. Results included a preference for bar charts over line graphs and faster performance with tables than with line graphs. A significant correlation was found between subjective confidence and decision accuracy. SVA was found to be predictive of accuracy but not speed; and SVA was found to be a stronger predictor of performance for males than for females. Low-SVA subjects reported that they relied more on software confidence than did medium- or high-SVA subjects. These and other findings have implications for the design of user interfaces to support human decision-making in on-call situations and to accommodate low-SVA users.

  12. Nicotine and Cotinine Exposure from Electronic Cigarettes: A Population Approach

    PubMed Central

    de Mendizábal, Nieves Vélez; Jones, David R.; Jahn, Andy; Bies, Robert R.; Brown, Joshua W.

    2015-01-01

    Background and Objectives Electronic cigarettes (e-cigarettes) are a recent technology that has gained rapid acceptance. Still, little is known about them in terms of safety and effectiveness. A basic question is how effectively they deliver nicotine, however the literature is surprisingly unclear on this point. Here, a population pharmacokinetic (PK) model was developed for nicotine and its major metabolite cotinine with the aim to provide a reliable framework for the simulation of nicotine and cotinine concentrations over time, based solely on inhalation airflow recordings and individual covariates (i.e. weight and breath carbon monoxide CO levels). Methods This study included 10 adults self-identified as heavy smokers (at least one pack per day). Plasma nicotine and cotinine concentrations were measured at regular 10-minute intervals for 90 minutes while human subjects inhaled nicotine vapor from a modified e-cigarette. Airflow measurements were recorded every 200 milliseconds throughout the session. A population PK model for nicotine and cotinine was developed based on previously published PK parameters and the airflow recordings. All the analyses were performed with the nonlinear mixed-effect modelling software NONMEM 7.2. Results The results show that e-cigarettes deliver nicotine effectively, although the pharmacokinetic profiles are lower than those achieved with regular cigarettes. Our PK model effectively predicts plasma nicotine and cotinine concentrations from the inhalation volume, and initial breath CO. Conclusion E-cigarettes are effective at delivering nicotine. This new PK model of e-cigarette usage might be used for pharmacodynamic analysis where the PK profiles are not available. PMID:25503588

  13. GPS Software Packages Deliver Positioning Solutions

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark A.; Bigelow, Matthew; Gilkey, Jeff C.

    The Super Strypi Navigation, Guidance & Control Software is a real-time implementation of the navigation, guidance and control algorithms designed to deliver a payload to a desired orbit for the rail launched Super Strypi launch vehicle. The software contains all flight control algorithms required from pre-launch until orbital insertion. The flight sequencer module calls the NG&C functions at the appropriate times of flight. Additional functionality includes all the low level drivers and I/O for communicating to other systems within the launch vehicle and to the ground support equipment. The software is designed such that changes to the launch location andmore » desired orbit can be changed without recompiling the code.« less

  15. Resource Allocation Planning Helper (RALPH): Lessons learned

    NASA Technical Reports Server (NTRS)

    Durham, Ralph; Reilly, Norman B.; Springer, Joe B.

    1990-01-01

    The current task of Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data driven, rule based planning system, RALPH, has expanded the planning horizon from 8 weeks to 10 years and has resulted in large labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special what if studies. The status of RALPH is briefly reviewed and important lessons learned from the creation of an highly functional design team are focused on through an evolutionary design and implementation period in which an AI shell was selected, prototyped, and ultimately abandoned, and through the fundamental changes to the very process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered to delivered software, changes in the planning methodology as a result of evolving software capabilities and creation of the ability to develop and process generic requirements to allow planning flexibility.

  16. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  17. Investigation of the accuracy of MV radiation isocentre calculations in the Elekta cone-beam CT software XVI.

    PubMed

    Riis, Hans L; Moltke, Lars N; Zimmermann, Sune J; Ebert, Martin A; Rowshanfarzad, Pejman

    2016-06-07

    Accurate determination of the megavoltage (MV) radiation isocentre of a linear accelerator (linac) is an important task in radiotherapy. The localization of the MV radiation isocentre is crucial for correct calibration of the in-room lasers and the cone-beam CT scanner used for patient positioning prior to treatment. Linac manufacturers offer tools for MV radiation isocentre localization. As a user, there is no access to the documentation for the underlying method and calculation algorithm used in the commercial software. The idea of this work was to evaluate the accuracy of the software tool for MV radiation isocentre calculation as delivered by Elekta using independent software. The image acquisition was based on the scheme designed by the manufacturer. Eight MV images were acquired in each series of a ball-bearing (BB) phantom attached to the treatment couch. The images were recorded at cardinal angles of the gantry using the electronic portal imaging device (EPID). Eight Elekta linacs with three different types of multileaf collimators (MLCs) were included in the test. The influence of MLC orientation, x-ray energy, and phantom modifications were examined. The acquired images were analysed using the Elekta x-ray volume imaging (XVI) software and in-house developed (IHD) MATLAB code. Results from the two different software were compared. A discrepancy in the longitudinal direction of the isocentre localization was found averaging 0.23 mm up to a maximum of 0.75 mm. The MLC orientation or the phantom asymmetry in the longitudinal direction do not appear to cause the discrepancy. The main cause of the differences could not be clearly identified. However, it is our opinion that the commercial software delivered by the linac manufacturer should be improved to reach better stability and precise results in the MV radiation isocentre calculations.

  18. IDC Re-Engineering Phase 2 Iteration E2 Use Case Realizations Version 1.2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamlet, Benjamin R.; Harris, James M.; Burns, John F.

    2016-12-01

    This document contains 4 use case realizations generated from the model contained in Rational Software Architect. These use case realizations are the current versions of the realizations originally delivered in Elaboration Iteration 2.

  19. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  20. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  1. Volttron version 5.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VOLTTRON is an agent execution platform providing services to its agents that allow them to easily communicate with physical devices and other resources. VOLTTRON delivers an innovative distributed control and sensing software platform that supports modern control strategies, including agent-based and transaction-based controls. It enables mobile and stationary software agents to perform information gathering, processing, and control actions. VOLTTRON can independently manage a wide range of applications, such as HVAC systems, electric vehicles, distributed energy or entire building loads, leading to improved operational efficiency.

  2. Multiple IMU system test plan, volume 4. [subroutines for space shuttle requirements

    NASA Technical Reports Server (NTRS)

    Landey, M.; Vincent, K. T., Jr.; Whittredge, R. S.

    1974-01-01

    Operating procedures for this redundant system are described. A test plan is developed with two objectives. First, performance of the hardware and software delivered is demonstrated. Second, applicability of multiple IMU systems to the space shuttle mission is shown through detailed experiments with FDI algorithms and other multiple IMU software: gyrocompassing, calibration, and navigation. Gimbal flip is examined in light of its possible detrimental effects on FDI and navigation. For Vol. 3, see N74-10296.

  3. Hybrid methods for cybersecurity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less

  4. The AIROPA software package: milestones for testing general relativity in the strong gravity regime with AO

    NASA Astrophysics Data System (ADS)

    Witzel, Gunther; Lu, Jessica R.; Ghez, Andrea M.; Martinez, Gregory D.; Fitzgerald, Michael P.; Britton, Matthew; Sitarski, Breann N.; Do, Tuan; Campbell, Randall D.; Service, Maxwell; Matthews, Keith; Morris, Mark R.; Becklin, E. E.; Wizinowich, Peter L.; Ragland, Sam; Doppmann, Greg; Neyman, Chris; Lyke, James; Kassis, Marc; Rizzi, Luca; Lilley, Scott; Rampy, Rachel

    2016-07-01

    General relativity can be tested in the strong gravity regime by monitoring stars orbiting the supermassive black hole at the Galactic Center with adaptive optics. However, the limiting source of uncertainty is the spatial PSF variability due to atmospheric anisoplanatism and instrumental aberrations. The Galactic Center Group at UCLA has completed a project developing algorithms to predict PSF variability for Keck AO images. We have created a new software package (AIROPA), based on modified versions of StarFinder and Arroyo, that takes atmospheric turbulence profiles, instrumental aberration maps, and images as inputs and delivers improved photometry and astrometry on crowded fields. This software package will be made publicly available soon.

  5. Delivering Science on Day One

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Timothy J.

    2016-03-01

    While benchmarking software is useful for testing the performance limits and stability of Argonne National Laboratory’s new Theta supercomputer, there is no substitute for running real applications to explore the system’s potential. The Argonne Leadership Computing Facility’s Theta Early Science Program, modeled after its highly successful code migration program for the Mira supercomputer, has one primary aim: to deliver science on day one. Here is a closer look at the type of science problems that will be getting early access to Theta, a next-generation machine being rolled out this year.

  6. RoadLIFE GPS : software application for processing GPS data from US550 in northwestern New Mexico.

    DOT National Transportation Integrated Search

    2008-04-01

    Public-private partnerships as an alternative means of delivering goods and services are receiving increased attention as state departments of transportation consider ways to maximize limited resources. In 1998 the New Mexico Department of Transporta...

  7. Documents in Microform.

    ERIC Educational Resources Information Center

    Lyons, Janet, Ed.

    1976-01-01

    This issue of "Illinois Libraries" contains the papers delivered at a 1974 workshop on government publications in microform. Twelve articles focus on such issues as: 1) reasons to collect microforms; 2) criteria for selecting microform documents; 3) microform hardware and software; 4) procurement procedures; 5) bibliographic control; and…

  8. A Mobile-Based E-Learning System

    ERIC Educational Resources Information Center

    Ojokoh, Bolanle Adefowoke; Doyeni, Olubimtan Ayo; Adewale, Olumide Sunday; Isinkaye, Folasade Olubusola

    2013-01-01

    E-learning is an innovative approach for delivering electronically mediated, well-designed, learner-centred interactive learning environments by utilizing internet and digital technologies with respect to instructional design principles. This paper presents the application of Software Development techniques in the development of a Mobile Based…

  9. Low Cost Ways to Keep Software Current.

    ERIC Educational Resources Information Center

    Schultheis, Robert A.

    1992-01-01

    Discusses strategies for providing students with current computer software technology including acquiring previous versions of software, obtaining demonstration software, using student versions, getting examination software, buying from mail order firms, buying few copies, exploring site licenses, acquiring shareware or freeware, and applying for…

  10. Treatment planning and dose analysis for interstitial photodynamic therapy of prostate cancer

    NASA Astrophysics Data System (ADS)

    Davidson, Sean R. H.; Weersink, Robert A.; Haider, Masoom A.; Gertner, Mark R.; Bogaards, Arjen; Giewercer, David; Scherz, Avigdor; Sherar, Michael D.; Elhilali, Mostafa; Chin, Joseph L.; Trachtenberg, John; Wilson, Brian C.

    2009-04-01

    With the development of new photosensitizers that are activated by light at longer wavelengths, interstitial photodynamic therapy (PDT) is emerging as a feasible alternative for the treatment of larger volumes of tissue. Described here is the application of PDT treatment planning software developed by our group to ensure complete coverage of larger, geometrically complex target volumes such as the prostate. In a phase II clinical trial of TOOKAD vascular targeted photodynamic therapy (VTP) for prostate cancer in patients who failed prior radiotherapy, the software was used to generate patient-specific treatment prescriptions for the number of treatment fibres, their lengths, their positions and the energy each delivered. The core of the software is a finite element solution to the light diffusion equation. Validation against in vivo light measurements indicated that the software could predict the location of an iso-fluence contour to within approximately ±2 mm. The same software was used to reconstruct the treatments that were actually delivered, thereby providing an analysis of the threshold light dose required for TOOKAD-VTP of the post-irradiated prostate. The threshold light dose for VTP-induced prostate damage, as measured one week post-treatment using contrast-enhanced MRI, was found to be highly heterogeneous, both within and between patients. The minimum light dose received by 90% of the prostate, D90, was determined from each patient's dose-volume histogram and compared to six-month sextant biopsy results. No patient with a D90 less than 23 J cm-2 had complete biopsy response, while 8/13 (62%) of patients with a D90 greater than 23 J cm-2 had negative biopsies at six months. The doses received by the urethra and the rectal wall were also investigated.

  11. Selecting wool-type fabrics for sensorial comfort in women office clothing for the cold season, using the multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Harpa, Rodica

    2017-10-01

    This article presents the strategy and the procedure used to achieve the declared goal: fabrics selection, pursuing sensorial comfort of a specific women-clothing item, by using the multi-criteria decision analysis. First, the objective evaluation of seven wool-type woven fabrics, suitable to the quality profile expected for the defined destination, was accomplished. Then, a survey was conducted on a sample of 187 consumers, women aged between 18 to 60 years old, with a background in the textile field, regarding both the preferences manifested in purchasing products, and the importance of various sensory perceptions through handling materials used in clothing products. Finally, the MCDM applied through the implementation of previous accomplished software STAT-ADM, allowed choosing the preferred wool-type fabric in order to get the expected sensorial comfort of women office trousers for the cold season, according to the previously established criteria. This overall approach showed good results in fabrics selection for assuring the sensorial comfort in women’s clothing, by using the multicriteria decision analysis based on a rating scale delivered by customers with knowledge in the textile field, but non-experts in the fabrics hand evaluation topic.

  12. Research on an expert system for database operation of simulation-emulation math models. Volume 1, Phase 1: Results

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G. O.; Schaffer, J. D.; Hsieh, B. J.; Padalkar, S.; Rodriguez-Moscoso, J. J.

    1985-01-01

    The results of the first phase of Research on an Expert System for Database Operation of Simulation/Emulation Math Models, is described. Techniques from artificial intelligence (AI) were to bear on task domains of interest to NASA Marshall Space Flight Center. One such domain is simulation of spacecraft attitude control systems. Two related software systems were developed to and delivered to NASA. One was a generic simulation model for spacecraft attitude control, written in FORTRAN. The second was an expert system which understands the usage of a class of spacecraft attitude control simulation software and can assist the user in running the software. This NASA Expert Simulation System (NESS), written in LISP, contains general knowledge about digital simulation, specific knowledge about the simulation software, and self knowledge.

  13. Virtual Satellite

    NASA Technical Reports Server (NTRS)

    Hammrs, Stephan R.

    2008-01-01

    Virtual Satellite (VirtualSat) is a computer program that creates an environment that facilitates the development, verification, and validation of flight software for a single spacecraft or for multiple spacecraft flying in formation. In this environment, enhanced functionality and autonomy of navigation, guidance, and control systems of a spacecraft are provided by a virtual satellite that is, a computational model that simulates the dynamic behavior of the spacecraft. Within this environment, it is possible to execute any associated software, the development of which could benefit from knowledge of, and possible interaction (typically, exchange of data) with, the virtual satellite. Examples of associated software include programs for simulating spacecraft power and thermal- management systems. This environment is independent of the flight hardware that will eventually host the flight software, making it possible to develop the software simultaneously with, or even before, the hardware is delivered. Optionally, by use of interfaces included in VirtualSat, hardware can be used instead of simulated. The flight software, coded in the C or C++ programming language, is compilable and loadable into VirtualSat without any special modifications. Thus, VirtualSat can serve as a relatively inexpensive software test-bed for development test, integration, and post-launch maintenance of spacecraft flight software.

  14. A bridge role metric model for nodes in software networks.

    PubMed

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  15. A Bridge Role Metric Model for Nodes in Software Networks

    PubMed Central

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices. PMID:25364938

  16. Implementation of workflow engine technology to deliver basic clinical decision support functionality.

    PubMed

    Huser, Vojtech; Rasmussen, Luke V; Oberg, Ryan; Starren, Justin B

    2011-04-10

    Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform.

  17. Using Free Internet Videogames in Upper Extremity Motor Training for Children with Cerebral Palsy.

    PubMed

    Sevick, Marisa; Eklund, Elizabeth; Mensch, Allison; Foreman, Matthew; Standeven, John; Engsberg, Jack

    2016-06-07

    Movement therapy is one type of upper extremity intervention for children with cerebral palsy (CP) to improve function. It requires high-intensity, repetitive and task-specific training. Tedium and lack of motivation are substantial barriers to completing the training. An approach to overcome these barriers is to couple the movement therapy with videogames. This investigation: (1) tested the feasibility of delivering a free Internet videogame upper extremity motor intervention to four children with CP (aged 8-17 years) with mild to moderate limitations to upper limb function; and (2) determined the level of intrinsic motivation during the intervention. The intervention used free Internet videogames in conjunction with the Microsoft Kinect motion sensor and the Flexible Action and Articulated Skeleton Toolkit software (FAAST) software. Results indicated that the intervention could be successfully delivered in the laboratory and the home, and pre- and post- impairment, function and performance assessments were possible. Results also indicated a high level of motivation among the participants. It was concluded that the use of inexpensive hardware and software in conjunction with free Internet videogames has the potential to be very motivating in helping to improve the upper extremity abilities of children with CP. Future work should include results from additional participants and from a control group in a randomized controlled trial to establish efficacy.

  18. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  19. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  20. Delivering real-time status and arrival information to commuter rail passengers at complex stations

    DOT National Transportation Integrated Search

    2003-08-01

    Software was developed for calculating real-time train status in an Automated Train Information Display System (ATIDS) at NJ Transit. Interfaces were developed for passing schedules and real-time train position and routing data from a rail traffic co...

  1. Realistic 3D computer model of the gerbil middle ear, featuring accurate morphology of bone and soft tissue structures.

    PubMed

    Buytaert, Jan A N; Salih, Wasil H M; Dierick, Manual; Jacobs, Patric; Dirckx, Joris J J

    2011-12-01

    In order to improve realism in middle ear (ME) finite-element modeling (FEM), comprehensive and precise morphological data are needed. To date, micro-scale X-ray computed tomography (μCT) recordings have been used as geometric input data for FEM models of the ME ossicles. Previously, attempts were made to obtain these data on ME soft tissue structures as well. However, due to low X-ray absorption of soft tissue, quality of these images is limited. Another popular approach is using histological sections as data for 3D models, delivering high in-plane resolution for the sections, but the technique is destructive in nature and registration of the sections is difficult. We combine data from high-resolution μCT recordings with data from high-resolution orthogonal-plane fluorescence optical-sectioning microscopy (OPFOS), both obtained on the same gerbil specimen. State-of-the-art μCT delivers high-resolution data on the 3D shape of ossicles and other ME bony structures, while the OPFOS setup generates data of unprecedented quality both on bone and soft tissue ME structures. Each of these techniques is tomographic and non-destructive and delivers sets of automatically aligned virtual sections. The datasets coming from different techniques need to be registered with respect to each other. By combining both datasets, we obtain a complete high-resolution morphological model of all functional components in the gerbil ME. The resulting 3D model can be readily imported in FEM software and is made freely available to the research community. In this paper, we discuss the methods used, present the resulting merged model, and discuss the morphological properties of the soft tissue structures, such as muscles and ligaments.

  2. Extracorporeal Stimulation of Sacral Nerve Roots for Observation of Pelvic Autonomic Nerve Integrity: Description of a Novel Methodological Setup.

    PubMed

    Moszkowski, Tomasz; Kauff, Daniel W; Wegner, Celine; Ruff, Roman; Somerlik-Fuchs, Karin H; Kruger, Thilo B; Augustyniak, Piotr; Hoffmann, Klaus-Peter; Kneist, Werner

    2018-03-01

    Neurophysiologic monitoring can improve autonomic nerve sparing during critical phases of rectal cancer surgery. To develop a system for extracorporeal stimulation of sacral nerve roots. Dedicated software controlled a ten-electrode stimulation array by switching between different electrode configurations and current levels. A built-in impedance and current level measurement assessed the effectiveness of current injection. Intra-anal surface electromyography (sEMG) informed on targeting the sacral nerve roots. All tests were performed on five pig specimens. During switching between electrode configurations, the system delivered 100% of the set current (25 mA, 30 Hz, 200 μs cathodic pulses) in 93% of 250 stimulation trains across all specimens. The impedance measured between single stimulation array contacts and corresponding anodes across all electrode configurations and specimens equaled 3.7 ± 2.5 kΩ. The intra-anal sEMG recorded a signal amplitude increase as previously observed in the literature. When the stimulation amplitude was tested in the range from 1 to 21 mA using the interconnected contacts of the stimulation array and the intra-anal anode, the impedance remained below 250 Ω and the system delivered 100% of the set current in all cases. Intra-anal sEMG showed an amplitude increase for current levels exceeding 6 mA. The system delivered stable electric current, which was proved by built-in impedance and current level measurements. Intra-anal sEMG confirmed the ability to target the branches of the autonomous nervous system originating from the sacral nerve roots. Stimulation outside of the operative field during rectal cancer surgery is feasible and may improve the practicality of pelvic intraoperative neuromonitoring.

  3. Enabling drug discovery project decisions with integrated computational chemistry and informatics

    NASA Astrophysics Data System (ADS)

    Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.

    2017-03-01

    Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.

  4. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    PubMed Central

    Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745

  5. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    PubMed

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  6. Students' Different Understandings of Class Diagrams

    ERIC Educational Resources Information Center

    Boustedt, Jonas

    2012-01-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a…

  7. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction

    PubMed Central

    Venkatesan, R.

    2016-01-01

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets. PMID:27738649

  8. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.

    PubMed

    Kumudha, P; Venkatesan, R

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  9. Technical Note: scuda: A software platform for cumulative dose assessment.

    PubMed

    Park, Seyoun; McNutt, Todd; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon

    2016-10-01

    Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (scuda) that can be seamlessly integrated into the clinical workflow. scuda consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.

  10. Technical Note: SCUDA: A software platform for cumulative dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Seyoun; McNutt, Todd; Quon, Harry

    Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our imagemore » PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.« less

  11. Pathways to lean software development: An analysis of effective methods of change

    NASA Astrophysics Data System (ADS)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  12. Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improvemore » the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.« less

  13. Factors Influencing Micro-Enterprises' Information Technology Adoption

    ERIC Educational Resources Information Center

    Song, Changsoo

    2014-01-01

    Public and non-profit organizations are operating different types of programs to help micro-enterprises appropriately adopt and utilize information technology (IT) for their businesses. Some programs provide mentoring or consultation services; some simply deliver discounted hardware and software; and some offer training services. However, it is…

  14. Making Your Blackboard Courses Talk!

    ERIC Educational Resources Information Center

    Burcham, Tim M.

    This presentation shows how to deliver audio/video (AV) lectures to online students using relatively inexpensive AV software (i.e., Camtasia Studio) and the standard Blackboard interface. The first section describes two types of production programs: presentation media converters and screen capture utilities. The second section covers making an AV…

  15. An Instructional Satellite System for the United States: Preliminary Considerations.

    ERIC Educational Resources Information Center

    DuMolin, James R.; Morgan, Robert P.

    Based on educational, social, political, and other considerations, an instructional satellite system, AVSIN (Ausio-Visual Satellite Instruction), is hypothesized which represents one possible organizational and administrative arrangement for delivering large amounts of quality software to schools and learning centers. The AVSIN system is conceived…

  16. Web-Based Teacher Training and Coaching/Feedback: A Case Study

    ERIC Educational Resources Information Center

    Wilczynski, Susan M.; Labrie, Allison; Baloski, Ann; Kaake, Amanda; Marchi, Nick; Zoder-Martell, Kimberly

    2017-01-01

    The present case study evaluated web-based training with coaching and feedback delivered through videoconferencing software to increase teacher use of behavioral methods associated with increased compliance. The participant, a preschool special education teacher, increased both her knowledge of efficacious interventions for autism spectrum…

  17. Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort

    NASA Technical Reports Server (NTRS)

    Bao, Han P.

    1991-01-01

    The goal of this project is to supplement the footwear design system of North Carolina State University (NCSU) with a software module to design and manufacture a combination sole. The four areas of concentration were: customization of NASCAD (NASA Computer Aided Design) to the footwear project; use of CENCIT data; computer aided manufacturing activities; and beginning work for the bottom elements of shoes. The task of generating a software module for producing a sole was completed with a demonstrated product realization. The software written in C was delivered to NCSU for inclusion in their design system for custom footwear known as LASTMOD. The machining process of the shoe last was improved using a spiral tool path approach.

  18. DigiSeis—A software component for digitizing seismic signals using the PC sound card

    NASA Astrophysics Data System (ADS)

    Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar

    2012-06-01

    An innovative software-based approach to develop an inexpensive experimental seismic recorder is presented. This approach requires no hardware as the built-in PC sound card is used for digitization of seismic signals. DigiSeis, an ActiveX component is developed to capture the digitized seismic signals from the sound card and deliver them to applications for processing and display. A seismic recorder application software SeisWave is developed over this component, which provides real-time monitoring and display of seismic events picked by a pair of external geophones. This recorder can be used as an educational aid for conducting seismic experiments. It can also be connected with suitable seismic sensors to record earthquakes. The software application and the ActiveX component are available for download. This component can be used to develop seismic recording applications according to user specific requirements.

  19. An experimental microcomputer controlled system for synchronized pulsating anti-gravity suit.

    PubMed

    Moore, T W; Foley, J; Reddy, B R; Kepics, F; Jaron, D

    1987-07-01

    An experimental system to deliver synchronized external pressure pulsations to the lower body is described in this technical note. The system is designed using a microcomputer with a real time interface and an electro-pneumatic subsystem capable of delivering pressure pulses to a modified anti-G suit at a fast rate. It is versatile, containing many options for synchronizing, phasing and sequencing of the pressure pulsations and controlling the pressure level in the suit bladders. Details of its software and hardware are described along with the results of initial testing in a Dynamic Flight Simulator on human volunteers.

  20. Antenna pattern study, task 2

    NASA Technical Reports Server (NTRS)

    Harper, Warren

    1989-01-01

    Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.

  1. The use of lower resolution viewing devices for mammographic interpretation: implications for education and training.

    PubMed

    Chen, Yan; James, Jonathan J; Turnbull, Anne E; Gale, Alastair G

    2015-10-01

    To establish whether lower resolution, lower cost viewing devices have the potential to deliver mammographic interpretation training. On three occasions over eight months, fourteen consultant radiologists and reporting radiographers read forty challenging digital mammography screening cases on three different displays: a digital mammography workstation, a standard LCD monitor, and a smartphone. Standard image manipulation software was available for use on all three devices. Receiver operating characteristic (ROC) analysis and ANOVA (Analysis of Variance) were used to determine the significance of differences in performance between the viewing devices with/without the application of image manipulation software. The effect of reader's experience was also assessed. Performance was significantly higher (p < .05) on the mammography workstation compared to the other two viewing devices. When image manipulation software was applied to images viewed on the standard LCD monitor, performance improved to mirror levels seen on the mammography workstation with no significant difference between the two. Image interpretation on the smartphone was uniformly poor. Film reader experience had no significant effect on performance across all three viewing devices. Lower resolution standard LCD monitors combined with appropriate image manipulation software are capable of displaying mammographic pathology, and are potentially suitable for delivering mammographic interpretation training. • This study investigates potential devices for training in mammography interpretation. • Lower resolution standard LCD monitors are potentially suitable for mammographic interpretation training. • The effect of image manipulation tools on mammography workstation viewing is insignificant. • Reader experience had no significant effect on performance in all viewing devices. • Smart phones are not suitable for displaying mammograms.

  2. Telemedicine using free voice over internet protocol (VoIP) technology.

    PubMed

    Miller, David J; Miljkovic, Nikola; Chiesa, Chad; Callahan, John B; Webb, Brad; Boedeker, Ben H

    2011-01-01

    Though dedicated videoteleconference (VTC) systems deliver high quality, low-latency audio and video for telemedical applications, they require expensive hardware and extensive infrastructure. The purpose of this study was to investigate free commercially available Voice over Internet Protocol (VoIP) software as a low cost alternative for telemedicine.

  3. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  4. Creating Simulations for An "Introduction to Research Methods" Course

    ERIC Educational Resources Information Center

    Adamson, Bob

    2010-01-01

    This paper describes the production of a software program for Master of Education students studying the "Introduction to Research Methods" course at a tertiary institution in Hong Kong. The course was originally delivered in a lecture mode, which proved unsatisfactory in providing sufficient learning support for the students. The paper…

  5. Customized News in Your Mailbox.

    ERIC Educational Resources Information Center

    Rudich, Joe

    1996-01-01

    Customized Internet services deliver news and selected research via e-mail, fax, Web browser, or their own software. Some are clipping services while others are full-fledged online newspapers. Most charge a monthly subscription fee, but a few are free to registered users. Provides the addresses, cost, scope, and evaluation of eight services. (PEN)

  6. Web Page Design in Distance Education

    ERIC Educational Resources Information Center

    Isman, Aytekin; Dabaj, Fahme; Gumus, Agah; Altinay, Fahriye; Altinay, Zehra

    2004-01-01

    Distance education is contemporary process of the education. It facilitates fast, easy delivery of information with its concrete hardware and software tools. The development of high technology, internet and web-design delivering become impact of effective using as delivery system to the students. Within the global perspective, even the all work…

  7. Web Page Design in Distance Education

    ERIC Educational Resources Information Center

    Isman, Aytekin; Dabaj, Fahme; Gumus, Agah; Altinay, Fahriye; Altinay, Zehra

    2004-01-01

    Distance education is contemporary process of the education. It facilitates fast, easy delivery of information with its concrete hardware and software tools. The development of high technology, Internet and web-design delivering become impact of effective using as delivery system to the students. Within the global perspective, even the all work…

  8. Helping Students Adapt to Computer-Based Encrypted Examinations

    ERIC Educational Resources Information Center

    Baker-Eveleth, Lori; Eveleth, Daniel M.; O'Neill, Michele; Stone, Robert W.

    2006-01-01

    The College of Business and Economics at the University of Idaho conducted a pilot study that used commercially available encryption software called Securexam to deliver computer-based examinations. A multi-step implementation procedure was developed, implemented, and then evaluated on the basis of what students viewed as valuable. Two key aspects…

  9. Application of Tablet PCs to Lecture Demonstrations on Optical Mineralogy

    ERIC Educational Resources Information Center

    Hoisch, Thomas D.; Austin, Barbara A.; Newell, Shawn L.; Manone, Mark F.

    2010-01-01

    Learning optical mineralogy requires students to integrate a complex theory with microscope manipulations and image interpretation. To assist student learning, we performed lecture demonstrations during which digital photomicrographs were taken and delivered to students using Tablet PCs, whereupon they were imported into note-taking software and…

  10. "Dropbox" Brings Course Management Back to Teachers

    ERIC Educational Resources Information Center

    Niles, Thaddeus M.

    2013-01-01

    Course management software (CMS) allows teachers to deliver content electronically and manage collaborative coursework, either blending with face-to-face interactions or as the core of an entirely virtual classroom environment. CMS often takes the form of an electronic storehouse of course materials with which students can interact, a virtual…

  11. Ubiquitous Wireless Laptops in Upper Elementary Mathematics

    ERIC Educational Resources Information Center

    Clariana, Roy

    2009-01-01

    This quasi-experimental investigation considers the second year of implementation of wireless laptops (1:1 ratio) in three 6th grade mathematics classrooms in one school compared to non-laptop classrooms (5:1 ratio) in seven other schools in the district. Comprehensive mathematics software from CompassLearning delivered via the internet was…

  12. Visual-Auditory Integration during Speech Imitation in Autism

    ERIC Educational Resources Information Center

    Williams, Justin H. G.; Massaro, Dominic W.; Peel, Natalie J.; Bosseler, Alexis; Suddendorf, Thomas

    2004-01-01

    Children with autistic spectrum disorder (ASD) may have poor audio-visual integration, possibly reflecting dysfunctional "mirror neuron" systems which have been hypothesised to be at the core of the condition. In the present study, a computer program, utilizing speech synthesizer software and a "virtual" head (Baldi), delivered speech stimuli for…

  13. The Impact of Graphic Organisers on Learning from Presentations

    ERIC Educational Resources Information Center

    Casteleyn, Jordi; Mottart, André; Valcke, Martin

    2013-01-01

    There is abundant educational research indicating that graphic organisers (knowledge maps, concept maps, or mind maps) have a beneficial impact on learning, but hardly any research has examined this in the context of presentations. This study therefore investigated how graphic organisers -- as delivered via presentation software -- affect learning…

  14. Digital Geological Mapping for Earth Science Students

    NASA Astrophysics Data System (ADS)

    England, Richard; Smith, Sally; Tate, Nick; Jordan, Colm

    2010-05-01

    This SPLINT (SPatial Literacy IN Teaching) supported project is developing pedagogies for the introduction of teaching of digital geological mapping to Earth Science students. Traditionally students are taught to make geological maps on a paper basemap with a notebook to record their observations. Learning to use a tablet pc with GIS based software for mapping and data recording requires emphasis on training staff and students in specific GIS and IT skills and beneficial adjustments to the way in which geological data is recorded in the field. A set of learning and teaching materials are under development to support this learning process. Following the release of the British Geological Survey's Sigma software we have been developing generic methodologies for the introduction of digital geological mapping to students that already have experience of mapping by traditional means. The teaching materials introduce the software to the students through a series of structured exercises. The students learn the operation of the software in the laboratory by entering existing observations, preferably data that they have collected. Through this the students benefit from being able to reflect on their previous work, consider how it might be improved and plan new work. Following this they begin fieldwork in small groups using both methods simultaneously. They are able to practise what they have learnt in the classroom and review the differences, advantages and disadvantages of the two methods, while adding to the work that has already been completed. Once the field exercises are completed students use the data that they have collected in the production of high quality map products and are introduced to the use of integrated digital databases which they learn to search and extract information from. The relatively recent development of the technologies which underpin digital mapping also means that many academic staff also require training before they are able to deliver the course materials. Consequently, a set of staff training materials are being developed in parallel to those for the students. These focus on the operation of the software and an introduction to the structure of the exercises. The presentation will review the teaching exercises and student and staff responses to their introduction.

  15. An improved real time superresolution FPGA system

    NASA Astrophysics Data System (ADS)

    Lakshmi Narasimha, Pramod; Mudigoudar, Basavaraj; Yue, Zhanfeng; Topiwala, Pankaj

    2009-05-01

    In numerous computer vision applications, enhancing the quality and resolution of captured video can be critical. Acquired video is often grainy and low quality due to motion, transmission bottlenecks, etc. Postprocessing can enhance it. Superresolution greatly decreases camera jitter to deliver a smooth, stabilized, high quality video. In this paper, we extend previous work on a real-time superresolution application implemented in ASIC/FPGA hardware. A gradient based technique is used to register the frames at the sub-pixel level. Once we get the high resolution grid, we use an improved regularization technique in which the image is iteratively modified by applying back-projection to get a sharp and undistorted image. The algorithm was first tested in software and migrated to hardware, to achieve 320x240 -> 1280x960, about 30 fps, a stunning superresolution by 16X in total pixels. Various input parameters, such as size of input image, enlarging factor and the number of nearest neighbors, can be tuned conveniently by the user. We use a maximum word size of 32 bits to implement the algorithm in Matlab Simulink as well as in FPGA hardware, which gives us a fine balance between the number of bits and performance. The proposed system is robust and highly efficient. We have shown the performance improvement of the hardware superresolution over the software version (C code).

  16. Intraperitoneal Injection Is Not a Suitable Administration Route for Single-Walled Carbon Nanotubes in Biomedical Applications.

    PubMed

    Liu, Xudong; Guo, Qing; Zhang, Yuchao; Li, Jinquan; Li, Rui; Wu, Yang; Ma, Ping; Yang, Xu

    2016-01-01

    Given the extensive application of carbon nanotubes (CNTs) in biomedical fields, there is increasing concern regarding unintentional health impacts. Research into safe usage is therefore increasingly necessary. This study investigated the responses of the mouse brain to single-walled CNTs (SWCNTs) delivered via intraperitoneal (IP) injection and compared these results with the previous study where SWCNTs were delivered via intravenous (IV) injection so as to explore which administration route is potentially better for SWCNTs application. This study suggests SWCNTs delivered via IP injection can have negative effects on the mouse brain through oxidative stress and inflammation at high concentration exposure, but these responses were not consistent and showed no dose-dependent effect. In a previous study, the results showed that IV-delivered SWCNTs induced a more consistent and dose-dependent effect. The comparison of the 2 studies suggested that using SWCNTs at a safe dosage delivered via IV injection may be a better administration route for SWCNTs in biomedical applications.

  17. The Impact of Software Culture on the Management of Community Data

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; Pulsifer, P. L.; Sheffield, E.; Lewis, S.; Oldenburg, J.

    2013-12-01

    The Exchange for Local Observations and Knowledge of the Arctic (ELOKA), a program hosted at the National Snow and Ice Data Center (NSIDC), supports the collection, curation, and distribution of Local and Traditional Knowledge (LTK) data, as well as some quantitative data products. Investigations involving LTK data often involve community participation, and therefore require flexible and robust user interfaces to support a reliable process of data collection and management. Often, investigators focused on LTK and community-based monitoring choose to use ELOKA's data services based on our ability to provide rapid proof-of-concepts and economical delivery of a usable product. To satisfy these two overarching criteria, ELOKA is experimenting with modifications to its software development culture both in terms of how the software applications are developed as well as the kind of software applications (or components) being developed. Over the past several years, NSIDC has shifted its software development culture from one of assigning individual scientific programmers to support particular principal investigators or projects, to an Agile Software Methodology implementation using Scrum practices. ELOKA has participated in this process by working with other product owners to schedule and prioritize development work which is then implemented by a team of application developers. Scrum, along with practices such as Test Driven Development (TDD) and paired programming, improves the quality of the software product delivered to the user community. To meet the need for rapid prototyping and to maximize product development and support with limited developer input, our software development efforts are now focused on creating a platform of application modules that can be quickly customized to suit the needs of a variety of LTK projects. This approach is in contrast to the strategy of delivering custom applications for individual projects. To date, we have integrated components of the Nunaliit Atlas framework (a Java/JavaScript client-server web-based application) with an existing Ruby on Rails application. This approach requires transitioning individual applications to expose a service layer, thus allowing interapplication communication via RESTful services. In this presentation we will report on our experiences using Agile Scrum practices, our efforts to move from custom solutions to a platform of customizable modules, and the impact of each on our ability to support researchers and Arctic residents in the domain of community-based observations and knowledge.

  18. Instrument control software development process for the multi-star AO system ARGOS

    NASA Astrophysics Data System (ADS)

    Kulas, M.; Barl, L.; Borelli, J. L.; Gässler, W.; Rabien, S.

    2012-09-01

    The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.

  19. UTM TCL 2.0 Software Version Description (SVD) Document

    NASA Technical Reports Server (NTRS)

    Mcguirk, Patrick

    2017-01-01

    This is the Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level(TCL) 2.0 Software Version Description (SVD) document. This UTM TCL 2.0 SVD describes the following four topics: 1. Software Release Contents: A listing of the files comprising this release 2. Installation Instructions: How to install the release and get it running 3. Changes Since Previous Release: General updates since the previous UTM release 4. Known Issues: Known issues and limitations in this release

  20. NASA Orbital Debris Engineering Model ORDEM2008 (Beta Version)

    NASA Technical Reports Server (NTRS)

    Stansbery, Eugene G.; Krisko, Paula H.

    2009-01-01

    This is an interim document intended to accompany the beta-release of the ORDEM2008 model. As such it provides the user with a guide for its use, a list of its capabilities, a brief summary of model development, and appendices included to educate the user as to typical runtimes for different orbit configurations. More detailed documentation will be delivered with the final product. ORDEM2008 supersedes NASA's previous model - ORDEM2000. The availability of new sensor and in situ data, the re-analysis of older data, and the development of new analytical techniques, has enabled the construction of this more comprehensive and sophisticated model. Integrated with the software is an upgraded graphical user interface (GUI), which uses project-oriented organization and provides the user with graphical representations of numerous output data products. These range from the conventional average debris size vs. flux magnitude for chosen analysis orbits, to the more complex color-contoured two-dimensional (2-D) directional flux diagrams in terms of local spacecraft pitch and yaw.

  1. A dense array stimulator to generate arbitrary spatio-temporal tactile stimuli

    PubMed Central

    Killebrew, Justin H.; Bensmaïa, Sliman J.; Dammann, John F.; Denchev, Peter; Hsiao, Steven S.; Craig, James C.

    2007-01-01

    The generation and presentation of tactile stimuli presents a unique challenge. Unlike vision and audition, in which standard equipment such as monitors and audio systems can be used for most experiments, tactile stimuli and/or stimulators often have to be tailor-made for a given study. Here, we present a novel tactile stimulator designed to present arbitrary spatio-temporal stimuli to the skin. The stimulator consists of 400 pins, arrayed over a 1 cm2 area, each under independent computer control. The dense array allows for an unprecedented number of stimuli to be presented within an experimental session (e.g., up to 1200 stimuli per minute) and for stimuli to be generated adaptively. The stimulator can be used in a variety of modes and can deliver indented and scanned patterns as well as stimuli defined by mathematical spatio-temporal functions (e.g., drifting sinusoids). We describe the hardware and software of the system, and discuss previous and prospective applications. PMID:17134760

  2. LSSGalPy: Interactive Visualization of the Large-scale Environment Around Galaxies

    NASA Astrophysics Data System (ADS)

    Argudo-Fernández, M.; Duarte Puertas, S.; Ruiz, J. E.; Sabater, J.; Verley, S.; Bergond, G.

    2017-05-01

    New tools are needed to handle the growth of data in astrophysics delivered by recent and upcoming surveys. We aim to build open-source, light, flexible, and interactive software designed to visualize extensive three-dimensional (3D) tabular data. Entirely written in the Python language, we have developed interactive tools to browse and visualize the positions of galaxies in the universe and their positions with respect to its large-scale structures (LSS). Motivated by a previous study, we created two codes using Mollweide projection and wedge diagram visualizations, where survey galaxies can be overplotted on the LSS of the universe. These are interactive representations where the visualizations can be controlled by widgets. We have released these open-source codes that have been designed to be easily re-used and customized by the scientific community to fulfill their needs. The codes are adaptable to other kinds of 3D tabular data and are robust enough to handle several millions of objects. .

  3. Teaching a laboratory-intensive online introductory electronics course*

    NASA Astrophysics Data System (ADS)

    Markes, Mark

    2008-03-01

    Most current online courses provide little or no hands-on laboratory content. This talk will describe the development and initial experiences with presenting an introductory online electronics course with significant hands-on laboratory content. The course is delivered using a Linux-based Apache web server, a Darwin Streaming Server, a SMART Board interactive white board, SMART Notebook software and a video camcorder. The laboratory uses primarily the Global Specialties PB-505 trainer and a Tenma 20MHz Oscilloscope that are provided to the students for the duration of the course and then returned. Testing is performed using Course Blackboard course management software.

  4. Using Free Internet Videogames in Upper Extremity Motor Training for Children with Cerebral Palsy

    PubMed Central

    Sevick, Marisa; Eklund, Elizabeth; Mensch, Allison; Foreman, Matthew; Standeven, John; Engsberg, Jack

    2016-01-01

    Movement therapy is one type of upper extremity intervention for children with cerebral palsy (CP) to improve function. It requires high-intensity, repetitive and task-specific training. Tedium and lack of motivation are substantial barriers to completing the training. An approach to overcome these barriers is to couple the movement therapy with videogames. This investigation: (1) tested the feasibility of delivering a free Internet videogame upper extremity motor intervention to four children with CP (aged 8–17 years) with mild to moderate limitations to upper limb function; and (2) determined the level of intrinsic motivation during the intervention. The intervention used free Internet videogames in conjunction with the Microsoft Kinect motion sensor and the Flexible Action and Articulated Skeleton Toolkit software (FAAST) software. Results indicated that the intervention could be successfully delivered in the laboratory and the home, and pre- and post- impairment, function and performance assessments were possible. Results also indicated a high level of motivation among the participants. It was concluded that the use of inexpensive hardware and software in conjunction with free Internet videogames has the potential to be very motivating in helping to improve the upper extremity abilities of children with CP. Future work should include results from additional participants and from a control group in a randomized controlled trial to establish efficacy. PMID:27338485

  5. Trends in software reliability for digital flight control

    NASA Technical Reports Server (NTRS)

    Hecht, H.; Hecht, M.

    1983-01-01

    Software error data of major recent Digital Flight Control Systems Development Programs. The report summarizes the data, compare these data with similar data from previous surveys and identifies trends and disciplines to improve software reliability.

  6. Patch Transporter: Incentivized, Decentralized Software Patch System for WSN and IoT Environments

    PubMed Central

    Lee, JongHyup

    2018-01-01

    In the complicated settings of WSN (Wireless Sensor Networks) and IoT (Internet of Things) environments, keeping a number of heterogeneous devices updated is a challenging job, especially with respect to effectively discovering target devices and rapidly delivering the software updates. In this paper, we convert the traditional software update process to a distributed service. We set an incentive system for faithfully transporting the patches to the recipient devices. The incentive system motivates independent, self-interested transporters for helping the devices to be updated. To ensure the system correctly operates, we employ the blockchain system that enforces the commitment in a decentralized manner. We also present a detailed specification for the proposed protocol and validate it by model checking and simulations for correctness. PMID:29438337

  7. Patch Transporter: Incentivized, Decentralized Software Patch System for WSN and IoT Environments.

    PubMed

    Lee, JongHyup

    2018-02-13

    [-12]In the complicated settings of WSN (Wireless Sensor Networks) and IoT (Internet of Things) environments, keeping a number of heterogeneous devices updated is a challenging job, especially with respect to effectively discovering target devices and rapidly delivering the software updates. In this paper, we convert the traditional software update process to a distributed service. We set an incentive system for faithfully transporting the patches to the recipient devices. The incentive system motivates independent, self-interested transporters for helping the devices to be updated. To ensure the system correctly operates, we employ the blockchain system that enforces the commitment in a decentralized manner. We also present a detailed specification for the proposed protocol and validate it by model checking and simulations for correctness.

  8. Absorbing Software Testing into the Scrum Method

    NASA Astrophysics Data System (ADS)

    Tuomikoski, Janne; Tervonen, Ilkka

    In this paper we study, how to absorb software testing into the Scrum method. We conducted the research as an action research during the years 2007-2008 with three iterations. The result showed that testing can and even should be absorbed to the Scrum method. The testing team was merged into the Scrum teams. The teams can now deliver better working software in a shorter time, because testing keeps track of the progress of the development. Also the team spirit is higher, because the Scrum team members are committed to the same goal. The biggest change from test manager’s point of view was the organized Product Owner Team. Test manager don’t have testing team anymore, and in the future all the testing tasks have to be assigned through the Product Backlog.

  9. SSME digital control design characteristics

    NASA Technical Reports Server (NTRS)

    Mitchell, W. T.; Searle, R. F.

    1985-01-01

    To protect against a latent programming error (software fault) existing in an untried branch combination that would render the space shuttle out of control in a critical flight phase, the Backup Flight System (BFS) was chartered to provide a safety alternative. The BFS is designed to operate in critical flight phases (ascent and descent) by monitoring the activities of the space shuttle flight subsystems that are under control of the primary flight software (PFS) (e.g., navigation, crew interface, propulsion), then, upon manual command by the flightcrew, to assume control of the space shuttle and deliver it to a noncritical flight condition (safe orbit or touchdown). The problems associated with the selection of the PFS/BFS system architecture, the internal BFS architecture, the fault tolerant software mechanisms, and the long term BFS utility are discussed.

  10. A Virtual Approach to Teaching Safety Skills to Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Self, Trisha; Scudder, Rosalind R.; Weheba, Gamal; Crumrine, Daiquirie

    2007-01-01

    Recent advancements in the development of hardware/software configurations for delivering virtual reality (VR) environments to individuals with disabilities have included approaches for children with autism spectrum disorder (ASD). This article describes a study comparing benefits of using VR to benefits of an integrated/visual treatment model…

  11. Course Management Systems: Time for Users to Get What They Need

    ERIC Educational Resources Information Center

    Ioannou, Andri; Hannafin, Robert D.

    2008-01-01

    Course management systems (CMSs) are software systems designed to manage course content and course activities. These tools integrate technological and pedagogical features into a web-based system that allows instructors, even those who are unfamiliar with web-based technologies, to design, deliver, and manage an online course. However, CMSs have…

  12. Turbocharged Diesels

    NASA Technical Reports Server (NTRS)

    1984-01-01

    In a number of feasibility studies of turbine rotor designs, engineers of Cummins Engine Company, Inc.'s turbocharger group have utilized a computer program from COSMIC. Part of Cummins research effort is aimed toward introduction of advanced turbocharged engines that deliver extra power with greater fuel efficiency. Company claims use of COSMIC program substantially reduced software development costs.

  13. Integrating the Multimedia Builder Software as an Educational Tool to Deliver Fairy Tales: Promoting Multiliteracies and Multimodality

    ERIC Educational Resources Information Center

    Eteokleous, Nikleia; Pavlou, Victoria; Tsolakidis, Simos

    2015-01-01

    As a way to respond to the contemporary challenges for promoting multiliteracies and multimodality in education, the current study proposes a theoretical framework--the multiliteracies model--in identifying, developing and evaluating multimodal material. The article examines, first theoretically and then empirically, the promotion of…

  14. Second-Grade Urban Learners: Preliminary Findings for a Computer-Assisted, Culturally Relevant, Repeated Reading Intervention

    ERIC Educational Resources Information Center

    Bennett, Jessica G.; Gardner, Ralph, III; Cartledge, Gwendolyn; Ramnath, Rajiv; Council, Morris R., III

    2017-01-01

    This study investigated the effects of a multicomponent, supplemental intervention on the reading fluency of second-grade African-American urban students who showed reading and special education risk. The packaged intervention combined repeated readings and culturally relevant stories, delivered through a novel computer software program to enhance…

  15. Characterization Test Report for the Mnemonics-UCS Wireless Surface Acoustic Wave Sensor System

    NASA Technical Reports Server (NTRS)

    Duncan, Joshua J.; Youngquist, Robert C.

    2013-01-01

    The scope of this testing includes the Surface Acoustic Wave Sensor System delivered to KSC: two interrogator (transceiver) systems, four temperature sensors, with wooden mounting blocks, two antennas, two power supplies, network cables, and analysis software. Also included are a number of additional temperature sensors and newly-developed hydrogen sensors

  16. Electronic Mail Is One High-Tech Management Tool that Really Delivers.

    ERIC Educational Resources Information Center

    Parker, Donald C.

    1987-01-01

    Describes an electronic mail system used by the Horseheads (New York) Central School Distict's eight schools and central office that saves time and enhances productivity. This software calls up information from the district's computer network and sends it to other users' special files--electronic "mailboxes" set aside for messages and…

  17. Strengthening Scientific Verbal Behavior: An Experimental Comparison of Progressively Prompted and Unprompted Programmed Instruction and Prose Tutorials

    ERIC Educational Resources Information Center

    Davis, Darrel R.; Bostow, Darrel E.; Heimisson, Gudmundur T.

    2007-01-01

    Web-based software was used to deliver and record the effects of programmed instruction that progressively added formal prompts until attempts were successful, programmed instruction with one attempt, and prose tutorials. Error-contingent progressive prompting took significantly longer than programmed instruction and prose. Both forms of…

  18. Delivery of Hardware for Syracuse University Faculty Loaner Program.

    ERIC Educational Resources Information Center

    Jares, Terry

    This paper describes the Faculty Assistance and Computing Education Services (FACES) loaner program at Syracuse University and the method used by FACES staff to deliver and keep track of hardware, software, and documentation. The roles of the various people involved in the program are briefly discussed, i.e., the administrator, who handles the…

  19. Communication Skills Training Exploiting Multimodal Emotion Recognition

    ERIC Educational Resources Information Center

    Bahreini, Kiavash; Nadolski, Rob; Westera, Wim

    2017-01-01

    The teaching of communication skills is a labour-intensive task because of the detailed feedback that should be given to learners during their prolonged practice. This study investigates to what extent our FILTWAM facial and vocal emotion recognition software can be used for improving a serious game (the Communication Advisor) that delivers a…

  20. E-Classical Fairy Tales: Multimedia Builder as a Tool

    ERIC Educational Resources Information Center

    Eteokleous, Nikleia; Ktoridou, Despo; Tsolakidis, Symeon

    2011-01-01

    The study examines pre-service teachers' experiences in delivering a traditional-classical fairy tale using the Multimedia Builder software, in other words an e-fairy tale. A case study approach was employed, collecting qualitative data through classroom observations and focus groups. The results focus on pre-service teachers' reactions, opinions,…

  1. The Challenges of Being Agile in DoD

    DTIC Science & Technology

    2013-02-01

    term “Agile” will serve as an overarching term to represent all forms of iterative development whether Scrum , Lean Software Development, extreme...occur? • How do we know what the development team will deliver at the end of the Sprint? (A basic unit of development in Scrum that lasts for “time

  2. The WHATs and HOWs of Maturing Computational and Software Engineering Skills in Russian Higher Education Institutions

    ERIC Educational Resources Information Center

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-01-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and…

  3. Cloud Computing in Support of Applied Learning: A Baseline Study of Infrastructure Design at Southern Polytechnic State University

    ERIC Educational Resources Information Center

    Conn, Samuel S.; Reichgelt, Han

    2013-01-01

    Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…

  4. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    ERIC Educational Resources Information Center

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  5. Cloud Computing Technologies in Writing Class: Factors Influencing Students' Learning Experience

    ERIC Educational Resources Information Center

    Wang, Jenny

    2017-01-01

    The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS) system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study…

  6. ELIXIR-UK role in bioinformatics training at the national level and across ELIXIR

    PubMed Central

    Larcombe, L.; Hendricusdottir, R.; Attwood, T.K.; Bacall, F.; Beard, N.; Bellis, L.J.; Dunn, W.B.; Hancock, J.M.; Nenadic, A.; Orengo, C.; Overduin, B.; Sansone, S-A; Thurston, M.; Viant, M.R.; Winder, C.L.; Goble, C.A.; Ponting, C.P.; Rustici, G.

    2017-01-01

    ELIXIR-UK is the UK node of ELIXIR, the European infrastructure for life science data. Since its foundation in 2014, ELIXIR-UK has played a leading role in training both within the UK and in the ELIXIR Training Platform, which coordinates and delivers training across all ELIXIR members. ELIXIR-UK contributes to the Training Platform’s coordination and supports the development of training to address key skill gaps amongst UK scientists. As part of this work it acts as a conduit for nationally-important bioinformatics training resources to promote their activities to the ELIXIR community. ELIXIR-UK also leads ELIXIR’s flagship Training Portal, TeSS, which collects information about a diverse range of training and makes it easily accessible to the community. ELIXIR-UK also works with others to provide key digital skills training, partnering with the Software Sustainability Institute to provide Software Carpentry training to the ELIXIR community and to establish the Data Carpentry initiative, and taking a lead role amongst national stakeholders to deliver the StaTS project – a coordinated effort to drive engagement with training in statistics. PMID:28781748

  7. Design and Construction of a Microcontroller-Based Ventilator Synchronized with Pulse Oximeter.

    PubMed

    Gölcük, Adem; Işık, Hakan; Güler, İnan

    2016-07-01

    This study aims to introduce a novel device with which mechanical ventilator and pulse oximeter work in synchronization. Serial communication technique was used to enable communication between the pulse oximeter and the ventilator. The SpO2 value and the pulse rate read on the pulse oximeter were transmitted to the mechanical ventilator through transmitter (Tx) and receiver (Rx) lines. The fuzzy-logic-based software developed for the mechanical ventilator interprets these values and calculates the percentage of oxygen (FiO2) and Positive End-Expiratory Pressure (PEEP) to be delivered to the patient. The fuzzy-logic-based software was developed to check the changing medical states of patients and to produce new results (FiO2 ve PEEP) according to each new state. FiO2 and PEEP values delivered from the ventilator to the patient can be calculated in this way without requiring any arterial blood gas analysis. Our experiments and the feedbacks from physicians show that this device makes it possible to obtain more successful results when compared to the current practices.

  8. X-train: teaching professionals remotely.

    PubMed

    Santerre, Charles R

    2005-05-01

    Increased popularity of the Internet, along with the development of new software applications have dramatically improved our ability to create and deliver online continuing education trainings to professionals in the areas of nutrition and food safety. In addition, these technological advances permit effective and affordable measurement of training outcomes, i.e., changes in knowledge, attitude, and behavior, that result from these educational efforts. Impact assessment of engagement programs is becoming increasing important for demonstrating the value of training activities to stakeholders. A novel software program, called X-Train, takes advantage of technological advances (databases, computer graphics, Web-based interfaces, and network speed) for delivering high-quality trainings to teachers and health care professionals. X-Train automatically collects outcome data, and generates and sends certificates of completion and communicates with participants through electronic messages. X-Train can be used as a collaborative tool whereby experts from various academic institutions are brought together to develop Web-based trainings. Finally, X-Train uses a unique approach that encourages cooperative extension specialists and educators to promote these educational opportunities within their state or county.

  9. The SHIP: A SIP to HTTP Interaction Protocol

    NASA Astrophysics Data System (ADS)

    Zeiß, Joachim; Gabner, Rene; Bessler, Sandford; Happenhofer, Marco

    IMS is capable of providing a wide range of services. As a result, terminal software becomes more and more complex to deliver network intelligence to user applications. Currently mobile terminal software needs to be permanently updated so that the latest network services and functionality can be delivered to the user. In the Internet, browser based user interfaces assure that an interface is made available to the user which offers the latest services in the net immediately. Our approach combines the benefits of the Session Initiation Protocol (SIP) and those of the HTTP protocol to bring the same type of user interfacing to IMS. SIP (IMS) realizes authentication, session management, charging and Quality of Service (QoS), HTTP provides access to Internet services and allows the user interface of an application to run on a mobile terminal while processing and orchestration is done on the server. A SHIP enabled IMS client only needs to handle data transport and session management via SIP, HTTP and RTP and render streaming media, HTML and Javascript. SHIP allows new kinds of applications, which combine audio, video and data within a single multimedia session.

  10. ELIXIR-UK role in bioinformatics training at the national level and across ELIXIR.

    PubMed

    Larcombe, L; Hendricusdottir, R; Attwood, T K; Bacall, F; Beard, N; Bellis, L J; Dunn, W B; Hancock, J M; Nenadic, A; Orengo, C; Overduin, B; Sansone, S-A; Thurston, M; Viant, M R; Winder, C L; Goble, C A; Ponting, C P; Rustici, G

    2017-01-01

    ELIXIR-UK is the UK node of ELIXIR, the European infrastructure for life science data. Since its foundation in 2014, ELIXIR-UK has played a leading role in training both within the UK and in the ELIXIR Training Platform, which coordinates and delivers training across all ELIXIR members. ELIXIR-UK contributes to the Training Platform's coordination and supports the development of training to address key skill gaps amongst UK scientists. As part of this work it acts as a conduit for nationally-important bioinformatics training resources to promote their activities to the ELIXIR community. ELIXIR-UK also leads ELIXIR's flagship Training Portal, TeSS, which collects information about a diverse range of training and makes it easily accessible to the community. ELIXIR-UK also works with others to provide key digital skills training, partnering with the Software Sustainability Institute to provide Software Carpentry training to the ELIXIR community and to establish the Data Carpentry initiative, and taking a lead role amongst national stakeholders to deliver the StaTS project - a coordinated effort to drive engagement with training in statistics.

  11. Spinoff 2015

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Topics covered include: 3D Endoscope to Boost Safety, Cut Cost of Surgery; Audio App Brings a Better Night's Sleep Liquid Cooling Technology Increases Exercise Efficiency; Algae-Derived Dietary Ingredients Nourish Animals; Space Grant Research Launches Rehabilitation Chair; Vision Trainer Teaches Focusing Techniques at Home; Aircraft Geared Architecture Reduces Fuel Cost and Noise; Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs; Flight Controller Software Protects Lightweight Flexible Aircraft; Cabin Pressure Monitors Notify Pilots to Save Lives; Ionospheric Mapping Software Ensures Accuracy of Pilots' GPS; Water Mapping Technology Rebuilds Lives in Arid Regions; Shock Absorbers Save Structures and Lives during Earthquakes; Software Facilitates Sharing of Water Quality Data Worldwide; Underwater Adhesives Retrofit Pipelines with Advanced Sensors; Laser Imaging Video Camera Sees through Fire, Fog, Smoke; 3D Lasers Increase Efficiency, Safety of Moving Machines; Air Revitalization System Enables Excursions to the Stratosphere; Magnetic Fluids Deliver Better Speaker Sound Quality; Bioreactor Yields Extracts for Skin Cream; Private Astronaut Training Prepares Commercial Crews of Tomorrow; Activity Monitors Help Users Get Optimum Sun Exposure; LEDs Illuminate Bulbs for Better Sleep, Wake Cycles; Charged Particles Kill Pathogens and Round Up Dust; Balance Devices Train Golfers for a Consistent Swing; Landsat Imagery Enables Global Studies of Surface Trends; Ruggedized Spectrometers Are Built for Tough Jobs; Gas Conversion Systems Reclaim Fuel for Industry; Remote Sensing Technologies Mitigate Drought; Satellite Data Inform Forecasts of Crop Growth; Probes Measure Gases for Environmental Research; Cloud Computing Technologies Facilitate Earth Research; Software Cuts Homebuilding Costs, Increases Energy Efficiency; Portable Planetariums Teach Science; Schedule Analysis Software Saves Time for Project Planners; Sound Modeling Simplifies Vehicle Noise Management; Custom 3D Printers Revolutionize Space Supply Chain; Improved Calibration Shows Images' True Colors; Micromachined Parts Advance Medicine, Astrophysics, and More; Metalworking Techniques Unlock a Unique Alloy; Low-Cost Sensors Deliver Nanometer-Accurate Measurements; Electrical Monitoring Devices Save on Time and Cost; Dry Lubricant Smooths the Way for Space Travel, Industry; and Compact Vapor Chamber Cools Critical Components.

  12. Prima Platform: A Scheme for Managing Equipment-Dependent Onboard Functions and Impacts on the Avionics Software Production Process

    NASA Astrophysics Data System (ADS)

    Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario

    2010-08-01

    The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.

  13. Evaluating Predictive Models of Software Quality

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  14. Mobile Software as a Medical Device (SaMD) for the Treatment of Epilepsy: Development of Digital Therapeutics Comprising Behavioral and Music-Based Interventions for Neurological Disorders

    PubMed Central

    Afra, Pegah; Bruggers, Carol S.; Sweney, Matthew; Fagatele, Lilly; Alavi, Fareeha; Greenwald, Michael; Huntsman, Merodean; Nguyen, Khanhly; Jones, Jeremiah K.; Shantz, David; Bulaj, Grzegorz

    2018-01-01

    Digital health technologies for people with epilepsy (PWE) include internet-based resources and mobile apps for seizure management. Since non-pharmacological interventions, such as listening to specific Mozart's compositions, cognitive therapy, psychosocial and educational interventions were shown to reduce epileptic seizures, these modalities can be integrated into mobile software and delivered by mobile medical apps as digital therapeutics. Herein, we describe: (1) a survey study among PWE about preferences to use mobile software for seizure control, (2) a rationale for developing digital therapies for epilepsy, (3) creation of proof-of-concept mobile software intended for use as an adjunct digital therapeutic to reduce seizures, and (4) broader applications of digital therapeutics for the treatment of epilepsy and other chronic disorders. A questionnaire was used to survey PWE with respect to preferred features in a mobile app for seizure control. Results from the survey suggested that over 90% of responders would be interested in using a mobile app to manage their seizures, while 75% were interested in listening to specific music that can reduce seizures. To define digital therapeutic for the treatment of epilepsy, we designed and created a proof-of-concept mobile software providing digital content intended to reduce seizures. The rationale for all components of such digital therapeutic is described. The resulting web-based app delivered a combination of epilepsy self-care, behavioral interventions, medication reminders and the antiseizure music, such as the Mozart's sonata K.448. To improve long-term patient engagement, integration of mobile medical app with music and multimedia streaming via smartphones, tablets and computers is also discussed. This work aims toward development and regulatory clearance of software as medical device (SaMD) for seizure control, yielding the adjunct digital therapeutic for epilepsy, and subsequently a drug-device combination product together with specific antiseizure medications. Mobile medical apps, music, therapeutic video games and their combinations with prescription medications present new opportunities to integrate pharmacological and non-pharmacological interventions for PWE, as well as those living with other chronic disorders, including depression and pain. PMID:29780310

  15. Mobile Software as a Medical Device (SaMD) for the Treatment of Epilepsy: Development of Digital Therapeutics Comprising Behavioral and Music-Based Interventions for Neurological Disorders.

    PubMed

    Afra, Pegah; Bruggers, Carol S; Sweney, Matthew; Fagatele, Lilly; Alavi, Fareeha; Greenwald, Michael; Huntsman, Merodean; Nguyen, Khanhly; Jones, Jeremiah K; Shantz, David; Bulaj, Grzegorz

    2018-01-01

    Digital health technologies for people with epilepsy (PWE) include internet-based resources and mobile apps for seizure management. Since non-pharmacological interventions, such as listening to specific Mozart's compositions, cognitive therapy, psychosocial and educational interventions were shown to reduce epileptic seizures, these modalities can be integrated into mobile software and delivered by mobile medical apps as digital therapeutics. Herein, we describe: (1) a survey study among PWE about preferences to use mobile software for seizure control, (2) a rationale for developing digital therapies for epilepsy, (3) creation of proof-of-concept mobile software intended for use as an adjunct digital therapeutic to reduce seizures, and (4) broader applications of digital therapeutics for the treatment of epilepsy and other chronic disorders. A questionnaire was used to survey PWE with respect to preferred features in a mobile app for seizure control. Results from the survey suggested that over 90% of responders would be interested in using a mobile app to manage their seizures, while 75% were interested in listening to specific music that can reduce seizures. To define digital therapeutic for the treatment of epilepsy, we designed and created a proof-of-concept mobile software providing digital content intended to reduce seizures. The rationale for all components of such digital therapeutic is described. The resulting web-based app delivered a combination of epilepsy self-care, behavioral interventions, medication reminders and the antiseizure music, such as the Mozart's sonata K.448. To improve long-term patient engagement, integration of mobile medical app with music and multimedia streaming via smartphones, tablets and computers is also discussed. This work aims toward development and regulatory clearance of software as medical device (SaMD) for seizure control, yielding the adjunct digital therapeutic for epilepsy, and subsequently a drug-device combination product together with specific antiseizure medications. Mobile medical apps, music, therapeutic video games and their combinations with prescription medications present new opportunities to integrate pharmacological and non-pharmacological interventions for PWE, as well as those living with other chronic disorders, including depression and pain.

  16. A multi-GPU real-time dose simulation software framework for lung radiotherapy.

    PubMed

    Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A

    2012-09-01

    Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.

  17. SU-F-T-489: 4-Years Experience of QA in TomoTherapy MVCT: What Do We Look Out For?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, F; Chan, K

    2016-06-15

    Purpose: To evaluate the QA results of TomoTherapy MVCT from March 2012 to February 2016, and to identify issues that may affect consistency in HU numbers and reconstructed treatment dose in MVCT. Methods: Monthly QA was performed on our TomoHD system. Phantom with rod inserts of various mass densities was imaged in MVCT and compared to baseline to evaluate HU number consistency. To evaluate treatment dose reconstructed by delivered sinogram and MVCT, a treatment plan was designed on a humanoid skull phantom. The phantom was imaged with MVCT and treatment plan was delivered to obtain the sinogram. The dose reconstructedmore » with the Planned Adaptive software was compared to the dose in the original plan. The QA tolerance for HU numbers was ±30 HU, and ±2% for discrepancy between original plan dose and reconstructed dose. Tolerances were referenced to AAPM TG148. Results: Several technical modifications or maintenance activities to the system have been identified which affected QA Results: 1) Upgrade in console system software which added a weekly HU calibration procedure; 2) Linac or MLC replacement leading to change in Accelerator Output Machine (AOM) parameters; 3) Upgrade in planning system algorithm affecting MVCT dose reconstruction. These events caused abrupt changes in QA results especially for the reconstructed dose. In the past 9 months, when no such modifications were done to the system, reconstructed dose was consistent with maximum deviation from baseline less than 0.6%. The HU number deviated less than 5HU. Conclusion: Routine QA is essential for MVCT, especially if the MVCT is used for daily dose reconstruction to monitor delivered dose to patients. Several technical events which may affect consistency of this are software changes, linac or MLC replacement. QA results reflected changes which justify re-calibration or system adjustment. In normal circumstances, the system should be relatively stable and quarterly QA may be sufficient.« less

  18. Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station

    NASA Technical Reports Server (NTRS)

    Kirby, Randy L.; Mann, David; Prenger, Stephen G.; Craig, Wayne; Greenwood, Andrew; Morsics, Jonathan; Fricker, Charles H.; Quach, Son; Lechese, Paul

    2003-01-01

    United Space Alliance (USA) developed and used a new software development method to meet technical, schedule, and budget challenges faced during the development and delivery of the new Shuttle Telemetry Ground Station at Kennedy Space Center. This method, called Collaborative Software Development, enabled KSC to effectively leverage industrial software and build additional capabilities to meet shuttle system and operational requirements. Application of this method resulted in reduced time to market, reduced development cost, improved product quality, and improved programmer competence while developing technologies of benefit to a small company in California (AP Labs Inc.). Many modifications were made to the baseline software product (VMEwindow), which improved its quality and functionality. In addition, six new software capabilities were developed, which are the subject of this article and add useful functionality to the VMEwindow environment. These new software programs are written in C or VXWorks and are used in conjunction with other ground station software packages, such as VMEwindow, Matlab, Dataviews, and PVWave. The Space Shuttle Telemetry Ground Station receives frequency-modulation (FM) and pulse-code-modulated (PCM) signals from the shuttle and support equipment. The hardware architecture (see figure) includes Sun workstations connected to multiple PCM- and FM-processing VersaModule Eurocard (VME) chassis. A reflective memory network transports raw data from PCM Processors (PCMPs) to the programmable digital-to-analog (D/A) converters, strip chart recorders, and analysis and controller workstations.

  19. Kinect Fusion improvement using depth camera calibration

    NASA Astrophysics Data System (ADS)

    Pagliari, D.; Menna, F.; Roncella, R.; Remondino, F.; Pinto, L.

    2014-06-01

    Scene's 3D modelling, gesture recognition and motion tracking are fields in rapid and continuous development which have caused growing demand on interactivity in video-game and e-entertainment market. Starting from the idea of creating a sensor that allows users to play without having to hold any remote controller, the Microsoft Kinect device was created. The Kinect has always attract researchers in different fields, from robotics to Computer Vision (CV) and biomedical engineering as well as third-party communities that have released several Software Development Kit (SDK) versions for Kinect in order to use it not only as a game device but as measurement system. Microsoft Kinect Fusion control libraries (firstly released in March 2013) allow using the device as a 3D scanning and produce meshed polygonal of a static scene just moving the Kinect around. A drawback of this sensor is the geometric quality of the delivered data and the low repeatability. For this reason the authors carried out some investigation in order to evaluate the accuracy and repeatability of the depth measured delivered by the Kinect. The paper will present a throughout calibration analysis of the Kinect imaging sensor, with the aim of establishing the accuracy and precision of the delivered information: a straightforward calibration of the depth sensor in presented and then the 3D data are correct accordingly. Integrating the depth correction algorithm and correcting the IR camera interior and exterior orientation parameters, the Fusion Libraries are corrected and a new reconstruction software is created to produce more accurate models.

  20. R-189 (C-620) air compressor control logic software documentation. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, K.E.

    1995-06-08

    This relates to FFTF plant air compressors. Purpose of this document is to provide an updated Computer Software Description for the software to be used on R-189 (C-620-C) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started.

  1. Implementation of workflow engine technology to deliver basic clinical decision support functionality

    PubMed Central

    2011-01-01

    Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We describe an implementation of a free workflow technology software suite (available at http://code.google.com/p/healthflow) and its application in the domain of clinical decision support. Our implementation seamlessly supports clinical logic testing on retrospective data and offers a user-friendly knowledge representation paradigm. With the presented software implementation, we demonstrate that workflow engine technology can provide a decision support platform which evaluates well against an established clinical decision support architecture evaluation framework. Due to cross-industry usage of workflow engine technology, we can expect significant future functionality enhancements that will further improve the technology's capacity to serve as a clinical decision support platform. PMID:21477364

  2. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  3. Topography Analysis and Visualization Software Supports a Guided Comparative Planetology Education Exhibit at the Smithsonian's Air and Space Museum

    NASA Technical Reports Server (NTRS)

    Roark, J. H.; Masuoka, C. M.; Frey, H. V.; Keller, J.; Williams, S.

    2005-01-01

    The Planetary Geodynamics Laboratory (http://geodynamics.gsfc.nasa.gov) of NASA s Goddard Space Flight Center designed, produced and recently delivered a "museum-friendly" version of GRIDVIEW, a grid visualization and analysis application, to the Smithsonian's National Air and Space Museum where it will be used in a guided comparative planetology education exhibit. The software was designed to enable museum visitors to interact with the same Earth and Mars topographic data and tools typically used by planetary scientists, and experience the thrill of discovery while learning about the geologic differences between Earth and Mars.

  4. PHM Enabled Autonomous Propellant Loading Operations

    NASA Technical Reports Server (NTRS)

    Walker, Mark; Figueroa, Fernando

    2017-01-01

    The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.

  5. Beyond the Renderer: Software Architecture for Parallel Graphics and Visualization

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1996-01-01

    As numerous implementations have demonstrated, software-based parallel rendering is an effective way to obtain the needed computational power for a variety of challenging applications in computer graphics and scientific visualization. To fully realize their potential, however, parallel renderers need to be integrated into a complete environment for generating, manipulating, and delivering visual data. We examine the structure and components of such an environment, including the programming and user interfaces, rendering engines, and image delivery systems. We consider some of the constraints imposed by real-world applications and discuss the problems and issues involved in bringing parallel rendering out of the lab and into production.

  6. Yleaf: Software for Human Y-Chromosomal Haplogroup Inference from Next-Generation Sequencing Data.

    PubMed

    Ralf, Arwin; Montiel González, Diego; Zhong, Kaiyin; Kayser, Manfred

    2018-05-01

    Next-generation sequencing (NGS) technologies offer immense possibilities given the large genomic data they simultaneously deliver. The human Y-chromosome serves as good example how NGS benefits various applications in evolution, anthropology, genealogy, and forensics. Prior to NGS, the Y-chromosome phylogenetic tree consisted of a few hundred branches, based on NGS data, it now contains many thousands. The complexity of both, Y tree and NGS data provide challenges for haplogroup assignment. For effective analysis and interpretation of Y-chromosome NGS data, we present Yleaf, a publically available, automated, user-friendly software for high-resolution Y-chromosome haplogroup inference independently of library and sequencing methods.

  7. The Development of a Dynamic Geomagnetic Cutoff Rigidity Model for the International Space Station

    NASA Technical Reports Server (NTRS)

    Smart, D. F.; Shea, M. A.

    1999-01-01

    We have developed a computer model of geomagnetic vertical cutoffs applicable to the orbit of the International Space Station. This model accounts for the change in geomagnetic cutoff rigidity as a function of geomagnetic activity level. This model was delivered to NASA Johnson Space Center in July 1999 and tested on the Space Radiation Analysis Group DEC-Alpha computer system to ensure that it will properly interface with other software currently used at NASA JSC. The software was designed for ease of being upgraded as other improved models of geomagnetic cutoff as a function of magnetic activity are developed.

  8. Ethoscopes: An open platform for high-throughput ethomics.

    PubMed

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J; French, Alice S; Jamasb, Arian R; Gilestro, Giorgio F

    2017-10-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  9. Increased User Satisfaction Through an Improved Message System

    NASA Technical Reports Server (NTRS)

    Weissert, C. L.

    1997-01-01

    With all of the enhancements in software methodology and testing, there is no guarantee that software can be delivered such that no user errors occur, How to handle these errors when they occur has become a major research topic within human-computer interaction (HCI). Users of the Multimission Spacecraft Analysis Subsystem(MSAS) at the Jet Propulsion Laboratory (JPL), a system of X and motif graphical user interfaces for analyzing spacecraft data, complained about the lack of information about the error cause and have suggested that recovery actions be included in the system error messages...The system was evaluated through usability surveys and was shown to be successful.

  10. From Here to Technology. How To Fund Hardware, Software, and More.

    ERIC Educational Resources Information Center

    Hunter, Barbara M.

    Faced with shrinking state and local tax support and an increased demand for K-12 educational reform, school leaders must use creative means to find money to improve and deliver instruction and services to their schools. This handbook describes innovative strategies that school leaders have used to find scarce dollars for purchasing educational…

  11. Fostering Cooperative Learning with Scrum in a Semi-Capstone Systems Analysis and Design Course

    ERIC Educational Resources Information Center

    Magana, Alejandra J.; Seah, Ying Ying; Thomas, Paul

    2018-01-01

    Agile methods such as Scrum that emphasize technical, communication, and teamwork skills have been practiced by IT professionals to effectively deliver software products of good quality. The same methods combined with pedagogies of engagement can potentially be used in the setting of higher education to promote effective group learning in software…

  12. A Comparison of Text, Voice, and Screencasting Feedback to Online Students

    ERIC Educational Resources Information Center

    Orlando, John

    2016-01-01

    The emergence of simple video and voice recording software has allowed faculty to deliver online course content in a variety of rich formats. But most faculty are still using traditional text comments for feedback to students. The author launched a study comparing student and faculty perceptions of text, voice, and screencasting feedback. The…

  13. Technological evaluation of gesture and speech interfaces for enabling dismounted soldier-robot dialogue

    NASA Astrophysics Data System (ADS)

    Kattoju, Ravi Kiran; Barber, Daniel J.; Abich, Julian; Harris, Jonathan

    2016-05-01

    With increasing necessity for intuitive Soldier-robot communication in military operations and advancements in interactive technologies, autonomous robots have transitioned from assistance tools to functional and operational teammates able to service an array of military operations. Despite improvements in gesture and speech recognition technologies, their effectiveness in supporting Soldier-robot communication is still uncertain. The purpose of the present study was to evaluate the performance of gesture and speech interface technologies to facilitate Soldier-robot communication during a spatial-navigation task with an autonomous robot. Gesture and speech semantically based spatial-navigation commands leveraged existing lexicons for visual and verbal communication from the U.S Army field manual for visual signaling and a previously established Squad Level Vocabulary (SLV). Speech commands were recorded by a Lapel microphone and Microsoft Kinect, and classified by commercial off-the-shelf automatic speech recognition (ASR) software. Visual signals were captured and classified using a custom wireless gesture glove and software. Participants in the experiment commanded a robot to complete a simulated ISR mission in a scaled down urban scenario by delivering a sequence of gesture and speech commands, both individually and simultaneously, to the robot. Performance and reliability of gesture and speech hardware interfaces and recognition tools were analyzed and reported. Analysis of experimental results demonstrated the employed gesture technology has significant potential for enabling bidirectional Soldier-robot team dialogue based on the high classification accuracy and minimal training required to perform gesture commands.

  14. The Globus Galaxies Platform. Delivering Science Gateways as a Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Ravi; Chard, Kyle; Chard, Ryan

    We use public cloud computers to host sophisticated scientific data; software is then used to transform scientific practice by enabling broad access to capabilities previously available only to the few. The primary obstacle to more widespread use of public clouds to host scientific software (‘cloud-based science gateways’) has thus far been the considerable gap between the specialized needs of science applications and the capabilities provided by cloud infrastructures. We describe here a domain-independent, cloud-based science gateway platform, the Globus Galaxies platform, which overcomes this gap by providing a set of hosted services that directly address the needs of science gatewaymore » developers. The design and implementation of this platform leverages our several years of experience with Globus Genomics, a cloud-based science gateway that has served more than 200 genomics researchers across 30 institutions. Building on that foundation, we have also implemented a platform that leverages the popular Galaxy system for application hosting and workflow execution; Globus services for data transfer, user and group management, and authentication; and a cost-aware elastic provisioning model specialized for public cloud resources. We describe here the capabilities and architecture of this platform, present six scientific domains in which we have successfully applied it, report on user experiences, and analyze the economics of our deployments. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less

  15. Assistive Software for Disabled Learners

    ERIC Educational Resources Information Center

    Clark, Sharon; Baggaley, Jon

    2004-01-01

    Previous reports in this series (#32 and 36) have discussed online software features of value to disabled learners in distance education. The current report evaluates four specific assistive software products with useful features for visually and hearing impaired learners: "ATutor", "ACollab", "Natural Voice", and "Just Vanilla". The evaluative…

  16. THE VALIDITY OF USING ROC SOFTWARE FOR ANALYSING VISUAL GRADING CHARACTERISTICS DATA: AN INVESTIGATION BASED ON THE NOVEL SOFTWARE VGC ANALYZER.

    PubMed

    Hansson, Jonny; Månsson, Lars Gunnar; Båth, Magnus

    2016-06-01

    The purpose of the present work was to investigate the validity of using single-reader-adapted receiver operating characteristics (ROC) software for analysis of visual grading characteristics (VGC) data. VGC data from four published VGC studies on optimisation of X-ray examinations, previously analysed using ROCFIT, were reanalysed using a recently developed software dedicated to VGC analysis (VGC Analyzer), and the outcomes [the mean and 95 % confidence interval (CI) of the area under the VGC curve (AUCVGC) and the p-value] were compared. The studies included both paired and non-paired data and were reanalysed both for the fixed-reader and the random-reader situations. The results showed good agreement between the softwares for the mean AUCVGC For non-paired data, wider CIs were obtained with VGC Analyzer than previously reported, whereas for paired data, the previously reported CIs were similar or even broader. Similar observations were made for the p-values. The results indicate that the use of single-reader-adapted ROC software such as ROCFIT for analysing non-paired VGC data may lead to an increased risk of committing Type I errors, especially in the random-reader situation. On the other hand, the use of ROC software for analysis of paired VGC data may lead to an increased risk of committing Type II errors, especially in the fixed-reader situation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Viability of Cross-Flow Fan with Helical Blades for Vertical Take-off and Landing Aircraft

    DTIC Science & Technology

    2012-09-01

    fluid dynamics (CFD) software, ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental results...computational fluid dynamics software (CFD), ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental...37 B. SIZING PARAMETERS AND ILLUSTRATION ................................. 37 APPENDIX B. ANSYS CFX PARAMETERS

  18. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  19. Towards fast online intrafraction replanning for free-breathing stereotactic body radiation therapy with the MR-linac.

    PubMed

    Kontaxis, C; Bol, G H; Stemkens, B; Glitzner, M; Prins, F M; Kerkmeijer, L G W; Lagendijk, J J W; Raaymakers, B W

    2017-08-21

    The hybrid MRI-radiotherapy machines, like the MR-linac (Elekta AB, Stockholm, Sweden) installed at the UMC Utrecht (Utrecht, The Netherlands), will be able to provide real-time patient imaging during treatment. In order to take advantage of the system's capabilities and enable online adaptive treatments, a new generation of software should be developed, ranging from motion estimation to treatment plan adaptation. In this work we present a proof of principle adaptive pipeline designed for high precision stereotactic body radiation therapy (SBRT) suitable for sites affected by respiratory motion, like renal cell carcinoma (RCC). We utilized our research MRL treatment planning system (MRLTP) to simulate a single fraction 25 Gy free-breathing SBRT treatment for RCC by performing inter-beam replanning for two patients and one volunteer. The simulated pipeline included a combination of (pre-beam) 4D-MRI and (online) 2D cine-MR acquisitions. The 4DMRI was used to generate the mid-position reference volume, while the cine-MRI, via an in-house motion model, provided three-dimensional (3D) deformable vector fields (DVFs) describing the anatomical changes during treatment. During the treatment fraction, at an inter-beam interval, the mid-position volume of the patient was updated and the delivered dose was accurately reconstructed on the underlying motion calculated by the model. Fast online replanning, targeting the latest anatomy and incorporating the previously delivered dose was then simulated with MRLTP. The adaptive treatment was compared to a conventional mid-position SBRT plan with a 3 mm planning target volume margin reconstructed on the same motion trace. We demonstrate that our system produced tighter dose distributions and thus spared the healthy tissue, while delivering more dose to the target. The pipeline was able to account for baseline variations/drifts that occurred during treatment ensuring target coverage at the end of the treatment fraction.

  20. Towards fast online intrafraction replanning for free-breathing stereotactic body radiation therapy with the MR-linac

    NASA Astrophysics Data System (ADS)

    Kontaxis, C.; Bol, G. H.; Stemkens, B.; Glitzner, M.; Prins, F. M.; Kerkmeijer, L. G. W.; Lagendijk, J. J. W.; Raaymakers, B. W.

    2017-09-01

    The hybrid MRI-radiotherapy machines, like the MR-linac (Elekta AB, Stockholm, Sweden) installed at the UMC Utrecht (Utrecht, The Netherlands), will be able to provide real-time patient imaging during treatment. In order to take advantage of the system’s capabilities and enable online adaptive treatments, a new generation of software should be developed, ranging from motion estimation to treatment plan adaptation. In this work we present a proof of principle adaptive pipeline designed for high precision stereotactic body radiation therapy (SBRT) suitable for sites affected by respiratory motion, like renal cell carcinoma (RCC). We utilized our research MRL treatment planning system (MRLTP) to simulate a single fraction 25 Gy free-breathing SBRT treatment for RCC by performing inter-beam replanning for two patients and one volunteer. The simulated pipeline included a combination of (pre-beam) 4D-MRI and (online) 2D cine-MR acquisitions. The 4DMRI was used to generate the mid-position reference volume, while the cine-MRI, via an in-house motion model, provided three-dimensional (3D) deformable vector fields (DVFs) describing the anatomical changes during treatment. During the treatment fraction, at an inter-beam interval, the mid-position volume of the patient was updated and the delivered dose was accurately reconstructed on the underlying motion calculated by the model. Fast online replanning, targeting the latest anatomy and incorporating the previously delivered dose was then simulated with MRLTP. The adaptive treatment was compared to a conventional mid-position SBRT plan with a 3 mm planning target volume margin reconstructed on the same motion trace. We demonstrate that our system produced tighter dose distributions and thus spared the healthy tissue, while delivering more dose to the target. The pipeline was able to account for baseline variations/drifts that occurred during treatment ensuring target coverage at the end of the treatment fraction.

  1. Knowledgeable Neighbors: a mobile clinic model for disease prevention and screening in underserved communities.

    PubMed

    Hill, Caterina; Zurakowski, David; Bennet, Jennifer; Walker-White, Rainelle; Osman, Jamie L; Quarles, Aaron; Oriol, Nancy

    2012-03-01

    The Family Van mobile health clinic uses a "Knowledgeable Neighbor" model to deliver cost-effective screening and prevention activities in underserved neighborhoods in Boston, MA. We have described the Knowledgeable Neighbor model and used operational data collected from 2006 to 2009 to evaluate the service. The Family Van successfully reached mainly minority low-income men and women. Of the clients screened, 60% had previously undetected elevated blood pressure, 14% had previously undetected elevated blood glucose, and 38% had previously undetected elevated total cholesterol. This represents an important model for reaching underserved communities to deliver proven cost-effective prevention activities, both to help control health care costs and to reduce health disparities.

  2. The Consumer Juggernaut: Web-Based and Mobile Applications as Innovation Pioneer

    NASA Astrophysics Data System (ADS)

    Messerschmitt, David G.

    As happened previously in electronics, software targeted at consumers is increasingly the focus of investment and innovation. Some of the areas where it is leading is animated interfaces, treating users as a community, audio and video information, software as a service, agile software development, and the integration of business models with software design. As a risk-taking and experimental market, and as a source of ideas, consumer software can benefit other areas of applications software. The influence of consumer software can be magnified by research into the internal organizations and processes of the innovative firms at its foundation.

  3. Writing references and using citation management software.

    PubMed

    Sungur, Mukadder Orhan; Seyhan, Tülay Özkan

    2013-09-01

    The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized.

  4. Application of Real Options Theory to Software Engineering for Strategic Decision Making in Software Related Capital Investments

    DTIC Science & Technology

    2008-12-01

    between our current project and the historical projects. Therefore to refine the historical volatility estimate of the previously completed software... historical volatility estimates obtained in the form of beliefs and plausibility based on subjective probabilities that take into consideration unique

  5. 77 FR 31758 - Airworthiness Directives; the Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-30

    .... That NPRM proposed to inspect for part numbers of the operational program software of the flight... operational program software (OPS) of the flight control computers (FCC), and doing corrective actions if... previous NPRM (75 FR 57885, September 23, 2010), we have determined that the software installation required...

  6. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  7. NASA Tech Briefs, April 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics include: Tool for Bending a Metal Tube Precisely in a Confined Space; Multiple-Use Mechanisms for Attachment to Seat Tracks; Force-Measuring Clamps; Cellular Pressure-Actuated Joint; Block QCA Fault-Tolerant Logic Gates; Hybrid VLSI/QCA Architecture for Computing FFTs; Arrays of Carbon Nanotubes as RF Filters in Waveguides; Carbon Nanotubes as Resonators for RF Spectrum Analyzers; Software for Viewing Landsat Mosaic Images; Updated Integrated Mission Program; Software for Sharing and Management of Information; Optical-Quality Thin Polymer Membranes; Rollable Thin Shell Composite-Material Paraboloidal Mirrors; Folded Resonant Horns for Power Ultrasonic Applications; Touchdown Ball-Bearing System for Magnetic Bearings; Flux-Based Deadbeat Control of Induction-Motor Torque; Block Copolymers as Templates for Arrays of Carbon Nanotubes; Throttling Cryogen Boiloff To Control Cryostat Temperature; Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station; Turbulence in Supercritical O2/H2 and C7H16/N2 Mixing Layers; and Time-Resolved Measurements in Optoelectronic Microbioanal.

  8. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis

    PubMed Central

    Hardisty, Frank; Robinson, Anthony C.

    2010-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  9. Project Management Software for Distributed Industrial Companies

    NASA Astrophysics Data System (ADS)

    Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.

    This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.

  10. L2 Learners' Engagement with High Stakes Listening Tests: Does Technology Have a Beneficial Role to Play?

    ERIC Educational Resources Information Center

    East, Martin; King, Chris

    2012-01-01

    In the listening component of the IELTS examination candidates hear the input once, delivered at "normal" speed. This format for listening can be problematic for test takers who often perceive normal speed input to be too fast for effective comprehension. The study reported here investigated whether using computer software to slow down…

  11. The CEBAF Element Database and Related Operational Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larrieu, Theodore; Slominski, Christopher; Keesee, Marie

    The newly commissioned 12GeV CEBAF accelerator relies on a flexible, scalable and comprehensive database to define the accelerator. This database delivers the configuration for CEBAF operational tools, including hardware checkout, the downloadable optics model, control screens, and much more. The presentation will describe the flexible design of the CEBAF Element Database (CED), its features and assorted use case examples.

  12. Computer-generated graphical presentations: use of multimedia to enhance communication.

    PubMed

    Marks, L S; Penson, D F; Maller, J J; Nielsen, R T; deKernion, J B

    1997-01-01

    Personal computers may be used to create, store, and deliver graphical presentations. With computer-generated combinations of the five media (text, images, sound, video, and animation)--that is, multimedia presentations--the effectiveness of message delivery can be greatly increased. The basic tools are (1) a personal computer; (2) presentation software; and (3) a projector to enlarge the monitor images for audience viewing. Use of this new method has grown rapidly in the business-conference world, but has yet to gain widespread acceptance at medical meetings. We review herein the rationale for multimedia presentations in medicine (vis-à-vis traditional slide shows) as an improved means for increasing audience attention, comprehension, and retention. The evolution of multimedia is traced from earliest times to the present. The steps involved in making a multimedia presentation are summarized, emphasizing advances in technology that bring the new method within practical reach of busy physicians. Specific attention is given to software, digital image processing, storage devices, and delivery methods. Our development of a urology multimedia presentation--delivered May 4, 1996, before the Society for Urology and Engineering and now Internet-accessible at http://www.usrf.org--was the impetus for this work.

  13. Hybrid architecture for building secure sensor networks

    NASA Astrophysics Data System (ADS)

    Owens, Ken R., Jr.; Watkins, Steve E.

    2012-04-01

    Sensor networks have various communication and security architectural concerns. Three approaches are defined to address these concerns for sensor networks. The first area is the utilization of new computing architectures that leverage embedded virtualization software on the sensor. Deploying a small, embedded virtualization operating system on the sensor nodes that is designed to communicate to low-cost cloud computing infrastructure in the network is the foundation to delivering low-cost, secure sensor networks. The second area focuses on securing the sensor. Sensor security components include developing an identification scheme, and leveraging authentication algorithms and protocols that address security assurance within the physical, communication network, and application layers. This function will primarily be accomplished through encrypting the communication channel and integrating sensor network firewall and intrusion detection/prevention components to the sensor network architecture. Hence, sensor networks will be able to maintain high levels of security. The third area addresses the real-time and high priority nature of the data that sensor networks collect. This function requires that a quality-of-service (QoS) definition and algorithm be developed for delivering the right data at the right time. A hybrid architecture is proposed that combines software and hardware features to handle network traffic with diverse QoS requirements.

  14. AccessMRS: integrating OpenMRS with smart forms on Android.

    PubMed

    Fazen, Louis E; Chemwolo, Benjamin T; Songok, Julia J; Ruhl, Laura J; Kipkoech, Carolyne; Green, James M; Ikemeri, Justus E; Christoffersen-Deb, Astrid

    2013-01-01

    We present a new open-source Android application, AccessMRS, for interfacing with an electronic medical record system (OpenMRS) and loading 'Smart Forms' on a mobile device. AccessMRS functions as a patient-centered interface for viewing OpenMRS data; managing patient information in reminders, task lists, and previous encounters; and launching patient-specific 'Smart Forms' for electronic data collection and dissemination of health information. We present AccessMRS in the context of related software applications we developed to serve Community Health Workers, including AccessInfo, AccessAdmin, AccessMaps, and AccessForms. The specific features and design of AccessMRS are detailed in relationship to the requirements that drove development: the workflows of the Kenyan Ministry of Health Community Health Volunteers (CHVs) supported by the AMPATH Primary Health Care Program. Specifically, AccessMRS was designed to improve the quality of community-based Maternal and Child Health services delivered by CHVs in Kosirai Division. AccessMRS is currently in use by more than 80 CHVs in Kenya and undergoing formal assessment of acceptability, effectiveness, and cost.

  15. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  16. Moving base Gravity Gradiometer Survey System (GGSS) program

    NASA Astrophysics Data System (ADS)

    Pfohl, Louis; Rusnak, Walter; Jircitano, Albert; Grierson, Andrew

    1988-04-01

    The GGSS program began in early 1983 with the objective of delivering a landmobile and airborne system capable of fast, accurate, and economical gravity gradient surveys of large areas anywhere in the world. The objective included the development and use of post-mission data reduction software to process the survey data into solutions for the gravity disturbance vector components (north, east and vertical). This document describes the GGSS equipment hardware and software, integration and lab test procedures and results, and airborne and land survey procedures and results. Included are discussions on test strategies, post-mission data reduction algorithms, and the data reduction processing experience. Perspectives and conclusions are drawn from the results.

  17. DART, a platform for the creation and registration of cone beam digital tomosynthesis datasets.

    PubMed

    Sarkar, Vikren; Shi, Chengyu; Papanikolaou, Niko

    2011-04-01

    Digital tomosynthesis is an imaging modality that allows for tomographic reconstructions using only a fraction of the images needed for CT reconstruction. Since it offers the advantages of tomographic images with a smaller imaging dose delivered to the patient, the technique offers much promise for use in patient positioning prior to radiation delivery. This paper describes a software environment developed to help in the creation of digital tomosynthesis image sets from digital portal images using three different reconstruction algorithms. The software then allows for use of the tomograms for patient positioning or for dose recalculation if shifts are not applied, possibly as part of an adaptive radiotherapy regimen.

  18. Ethoscopes: An open platform for high-throughput ethomics

    PubMed Central

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J.; French, Alice S.; Jamasb, Arian R.

    2017-01-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope. PMID:29049280

  19. TU-H-CAMPUS-JeP2-04: Deriving Delivered Doses to Assess the Viability of 2.5 Mm Margins in Head and Neck SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, S; Shang, Q; Godley, A

    Purpose: To calculate the delivered dose for head and neck SBRT patients using pre-treatment images. This delivered dose was then used to determine the viability of 2.5 mm margins. Methods: Daily cone beam CTs (CBCTs) were collected for 20 patients along with a planning CT, planned dose, and planning structures. The day 1 CBCT was aligned to the planning CT using the treatment shifts (six degrees of freedom) and then the dose and contours were transferred to the CBCT. The day 1 CBCT becomes the reference image for days 2–5. The day 2–5 CBCTs were also aligned to the planningmore » CT using the treatment shifts given and the dose transferred. The day 2–5 CBCTs were then deformably registered to the day 1 CBCT. The doses delivered on days 2–5 were then deformed to the day 1 CBCT where they could be accumulated. This was achieved with MIM 6.5.1 (MIM Software, Cleveland OH). The accumulated doses for the 20 patients were evaluated against the planned doses using the initial planning criteria as points of comparison. Results: The delivered CTV dose conformed to the planned 98.6% coverage, with an average decrease of 2.6% between planned and delivered coverage. This implies the 2.5 mm margin was sufficient. Larger CTVs correlated to smaller differences between planned and delivered coverage. Delivered dose to critical structures including the spinal cord, mandible, brain, brainstem, and larynx was acceptable, with differences between planned and delivered max dose <5% on average. Similarly for the parotid glands, globes, cochlear, optic nerve, lens, and submandibular glands, differences between planned and delivered doses were generally <5%. Conclusion: The 2.5 mm margin provided acceptable CTV coverage, adequately accounting for setup errors. Organ at risk sparing was also satisfactory. Small tumor volumes (<20 cc) may require a larger margin to treat effectively.« less

  20. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    NASA Astrophysics Data System (ADS)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  1. Writing references and using citation management software

    PubMed Central

    Sungur, Mukadder Orhan; Seyhan, Tülay Özkan

    2013-01-01

    The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized. PMID:26328132

  2. Extensive Evaluation of Using a Game Project in a Software Architecture Course

    ERIC Educational Resources Information Center

    Wang, Alf Inge

    2011-01-01

    This article describes an extensive evaluation of introducing a game project to a software architecture course. In this project, university students have to construct and design a type of software architecture, evaluate the architecture, implement an application based on the architecture, and test this implementation. In previous years, the domain…

  3. Implementing Kanban for agile process management within the ALMA Software Operations Group

    NASA Astrophysics Data System (ADS)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  4. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  5. Mobile medical computing driven by the complexity of neurologic diagnosis.

    PubMed

    Segal, Michael M

    2006-07-01

    Medical computing has been split between palm-sized computers optimized for mobility and desktop computers optimized for capability. This split was due to technology too immature to deliver both mobility and capability in the same computer and the lack of medical software that demanded both mobility and capability. Advances in hardware and software are ushering in an era in which fully capable computers will be available ubiquitously. As a result, medical practice, education and publishing will change. Medical practice will be improved by the use of software that not only assists with diagnosis but can do so at the bedside, where the doctor can act immediately upon suggestions such as useful findings to check. Medical education will shift away from a focus on details of unusual diseases and toward a focus on skills of physical examination and using computerized tools. Medical publishing, in contrast, will shift toward greater detail: it will be increasingly important to quantitate the frequency of findings in diseases and their time course since such information can have a major impact clinically when added to decision support software.

  6. Automated registration of large deformations for adaptive radiation therapy of prostate cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godley, Andrew; Ahunbay, Ergun; Peng Cheng

    2009-04-15

    Available deformable registration methods are often inaccurate over large organ variation encountered, for example, in the rectum and bladder. The authors developed a novel approach to accurately and effectively register large deformations in the prostate region for adaptive radiation therapy. A software tool combining a fast symmetric demons algorithm and the use of masks was developed in C++ based on ITK libraries to register CT images acquired at planning and before treatment fractions. The deformation field determined was subsequently used to deform the delivered dose to match the anatomy of the planning CT. The large deformations involved required that themore » bladder and rectum volume be masked with uniform intensities of -1000 and 1000 HU, respectively, in both the planning and treatment CTs. The tool was tested for five prostate IGRT patients. The average rectum planning to treatment contour overlap improved from 67% to 93%, the lowest initial overlap is 43%. The average bladder overlap improved from 83% to 98%, with a lowest initial overlap of 60%. Registration regions were set to include a volume receiving 4% of the maximum dose. The average region was 320x210x63, taking approximately 9 min to register on a dual 2.8 GHz Linux system. The prostate and seminal vesicles were correctly placed even though they are not masked. The accumulated doses for multiple fractions with large deformation were computed and verified. The tool developed can effectively supply the previously delivered dose for adaptive planning to correct for interfractional changes.« less

  7. Knowledgeable Neighbors:A Mobile Clinic Model for Disease Prevention and Screening in Underserved Communities

    PubMed Central

    Zurakowski, David; Bennet, Jennifer; Walker-White, Rainelle; Osman, Jamie L.; Quarles, Aaron; Oriol, Nancy

    2012-01-01

    The Family Van mobile health clinic uses a “Knowledgeable Neighbor” model to deliver cost-effective screening and prevention activities in underserved neighborhoods in Boston, MA. We have described the Knowledgeable Neighbor model and used operational data collected from 2006 to 2009 to evaluate the service. The Family Van successfully reached mainly minority low-income men and women. Of the clients screened, 60% had previously undetected elevated blood pressure, 14% had previously undetected elevated blood glucose, and 38% had previously undetected elevated total cholesterol. This represents an important model for reaching underserved communities to deliver proven cost-effective prevention activities, both to help control health care costs and to reduce health disparities. PMID:22390503

  8. MITT writer and MITT writer advanced development: Developing authoring and training systems for complex technical domains

    NASA Technical Reports Server (NTRS)

    Wiederholt, Bradley J.; Browning, Elica J.; Norton, Jeffrey E.; Johnson, William B.

    1991-01-01

    MITT Writer is a software system for developing computer based training for complex technical domains. A training system produced by MITT Writer allows a student to learn and practice troubleshooting and diagnostic skills. The MITT (Microcomputer Intelligence for Technical Training) architecture is a reasonable approach to simulation based diagnostic training. MITT delivers training on available computing equipment, delivers challenging training and simulation scenarios, and has economical development and maintenance costs. A 15 month effort was undertaken in which the MITT Writer system was developed. A workshop was also conducted to train instructors in how to use MITT Writer. Earlier versions were used to develop an Intelligent Tutoring System for troubleshooting the Minuteman Missile Message Processing System.

  9. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  10. TEMPUS: Simulating personnel and tasks in a 3-D environment

    NASA Technical Reports Server (NTRS)

    Badler, N. I.; Korein, J. D.

    1985-01-01

    The latest TEMPUS installation occurred in March, 1985. Another update is slated for early June, 1985. An updated User's Manual is in preparation and will be delivered approximately mid-June, 1985. NASA JSC has full source code listings and internal documentation for installed software. NASA JSC staff has received instruction in the use of TEMPUS. Telephone consultations have augmented on-site instruction.

  11. Mixed Methods Student Evaluation of an Online Systemic Human Anatomy Course with Laboratory

    ERIC Educational Resources Information Center

    Attardi, Stefanie M.; Choi, Suwhan; Barnett, John; Rogers, Kem A.

    2016-01-01

    A fully online section of an existing face-to-face (F2F) systemic human anatomy course with a prosection laboratory was offered for the first time in 2012-2013. Lectures for F2F students (N = 365) were broadcast in both live and archived format to online students (N = 40) using virtual classroom software. Laboratories were delivered online by a…

  12. A Taxonomy of Operational Risks

    DTIC Science & Technology

    2005-09-01

    the operational organization. Con - tractual constraints or requirements can impose risk if the mission delivers products or services under contract...Carnegie Mellon Software Engineering Institute A Taxonomy of Operational Risks CMU/SEI-2005-TN-036 Brian P. Gallagher Pamela J. Case DIST...Operational Risks CMU/SEI-2005-TN-036 Brian P. Gallagher Pamela J. Case Rita C. Creel Susan Kushner Ray C. Williams September2005 Acquisition Support Program

  13. CMMI for Services (SVC): The Strategic Landscape for Service

    DTIC Science & Technology

    2012-01-01

    processes. • Many existing models are designed for specific services or industries . • Other existing models do not provide a clear improvement path...Production, such as engineering and manufacturing Disciplines and industries , such as education, health care, insurance, utilities, and hospitality...as a Service ―More and more major businesses and industries are being run on software and delivered as online services—from movies to agriculture

  14. Single-crystal diffraction instrument TriCS at SINQ

    NASA Astrophysics Data System (ADS)

    Schefer, J.; Könnecke, M.; Murasik, A.; Czopnik, A.; Strässle, Th; Keller, P.; Schlumpf, N.

    2000-03-01

    The single-crystal diffractometer TriCS at the Swiss Continuous Spallation Source (SINQ) is presently in the commissioning phase. A two-dimensional wire detector produced by EMBL was delivered in March 1999. The instrument is presently tested with a single detector. First measurements on magnetic structures have been performed. The instrument is remotely controlled using JAVA-based software and a UNIX DEC-α host computer.

  15. Advances in the Acquisition of Secure Systems Based on Open Architectures

    DTIC Science & Technology

    2011-04-30

    2011 11:15 a.m. – 12:45 p.m. Chair: Christopher Deegan , Executive Director, Program Executive Office for Integrated Warfare Systems Delivering...Systems Based on Open Architectures Walt Scacchi and Thomas Alspaugh, Institute for Software Research Christopher Deegan —Executive Director, Program...Executive Officer, Integrated Warfare Systems (PEO IWS). Mr. Deegan directs the development, acquisition, and fleet support of 150 combat weapon system

  16. The CREATE Program Software Applications for the Design and Analysis of Air Vehicles, Naval Vessels, Radio Frequency Antennas, and Ground Vehicles

    DTIC Science & Technology

    2015-07-10

    Kramer noted in the Q4 2010 Earnings Call: “Our innovation engine again delivered in 2010. The percentage of new products in our overall lineup is...their use (not to redistribute the code, reverse engineer it, etc.) and their intended use. They also agree to abide by the ITAR procedures which have

  17. Applied Cognitive Models of Behavior and Errors Patterns

    DTIC Science & Technology

    2017-09-01

    methods offer an opportunity to deliver good , effective introductory and basic training , thus potentially enabling a single human instructor to train ...emergency medical technician (EMT) domain, which offers a standardized curriculum on which we can create training scenarios. 2. Develop...complexity of software integration and limited access to physical devices can result in commitment to a de- sign that turns out to not offer many training

  18. Lessons Learned in Cyberspace Security

    DTIC Science & Technology

    2014-06-01

    software; something undesirable is packaged together with something desirable. A classic example was Elf Bowling attachment, which ran rampant through...the authors’ former school. It combined a fun program featuring elves as bowling pins, however it was packaged with SubSeven (Sub7) malware that...allowed remote access to the infected machine. IExpress, which is delivered in the Windows OS, is one of the legitimate tools for packaging multiple

  19. Multi-Country Experience in Delivering a Joint Course on Software Engineering--Numerical Results

    ERIC Educational Resources Information Center

    Budimac, Zoran; Putnik, Zoran; Ivanovic, Mirjana; Bothe, Klaus; Zdravkova, Katerina; Jakimovski, Boro

    2014-01-01

    A joint course, created as a result of a project under the auspices of the "Stability Pact of South-Eastern Europe" and DAAD, has been conducted in several Balkan countries: in Novi Sad, Serbia, for the last six years in several different forms, in Skopje, FYR of Macedonia, for two years, for several types of students, and in Tirana,…

  20. SU-C-202-05: Pilot Study of Online Treatment Evaluation and Adaptive Re-Planning for Laryngeal SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao, W; Henry Ford Health System, Detroit, MI; Liu, C

    Purpose: We have instigated a phase I trial of 5-fraction stereotactic body radiotherapy (SBRT) for advanced-stage laryngeal cancer. We conducted this pilot dosimetric study to confirm the potential utility of online adaptive re-planning to preserve treatment quality. Methods: Ten cases of larynx cancer were evaluated. Baseline and daily SBRT treatment plans were generated per trial protocol. Daily volumetric images were acquired prior to every fraction of treatment. Reference simulation CT images were deformably registered to daily volumetric images using Eclipse. Planning contours were then deformably propagated to daily images. Reference SBRT plans were directly copied to calculate delivered dose distributionsmore » on deformed reference CT images. In-house software platform has been developed to calculate cumulative dose over a course of treatment in four steps: 1) deforming delivered dose grid to reference CT images using deformation information exported from Eclipse; 2) generating tetrahedrons using deformed dose grid as vertices; 3) resampling dose to a high resolution within every tetrahedron; 4) calculating dose-volume histograms. Our inhouse software was benchmarked with a commercial software, Mirada. Results: In all ten cases including 49 fractions of treatments, delivered daily doses were completely evaluated and treatment could be re-planned within 10 minutes. Prescription dose coverage of PTV was less than intended in 53% of fractions of treatment (mean: 94%, range: 84%–98%) while minimum coverage of CTV and GTV was 94% and 97%, respectively. Maximum bystander point dose limits to arytenoids, parotids, and spinal cord remained respected in all cases, although variances in carotid artery doses were observed in a minority of cases. Conclusion: Although GTV and CTV coverage is preserved by in-room 3D image guidance of larynx SBRT, PTV coverage can vary significantly from intended plans. Online adaptive treatment evaluation and re-planning is potentially necessary and our procedure is clinically applicable to fully preserve treatment quality. This project is supported by CPRIT Individual Investigator Research Award RP150386.« less

  1. Using mobile technology to deliver a cognitive behaviour therapy-informed intervention in early psychosis (Actissist): study protocol for a randomised controlled trial.

    PubMed

    Bucci, Sandra; Barrowclough, Christine; Ainsworth, John; Morris, Rohan; Berry, Katherine; Machin, Matthew; Emsley, Richard; Lewis, Shon; Edge, Dawn; Buchan, Iain; Haddock, Gillian

    2015-09-10

    Cognitive behaviour therapy (CBT) is recommended for the treatment of psychosis; however, only a small proportion of service users have access to this intervention. Smartphone technology using software applications (apps) could increase access to psychological approaches for psychosis. This paper reports the protocol development for a clinical trial of smartphone-based CBT. We present a study protocol that describes a single-blind randomised controlled trial comparing a cognitive behaviour therapy-informed software application (Actissist) plus Treatment As Usual (TAU) with a symptom monitoring software application (ClinTouch) plus TAU in early psychosis. The study consists of a 12-week intervention period. We aim to recruit and randomly assign 36 participants registered with early intervention services (EIS) across the North West of England, UK in a 2:1 ratio to each arm of the trial. Our primary objective is to determine whether in people with early psychosis the Actissist app is feasible to deliver and acceptable to use. Secondary aims are to determine whether Actissist impacts on predictors of first episode psychosis (FEP) relapse and enhances user empowerment, functioning and quality of life. Assessments will take place at baseline, 12 weeks (post-treatment) and 22-weeks (10 weeks post-treatment) by assessors blind to treatment condition. The trial will report on the feasibility and acceptability of Actissist and compare outcomes between the randomised arms. The study also incorporates semi-structured interviews about the experience of participating in the Actissist trial that will be qualitatively analysed to inform future developments of the Actissist protocol and app. To our knowledge, this is the first controlled trial to test the feasibility, acceptability, uptake, attrition and potential efficacy of a CBT-informed smartphone app for early psychosis. Mobile applications designed to deliver a psychologically-informed intervention offer new possibilities to extend the reach of traditional mental health service delivery across a range of serious mental health problems and provide choice about available care. ISRCTN34966555. Date of first registration: 12 June 2014.

  2. SU-E-T-616: Plan Quality Assessment of Both Treatment Planning System Dose and Measurement-Based 3D Reconstructed Dose in the Patient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olch, A

    2015-06-15

    Purpose: Systematic radiotherapy plan quality assessment promotes quality improvement. Software tools can perform this analysis by applying site-specific structure dose metrics. The next step is to similarly evaluate the quality of the dose delivery. This study defines metrics for acceptable doses to targets and normal organs for a particular treatment site and scores each plan accordingly. The input can be the TPS or the measurement-based 3D patient dose. From this analysis, one can determine whether the delivered dose distribution to the patient receives a score which is comparable to the TPS plan score, otherwise replanning may be indicated. Methods: Elevenmore » neuroblastoma patient plans were exported from Eclipse to the Quality Reports program. A scoring algorithm defined a score for each normal and target structure based on dose-volume parameters. Each plan was scored by this algorithm and the percentage of total possible points was obtained. Each plan also underwent IMRT QA measurements with a Mapcheck2 or ArcCheck. These measurements were input into the 3DVH program to compute the patient 3D dose distribution which was analyzed using the same scoring algorithm as the TPS plan. Results: The mean quality score for the TPS plans was 75.37% (std dev=14.15%) compared to 71.95% (std dev=13.45%) for the 3DVH dose distribution. For 3/11 plans, the 3DVH-based quality score was higher than the TPS score, by between 0.5 to 8.4 percentage points. Eight/11 plans scores decreased based on IMRT QA measurements by 1.2 to 18.6 points. Conclusion: Software was used to determine the degree to which the plan quality score differed between the TPS and measurement-based dose. Although the delivery score was generally in good agreement with the planned dose score, there were some that improved while there was one plan whose delivered dose quality was significantly less than planned. This methodology helps evaluate both planned and delivered dose quality. Sun Nuclear Corporation has provded a license for the software described.« less

  3. Forty Projects by Groups of Kids.

    ERIC Educational Resources Information Center

    National Commission on Resources for Youth, Inc., New York, NY.

    Ways in which young people have delivered needed services to their communities and have improved on previously established systems for delivering these services are described. The forty projects suggest some of the ways to provide teenagers with learning experiences that meet their own particular needs and, at the same time, offer a genuine and…

  4. Navigating the Challenges of Delivering Secondary School Courses by Videoconference

    ERIC Educational Resources Information Center

    Rehn, Nicole; Maor, Dorit; McConney, Andrew

    2017-01-01

    The purpose of this research is to unpack and learn from the experiences of teachers who deliver courses to remote secondary school students by videoconference. School districts are using videoconferencing to connect students and teachers who are separated geographically through regular live, real-time conferences. Previous studies have shown the…

  5. Collaboration across Cultures: Planning and Delivering Professional Development for Inclusive Education in India

    ERIC Educational Resources Information Center

    Rose, Richard; Doveston, Mary

    2015-01-01

    In recent years a number of western universities have established professional development courses in international contexts. These have often involved tutors travelling to countries with which they may have previously had little contact, in order to deliver courses that have been long established in their own universities. This article discusses…

  6. Current and emerging formulation strategies for the effective transdermal delivery of HIV inhibitors.

    PubMed

    Ham, Anthony S; Buckheit, Robert W

    2015-02-01

    Current and emerging formulation strategies for skin permeation are poised to open the transdermal drug delivery to a broader range of small molecule compounds that do not fit the traditional requirements for successful transdermal drug delivery, allowing the development of new patch technologies to deliver antiretroviral drugs that were previously incapable of being delivered through transdermal means. Transdermal drug delivery offers several distinct advantages over traditional dosage forms. Current antiretroviral drugs used for the treatment of HIV infection include a variety of highly active small molecule compounds with significantly limited skin permeability, and thus new and novel means of enhancing transport through the skin are needed. Current and emerging formulation strategies are poised to open the transdermal drug delivery to a broader range of compounds that do not fit the traditional requirements for successful transdermal drug delivery, allowing the development of new patch technologies to deliver antiretroviral drugs that were previously incapable of being delivered through transdermal means. Thus, with continuing research into skin permeability and patch formulation strategies, there is a large potential for antiretroviral transdermal drug delivery.

  7. Current and emerging formulation strategies for the effective transdermal delivery of HIV inhibitors

    PubMed Central

    Ham, Anthony S; Buckheit, Robert W

    2015-01-01

    Current and emerging formulation strategies for skin permeation are poised to open the transdermal drug delivery to a broader range of small molecule compounds that do not fit the traditional requirements for successful transdermal drug delivery, allowing the development of new patch technologies to deliver antiretroviral drugs that were previously incapable of being delivered through transdermal means. Transdermal drug delivery offers several distinct advantages over traditional dosage forms. Current antiretroviral drugs used for the treatment of HIV infection include a variety of highly active small molecule compounds with significantly limited skin permeability, and thus new and novel means of enhancing transport through the skin are needed. Current and emerging formulation strategies are poised to open the transdermal drug delivery to a broader range of compounds that do not fit the traditional requirements for successful transdermal drug delivery, allowing the development of new patch technologies to deliver antiretroviral drugs that were previously incapable of being delivered through transdermal means. Thus, with continuing research into skin permeability and patch formulation strategies, there is a large potential for antiretroviral transdermal drug delivery. PMID:25690088

  8. The Development of Ada (Trademark) Software for Secure Environments

    DTIC Science & Technology

    1986-05-23

    Telecommunications environment, This paper discusses software socurity and seeks to demostrate how the Ada programming language can be utilizec as a tool...complexity 4 . We use abstraction in our lives every day to control complexity; the principles of abstraction for software engineering are ro different...systems. These features directly sup,) )-t t.ie m odernp software engineering principles d1 s I , , 1 t, thne previous section. This is not surprising

  9. Design Considerations in Development of a Mobile Health Intervention Program: The TEXT ME and TEXTMEDS Experience

    PubMed Central

    Thakkar, Jay; Barry, Tony; Thiagalingam, Aravinda; Redfern, Julie; McEwan, Alistair L; Rodgers, Anthony

    2016-01-01

    Background Mobile health (mHealth) has huge potential to deliver preventative health services. However, there is paucity of literature on theoretical constructs, technical, practical, and regulatory considerations that enable delivery of such services. Objectives The objective of this study was to outline the key considerations in the development of a text message-based mHealth program; thus providing broad recommendations and guidance to future researchers designing similar programs. Methods We describe the key considerations in designing the intervention with respect to functionality, technical infrastructure, data management, software components, regulatory requirements, and operationalization. We also illustrate some of the potential issues and decision points utilizing our experience of developing text message (short message service, SMS) management systems to support 2 large randomized controlled trials: TEXT messages to improve MEDication adherence & Secondary prevention (TEXTMEDS) and Tobacco, EXercise and dieT MEssages (TEXT ME). Results The steps identified in the development process were: (1) background research and development of the text message bank based on scientific evidence and disease-specific guidelines, (2) pilot testing with target audience and incorporating feedback, (3) software-hardware customization to enable delivery of complex personalized programs using prespecified algorithms, and (4) legal and regulatory considerations. Additional considerations in developing text message management systems include: balancing the use of customized versus preexisting software systems, the level of automation versus need for human inputs, monitoring, ensuring data security, interface flexibility, and the ability for upscaling. Conclusions A merging of expertise in clinical and behavioral sciences, health and research data management systems, software engineering, and mobile phone regulatory requirements is essential to develop a platform to deliver and manage support programs to hundreds of participants simultaneously as in TEXT ME and TEXTMEDS trials. This research provides broad principles that may assist other researchers in developing mHealth programs. PMID:27847350

  10. Design Considerations in Development of a Mobile Health Intervention Program: The TEXT ME and TEXTMEDS Experience.

    PubMed

    Thakkar, Jay; Barry, Tony; Thiagalingam, Aravinda; Redfern, Julie; McEwan, Alistair L; Rodgers, Anthony; Chow, Clara K

    2016-11-15

    Mobile health (mHealth) has huge potential to deliver preventative health services. However, there is paucity of literature on theoretical constructs, technical, practical, and regulatory considerations that enable delivery of such services. The objective of this study was to outline the key considerations in the development of a text message-based mHealth program; thus providing broad recommendations and guidance to future researchers designing similar programs. We describe the key considerations in designing the intervention with respect to functionality, technical infrastructure, data management, software components, regulatory requirements, and operationalization. We also illustrate some of the potential issues and decision points utilizing our experience of developing text message (short message service, SMS) management systems to support 2 large randomized controlled trials: TEXT messages to improve MEDication adherence & Secondary prevention (TEXTMEDS) and Tobacco, EXercise and dieT MEssages (TEXT ME). The steps identified in the development process were: (1) background research and development of the text message bank based on scientific evidence and disease-specific guidelines, (2) pilot testing with target audience and incorporating feedback, (3) software-hardware customization to enable delivery of complex personalized programs using prespecified algorithms, and (4) legal and regulatory considerations. Additional considerations in developing text message management systems include: balancing the use of customized versus preexisting software systems, the level of automation versus need for human inputs, monitoring, ensuring data security, interface flexibility, and the ability for upscaling. A merging of expertise in clinical and behavioral sciences, health and research data management systems, software engineering, and mobile phone regulatory requirements is essential to develop a platform to deliver and manage support programs to hundreds of participants simultaneously as in TEXT ME and TEXTMEDS trials. This research provides broad principles that may assist other researchers in developing mHealth programs. ©Jay Thakkar, Tony Barry, Aravinda Thiagalingam, Julie Redfern, Alistair L McEwan, Anthony Rodgers, Clara K Chow. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 15.11.2016.

  11. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  12. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  13. Developing the E-Scape Software System

    ERIC Educational Resources Information Center

    Derrick, Karim

    2012-01-01

    Most innovations have contextual pre-cursors that prompt new ways of thinking and in their turn help to give form to the new reality. This was the case with the e-scape software development process. The origins of the system existed in software components and ideas that we had developed through previous projects, but the ultimate direction we took…

  14. Top down, bottom up structured programming and program structuring

    NASA Technical Reports Server (NTRS)

    Hamilton, M.; Zeldin, S.

    1972-01-01

    New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.

  15. Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

    NASA Astrophysics Data System (ADS)

    Keyes, Robert; ATLAS Collaboration

    2017-10-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low-level memory and CPU requirements, to distributions and efficiencies of high-level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort and thus directly influences the overall performance of the ATLAS experiment.

  16. OpenSQUID: A Flexible Open-Source Software Framework for the Control of SQUID Electronics

    DOE PAGES

    Jaeckel, Felix T.; Lafler, Randy J.; Boyd, S. T. P.

    2013-02-06

    We report commercially available computer-controlled SQUID electronics are usually delivered with software providing a basic user interface for adjustment of SQUID tuning parameters, such as bias current, flux offset, and feedback loop settings. However, in a research context it would often be useful to be able to modify this code and/or to have full control over all these parameters from researcher-written software. In the case of the STAR Cryoelectronics PCI/PFL family of SQUID control electronics, the supplied software contains modules for automatic tuning and noise characterization, but does not provide an interface for user code. On the other hand, themore » Magnicon SQUIDViewer software package includes a public application programming interface (API), but lacks auto-tuning and noise characterization features. To overcome these and other limitations, we are developing an "open-source" framework for controlling SQUID electronics which should provide maximal interoperability with user software, a unified user interface for electronics from different manufacturers, and a flexible platform for the rapid development of customized SQUID auto-tuning and other advanced features. Finally, we have completed a first implementation for the STAR Cryoelectronics hardware and have made the source code for this ongoing project available to the research community on SourceForge (http://opensquid.sourceforge.net) under the GNU public license.« less

  17. Transvaginal ultrasonographic measurement of cervical length in asymptomatic high-risk women with a short cervical length in the previous pregnancy.

    PubMed

    Crane, J M G; Hutchens, D

    2011-07-01

    To determine if asymptomatic women at high risk of preterm delivery who had a short cervical length in their previous pregnancy and delivered at term are at increased risk of having a short cervical length in their next pregnancy, and whether they are at increased risk of preterm birth. This retrospective cohort study included high-risk (those with a history of spontaneous preterm birth, uterine anomaly or excisional treatment for cervical dysplasia) asymptomatic women who were pregnant with a singleton gestation delivering between April 2003 and March 2010, who had had a previous pregnancy and who had transvaginal ultrasonographic cervical length measurement performed at 16-30 weeks' gestation in both pregnancies. Comparison was among women who had a short cervical length (< 3.0 cm) in their previous pregnancy but delivered at term in that pregnancy (Short Term Group), women with a history of a normal cervical length (≥ 3.0 cm) in their previous pregnancy delivering at term (Long Term Group), and women who had a short cervical length (< 3.0 cm) in their previous pregnancy delivering preterm (Short Preterm Group). Primary outcomes were spontaneous preterm birth at < 37 weeks' gestation and cervical length. Secondary outcomes were spontaneous preterm birth at < 35 weeks and < 32 weeks, low birth weight, maternal outcomes and neonatal morbidity. A total of 62 women were included. Women in the Short Term Group were more likely to have a short cervical length in their next pregnancy compared with those in the Long Term Group (10/23 (43.5%) vs. 4/26 (15.4%), respectively) but not as likely as women in the Short Preterm Group (9/13 (69.2%); P=0.003). Women in the Short Term Group were not at an increased risk of spontaneous preterm birth at < 37 weeks in the next pregnancy compared with women in the Long Term Group (2/23 (8.7%) vs. 2/26 (7.7%), respectively), but women in the Short Preterm Group were at an increased risk (6/13 (46.2%); P<0.0001). Compared with women in the Short Term and Long Term groups, women in the Short Preterm Group were also at an increased risk of threatened preterm labor (6/23 (26.1%) and 4/26 (15.4%) vs. 9/13 (69.2%), respectively; P=0.002) and of receiving corticosteroids for fetal lung maturation (6/23 (26.1%) and 4/26 (15.4%) vs. 11/13 (84.6%), respectively; P<0.0001). Although high-risk asymptomatic women with a short cervical length in their previous pregnancy who delivered at term are at increased risk of having a short cervix in their next pregnancy, they are not at increased risk of preterm birth. Copyright © 2011 ISUOG. Published by John Wiley & Sons, Ltd.

  18. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  19. A CMMI-based approach for medical software project life cycle study.

    PubMed

    Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi

    2013-01-01

    In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.

  20. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  1. Modernized build and test infrastructure for control software at ESO: highly flexible building, testing, and automatic quality practices for telescope control software

    NASA Astrophysics Data System (ADS)

    Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.

    2016-07-01

    The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.

  2. An Interactive Tool for Discrete Phase Analysis in Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Dejong, Frederik J.; Thoren, Stephen J.

    1993-01-01

    Under a NASA MSFC SBIR Phase 1 effort an interactive software package has been developed for the analysis of discrete (particulate) phase dynamics in two-phase flows in which the discrete phase does not significantly affect the continuous phase. This package contains a Graphical User Interface (based on the X Window system and the Motif tool kit) coupled to a particle tracing program, which allows the user to interactively set up and run a case for which a continuous phase grid and flow field are available. The software has been applied to a solid rocket motor problem, to demonstrate its ease of use and its suitability for problems of engineering interest, and has been delivered to NASA Marshall Space Flight Center.

  3. End-to-end operations at the National Radio Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Radziwill, Nicole M.

    2008-07-01

    In 2006 NRAO launched a formal organization, the Office of End to End Operations (OEO), to broaden access to its instruments (VLA/EVLA, VLBA, GBT and ALMA) in the most cost-effective ways possible. The VLA, VLBA and GBT are mature instruments, and the EVLA and ALMA are currently under construction, which presents unique challenges for integrating software across the Observatory. This article 1) provides a survey of the new developments over the past year, and those planned for the next year, 2) describes the business model used to deliver many of these services, and 3) discusses the management models being applied to ensure continuous innovation in operations, while preserving the flexibility and autonomy of telescope software development groups.

  4. SARS: Safeguards Accounting and Reporting Software

    NASA Astrophysics Data System (ADS)

    Mohammedi, B.; Saadi, S.; Ait-Mohamed, S.

    In order to satisfy the requirements of the SSAC (State System for Accounting and Control of nuclear materials), for recording and reporting objectives; this computer program comes to bridge the gape between nuclear facilities operators and national inspection verifying records and delivering reports. The SARS maintains and generates at-facility safeguards accounting records and generates International Atomic Energy Agency (IAEA) safeguards reports based on accounting data input by the user at any nuclear facility. A database structure is built and BORLAND DELPHI programming language has been used. The software is designed to be user-friendly, to make extensive and flexible management of menus and graphs. SARS functions include basic physical inventory tacking, transaction histories and reporting. Access controls are made by different passwords.

  5. Wyoming greater sage-grouse habitat prioritization: A collection of multi-scale seasonal models and geographic information systems land management tools

    USGS Publications Warehouse

    O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.

    2015-01-01

    We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.

  6. Browndye: A Software Package for Brownian Dynamics

    PubMed Central

    McCammon, J. Andrew

    2010-01-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109

  7. XML Flight/Ground Data Dictionary Management

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Wiklow, Colette

    2007-01-01

    A computer program generates Extensible Markup Language (XML) files that effect coupling between the command- and telemetry-handling software running aboard a spacecraft and the corresponding software running in ground support systems. The XML files are produced by use of information from the flight software and from flight-system engineering. The XML files are converted to legacy ground-system data formats for command and telemetry, transformed into Web-based and printed documentation, and used in developing new ground-system data-handling software. Previously, the information about telemetry and command was scattered in various paper documents that were not synchronized. The process of searching and reading the documents was time-consuming and introduced errors. In contrast, the XML files contain all of the information in one place. XML structures can evolve in such a manner as to enable the addition, to the XML files, of the metadata necessary to track the changes and the associated documentation. The use of this software has reduced the extent of manual operations in developing a ground data system, thereby saving considerable time and removing errors that previously arose in the translation and transcription of software information from the flight to the ground system.

  8. Effects of an iPad iBook on Reading Comprehension, Electrodermal Activity, and Engagement for Adolescents with Disabilities

    ERIC Educational Resources Information Center

    Pollitt, Daniel T.

    2013-01-01

    The purpose of this study was to investigate the effects of an iPad iBook for adolescents with disabilities. With its release in 2012, the iBooks Author software for the Apple iPad allows classroom teachers to create accessible and engaging textbooks. Leveraging media and interactive widgets, iBooks Author holds promise for delivering content to…

  9. Strategic Supply

    DTIC Science & Technology

    2002-01-01

    The ultimate challenge is to bring the right product to the right market in the right quantity at the right price . Much of the current focus of...added services to enhance process integration in order to deliver the right product at the right price and time, through the right distribution...328. 7 Kathleen Kiley, “Optimization Software Designed to Make Sure That the Price is Right ”, 4 February 2002, KPMG Insiders, <www.kpmginsiders.com

  10. VALIDATING the Accuracy of Sighten's Automated Shading Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solar companies - including installers, financiers, and distributors - leverage Sighten software to deliver accurate shading calculations and solar proposals. Sighten recently partnered with Google Project Sunroof to provide automated remote shading analysis directly within the Sighten platform. The National Renewable Energy Laboratory (NREL), in partnership with Sighten, independently verified the accuracy of Sighten's remote-shading solar access values (SAVs) on an annual basis for locations in Los Angeles, California, and Denver, Colorado.

  11. An Open Service Provider Concept for Enterprise Complex Automation

    NASA Astrophysics Data System (ADS)

    Ivaschenko, A. V.; Sitnikov, P. V.; Tanonykhina, M. O.

    2017-01-01

    The paper introduces a solution for IT services representation and management in the integrated information space of distributed enterprises. It is proposed to develop an Open Service Provider as a software platform for interaction between IT services providers and their users. Implementation of the proposed concept and approach is illustrated by an after-sales customer support system for a large manufacturing corporation delivered by SEC “Open Code”.

  12. Intracellular Protein Delivery for Treating Breast Cancer

    DTIC Science & Technology

    2012-06-01

    are efficiently internalized by mammalian cells lines as characterized by confocal microscopy, and rhodamine-labeled apoptin can be observed in the...To determine the cellular localization of delivered proteins, confocal images were taken with HeLa, MCF-7, or HEF cells incubated with 20 nM of S-S...and analyzed by Nikon NIS Element software. Fluorescence images were acquired on a Yokogawa spinning-disk confocal scanner system using a Nikon

  13. Open Source Software Compliance within the Government

    DTIC Science & Technology

    2016-12-01

    The exception to this rule is the various General Public License (GPLs), which consider all distributions to contractors as outside distribution...is developed by a contractor at the government’s expense or for the government’s exclusive use. The third condition that must be met is that ERDC...executables and source code can only be offered by an authorized delivering entity to an authorized receiving entity. This means that contractors , with

  14. The EPA Comptox Chemistry Dashboard: A Web-Based Data ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data but recent developments have focused on the development of a new software architecture that assembles the resources into a single platform. A new web application, the CompTox Chemistry Dashboard provides access to data associated with ~720,000 chemical substances. These data include experimental and predicted physicochemical property data, bioassay screening data associated with the ToxCast program, product and functional use information and a myriad of related data of value to environmental scientists. The dashboard provides chemical-based searching based on chemical names, synonyms and CAS Registry Numbers. Flexible search capabilities allow for chemical identificati

  15. Validation of a Portable Low-Power Deep Brain Stimulation Device Through Anxiolytic Effects in a Laboratory Rat Model.

    PubMed

    Kouzani, Abbas Z; Kale, Rajas P; Zarate-Garza, Pablo Patricio; Berk, Michael; Walder, Ken; Tye, Susannah J

    2017-09-01

    Deep brain stimulation (DBS) devices deliver electrical pulses to neural tissue through an electrode. To study the mechanisms and therapeutic benefits of deep brain stimulation, murine preclinical research is necessary. However, conducting naturalistic long-term, uninterrupted animal behavioral experiments can be difficult with bench-top systems. The reduction of size, weight, power consumption, and cost of DBS devices can assist the progress of this research in animal studies. A low power, low weight, miniature DBS device is presented in this paper. This device consists of electronic hardware and software components including a low-power microcontroller, an adjustable current source, an n-channel metal-oxide-semiconductor field-effect transistor, a coin-cell battery, electrode wires and a software program to operate the device. Evaluation of the performance of the device in terms of battery lifetime and device functionality through bench and in vivo tests was conducted. The bench test revealed that this device can deliver continuous stimulation current pulses of strength [Formula: see text], width [Formula: see text], and frequency 130 Hz for over 22 days. The in vivo tests demonstrated that chronic stimulation of the nucleus accumbens (NAc) with this device significantly increased psychomotor activity, together with a dramatic reduction in anxiety-like behavior in the elevated zero-maze test.

  16. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  17. The effect of nurses’ preparedness and nurse practitioner status on triage call management in primary care: A secondary analysis of cross-sectional data from the ESTEEM trial

    PubMed Central

    Varley, Anna; Warren, Fiona C.; Richards, Suzanne H.; Calitri, Raff; Chaplin, Katherine; Fletcher, Emily; Holt, Tim A.; Lattimer, Valerie; Murdoch, Jamie; Richards, David A.; Campbell, John

    2016-01-01

    Background Nurse-led telephone triage is increasingly used to manage demand for general practitioner consultations in UK general practice. Previous studies are equivocal about the relationship between clinical experience and the call outcomes of nurse triage. Most research is limited to investigating nurse telephone triage in out-of-hours settings. Objective To investigate whether the professional characteristics of primary care nurses undertaking computer decision supported software telephone triage are related to call disposition. Design Questionnaire survey of nurses delivering the nurse intervention arm of the ESTEEM trial, to capture role type (practice nurse or nurse practitioner), prescriber status, number of years’ nursing experience, graduate status, previous experience of triage, and perceived preparedness for triage. Our main outcome was the proportion of triaged patients recommended for follow-up within the practice (call disposition), including all contact types (face-to-face, telephone or home visit), by a general practitioner or nurse. Settings 15 general practices and 7012 patients receiving the nurse triage intervention in four regions of the UK. Participants 45 nurse practitioners and practice nurse trained in the use of clinical decision support software. Methods We investigated the associations between nursing characteristics and triage call disposition for patient ‘same-day’ appointment requests in general practice using multivariable logistic regression modelling. Results Valid responses from 35 nurses (78%) from 14 practices: 31/35 (89%) had ≥10 years’ experience with 24/35 (69%) having ≥20 years. Most patient contacts (3842/4605; 86%) were recommended for follow-up within the practice. Nurse practitioners were less likely to recommend patients for follow-up odds ratio 0.19, 95% confidence interval 0.07; 0.49 than practice nurses. Nurses who reported that their previous experience had prepared them less well for triage were more likely to recommend patients for follow-up (OR 3.17, 95% CI 1.18–5.55). Conclusion Nurse characteristics were associated with disposition of triage calls to within practice follow-up. Nurse practitioners or those who reported feeling ‘more prepared’ for the role were more likely to manage the call definitively. Practices considering nurse triage should ensure that nurses transitioning into new roles feel adequately prepared. While standardised training is necessary, it may not be sufficient to ensure successful implementation. PMID:27087294

  18. Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro

    NASA Technical Reports Server (NTRS)

    Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.

    2012-01-01

    The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates

  19. NASA Tech Briefs, March 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics covered include: Tool for Bending a Metal Tube Precisely in a Confined Space; Multiple-Use Mechanisms for Attachment to Seat Tracks; Force-Measuring Clamps; Cellular Pressure-Actuated Joint; Block QCA Fault-Tolerant Logic Gates; Hybrid VLSI/QCA Architecture for Computing FFTs; Arrays of Carbon Nanotubes as RF Filters in Waveguides; Carbon Nanotubes as Resonators for RF Spectrum Analyzers; Software for Viewing Landsat Mosaic Images; Updated Integrated Mission Program; Software for Sharing and Management of Information; Update on Integrated Optical Design Analyzer; Optical-Quality Thin Polymer Membranes; Rollable Thin Shell Composite-Material Paraboloidal Mirrors; Folded Resonant Horns for Power Ultrasonic Applications; Touchdown Ball-Bearing System for Magnetic Bearings; Flux-Based Deadbeat Control of Induction-Motor Torque; Block Copolymers as Templates for Arrays of Carbon Nanotubes; Throttling Cryogen Boiloff To Control Cryostat Temperature; Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station; Turbulence in Supercritical O2/H2 and C7H16/N2 Mixing Layers; and Time-Resolved Measurements in Optoelectronic Microbioanal.

  20. Design for Run-Time Monitor on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.

  1. Porting the Core Flight System to the Dellingr Cubesat

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan

    2017-01-01

    Dellingr is a 6U Cubesat developed by NASA Goddard Space Flight Center. It was delivered to the International Space Station in August 2017, and is scheduled to be deployed in November 2017. Compared to a typical NASA satellite, the Dellingr Cubesat had an extremely low budget and short schedule. Although the Dellingr Cubesat has minimal hardware resources, the cFS was ultimately chosen for the flight software. Using the cFS on the Dellingr Cubesat presented a few challenges, but also offered opportunities to help speed up development and verify the ACS flight software. This presentation will cover the lessons learned in porting the cFS to the Dellingr Cubesat, including working with the limited hardware resources, porting the cFS to FreeRTOS, and overcoming limitations related to data storage and file transfer. This presentation will also cover how hardware abstraction was used to run the flight software on multiple platforms and interface with the 42 dynamic simulator.

  2. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATLmore » Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.« less

  3. The WHATs and HOWs of maturing computational and software engineering skills in Russian higher education institutions

    NASA Astrophysics Data System (ADS)

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-05-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.

  4. Chimpanzees return favors at a personal cost.

    PubMed

    Schmelz, Martin; Grueneisen, Sebastian; Kabalak, Alihan; Jost, Jürgen; Tomasello, Michael

    2017-07-11

    Humans regularly provide others with resources at a personal cost to themselves. Chimpanzees engage in some cooperative behaviors in the wild as well, but their motivational underpinnings are unclear. In three experiments, chimpanzees ( Pan troglodytes ) always chose between an option delivering food both to themselves and a partner and one delivering food only to themselves. In one condition, a conspecific partner had just previously taken a personal risk to make this choice available. In another condition, no assistance from the partner preceded the subject's decision. Chimpanzees made significantly more prosocial choices after receiving their partner's assistance than when no assistance was given (experiment 1) and, crucially, this was the case even when choosing the prosocial option was materially costly for the subject (experiment 2). Moreover, subjects appeared sensitive to the risk of their partner's assistance and chose prosocially more often when their partner risked losing food by helping (experiment 3). These findings demonstrate experimentally that chimpanzees are willing to incur a material cost to deliver rewards to a conspecific, but only if that conspecific previously assisted them, and particularly when this assistance was risky. Some key motivations involved in human cooperation thus may have deeper phylogenetic roots than previously suspected.

  5. Update on PISCES

    NASA Technical Reports Server (NTRS)

    Pearson, Don; Hamm, Dustin; Kubena, Brian; Weaver, Jonathan K.

    2010-01-01

    An updated version of the Platform Independent Software Components for the Exploration of Space (PISCES) software library is available. A previous version was reported in Library for Developing Spacecraft-Mission-Planning Software (MSC-22983), NASA Tech Briefs, Vol. 25, No. 7 (July 2001), page 52. To recapitulate: This software provides for Web-based, collaborative development of computer programs for planning trajectories and trajectory- related aspects of spacecraft-mission design. The library was built using state-of-the-art object-oriented concepts and software-development methodologies. The components of PISCES include Java-language application programs arranged in a hierarchy of classes that facilitates the reuse of the components. As its full name suggests, the PISCES library affords platform-independence: The Java language makes it possible to use the classes and application programs with a Java virtual machine, which is available in most Web-browser programs. Another advantage is expandability: Object orientation facilitates expansion of the library through creation of a new class. Improvements in the library since the previous version include development of orbital-maneuver- planning and rendezvous-launch-window application programs, enhancement of capabilities for propagation of orbits, and development of a desktop user interface.

  6. Readiness of the Belgian network of sentinel general practitioners to deliver electronic health record data for surveillance purposes: results of survey study.

    PubMed

    Boffin, Nicole; Bossuyt, Nathalie; Vanthomme, Katrien; Van Casteren, Viviane

    2010-06-25

    In order to proceed from a paper based registration to a surveillance system that is based on extraction of electronic health records (EHR), knowledge is needed on the number and representativeness of sentinel GPs using a government-certified EHR system and the quality of EHR data for research, expressed in the compliance rate with three criteria: recording of home visits, use of prescription module and diagnostic subject headings. Data were collected by annual postal surveys between 2005 and 2009 among all sentinel GPs. We tested relations between four key GP characteristics (age, gender, language community, practice organisation) and use of a certified EHR system by multivariable logistic regression. The relation between EHR software package, GP characteristics and compliance with three quality criteria was equally measured by multivariable logistic regression. A response rate of 99% was obtained. Of 221 sentinel GPs, 55% participated in the surveillance without interruption from 2005 onwards, i.e. all five years, and 78% were participants in 2009. Sixteen certified EHR systems were used among 91% of the Dutch and 63% of the French speaking sentinel GPs. The EHR software package was strongly related to the community and only one EHR system was used by a comparable number of sentinel GPs in both communities. Overall, the prescription module was always used and home visits were usually recorded. Uniform subject headings were only sometimes used and the compliance with this quality criterion was almost exclusively related to the EHR software package in use. The challenge is to progress towards a sentinel network of GPs delivering care-based data that are (partly) extracted from well performing EHR systems and still representative for Belgian general practice.

  7. Workflow for high-content, individual cell quantification of fluorescent markers from universal microscope data, supported by open source software.

    PubMed

    Stockwell, Simon R; Mittnacht, Sibylle

    2014-12-16

    Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software(1) to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.

  8. High-performance hardware implementation of a parallel database search engine for real-time peptide mass fingerprinting

    PubMed Central

    Bogdán, István A.; Rivers, Jenny; Beynon, Robert J.; Coca, Daniel

    2008-01-01

    Motivation: Peptide mass fingerprinting (PMF) is a method for protein identification in which a protein is fragmented by a defined cleavage protocol (usually proteolysis with trypsin), and the masses of these products constitute a ‘fingerprint’ that can be searched against theoretical fingerprints of all known proteins. In the first stage of PMF, the raw mass spectrometric data are processed to generate a peptide mass list. In the second stage this protein fingerprint is used to search a database of known proteins for the best protein match. Although current software solutions can typically deliver a match in a relatively short time, a system that can find a match in real time could change the way in which PMF is deployed and presented. In a paper published earlier we presented a hardware design of a raw mass spectra processor that, when implemented in Field Programmable Gate Array (FPGA) hardware, achieves almost 170-fold speed gain relative to a conventional software implementation running on a dual processor server. In this article we present a complementary hardware realization of a parallel database search engine that, when running on a Xilinx Virtex 2 FPGA at 100 MHz, delivers 1800-fold speed-up compared with an equivalent C software routine, running on a 3.06 GHz Xeon workstation. The inherent scalability of the design means that processing speed can be multiplied by deploying the design on multiple FPGAs. The database search processor and the mass spectra processor, running on a reconfigurable computing platform, provide a complete real-time PMF protein identification solution. Contact: d.coca@sheffield.ac.uk PMID:18453553

  9. CCSDS File Delivery Protocol (CFDP): Why it's Useful and How it Works

    NASA Technical Reports Server (NTRS)

    Ray, Tim

    2003-01-01

    Reliable delivery of data products is often required across space links. For example, a NASA mission will require reliable delivery of images produced by an on-board detector. Many missions have their own (unique) way of accomplishing this, requiring custom software. Many missions also require manual operations (e.g. the telemetry receiver software keeps track of what data is missing, and a person manually inputs the appropriate commands to request retransmissions). The Consultative Committee for Space Data Systems (CCSDS) developed the CCSDS File Delivery Protocol (CFDP) specifically for this situation. CFDP is an international standard communication protocol that provides reliable delivery of data products. It is designed for use across space links. It will work well if run over the widely used CCSDS Telemetry and Telecommand protocols. However, it can be run over any protocol, and will work well as long as the underlying protocol delivers a reasonable portion of the data. The CFDP receiver will autonomously determine what data is missing, and request retransmissions as needed. The CFDP sender will autonomously perform the requested transmissions. When the entire data product is delivered, the CFDP receiver will let the CFDP sender know that the transaction has completed successfully. The result is that custom software becomes standard, and manual operations become autonomous. This paper will consider various ways of achieving reliable file delivery, explain why CFDP is the optimal choice for use over space links, explain how the core protocol works, and give some guidance on how to best utilize CFDP within various mission scenarios. It will also touch on additional features of CFDP, as well as other uses for CFDP (e.g. the loading of on-board memory and tables).

  10. TH-CD-202-12: Online Inter-Beam Replanning Based On Real-Time Dose Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamerling, CP; Fast, MF; Ziegenhein, P

    Purpose: This work provides a proof-of-concept study for online replanning during treatment delivery for step-and-shoot prostate SBRT, based on real-time dose reconstruction. Online replanning is expected to improve the trade-off between target coverage and organ-at-risk dose in the presence of intra-fractional motion. Methods: We have implemented an online replanning workflow on top of our previously reported real-time dose reconstruction software which connects to an Elekta research linac. The treatment planning system DynaPlan was extended to (1) re-optimize and sequence treatment plans (in clockwise beam order) before each beam, based on actual delivered dose, in a timeframe limited by the gantrymore » rotation between subsequent beams, and (2) send the respective segments to the delivery control software DynaTrack which starts/continues treatment immediately.To investigate the impact of a reduced safety margin, we have created and delivered (on a linac emulator) a conventional CTV+5/3mm (I) and a reduced CTV+1mm margin (II) treatment plan for a prostate patient. We have assessed CTV coverage with and without inter-beam replanning, all exposed to a gradual target shift of 0–5mm in posterior and inferior direction from start until the end of delivery. Results: For the reconstructed conventional plan (I), D98 for CTV was 100% of D98 of the planned dose. For the reconstructed margin-reduced plan (II), D98 for CTV was 95% of the planned D98 without replanning, but could be recovered to 99% by replanning for each beam. Plan (II) with replanning resulted in a decrease for bladder V90% by 88% and an increase to rectum V90% by 9% compared to the conventional plan (I). Dose calculation/accumulation was performed in <15ms per MLC aperture, replanning in <15s per beam. Conclusion: We have shown that online inter-beam replanning is technically feasible and potentially allows for a margin reduction. Future investigation considering motion-robust replanning optimization parameters is in progress. We acknowledge support of the MLC research from Elekta AB. This work is supported by Cancer Research UK under Programme C33589/A19908. Research at ICR is also supported by Cancer Research UK under Programme C33589/A19727 and NHS funding to the NIHR Biomedical Research Centre at RMH and ICR.« less

  11. Effect of solar radio bursts on GNSS signal reception over Europe for the period 1999-2013

    NASA Astrophysics Data System (ADS)

    Chevalier, Jean-Marie; Bergeot, Nicolas; Marqué, Christophe; Aerts, Wim; Bruyninx, Carine

    2015-04-01

    Intense solar radio bursts (SRB) emitted at L-band frequencies can affect the carrier-to-noise C/N0 ratio of Global Navigation Satellite Systems (GNSS) signals by increasing the background noise. Such space weather events can consequently decrease the quality of GNSS-based results especially for kinematic high-precision positioning. It is thus important to develop a method capable to detect such events in near real time on a wide area. For this purpose, the ROB-IONO software was adapted for analysing the effect of SRB on the dense EUREF Permanent GNSS Network (EPN). First, S1 and S2 raw data extracted from RINEX files were converted into the C/N0 unit (dB.Hz) taking into account manufacturer corrections. Then, the differences (ΔC/N0) between all these C/N0observables and their medians of the 7 previous satellite ground track repeat cycles, i.e. their normal quiet state, were computed. The mean of all these well-calibrated ΔC/N0values from different GNSS receivers and satellites offer at each epoch a reliable metric to detect and quantify the impact of a SRB. We investigated the degradation of GPS and GLONASS C/N0 on the entire EPN during 10 intense SRBs occurring at daylight over Europe between 1999 and 2013. The analysis shows that: (1) GPS and GLONASS ΔC/N0 agree at the 0.1±0.2dB.Hz level; (2) The standard deviation of the mean ΔC/N0of the EPN GNSS receivers is below 1dB.Hz 96% of the time, and below 0.6dB.Hz 76% of the time; (3) maximum ΔC/N0 degradation occurs at the epoch of maximum solar peak flux delivered by the solar ground observatories; (4) C/N0 degradation becomes larger with increasing solar zenithal angle. Consequently, the ROB-IONO software is capable to detect the degradation of GNSS signal reception over Europe due to SRBs. In addition, by taking advantage of the increasing number of EPN stations delivering C/N0 data since 2005, even less intense SRB events can now be detected. Finally, the developed method can be completely applied in near real time.

  12. 3D Visualization for Phoenix Mars Lander Science Operations

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Keely, Leslie; Lees, David; Stoker, Carol

    2012-01-01

    Planetary surface exploration missions present considerable operational challenges in the form of substantial communication delays, limited communication windows, and limited communication bandwidth. A 3D visualization software was developed and delivered to the 2008 Phoenix Mars Lander (PML) mission. The components of the system include an interactive 3D visualization environment called Mercator, terrain reconstruction software called the Ames Stereo Pipeline, and a server providing distributed access to terrain models. The software was successfully utilized during the mission for science analysis, site understanding, and science operations activity planning. A terrain server was implemented that provided distribution of terrain models from a central repository to clients running the Mercator software. The Ames Stereo Pipeline generates accurate, high-resolution, texture-mapped, 3D terrain models from stereo image pairs. These terrain models can then be visualized within the Mercator environment. The central cross-cutting goal for these tools is to provide an easy-to-use, high-quality, full-featured visualization environment that enhances the mission science team s ability to develop low-risk productive science activity plans. In addition, for the Mercator and Viz visualization environments, extensibility and adaptability to different missions and application areas are key design goals.

  13. Technology to Augment Early Home Visitation for Child Maltreatment Prevention: A Pragmatic Randomized Trial.

    PubMed

    Ondersma, Steven J; Martin, Joanne; Fortson, Beverly; Whitaker, Daniel J; Self-Brown, Shannon; Beatty, Jessica; Loree, Amy; Bard, David; Chaffin, Mark

    2017-11-01

    Early home visitation (EHV) for child maltreatment prevention is widely adopted but has received inconsistent empirical support. Supplementation with interactive software may facilitate attention to major risk factors and use of evidence-based approaches. We developed eight 20-min computer-delivered modules for use by mothers during the course of EHV. These modules were tested in a randomized trial in which 413 mothers were assigned to software-supplemented e-Parenting Program ( ePP), services as usual (SAU), or community referral conditions, with evaluation at 6 and 12 months. Outcomes included satisfaction, working alliance, EHV retention, child maltreatment, and child maltreatment risk factors. The software was well-received overall. At the 6-month follow-up, working alliance ratings were higher in the ePP condition relative to the SAU condition (Cohen's d = .36, p < .01), with no differences at 12 months. There were no between-group differences in maltreatment or major risk factors at either time point. Despite good acceptability and feasibility, these findings provide limited support for use of this software within EHV. These findings contribute to the mixed results seen across different models of EHV for child maltreatment prevention.

  14. Software for real-time localization of baleen whale calls using directional sonobuoys: A case study on Antarctic blue whales.

    PubMed

    Miller, Brian S; Calderan, Susannah; Gillespie, Douglas; Weatherup, Graham; Leaper, Russell; Collins, Kym; Double, Michael C

    2016-03-01

    Directional frequency analysis and recording (DIFAR) sonobuoys can allow real-time acoustic localization of baleen whales for underwater tracking and remote sensing, but limited availability of hardware and software has prevented wider usage. These software limitations were addressed by developing a module in the open-source software PAMGuard. A case study is presented demonstrating that this software provides greater efficiency and accessibility than previous methods for detecting, localizing, and tracking Antarctic blue whales in real time. Additionally, this software can easily be extended to track other low and mid frequency sounds including those from other cetaceans, pinnipeds, icebergs, shipping, and seismic airguns.

  15. Dispatcher assistance and automated external defibrillator performance among elders.

    PubMed

    Ecker, R; Rea, T D; Meischke, H; Schaeffer, S M; Kudenchuk, P; Eisenberg, M S

    2001-10-01

    Automated external defibrillators (AEDs) provide an opportunity to improve survival in out-of-hospital, ventricular fibrillation (VF) cardiac arrest by enabling laypersons not trained in rhythm recognition to deliver lifesaving therapy. The potential role of emergency dispatchers in the layperson use of AEDs is uncertain. This study was performed to examine whether dispatcher telephone assistance affected AED skill performance during a simulated VF cardiac arrest among a cohort of older adults. The hypothesis was that dispatcher assistance would increase the proportion who were able to correctly deliver a shock, but might require additional time. One hundred fifty community-dwelling persons aged 58-84 years were recruited from eight senior centers in King County, Washington. All participants had received AED training approximately six months previously. For this study, the participants were randomized to AED operation with or without dispatcher assistance during a simulated VF cardiac arrest. The proportions who successfully delivered a shock and the time intervals from collapse to shock were compared between the two groups. The participants who received dispatcher assistance were more likely to correctly deliver a shock with the AED during the simulated VF cardiac arrest (91% vs 68%, p = 0.001). Among those who were able to deliver a shock, the participants who received dispatcher assistance required a longer time interval from collapse to shock [median (25th, 75th percentile) = 193 seconds (165, 225) for dispatcher assistance, and 148 seconds (138, 166) for no dispatcher assistance, p = 0.001]. Among older laypersons previously trained in AED operation, dispatcher assistance may increase the proportion who can successfully deliver a shock during a VF cardiac arrest.

  16. Evolving software reengineering technology for the emerging innovative-competitive era

    NASA Technical Reports Server (NTRS)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.

  17. Calibration of work zone impact analysis software for Missouri.

    DOT National Transportation Integrated Search

    2013-12-01

    This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...

  18. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  19. Design and evaluation of a personal digital assistant- based alerting service for clinicians.

    PubMed

    Johnson, E Diane; Pancoast, Paul E; Mitchell, Joyce A; Shyu, Chi-Ren

    2004-10-01

    This study describes the system architecture and user acceptance of a suite of programs that deliver information about newly updated library resources to clinicians' personal digital assistants (PDAs). Participants received headlines delivered to their PDAs alerting them to new books, National Guideline Clearinghouse guidelines, Cochrane Reviews, and National Institutes of Health (NIH) Clinical Alerts, as well as updated content in UpToDate, Harrison's Online, Scientific American Medicine, and Clinical Evidence. Participants could request additional information for any of the headlines, and the information was delivered via e-mail during their next synchronization. Participants completed a survey at the conclusion of the study to gauge their opinions about the service. Of the 816 headlines delivered to the 16 study participants' PDAs during the project, Scientific American Medicine generated the highest proportion of headline requests at 35%. Most users of the PDA Alerts software reported that they learned about new medical developments sooner than they otherwise would have, and half reported that they learned about developments that they would not have heard about at all. While some users liked the PDA platform for receiving headlines, it seemed that a Web database that allowed tailored searches and alerts could be configured to satisfy both PDA-oriented and e-mail-oriented users.

  20. Design and evaluation of a personal digital assistant–based alerting service for clinicians*†

    PubMed Central

    Johnson, E. Diane; Pancoast, Paul E.; Mitchell, Joyce A.; Shyu, Chi-Ren

    2004-01-01

    Purpose: This study describes the system architecture and user acceptance of a suite of programs that deliver information about newly updated library resources to clinicians' personal digital assistants (PDAs). Description: Participants received headlines delivered to their PDAs alerting them to new books, National Guideline Clearinghouse guidelines, Cochrane Reviews, and National Institutes of Health (NIH) Clinical Alerts, as well as updated content in UpToDate, Harrison's Online, Scientific American Medicine, and Clinical Evidence. Participants could request additional information for any of the headlines, and the information was delivered via email during their next synchronization. Participants completed a survey at the conclusion of the study to gauge their opinions about the service. Results/Outcome: Of the 816 headlines delivered to the 16 study participants' PDAs during the project, Scientific American Medicine generated the highest proportion of headline requests at 35%. Most users of the PDA Alerts software reported that they learned about new medical developments sooner than they otherwise would have, and half reported that they learned about developments that they would not have heard about at all. While some users liked the PDA platform for receiving headlines, it seemed that a Web database that allowed tailored searches and alerts could be configured to satisfy both PDA-oriented and email-oriented users. PMID:15494759

  1. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  2. Identifying impact of software dependencies on replicability of biomedical workflows.

    PubMed

    Miksa, Tomasz; Rauber, Andreas; Mina, Eleni

    2016-12-01

    Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kowalkowski, Jim; Lyon, Adam; Paterno, Marc

    Over the past few years, container technology has become increasingly promising as a means to seamlessly make our software available across a wider range of platforms. In December 2015, we decided to put together a set of docker images that serve as a demonstration of this container technology for managing a run-time environment for art-related software projects, and also serve as a set of test cases for evaluation of performance. Docker[1] containers provide a way to “wrap up a piece of software in a complete filesystem that contains everything it needs to run”. In combination with Shifter[2], such containers providemore » a way to run software developed and deployed on “typical” HEP platforms (such as SLF 6, in common use at Fermilab and on OSG platforms) on HPC facilities at NERSC. Docker containers provide a means of delivering software that can be run on a variety of hosts without needing to be compiled specially for each OS to be supported. This could substantially reduce the effort required to create and validate a new release, since one build could be suitable for use on both grid machines (both FermiGrid and OSG) as well as any machine capable of running the Docker container. In addition, docker containers may provide for a quick and easy way for users to install and use a software release in a standardized environment. This report contains the results and status of this demonstration and evaluation.« less

  5. Neurovascular bundle–sparing radiotherapy for prostate cancer using MRI-CT registration: A dosimetric feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cassidy, R.J., E-mail: richardjcassidy@emory.edu; Yang, X.; Liu, T.

    Purpose: Sexual dysfunction after radiotherapy for prostate cancer remains an important late adverse toxicity. The neurovascular bundles (NVB) that lie posterolaterally to the prostate are typically spared during prostatectomy, but in traditional radiotherapy planning they are not contoured as an organ-at-risk with dose constraints. Our goal was to determine the dosimetric feasibility of “NVB-sparing” prostate radiotherapy while still delivering adequate dose to the prostate. Methods: Twenty-five consecutive patients with prostate cancer (with no extraprostatic disease on pelvic magnetic resonance imaging [MRI]) who that were treated with external beam radiotherapy, with the same primary planning target volume margins, to a dosemore » of 79.2 Gy were evaluated. Pelvic MRI and simulation computed tomography scans were registered using dedicated software to allow for bilateral NVB target delineation on T2-weighted MRI. A volumetric modulated arc therapy plan was generated using the NVB bilaterally with 2 mm margin as an organ to spare and compared to the patient’s previously delivered plan. Dose-volume histogram endpoints for NVB, rectum, bladder, and planning target volume 79.2 were compared between the 2 plans using a 2-tailed paired t-test. Results: The V70 for the NVB was significantly lower on the NVB-sparing plan (p <0.01), while rectum and bladder endpoints were similar. Target V100% was similar but V{sub 105%} was higher for the NVB-sparing plans (p <0.01). Conclusions: “NVB-sparing” radiotherapy is dosimetrically feasible using CT-MRI registration, and for volumetric modulated arc therapy technology — target coverage is acceptable without increased dose to other normal structures, but with higher target dose inhomogeneity. The clinical impact of “NVB-sparing” radiotherapy is currently under study at our institution.« less

  6. A model of neonatal tidal liquid ventilation mechanics.

    PubMed

    Costantino, M L; Fiore, G B

    2001-09-01

    Tidal liquid ventilation (TLV) with perfluorocarbons (PFC) has been proposed to treat surfactant-deficient lungs of preterm neonates, since it may prevent pulmonary instability by abating saccular surface tension. With a previous model describing gas exchange, we showed that ventilator settings are crucial for CO(2) scavenging during neonatal TLV. The present work is focused on some mechanical aspects of neonatal TLV that were hardly studied, i.e. the distribution of mechanical loads in the lungs, which is expected to differ substantially from gas ventilation. A new computational model is presented, describing pulmonary PFC hydrodynamics, where viscous losses, kinetic energy changes and lung compliance are accounted for. The model was implemented in a software package (LVMech) aimed at calculating pressures (and approximately estimate shear stresses) within the bronchial tree at different ventilator regimes. Simulations were run taking the previous model's outcomes into account. Results show that the pressure decrease due to high saccular compliance may compensate for the increased pressure drops due to PFC viscosity, and keep airway pressure low. Saccules are exposed to pressures remarkably different from those at the airway opening; during expiration negative pressures, which may cause airway collapse, are moderate and appear in the upper airways only. Delivering the fluid with a slightly smoothed square flow wave is convenient with respect to a sine wave. The use of LVMech allows to familiarize with LV treatment management taking the lungs' mechanical load into account, consistently with a proper respiratory support.

  7. 2002 Industry Studies: Strategic Supply

    DTIC Science & Technology

    2002-01-01

    quantity at the right price . Much of the current focus of supply chain practitioners is on the strategic response to demand. An integrated and aligned...value-added services to enhance process integration in order to deliver the right product at the right price and time, through the right distribution...Kiley, “Optimization Software Designed to Make Sure That the Price is Right ”, 4 February 2002, KPMG Insiders, <www.kpmginsiders.com> (25 February 2002

  8. DoD Information Assurance and Agile: Challenges and Recommendations Gathered Through Interviews with Agile Program Managers and DoD Accreditation Reviewers

    DTIC Science & Technology

    2012-11-01

    Tradeoff Analysis Method; ATAM, Capability Maturity Model , Capability Maturity Modeling , Carnegie Mellon, CERT, CERT Coordination Center, CMM, CMMI...Hermansen, Product Design, Sphere of Influence (https://www.SphereOfInfluence.com) Joel McAteer, Information Assurance Manager, Modeling ...use of them does introduce some challenges related to delivering software features rapidly and/or in- crementally . • Challenges with respect to

  9. China SLAT Plan Template

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dietrich, Richard E.

    2016-07-01

    This document serves as the System-Level Acceptance Test (SLAT) Plan for Site Name, City, Country. This test plan is to provide independent testing of the Radiation Detection System (RDS) installed at Site Name to verify that Customs has been delivered a fully-functioning system as required by all contractual commitments. The system includes all installed hardware and software components. The SLAT plan will verify that separate components are working individually and collectively from a system perspective.

  10. Delivering Faster Congestion Feedback with the Mark-Front Strategy

    NASA Technical Reports Server (NTRS)

    Liu, Chunlei; Jain, Raj

    2001-01-01

    Computer networks use congestion feedback from the routers and destinations to control the transmission load. Delivering timely congestion feedback is essential to the performance of networks. Reaction to the congestion can be more effective if faster feedback is provided. Current TCP/IP networks use timeout, duplicate Acknowledgement Packets (ACKs) and explicit congestion notification (ECN) to deliver the congestion feedback, each provides a faster feedback than the previous method. In this paper, we propose a markfront strategy that delivers an even faster congestion feedback. With analytical and simulation results, we show that mark-front strategy reduces buffer size requirement, improves link efficiency and provides better fairness among users. Keywords: Explicit Congestion Notification, mark-front, congestion control, buffer size requirement, fairness.

  11. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  12. SU-E-T-133: Assessing IMRT Treatment Delivery Accuracy and Consistency On a Varian TrueBeam Using the SunNuclear PerFraction EPID Dosimetry Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dieterich, S; Trestrail, E; Holt, R

    2015-06-15

    Purpose: To assess if the TrueBeam HD120 collimator is delivering small IMRT fields accurately and consistently throughout the course of treatment using the SunNuclear PerFraction software. Methods: 7-field IMRT plans for 8 canine patients who passed IMRT QA using SunNuclear Mapcheck DQA were selected for this study. The animals were setup using CBCT image guidance. The EPID fluence maps were captured for each treatment field and each treatment fraction, with the first fraction EPID data serving as the baseline for comparison. The Sun Nuclear PerFraction Software was used to compare the EPID data for subsequent fractions using a Gamma (3%/3mm)more » pass rate of 90%. To simulate requirements for SRS, the data was reanalyzed using a Gamma (3%/1mm) pass rate of 90%. Low-dose, low- and high gradient thresholds were used to focus the analysis on clinically relevant parts of the dose distribution. Results: Not all fractions could be analyzed, because during some of the treatment courses the DICOM tags in the EPID images intermittently change from CU to US (unspecified), which would indicate a temporary loss of EPID calibration. This technical issue is still being investigated. For the remaining fractions, the vast majority (7/8 of patients, 95% of fractions, and 96.6% of fields) are passing the less stringent Gamma criteria. The more stringent Gamma criteria caused a drop in pass rate (90 % of fractions, 84% of fields). For the patient with the lowest pass rate, wet towel bolus was used. Another patient with low pass rates experienced masseter muscle wasting. Conclusion: EPID dosimetry using the PerFraction software demonstrated that the majority of fields passed a Gamma (3%/3mm) for IMRT treatments delivered with a TrueBeam HD120 MLC. Pass rates dropped for a DTA of 1mm to model SRS tolerances. PerFraction pass rates can flag missing bolus or internal shields. Sanjeev Saini is an employee of Sun Nuclear Corporation. For this study, a pre-release version of PerFRACTION 1.1 software from Sun Nuclear Corporation was used.« less

  13. Envelope Interactions in Multi-Channel Amplitude Modulation Frequency Discrimination by Cochlear Implant Users.

    PubMed

    Galvin, John J; Oba, Sandra I; Başkent, Deniz; Chatterjee, Monita; Fu, Qian-Jie

    2015-01-01

    Previous cochlear implant (CI) studies have shown that single-channel amplitude modulation frequency discrimination (AMFD) can be improved when coherent modulation is delivered to additional channels. It is unclear whether the multi-channel advantage is due to increased loudness, multiple envelope representations, or to component channels with better temporal processing. Measuring envelope interference may shed light on how modulated channels can be combined. In this study, multi-channel AMFD was measured in CI subjects using a 3-alternative forced-choice, non-adaptive procedure ("which interval is different?"). For the reference stimulus, the reference AM (100 Hz) was delivered to all 3 channels. For the probe stimulus, the target AM (101, 102, 104, 108, 116, 132, 164, 228, or 256 Hz) was delivered to 1 of 3 channels, and the reference AM (100 Hz) delivered to the other 2 channels. The spacing between electrodes was varied to be wide or narrow to test different degrees of channel interaction. Results showed that CI subjects were highly sensitive to interactions between the reference and target envelopes. However, performance was non-monotonic as a function of target AM frequency. For the wide spacing, there was significantly less envelope interaction when the target AM was delivered to the basal channel. For the narrow spacing, there was no effect of target AM channel. The present data were also compared to a related previous study in which the target AM was delivered to a single channel or to all 3 channels. AMFD was much better with multiple than with single channels whether the target AM was delivered to 1 of 3 or to all 3 channels. For very small differences between the reference and target AM frequencies (2-4 Hz), there was often greater sensitivity when the target AM was delivered to 1 of 3 channels versus all 3 channels, especially for narrowly spaced electrodes. Besides the increased loudness, the present results also suggest that multiple envelope representations may contribute to the multi-channel advantage observed in previous AMFD studies. The different patterns of results for the wide and narrow spacing suggest a peripheral contribution to multi-channel temporal processing. Because the effect of target AM frequency was non-monotonic in this study, adaptive procedures may not be suitable to measure AMFD thresholds with interfering envelopes. Envelope interactions among multiple channels may be quite complex, depending on the envelope information presented to each channel and the relative independence of the stimulated channels.

  14. Envelope Interactions in Multi-Channel Amplitude Modulation Frequency Discrimination by Cochlear Implant Users

    PubMed Central

    2015-01-01

    Rationale Previous cochlear implant (CI) studies have shown that single-channel amplitude modulation frequency discrimination (AMFD) can be improved when coherent modulation is delivered to additional channels. It is unclear whether the multi-channel advantage is due to increased loudness, multiple envelope representations, or to component channels with better temporal processing. Measuring envelope interference may shed light on how modulated channels can be combined. Methods In this study, multi-channel AMFD was measured in CI subjects using a 3-alternative forced-choice, non-adaptive procedure (“which interval is different?”). For the reference stimulus, the reference AM (100 Hz) was delivered to all 3 channels. For the probe stimulus, the target AM (101, 102, 104, 108, 116, 132, 164, 228, or 256 Hz) was delivered to 1 of 3 channels, and the reference AM (100 Hz) delivered to the other 2 channels. The spacing between electrodes was varied to be wide or narrow to test different degrees of channel interaction. Results Results showed that CI subjects were highly sensitive to interactions between the reference and target envelopes. However, performance was non-monotonic as a function of target AM frequency. For the wide spacing, there was significantly less envelope interaction when the target AM was delivered to the basal channel. For the narrow spacing, there was no effect of target AM channel. The present data were also compared to a related previous study in which the target AM was delivered to a single channel or to all 3 channels. AMFD was much better with multiple than with single channels whether the target AM was delivered to 1 of 3 or to all 3 channels. For very small differences between the reference and target AM frequencies (2–4 Hz), there was often greater sensitivity when the target AM was delivered to 1 of 3 channels versus all 3 channels, especially for narrowly spaced electrodes. Conclusions Besides the increased loudness, the present results also suggest that multiple envelope representations may contribute to the multi-channel advantage observed in previous AMFD studies. The different patterns of results for the wide and narrow spacing suggest a peripheral contribution to multi-channel temporal processing. Because the effect of target AM frequency was non-monotonic in this study, adaptive procedures may not be suitable to measure AMFD thresholds with interfering envelopes. Envelope interactions among multiple channels may be quite complex, depending on the envelope information presented to each channel and the relative independence of the stimulated channels. PMID:26431043

  15. Software Aids for radiologists: Part 1, Useful Photoshop skills.

    PubMed

    Gross, Joel A; Thapa, Mahesh M

    2012-12-01

    The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.

  16. Automated complex for research of electric drives control

    NASA Astrophysics Data System (ADS)

    Avlasko, P. V.; Antonenko, D. A.

    2018-05-01

    In the article, the automated complex intended for research of various control modes of electric motors including the inductor motor of double-way feed is described. As a basis of the created complex, the National Instruments platform is chosen. The operating controller built in a platform is delivered with an operating system of real-time for creation of systems of measurement and management. The software developed in the environment of LabVIEW consists of several connected modules which are in different elements of a complex. Besides the software for automated management by experimental installation, the program complex is developed for modelling of processes in the electric drive. As a result there is an opportunity to compare simulated and received experimentally transitional characteristics of the electric drive in various operating modes.

  17. Mendel-GPU: haplotyping and genotype imputation on graphics processing units

    PubMed Central

    Chen, Gary K.; Wang, Kai; Stram, Alex H.; Sobel, Eric M.; Lange, Kenneth

    2012-01-01

    Motivation: In modern sequencing studies, one can improve the confidence of genotype calls by phasing haplotypes using information from an external reference panel of fully typed unrelated individuals. However, the computational demands are so high that they prohibit researchers with limited computational resources from haplotyping large-scale sequence data. Results: Our graphics processing unit based software delivers haplotyping and imputation accuracies comparable to competing programs at a fraction of the computational cost and peak memory demand. Availability: Mendel-GPU, our OpenCL software, runs on Linux platforms and is portable across AMD and nVidia GPUs. Users can download both code and documentation at http://code.google.com/p/mendel-gpu/. Contact: gary.k.chen@usc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22954633

  18. The Raid distributed database system

    NASA Technical Reports Server (NTRS)

    Bhargava, Bharat; Riedl, John

    1989-01-01

    Raid, a robust and adaptable distributed database system for transaction processing (TP), is described. Raid is a message-passing system, with server processes on each site to manage concurrent processing, consistent replicated copies during site failures, and atomic distributed commitment. A high-level layered communications package provides a clean location-independent interface between servers. The latest design of the package delivers messages via shared memory in a configuration with several servers linked into a single process. Raid provides the infrastructure to investigate various methods for supporting reliable distributed TP. Measurements on TP and server CPU time are presented, along with data from experiments on communications software, consistent replicated copy control during site failures, and concurrent distributed checkpointing. A software tool for evaluating the implementation of TP algorithms in an operating-system kernel is proposed.

  19. Software development: A paradigm for the future

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1989-01-01

    A new paradigm for software development that treats software development as an experimental activity is presented. It provides built-in mechanisms for learning how to develop software better and reusing previous experience in the forms of knowledge, processes, and products. It uses models and measures to aid in the tasks of characterization, evaluation and motivation. An organization scheme is proposed for separating the project-specific focus from the organization's learning and reuse focuses of software development. The implications of this approach for corporations, research and education are discussed and some research activities currently underway at the University of Maryland that support this approach are presented.

  20. Validation of thermal effects of LED package by using Elmer finite element simulation method

    NASA Astrophysics Data System (ADS)

    Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap

    2017-02-01

    The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.

  1. Ground station software for receiving and handling Irecin telemetry data

    NASA Astrophysics Data System (ADS)

    Ferrante, M.; Petrozzi, M.; Di Ciolo, L.; Ortenzi, A.; Troso, G

    2004-11-01

    The on board resources, needed to perform the mission tasks, are very limited in nano-satellites. This paper proposes a software system to receive, manage and process in Real Time the Telemetry data coming from IRECIN nanosatellite and transmit operator manual commands and operative procedures. During the receiving phase, it shows the IRECIN subsystem physical values, visualizes the IRECIN attitude, and performs other suitable functions. The IRECIN Ground Station program is in charge to exchange information between IRECIN and the Ground segment. It carries out, in real time during IRECIN transmission phase, IRECIN attitude drawing, sun direction drawing, power supply received from Sun, visualization of the telemetry data, visualization of Earth magnetic field and more other functions. The received data are memorized and interpreted by a module, parser, and distribute to the suitable modules. Moreover it allows sending manual and automatic commands. Manual commands are delivered by an operator, on the other hand, automatic commands are provided by pre-configured operative procedures. Operative procedures development is realized in a previous phase called configuration phase. This program is also in charge to carry out a test session by mean the scheduler and commanding modules allowing execution of specific tasks without operator control. A log module to memorize received and transmitted data is realized. A phase to analyze, filter and visualize in off line the collected data, called post analysis, is based on the data extraction form the log module. At the same time, the Ground Station Software can work in network allowing managing, receiving and sending data/commands from different sites. The proposed system constitutes the software of IRECIN Ground Station. IRECIN is a modular nanosatellite weighting less than 2 kg, constituted by sixteen external sides with surface-mounted solar cells and three internal Al plates, kept together by four steel bars. Lithium-ions batteries are used. Attitude is determined by two three-axis magnetometers and the solar panels data. Control is provided by an active magnetic control system. The spacecraft will be spin- stabilized with the spin-axis normal to the orbit. All IRECIN electronic components are SMD technology in order to reduce weight and size. The realized Electronic board are completely developed, realized and tested at the Vitrociset S.P.A. under control of Research and Develop Group

  2. Services supporting collaborative alignment of engineering networks

    NASA Astrophysics Data System (ADS)

    Jansson, Kim; Uoti, Mikko; Karvonen, Iris

    2015-08-01

    Large-scale facilities such as power plants, process factories, ships and communication infrastructures are often engineered and delivered through geographically distributed operations. The competencies required are usually distributed across several contributing organisations. In these complicated projects, it is of key importance that all partners work coherently towards a common goal. VTT and a number of industrial organisations in the marine sector have participated in a national collaborative research programme addressing these needs. The main output of this programme was development of the Innovation and Engineering Maturity Model for Marine-Industry Networks. The recently completed European Union Framework Programme 7 project COIN developed innovative solutions and software services for enterprise collaboration and enterprise interoperability. One area of focus in that work was services for collaborative project management. This article first addresses a number of central underlying research themes and previous research results that have influenced the development work mentioned above. This article presents two approaches for the development of services that support distributed engineering work. Experience from use of the services is analysed, and potential for development is identified. This article concludes with a proposal for consolidation of the two above-mentioned methodologies. This article outlines the characteristics and requirements of future services supporting collaborative alignment of engineering networks.

  3. A novel frame-level constant-distortion bit allocation for smooth H.264/AVC video quality

    NASA Astrophysics Data System (ADS)

    Liu, Li; Zhuang, Xinhua

    2009-01-01

    It is known that quality fluctuation has a major negative effect on visual perception. In previous work, we introduced a constant-distortion bit allocation method [1] for H.263+ encoder. However, the method in [1] can not be adapted to the newest H.264/AVC encoder directly as the well-known chicken-egg dilemma resulted from the rate-distortion optimization (RDO) decision process. To solve this problem, we propose a new two stage constant-distortion bit allocation (CDBA) algorithm with enhanced rate control for H.264/AVC encoder. In stage-1, the algorithm performs RD optimization process with a constant quantization QP. Based on prediction residual signals from stage-1 and target distortion for smooth video quality purpose, the frame-level bit target is allocated by using a close-form approximations of ratedistortion relationship similar to [1], and a fast stage-2 encoding process is performed with enhanced basic unit rate control. Experimental results show that, compared with original rate control algorithm provided by H.264/AVC reference software JM12.1, the proposed constant-distortion frame-level bit allocation scheme reduces quality fluctuation and delivers much smoother PSNR on all testing sequences.

  4. The Use of Uas for Rapid 3d Mapping in Geomatics Education

    NASA Astrophysics Data System (ADS)

    Teo, Tee-Ann; Tian-Yuan Shih, Peter; Yu, Sz-Cheng; Tsai, Fuan

    2016-06-01

    With the development of technology, UAS is an advance technology to support rapid mapping for disaster response. The aim of this study is to develop educational modules for UAS data processing in rapid 3D mapping. The designed modules for this study are focused on UAV data processing from available freeware or trial software for education purpose. The key modules include orientation modelling, 3D point clouds generation, image georeferencing and visualization. The orientation modelling modules adopts VisualSFM to determine the projection matrix for each image station. Besides, the approximate ground control points are measured from OpenStreetMap for absolute orientation. The second module uses SURE and the orientation files from previous module for 3D point clouds generation. Then, the ground point selection and digital terrain model generation can be archived by LAStools. The third module stitches individual rectified images into a mosaic image using Microsoft ICE (Image Composite Editor). The last module visualizes and measures the generated dense point clouds in CloudCompare. These comprehensive UAS processing modules allow the students to gain the skills to process and deliver UAS photogrammetric products in rapid 3D mapping. Moreover, they can also apply the photogrammetric products for analysis in practice.

  5. eXascale PRogramming Environment and System Software (XPRESS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Gabriel, Edgar

    Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-­scale computing for both exascale and strong­scaled problems. The XPRESS collaborative research project will advance the state-­of-­the-­art in high performance computing and enable exascale computing for current and future DOE mission-­critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-­stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less

  6. ELISA, a demonstrator environment for information systems architecture design

    NASA Technical Reports Server (NTRS)

    Panem, Chantal

    1994-01-01

    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.

  7. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    NASA Astrophysics Data System (ADS)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  8. [Epidemiology of shoulder dystocia].

    PubMed

    Deneux-Tharaux, C; Delorme, P

    2015-12-01

    To synthetize the available evidence regarding the incidence and risk factors of shoulder dystocia (SD). Consultation of the Medline database, and of national guidelines. Shoulder dystocia is defined as a vaginal delivery that requires additional obstetric manoeuvres to deliver the foetus after the head has delivered and gentle traction has failed. With this definition, the incidence of SD in population-based studies is about 0.5-1% of vaginal deliveries. Many risk factors have been described but most associations are not independent, or have not been constantly found. The 2 characteristics consistently found as independent risk factors for SD in the literature are previous SD (incidence of SD of about 10% in parturients with previous SD) and foetal macrosomia. Maternal diabetes and obesity also are associated with a higher risk of SD (2 to 4 folds) but these associations may be completely explained by foetal macrosomia. However, even factors independently and constantly associated with SD do not allow a valid prediction of SD because they are not discriminant; 50 to 70% of SD cases occur in their absence, and the great majority of deliveries when they are present is not associated with SD. Shoulder dystocia is defined by the need for additional obstetric manoeuvres to deliver the foetus after the head has delivered and gentle traction has failed, and complicates 0.5-1% of vaginal deliveries. Its main risk factors are previous SD and macrosomia, but they are poorly predictive. SD remains a non-predictable obstetrics emergency. Knowledge of SD risk factors should increase the vigilance of clinicians in at-risk contexts. Copyright © 2015. Published by Elsevier Masson SAS.

  9. Investigation on the ability of an ultrasound bubble detector to deliver size measurements of gaseous bubbles in fluid lines by using a glass bead model.

    PubMed

    Eitschberger, S; Henseler, A; Krasenbrink, B; Oedekoven, B; Mottaghy, K

    2001-01-01

    Detectors based on ultrasonic principles are today's state of the art devices to detect gaseous bubbles that may be present in extracorporeal circuits (ECC) for various reasons. Referring to theoretical considerations and other studies, it also seems possible to use this technology to measure the size of detected bubbles, thus offering the chance to evaluate their potential hazardous effect if introduced into a patient's circulation. Based on these considerations, a commercially available ultrasound bubble detector has been developed by Hatteland Instrumentering, Norway, to deliver bubble size measurements by means of supplementary software. This device consists of an ultrasound sensor that can be clamped onto the ECC tubing, and the necessary electronic equipment to amplify and rectify the received signals. It is supplemented by software that processes these signals and presents them as specific data. On the basis of our knowledge and experience with bubble detection by ultrasound technology, we believe it is particularly difficult to meet all the requirements for size measurements, especially if these are to be achieved by using a mathematical procedure rather than exact devices. Therefore, we tried to evaluate the quality of the offered bubble detector in measuring bubble sizes. After establishing a standardized test stand, including a roller pump and a temperature sensor, we performed several sets of experiments using the manufacturers software and a program specifically designed at our department for this purpose. The first set revealed that the manufacturer's recommended calibration material did not meet essential requirements as established by other authors. Having solved that problem, we could actually demonstrate that the ultrasonic field, as generated by the bubble detector, has been correctly calculated by the manufacturer. Simply, it is a field having the strongest reflecting region in the center, subsequently losing strength toward the ECC tubing's edge. The following set of experiments revealed that the supplementary software not only does not compensate for the ultrasonic field's inhomogeneity, but, furthermore, delivers results that are inappropriate to the applied calibration material. In the last set of experiments, we were able to demonstrate that the signals as recorded by the bubble detector heavily depend upon the circulating fluid's temperature, a fact that the manufacturer does not address. Therefore, it seems impossible to resolve all these sensor related problems by ever-increasing mathematical intervention. We believe it is more appropriate to develop a new kind of ultrasound device, free of these shortcomings. This seems to be particularly useful, because the problem of determining the size of gaseous bubbles in ECC is not yet solved.

  10. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  11. Sociocultural determinants of home delivery in Ethiopia: a qualitative study.

    PubMed

    Kaba, Mirgissa; Bulto, Tesfaye; Tafesse, Zergu; Lingerh, Wassie; Ali, Ismael

    2016-01-01

    Maternal health remains a major public health problem in Ethiopia. Despite the government's measures to ensure institutional delivery assisted by skilled attendants, home delivery remains high, estimated at over 80% of all pregnant women. The study aims to identify determinants that sustain home delivery in Ethiopia. A total of 48 women who delivered their most recent child at home, 56 women who delivered their most recent child in a health facility, 55 husbands of women who delivered within 1 year preceding the study, and 23 opinion leaders in selected districts of Amhara, Oromia, Southern Nations, Nationalities, and Peoples' Region, and Tigray regions were involved in the study. Key informant interview, in-depth interviews, and focus group discussions were conducted to collect data using checklists developed for this purpose. Data reduction and analysis were facilitated by Maxqda qualitative data analysis software version 11. Findings show that pregnancy and delivery is a normal and natural life event. Research participants unanimously argue that such a life event should not be linked with health problems. Home is considered a natural space for delivery and most women aspire to deliver at home where rituals during labor and after delivery are considered enjoyable. Even those who delivered in health facilities appreciate events in connection to home delivery. Efforts are underway to create home-like environments in health facilities, but health facilities are not yet recognized as a natural place of delivery. The positive tendency to deliver at home is further facilitated by poor service delivery at the facility level. Perceived poor competence of providers and limited availability of supplies and equipment were found to maintain the preference to deliver at home. The government's endeavor to improve maternal health has generated positive results with more women now attending antenatal care. Yet over 80% of women deliver at home and this was found to be the preferred option. Thus, the current form of intervention needs to focus on factors that determine decisions to deliver at home and also focus on investing in improving service delivery at health facilities.

  12. Sociocultural determinants of home delivery in Ethiopia: a qualitative study

    PubMed Central

    Kaba, Mirgissa; Bulto, Tesfaye; Tafesse, Zergu; Lingerh, Wassie; Ali, Ismael

    2016-01-01

    Background Maternal health remains a major public health problem in Ethiopia. Despite the government’s measures to ensure institutional delivery assisted by skilled attendants, home delivery remains high, estimated at over 80% of all pregnant women. Objective The study aims to identify determinants that sustain home delivery in Ethiopia. Methods A total of 48 women who delivered their most recent child at home, 56 women who delivered their most recent child in a health facility, 55 husbands of women who delivered within 1 year preceding the study, and 23 opinion leaders in selected districts of Amhara, Oromia, Southern Nations, Nationalities, and Peoples’ Region, and Tigray regions were involved in the study. Key informant interview, in-depth interviews, and focus group discussions were conducted to collect data using checklists developed for this purpose. Data reduction and analysis were facilitated by Maxqda qualitative data analysis software version 11. Results Findings show that pregnancy and delivery is a normal and natural life event. Research participants unanimously argue that such a life event should not be linked with health problems. Home is considered a natural space for delivery and most women aspire to deliver at home where rituals during labor and after delivery are considered enjoyable. Even those who delivered in health facilities appreciate events in connection to home delivery. Efforts are underway to create home-like environments in health facilities, but health facilities are not yet recognized as a natural place of delivery. The positive tendency to deliver at home is further facilitated by poor service delivery at the facility level. Perceived poor competence of providers and limited availability of supplies and equipment were found to maintain the preference to deliver at home. Conclusion The government’s endeavor to improve maternal health has generated positive results with more women now attending antenatal care. Yet over 80% of women deliver at home and this was found to be the preferred option. Thus, the current form of intervention needs to focus on factors that determine decisions to deliver at home and also focus on investing in improving service delivery at health facilities. PMID:27114718

  13. Enhancing the immersive reality of virtual simulators for easily accessible laparoscopic surgical training

    NASA Astrophysics Data System (ADS)

    McKenna, Kyra; McMenemy, Karen; Ferguson, R. S.; Dick, Alistair; Potts, Stephen

    2008-02-01

    Computer simulators are a popular method of training surgeons in the techniques of laparoscopy. However, for the trainee to feel totally immersed in the process, the graphical display should be as lifelike as possible and two-handed force feedback interaction is required. This paper reports on how a compelling immersive experience can be delivered at low cost using commonly available hardware components. Three specific themes are brought together. Firstly, programmable shaders executing in standard PC graphics adapter's deliver the appearance of anatomical realism, including effects of: translucent tissue surfaces, semi-transparent membranes, multilayer image texturing and real-time shadowing. Secondly, relatively inexpensive 'off the shelf' force feedback devices contribute to a holistic immersive experience. The final element described is the custom software that brings these together with hierarchically organized and optimized polygonal models for abdominal anatomy.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The plpdfa software is a product of an LDRD project at LLNL entitked "Adaptive Sampling for Very High Throughput Data Streams" (tracking number 11-ERD-035). This software was developed by a graduate student summer intern, Chris Challis, who worked under project PI Dan Merl furing the summer of 2011. The software the source code is implementing is a statistical analysis technique for clustering and classification of text-valued data. The method had been previously published by the PI in the open literature.

  15. Auto-Coding UML Statecharts for Flight Software

    NASA Technical Reports Server (NTRS)

    Benowitz, Edward G; Clark, Ken; Watney, Garth J.

    2006-01-01

    Statecharts have been used as a means to communicate behaviors in a precise manner between system engineers and software engineers. Hand-translating a statechart to code, as done on some previous space missions, introduces the possibility of errors in the transformation from chart to code. To improve auto-coding, we have developed a process that generates flight code from UML statecharts. Our process is being used for the flight software on the Space Interferometer Mission (SIM).

  16. Long range targeting for space based rendezvous

    NASA Technical Reports Server (NTRS)

    Everett, Louis J.; Redfield, R. C.

    1995-01-01

    The work performed under this grant supported the Dexterous Flight Experiment one STS-62 The project required developing hardware and software for automating a TRAC sensor on orbit. The hardware developed by for the flight has been documented through standard NASA channels since it has to pass safety, environmental, and other issues. The software has not been documented previously, therefore, this report provides a software manual for the TRAC code developed for the grant.

  17. A Comparison of Software Schedule Estimators

    DTIC Science & Technology

    1990-09-01

    SLIM ...................................... 33 SPQR /20 ................................... 35 System -4 .................................... 37 Previous...24 3. PRICE-S Outputs ..................................... 26 4. COCOMO Factors by Category ........................... 28 5. SPQR /20 Activities...actual schedules experienced on the projects. The models analyzed were REVIC, PRICE-S, System-4, SPQR /20, and SEER. ix A COMPARISON OF SOFTWARE

  18. Perceptions of the software skills of graduates by employers in the financial services industry

    NASA Astrophysics Data System (ADS)

    Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2013-12-01

    Software, particularly spreadsheet software, is ubiquitous in the financial services workplace. Yet little is known about the extent to which universities should, and do, prepare graduates for this aspect of the modern workplace. We have investigated this issue through a survey of financial services employers of graduates, the results of which are reported in this paper, as well as surveys of university graduates and academics, reported previously. Financial services employers rate software skills as important, would like their employees to be more highly skilled in the use of such software, and tend to prefer 'on-the-job' training rather than university training for statistical, database and specialized actuarial/financial software. There is a perception among graduates that employers do not provide adequate formal workplace training in the use of technical software.

  19. Preliminary Studies for a CBCT Imaging Protocol for Offline Organ Motion Analysis: Registration Software Validation and CTDI Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falco, Maria Daniela, E-mail: mdanielafalco@hotmail.co; Fontanarosa, Davide; Miceli, Roberto

    2011-04-01

    Cone-beam X-ray volumetric imaging in the treatment room, allows online correction of set-up errors and offline assessment of residual set-up errors and organ motion. In this study the registration algorithm of the X-ray volume imaging software (XVI, Elekta, Crawley, United Kingdom), which manages a commercial cone-beam computed tomography (CBCT)-based positioning system, has been tested using a homemade and an anthropomorphic phantom to: (1) assess its performance in detecting known translational and rotational set-up errors and (2) transfer the transformation matrix of its registrations into a commercial treatment planning system (TPS) for offline organ motion analysis. Furthermore, CBCT dose index hasmore » been measured for a particular site (prostate: 120 kV, 1028.8 mAs, approximately 640 frames) using a standard Perspex cylindrical body phantom (diameter 32 cm, length 15 cm) and a 10-cm-long pencil ionization chamber. We have found that known displacements were correctly calculated by the registration software to within 1.3 mm and 0.4{sup o}. For the anthropomorphic phantom, only translational displacements have been considered. Both studies have shown errors within the intrinsic uncertainty of our system for translational displacements (estimated as 0.87 mm) and rotational displacements (estimated as 0.22{sup o}). The resulting table translations proposed by the system to correct the displacements were also checked with portal images and found to place the isocenter of the plan on the linac isocenter within an error of 1 mm, which is the dimension of the spherical lead marker inserted at the center of the homemade phantom. The registration matrix translated into the TPS image fusion module correctly reproduced the alignment between planning CT scans and CBCT scans. Finally, measurements on the CBCT dose index indicate that CBCT acquisition delivers less dose than conventional CT scans and electronic portal imaging device portals. The registration software was found to be accurate, and its registration matrix can be easily translated into the TPS and a low dose is delivered to the patient during image acquisition. These results can help in designing imaging protocols for offline evaluations.« less

  20. Page Recognition: Quantum Leap In Recognition Technology

    NASA Astrophysics Data System (ADS)

    Miller, Larry

    1989-07-01

    No milestone has proven as elusive as the always-approaching "year of the LAN," but the "year of the scanner" might claim the silver medal. Desktop scanners have been around almost as long as personal computers. And everyone thinks they are used for obvious desktop-publishing and business tasks like scanning business documents, magazine articles and other pages, and translating those words into files your computer understands. But, until now, the reality fell far short of the promise. Because it's true that scanners deliver an accurate image of the page to your computer, but the software to recognize this text has been woefully disappointing. Old optical-character recognition (OCR) software recognized such a limited range of pages as to be virtually useless to real users. (For example, one OCR vendor specified 12-point Courier font from an IBM Selectric typewriter: the same font in 10-point, or from a Diablo printer, was unrecognizable!) Computer dealers have told me the chasm between OCR expectations and reality is so broad and deep that nine out of ten prospects leave their stores in disgust when they learn the limitations. And this is a very important, very unfortunate gap. Because the promise of recognition -- what people want it to do -- carries with it tremendous improvements in our productivity and ability to get tons of written documents into our computers where we can do real work with it. The good news is that a revolutionary new development effort has led to the new technology of "page recognition," which actually does deliver the promise we've always wanted from OCR. I'm sure every reader appreciates the breakthrough represented by the laser printer and page-makeup software, a combination so powerful it created new reasons for buying a computer. A similar breakthrough is happening right now in page recognition: the Macintosh (and, I must admit, other personal computers) equipped with a moderately priced scanner and OmniPage software (from Caere Corporation) can recognize not only different fonts (omnifont recogniton) but different page (omnipage) formats, as well.

  1. Open-source, small-animal magnetic resonance-guided focused ultrasound system.

    PubMed

    Poorman, Megan E; Chaplin, Vandiver L; Wilkens, Ken; Dockery, Mary D; Giorgio, Todd D; Grissom, William A; Caskey, Charles F

    2016-01-01

    MR-guided focused ultrasound or high-intensity focused ultrasound (MRgFUS/MRgHIFU) is a non-invasive therapeutic modality with many potential applications in areas such as cancer therapy, drug delivery, and blood-brain barrier opening. However, the large financial costs involved in developing preclinical MRgFUS systems represent a barrier to research groups interested in developing new techniques and applications. We aim to mitigate these challenges by detailing a validated, open-source preclinical MRgFUS system capable of delivering thermal and mechanical FUS in a quantifiable and repeatable manner under real-time MRI guidance. A hardware and software package was developed that includes closed-loop feedback controlled thermometry code and CAD drawings for a therapy table designed for a preclinical MRI scanner. For thermal treatments, the modular software uses a proportional integral derivative controller to maintain a precise focal temperature rise in the target given input from MR phase images obtained concurrently. The software computes the required voltage output and transmits it to a FUS transducer that is embedded in the delivery table within the magnet bore. The delivery table holds the FUS transducer, a small animal and its monitoring equipment, and a transmit/receive RF coil. The transducer is coupled to the animal via a water bath and is translatable in two dimensions from outside the magnet. The transducer is driven by a waveform generator and amplifier controlled by real-time software in Matlab. MR acoustic radiation force imaging is also implemented to confirm the position of the focus for mechanical and thermal treatments. The system was validated in tissue-mimicking phantoms and in vivo during murine tumor hyperthermia treatments. Sonications were successfully controlled over a range of temperatures and thermal doses for up to 20 min with minimal temperature overshoot. MR thermometry was validated with an optical temperature probe, and focus visualization was achieved with acoustic radiation force imaging. We developed an MRgFUS platform for small-animal treatments that robustly delivers accurate, precise, and controllable sonications over extended time periods. This system is an open source and could increase the availability of low-cost small-animal systems to interdisciplinary researchers seeking to develop new MRgFUS applications and technology.

  2. A comprehensive data acquisition and management system for an ecosystem-scale peatland warming and elevated CO2 experiment

    NASA Astrophysics Data System (ADS)

    Krassovski, M. B.; Riggs, J. S.; Hook, L. A.; Nettles, W. R.; Hanson, P. J.; Boden, T. A.

    2015-07-01

    Ecosystem-scale manipulation experiments represent large science investments that require well-designed data acquisition and management systems to provide reliable, accurate information to project participants and third party users. The SPRUCE Project (Spruce and Peatland Responses Under Climatic and Environmental Change, http://mnspruce.ornl.gov) is such an experiment funded by the Department of Energy's (DOE), Office of Science, Terrestrial Ecosystem Science (TES) Program. The SPRUCE experimental mission is to assess ecosystem-level biological responses of vulnerable, high carbon terrestrial ecosystems to a range of climate warming manipulations and an elevated CO2 atmosphere. SPRUCE provides a platform for testing mechanisms controlling the vulnerability of organisms, biogeochemical processes, and ecosystems to climatic change (e.g., thresholds for organism decline or mortality, limitations to regeneration, biogeochemical limitations to productivity, the cycling and release of CO2 and CH4 to the atmosphere). The SPRUCE experiment will generate a wide range of continuous and discrete measurements. To successfully manage SPRUCE data collection, achieve SPRUCE science objectives, and support broader climate change research, the research staff has designed a flexible data system using proven network technologies and software components. The primary SPRUCE data system components are: 1. Data acquisition and control system - set of hardware and software to retrieve biological and engineering data from sensors, collect sensor status information, and distribute feedback to control components. 2. Data collection system - set of hardware and software to deliver data to a central depository for storage and further processing. 3. Data management plan - set of plans, policies, and practices to control consistency, protect data integrity, and deliver data. This publication presents our approach to meeting the challenges of designing and constructing an efficient data system for managing high volume sources of in-situ observations in a remote, harsh environmental location. The approach covers data flow starting from the sensors and ending at the archival/distribution points, discusses types of hardware and software used, examines design considerations that were used to choose them, and describes the data management practices chosen to control and enhance the value of the data.

  3. A comprehensive data acquisition and management system for an ecosystem-scale peatland warming and elevated CO2 experiment

    NASA Astrophysics Data System (ADS)

    Krassovski, M. B.; Riggs, J. S.; Hook, L. A.; Nettles, W. R.; Hanson, P. J.; Boden, T. A.

    2015-11-01

    Ecosystem-scale manipulation experiments represent large science investments that require well-designed data acquisition and management systems to provide reliable, accurate information to project participants and third party users. The SPRUCE project (Spruce and Peatland Responses Under Climatic and Environmental Change, http://mnspruce.ornl.gov) is such an experiment funded by the Department of Energy's (DOE), Office of Science, Terrestrial Ecosystem Science (TES) Program. The SPRUCE experimental mission is to assess ecosystem-level biological responses of vulnerable, high carbon terrestrial ecosystems to a range of climate warming manipulations and an elevated CO2 atmosphere. SPRUCE provides a platform for testing mechanisms controlling the vulnerability of organisms, biogeochemical processes, and ecosystems to climatic change (e.g., thresholds for organism decline or mortality, limitations to regeneration, biogeochemical limitations to productivity, and the cycling and release of CO2 and CH4 to the atmosphere). The SPRUCE experiment will generate a wide range of continuous and discrete measurements. To successfully manage SPRUCE data collection, achieve SPRUCE science objectives, and support broader climate change research, the research staff has designed a flexible data system using proven network technologies and software components. The primary SPRUCE data system components are the following: 1. data acquisition and control system - set of hardware and software to retrieve biological and engineering data from sensors, collect sensor status information, and distribute feedback to control components; 2. data collection system - set of hardware and software to deliver data to a central depository for storage and further processing; 3. data management plan - set of plans, policies, and practices to control consistency, protect data integrity, and deliver data. This publication presents our approach to meeting the challenges of designing and constructing an efficient data system for managing high volume sources of in situ observations in a remote, harsh environmental location. The approach covers data flow starting from the sensors and ending at the archival/distribution points, discusses types of hardware and software used, examines design considerations that were used to choose them, and describes the data management practices chosen to control and enhance the value of the data.

  4. Design and development of an IBM/VM menu system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cazzola, D.J.

    1992-10-01

    This report describes a full screen menu system developed using IBM's Interactive System Productivity Facility (ISPF) and the REXX programming language. The software was developed for the 2800 IBM/VM Electrical Computer Aided Design (ECAD) system. The system was developed to deliver electronic drawing definitions to a corporate drawing release system. Although this report documents the status of the menu system when it was retired, the methodologies used and the requirements defined are very applicable to replacement systems.

  5. The right stuff ... meeting your customer needs.

    PubMed

    Rubin, P; Carrington, S

    1999-11-01

    Meeting (and exceeding) your customers' needs is a requirement for competing in the current business world. New tools and techniques must be employed to deal with the rapidly changing global environment. This article describes the success of a global supply chain integration project for a division of a large multinational corporation. A state-of-the-art ERP software package was implemented in conjunction with major process changes to improve the organization's ability to promise and deliver product to their customers.

  6. Virtual patients: practical advice for clinical authors using Labyrinth.

    PubMed

    Begg, Michael

    2010-09-01

    Labyrinth is a tool originally developed in the University of Edinburgh's Learning Technology Section for authoring and delivering branching case scenarios. The scenarios can incorporate game-informed elements such as scoring, randomising, avatars and counters. Labyrinth has grown more popular internationally since a version of the build was made available on the open source network Source Forge. This paper offers help and advice for clinical educators interested in creating cases. Labyrinth is increasingly recognised as a tool offering great potential for delivering cases that promote rich, situated learning opportunities for learners. There are, however, significant challenges to generating such cases, not least of which is the challenge for potential authors in approaching the process of constructing narrative-rich, context-sensitive cases in an unfamiliar authoring environment. This paper offers a brief overview of the principles informing Labyrinth cases (game-informed learning), and offers some practical advice to better prepare educators with little or no prior experience. Labyrinth has continued to grow and develop, from its roots as a research and development environment to one that is optimised for use by non-technical clinical educators. The process becomes increasingly iterative and better informed as the teaching community push the software further. The positive implications of providing practical advice and concept insight to new case authors is that it ideally leads to a broader base of users who will inform future iterations of the software. © Blackwell Publishing Ltd 2010.

  7. A Predictive Approach to Eliminating Errors in Software Code

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA s Metrics Data Program Data Repository is a database that stores problem, product, and metrics data. The primary goal of this data repository is to provide project data to the software community. In doing so, the Metrics Data Program collects artifacts from a large NASA dataset, generates metrics on the artifacts, and then generates reports that are made available to the public at no cost. The data that are made available to general users have been sanitized and authorized for publication through the Metrics Data Program Web site by officials representing the projects from which the data originated. The data repository is operated by NASA s Independent Verification and Validation (IV&V) Facility, which is located in Fairmont, West Virginia, a high-tech hub for emerging innovation in the Mountain State. The IV&V Facility was founded in 1993, under the NASA Office of Safety and Mission Assurance, as a direct result of recommendations made by the National Research Council and the Report of the Presidential Commission on the Space Shuttle Challenger Accident. Today, under the direction of Goddard Space Flight Center, the IV&V Facility continues its mission to provide the highest achievable levels of safety and cost-effectiveness for mission-critical software. By extending its data to public users, the facility has helped improve the safety, reliability, and quality of complex software systems throughout private industry and other government agencies. Integrated Software Metrics, Inc., is one of the organizations that has benefited from studying the metrics data. As a result, the company has evolved into a leading developer of innovative software-error prediction tools that help organizations deliver better software, on time and on budget.

  8. A Cloud-based, Open-Source, Command-and-Control Software Paradigm for Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Melton, R.; Thomas, J.

    With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.

  9. Practical experience with test-driven development during commissioning of the multi-star AO system ARGOS

    NASA Astrophysics Data System (ADS)

    Kulas, M.; Borelli, Jose Luis; Gässler, Wolfgang; Peter, Diethard; Rabien, Sebastian; Orban de Xivry, Gilles; Busoni, Lorenzo; Bonaglia, Marco; Mazzoni, Tommaso; Rahmer, Gustavo

    2014-07-01

    Commissioning time for an instrument at an observatory is precious, especially the night time. Whenever astronomers come up with a software feature request or point out a software defect, the software engineers have the task to find a solution and implement it as fast as possible. In this project phase, the software engineers work under time pressure and stress to deliver a functional instrument control software (ICS). The shortness of development time during commissioning is a constraint for software engineering teams and applies to the ARGOS project as well. The goal of the ARGOS (Advanced Rayleigh guided Ground layer adaptive Optics System) project is the upgrade of the Large Binocular Telescope (LBT) with an adaptive optics (AO) system consisting of six Rayleigh laser guide stars and wavefront sensors. For developing the ICS, we used the technique Test- Driven Development (TDD) whose main rule demands that the programmer writes test code before production code. Thereby, TDD can yield a software system, that grows without defects and eases maintenance. Having applied TDD in a calm and relaxed environment like office and laboratory, the ARGOS team has profited from the benefits of TDD. Before the commissioning, we were worried that the time pressure in that tough project phase would force us to drop TDD because we would spend more time writing test code than it would be worth. Despite this concern at the beginning, we could keep TDD most of the time also in this project phase This report describes the practical application and performance of TDD including its benefits, limitations and problems during the ARGOS commissioning. Furthermore, it covers our experience with pair programming and continuous integration at the telescope.

  10. The TRIDEC Project: Future-Saving FOSS GIS Applications for Tsunami Early Warning

    NASA Astrophysics Data System (ADS)

    Loewe, P.; Wächter, J.; Hammitzsch, M.

    2011-12-01

    The Boxing Day Tsunami of 2004 killed over 240,000 people in 14 countries and inundated the affected shorelines with waves reaching heights up to 30m. This natural disaster coincided with an information catastrophy, as potentially life-saving early warning information existed, yet no means were available to deliver it to the communities under imminent threat. Tsunami Early Warning Capabilities have improved in the meantime by continuing development of modular Tsunami Early Warning Systems (TEWS). However, recent tsunami events, like the Chile 2010 and the Tohoku 2011 tsunami demonstrate that the key challenge for ongoing TEWS research on the supranational scale still lies in the timely issuing of reliable early warning messages. Since 2004, the GFZ German Research Centre for Geosciences has built up expertise in the field of TEWS. Within GFZ, the Centre for GeoInformation Technology (CEGIT) has focused its work on the geoinformatics aspects of TEWS in two projects already: The German Indonesian Tsunami Early Warning System (GITEWS) funded by the German Federal Ministry of Education and Research (BMBF) and the Distant Early Warning System (DEWS), a European project funded under the sixth Framework Programme (FP6). These developments are continued in the TRIDEC project (Collaborative, Complex, and Critical Decision Processes in Evolving Crises) funded under the European Union's seventh Framework Programme (FP7). This ongoing project focuses on real-time intelligent information management in Earth management and its long-term application. All TRIDEC developments are based on Free and Open Source Software (FOSS) components and industry standards where-ever possible. Tsunami Early Warning in TRIDEC is also based on mature system architecture models to ensure long-term usability and the flexibility to adapt to future generations of Tsunami sensors. All open source software produced by the project consortium are foreseen to be published on FOSSLAB, a publicly available software repository provided by CEGIT. FOSSLAB serves as a platform for the development of FOSS projects in geospatial context, allowing to save, advance and reuse results achieved in previous and on-going project activities and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. FOSSLABs potential to preserve and advance existing best practices for reuse in new scenarios is documented by a first case study: For TEWS education and public outreach a comprehensive approach to generate high resolution globe maps was compiled using GRASS GIS and the POV-Ray rendering software. The task resulted in the merging of isolated technical know-how into publicly available best practices, which had been previously maintained in disparate GIS- and rendering communities. Beyond the scope of TRIDEC, FOSSLAB constitutes an umbrella encompassing several geoinformatics-related activities, such as the documentation of Best Practices for experiences and results while working with Spatial Data Infrastructures (SDI), Geographic Information Systems (GIS), Geomatics, and future spatial processing on Computation Clusters and in Cloud Computing.

  11. Predictors and Correlates of Abortion in the Fragile Families and Well-Being Study: Paternal Behavior, Substance Use, and Partner Violence

    ERIC Educational Resources Information Center

    Coleman, Priscilla K.; Maxey, Charles David; Spence, Maria; Nixon, Charisse L.

    2009-01-01

    This study was designed to identify predictors of the choice to abort or deliver a child within 18 months of a previous birth and to compare mothers who chose to abort or deliver relative to substance use and adverse partner behavior. Using a systems perspective, data from the Fragile Families and Well-Being Study were examined. The sample…

  12. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  13. Top 10 metrics for life science software good practices.

    PubMed

    Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel

    2016-01-01

    Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.

  14. Top 10 metrics for life science software good practices

    PubMed Central

    2016-01-01

    Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here. PMID:27635232

  15. Screen Miniatures as Icons for Backward Navigation in Content-Based Software.

    ERIC Educational Resources Information Center

    Boling, Elizabeth; Ma, Guoping; Tao, Chia-Wen; Askun, Cengiz; Green, Tim; Frick, Theodore; Schaumburg, Heike

    Users of content-based software programs, including hypertexts and instructional multimedia, rely on the navigation functions provided by the designers of those program. Typical navigation schemes use abstract symbols (arrows) to label basic navigational functions like moving forward or backward through screen displays. In a previous study, the…

  16. COMPILATION OF SATURATED AND UNSATURATED ZONE MODELING SOFTWARE

    EPA Science Inventory

    The full report provides readers an overview of available ground-water modeling programs and related software. It is an update of EPA/600/R-93/118 and EPA/600/R-94/028, two previous reports from the same program at the International Ground Water Modeling Center (IGWMC) in Colora...

  17. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    NASA Technical Reports Server (NTRS)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  18. MPST Software: MoonKommand

    NASA Technical Reports Server (NTRS)

    Kwok, John H.; Call, Jared A.; Khanampornpan, Teerapat

    2012-01-01

    This software automatically processes Sally Ride Science (SRS) delivered MoonKAM camera control files (ccf) into uplink products for the GRAIL-A and GRAIL-B spacecraft as part of an education and public outreach (EPO) extension to the Grail Mission. Once properly validated and deemed safe for execution onboard the spacecraft, MoonKommand generates the command products via the Automated Sequence Processor (ASP) and generates uplink (.scmf) files for radiation to the Grail-A and/or Grail-B spacecraft. Any errors detected along the way are reported back to SRS via email. With Moon Kommand, SRS can control their EPO instrument as part of a fully automated process. Inputs are received from SRS as either image capture files (.ccficd) for new image requests, or downlink/delete files (.ccfdl) for requesting image downlink from the instrument and on-board memory management. The Moon - Kommand outputs are command and file-load (.scmf) files that will be uplinked by the Deep Space Network (DSN). Without MoonKommand software, uplink product generation for the MoonKAM instrument would be a manual process. The software is specific to the Moon - KAM instrument on the GRAIL mission. At the time of this writing, the GRAIL mission was making final preparations to begin the science phase, which was scheduled to continue until June 2012.

  19. Assessment of nursing care using indicators generated by software.

    PubMed

    Lima, Ana Paula Souza; Chianca, Tânia Couto Machado; Tannure, Meire Chucre

    2015-01-01

    to analyze the efficacy of the Nursing Process in an Intensive Care Unit using indicators generated by software. cross-sectional study using data collected for four months. RNs and students daily registered patients, took history (at admission), performed physical assessments, and established nursing diagnoses, nursing plans/prescriptions, and assessed care delivered to 17 patients using software. Indicators concerning the incidence and prevalence of nursing diagnoses, rate of effectiveness, risk diagnoses, and rate of effective prevention of complications were computed. the Risk for imbalanced body temperature was the most frequent diagnosis (23.53%), while the least frequent was Risk for constipation (0%). The Risk for Impaired skin integrity was prevalent in 100% of the patients, while Risk for acute confusion was the least prevalent (11.76%). Risk for constipation and Risk for impaired skin integrity obtained a rate of risk diagnostic effectiveness of 100%. The rate of effective prevention of acute confusion and falls was 100%. the efficacy of the Nursing Process using indicators was analyzed because these indicators reveal how nurses have identified patients' risks and conditions, and planned care in a systematized manner.

  20. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    PubMed Central

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811

  1. Section 3: Quality and Value-Based Requirements

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Traditionally, research and practice in software engineering has focused its attention on specific software qualities, such as functionality and performance. According to this perspective, a system is deemed to be of good quality if it delivers all required functionality (“fitness-for-purpose”) and its performance is above required thresholds. Increasingly, primarily in research but also in practice, other qualities are attracting attention. To facilitate evolution, maintainability and adaptability are gaining popularity. Usability, universal accessibility, innovativeness, and enjoyability are being studied as novel types of non-functional requirements that we do not know how to define, let alone accommodate, but which we realize are critical under some contingencies. The growing importance of the business context in the design of software-intensive systems has also thrust economic value, legal compliance, and potential social and ethical implications into the forefront of requirements topics. A focus on the broader user environment and experience, as well as the organizational and societal implications of system use, thus has become more central to the requirements discourse. This section includes three contributions to this broad and increasingly important topic.

  2. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    PubMed

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  3. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  4. The MERMAID project

    NASA Technical Reports Server (NTRS)

    Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A

    1992-01-01

    The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.

  5. Evolution of Software-Only-Simulation at NASA IV and V

    NASA Technical Reports Server (NTRS)

    McCarty, Justin; Morris, Justin; Zemerick, Scott

    2014-01-01

    Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.

  6. Open systems storage platforms

    NASA Technical Reports Server (NTRS)

    Collins, Kirby

    1992-01-01

    The building blocks for an open storage system includes a system platform, a selection of storage devices and interfaces, system software, and storage applications CONVEX storage systems are based on the DS Series Data Server systems. These systems are a variant of the C3200 supercomputer with expanded I/O capabilities. These systems support a variety of medium and high speed interfaces to networks and peripherals. System software is provided in the form of ConvexOS, a POSIX compliant derivative of 4.3BSD UNIX. Storage applications include products such as UNITREE and EMASS. With the DS Series of storage systems, Convex has developed a set of products which provide open system solutions for storage management applications. The systems are highly modular, assembled from off the shelf components with industry standard interfaces. The C Series system architecture provides a stable base, with the performance and reliability of a general purpose platform. This combination of a proven system architecture with a variety of choices in peripherals and application software allows wide flexibility in configurations, and delivers the benefits of open systems to the mass storage world.

  7. Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.

    2014-12-01

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.

  8. A simplified, low power system for effective vessel sealing

    NASA Astrophysics Data System (ADS)

    Lyle, Allison B.; Kennedy, Jenifer S.; Schmaltz, Dale F.; Kennedy, Aaron S.

    2015-03-01

    The first bipolar vessel sealing system was developed nearly 15 years ago and has since become standard of care in surgery. These systems make use of radio frequency current that is delivered between bipolar graspers to permanently seal arteries, veins and tissue bundles. Conventional vessel sealing generators are based off traditional electrosurgery generator architecture and deliver high power (150-300 Watts) and high current using complex control and sense algorithms to adjust the output for vessel sealing applications. In recent years, a need for small-scale surgical vessel sealers has developed as surgeons strive to further reduce their footprint on patients. There are many technical challenges associated with miniaturization of vessel sealing devices including maintaining electrical isolation while delivering high current in a saline environment. Research into creating a small, 3mm diameter vessel sealer revealed that a highly simplified generator system could be used to achieve excellent results and subsequently a low power vessel sealing system was developed. This system delivers 25 Watts constant power while limiting voltage (<= Vrms) and current (<= Amps) until an impedance endpoint is achieved, eliminating the use of complicated control and sensing software. The result is optimized tissue effect, where high seal strength is maintained (> 360mmHg), but seal times (1.7 +/- 0.7s versus 4.1 +/- 0.7s), thermal spread (<1mm vs <=2mm) and total energy delivery are reduced, when compared to an existing high power system.

  9. The QCDOC Project

    NASA Astrophysics Data System (ADS)

    Boyle, P.; Chen, D.; Christ, N.; Clark, M.; Cohen, S.; Cristian, C.; Dong, Z.; Gara, A.; Joo, B.; Jung, C.; Kim, C.; Levkova, L.; Liao, X.; Liu, G.; Li, S.; Lin, H.; Mawhinney, R.; Ohta, S.; Petrov, K.; Wettig, T.; Yamaguchi, A.

    2005-03-01

    The QCDOC project has developed a supercomputer optimised for the needs of Lattice QCD simulations. It provides a very competitive price to sustained performance ratio of around $1 USD per sustained Megaflop/s in combination with outstanding scalability. Thus very large systems delivering over 5 TFlop/s of performance on the evolution of a single lattice is possible. Large prototypes have been built and are functioning correctly. The software environment raises the state of the art in such custom supercomputers. It is based on a lean custom node operating system that eliminates many unnecessary overheads that plague other systems. Despite the custom nature, the operating system implements a standards compliant UNIX-like programming environment easing the porting of software from other systems. The SciDAC QMP interface adds internode communication in a fashion that provides a uniform cross-platform programming environment.

  10. Real-time acquisition and tracking system with multiple Kalman filters

    NASA Astrophysics Data System (ADS)

    Beard, Gary C.; McCarter, Timothy G.; Spodeck, Walter; Fletcher, James E.

    1994-07-01

    The design of a real-time, ground-based, infrared tracking system with proven field success in tracking boost vehicles through burnout is presented with emphasis on the software design. The system was originally developed to deliver relative angular positions during boost, and thrust termination time to a sensor fusion station in real-time. Autonomous target acquisition and angle-only tracking features were developed to ensure success under stressing conditions. A unique feature of the system is the incorporation of multiple copies of a Kalman filter tracking algorithm running in parallel in order to minimize run-time. The system is capable of updating the state vector for an object at measurement rates approaching 90 Hz. This paper will address the top-level software design, details of the algorithms employed, system performance history in the field, and possible future upgrades.

  11. Developing a Hydrologic Assessment Tool for Designing Bioretention in a watershed

    NASA Astrophysics Data System (ADS)

    Baek, Sangsoo; Ligaray, Mayzonee; Park, Jeong-Pyo; Kwon, Yongsung; Cho, Kyung Hwa

    2017-04-01

    Continuous urbanization has negatively impacted the ecological and hydrological environments at the global, regional, and local scales. This issue was addressed by developing Low Impact Development (LID) practices to deliver better hydrologic function and improve the environmental, economic, social and cultural outcomes. This study developed a modeling software to simulate and optimize bioretentions among LID in a given watershed. The model calculated a detailed soil infiltration process in bioretention with hydrological conditions and hydraulic facilities (e.g. riser and underdrain) and also generated an optimized plan using Flow Duration Curve (FDC). The optimization result from the simulation demonstrated that the location and size of bioretention, as well as the soil texture, are important elements for an efficient bioretention. We hope that the developed software in this study could be useful for establishing an appropriate scheme of LID installment

  12. VoIP for Telerehabilitation: A Pilot Usability Study for HIPAA Compliance

    PubMed Central

    Watzlaf, Valerie R.; Ondich, Briana

    2012-01-01

    Consumer-based, free Voice and video over the Internet Protocol (VoIP) software systems such as Skype and others are used by health care providers to deliver telerehabilitation and other health-related services to clients. Privacy and security applications as well as HIPAA compliance within these protocols have been questioned by practitioners, health information managers, and other healthcare entities. This pilot usability study examined whether four respondents who used the top three, free consumer-based, VoIP software systems perceived these VoIP technologies to be private, secure, and HIPAA compliant; most did not. While the pilot study limitations include the number of respondents and systems assessed, the protocol can be applied to future research and replicated for instructional purposes. Recommendations are provided for VoIP companies, providers, and clients/consumers. PMID:25945194

  13. Situation Awareness and Levels of Automation

    NASA Technical Reports Server (NTRS)

    Kaber, David B.

    1999-01-01

    During the first year of this project, a taxonomy of theoretical levels of automation (LOAs) was applied to the advanced commercial aircraft by categorizing actual modes of McDonald Douglas MD-11 autoflight system operation in terms of the taxonomy. As well, high LOAs included in the taxonomy (e.g., supervisory control) were modeled in the context of MD-11 autoflight systems through development of a virtual flight simulator. The flight simulator was an integration of a re-configurable simulator developed by the Georgia Institute Technology and new software prototypes of autoflight system modules found in the MD-11 cockpit. In addition to this work, a version of the Situation Awareness Global Assessment Technique (SAGAT) was developed for application to commercial piloting tasks. A software package was developed to deliver the SAGAT and was integrated with the virtual flight simulator.

  14. Ultraviolet spectrometer and polarimeter (UVSP) software development and hardware tests for the solar maximum mission

    NASA Technical Reports Server (NTRS)

    Bruner, M. E.; Haisch, B. M.

    1986-01-01

    The Ultraviolet Spectrometer/Polarimeter Instrument (UVSP) for the Solar Maximum Mission (SMM) was based on the re-use of the engineering model of the high resolution ultraviolet spectrometer developed for the OSO-8 mission. Lockheed assumed four distinct responsibilities in the UVSP program: technical evaluation of the OSO-8 engineering model; technical consulting on the electronic, optical, and mechanical modifications to the OSO-8 engineering model hardware; design and development of the UVSP software system; and scientific participation in the operations and analysis phase of the mission. Lockheed also provided technical consulting and assistance with instrument hardware performance anomalies encountered during the post launch operation of the SMM observatory. An index to the quarterly reports delivered under the contract are contained, and serves as a useful capsule history of the program activity.

  15. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  16. A new approach for instrument software at Gemini

    NASA Astrophysics Data System (ADS)

    Gillies, Kim; Nunez, Arturo; Dunn, Jennifer

    2008-07-01

    Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.

  17. Evaluation of a new syringe presentation of reduced-antigen content diphtheria, tetanus, and acellular pertussis vaccine in healthy adolescents - A single blind randomized trial

    PubMed Central

    Pavia-Ruz, Noris; Abarca, Katia; Lepetic, Alejandro; Cervantes-Apolinar, Maria Yolanda; Hardt, Karin; Jayadeva, Girish; Kuriyakose, Sherine; Han, Htay Htay; de la O, Manuel

    2015-01-01

    Reduced-antigen-content diphtheria-tetanus-acellular pertussis (dTpa) vaccine, Boostrix™, is indicated for booster vaccination of children, adolescents and adults. The original prefilled disposable dTpa syringe presentation was recently replaced by another prefilled-syringe presentation with latex-free tip-caps and plunger-stoppers. 671 healthy adolescents aged 10–15 years who had previously received 5 or 6 previous DT(P)/dT(pa) vaccine doses, were randomized (1:1) to receive dTpa booster, injected using the new (dTpa-new) or previous syringe (dTpa-previous) presentations. Immunogenicity was assessed before and 1-month post-booster vaccination; safety/reactogenicity were assessed during 31-days post-vaccination. Non-inferiority of dTpa-new versus dTpa-previous was demonstrated for all antigens (ULs 95% CIs for GMC ratios ranged between 1.03-1.13). 1-month post-booster, immune responses were in similar ranges for all antigens with both syringe presentations. dTpa delivered using either syringe presentation was well-tolerated. These clinical results complement the technical data and support the use of the new syringe presentation to deliver the dTpa vaccine. PMID:26075317

  18. Design and development of an IBM/VM menu system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cazzola, D.J.

    1992-10-01

    This report describes a full screen menu system developed using IBM`s Interactive System Productivity Facility (ISPF) and the REXX programming language. The software was developed for the 2800 IBM/VM Electrical Computer Aided Design (ECAD) system. The system was developed to deliver electronic drawing definitions to a corporate drawing release system. Although this report documents the status of the menu system when it was retired, the methodologies used and the requirements defined are very applicable to replacement systems.

  19. Microcomputer Intelligence for Technical Training (MITT): The evolution of an intelligent tutoring system

    NASA Technical Reports Server (NTRS)

    Norton, Jeffrey E.; Wiederholt, Bradley J.; Johnson, William B.

    1990-01-01

    Microcomputer Intelligence for Technical Training (MITT) uses Intelligent Tutoring System (OTS) technology to deliver diagnostic training in a variety of complex technical domains. Over the past six years, MITT technology has been used to develop training systems for nuclear power plant diesel generator diagnosis, Space Shuttle fuel cell diagnosis, and message processing diagnosis for the Minuteman missile. Presented here is an overview of the MITT system, describing the evolution of the MITT software and the benefits of using the MITT system.

  20. Control system development for a 1 MW/e/ solar thermal power plant

    NASA Technical Reports Server (NTRS)

    Daubert, E. R.; Bergthold, F. M., Jr.; Fulton, D. G.

    1981-01-01

    The point-focusing distributed receiver power plant considered consists of a number of power modules delivering power to a central collection point. Each power module contains a parabolic dish concentrator with a closed-cycle receiver/turbine/alternator assembly. Currently, a single-module prototype plant is under construction. The major control system tasks required are related to concentrator pointing control, receiver temperature control, and turbine speed control. Attention is given to operational control details, control hardware and software, and aspects of CRT output display.

  1. An Automated System for the Maintenance of Multiform Documentation

    NASA Astrophysics Data System (ADS)

    Rousseau, Bertrand; Ruggier, Mario; Smith, Matthiew

    Software documentation for the user often exists in several forms including paper, electronic, on-line help, etc. We have build a system to help with the writing and maintenance of such kinds of documentation which relies on the FrameMaker product. As an example, we show how it is used to maintain the ADAMO documentation, delivered in 4 incarnations on paper, WWW hypertext, KUIP and running examples. The use of the system results in both time saving and quality improvements.

  2. Relay Forward-Link File Management Services (MaROS Phase 2)

    NASA Technical Reports Server (NTRS)

    Allard, Daniel A.; Wallick, Michael N.; Hy, Franklin H.; Gladden, Roy E.

    2013-01-01

    This software provides the service-level functionality to manage the delivery of files from a lander mission repository to an orbiter mission repository for eventual spacelink relay by the orbiter asset on a specific communications pass. It provides further functions to deliver and track a set of mission-defined messages detailing lander authorization instructions and orbiter data delivery state. All of the information concerning these transactions is persisted in a database providing a high level of accountability of the forward-link relay process.

  3. Direct comparison of the impact of head tracking, reverberation, and individualized head-related transfer functions on the spatial perception of a virtual speech source

    NASA Technical Reports Server (NTRS)

    Begault, D. R.; Wenzel, E. M.; Anderson, M. R.

    2001-01-01

    A study of sound localization performance was conducted using headphone-delivered virtual speech stimuli, rendered via HRTF-based acoustic auralization software and hardware, and blocked-meatus HRTF measurements. The independent variables were chosen to evaluate commonly held assumptions in the literature regarding improved localization: inclusion of head tracking, individualized HRTFs, and early and diffuse reflections. Significant effects were found for azimuth and elevation error, reversal rates, and externalization.

  4. Endoscopic measurements using a panoramic annular lens

    NASA Technical Reports Server (NTRS)

    Gilbert, John A.; Matthys, Donald R.

    1992-01-01

    The objective of this project was to design, build, demonstrate, and deliver a prototype system for making measurements within cavities. The system was to utilize structured lighting as the means for making measurements and was to rely on a stationary probe, equipped with a unique panoramic annular lens, to capture a cylindrical view of the illuminated cavity. Panoramic images, acquired with a digitizing camera and stored in a desk top computer, were to be linearized and analyzed by mouse-driven interactive software.

  5. Machine translation project alternatives analysis

    NASA Technical Reports Server (NTRS)

    Bajis, Catherine J.; Bedford, Denise A. D.

    1993-01-01

    The Machine Translation Project consists of several components, two of which, the Project Plan and the Requirements Analysis, have already been delivered. The Project Plan details the overall rationale, objectives and time-table for the project as a whole. The Requirements Analysis compares a number of available machine translation systems, their capabilities, possible configurations, and costs. The Alternatives Analysis has resulted in a number of conclusions and recommendations to the NASA STI program concerning the acquisition of specific MT systems and related hardware and software.

  6. Evaluation of SuperLU on multicore architectures

    NASA Astrophysics Data System (ADS)

    Li, X. S.

    2008-07-01

    The Chip Multiprocessor (CMP) will be the basic building block for computer systems ranging from laptops to supercomputers. New software developments at all levels are needed to fully utilize these systems. In this work, we evaluate performance of different high-performance sparse LU factorization and triangular solution algorithms on several representative multicore machines. We included both Pthreads and MPI implementations in this study and found that the Pthreads implementation consistently delivers good performance and that a left-looking algorithm is usually superior.

  7. TAPRegExt: a VOResource Schema Extension for Describing TAP Services Version 1.0

    NASA Astrophysics Data System (ADS)

    Demleitner, Markus; Dowler, Patrick; Plante, Ray; Rixon, Guy; Taylor, Mark; Demleitner, Markus

    2012-08-01

    This document describes an XML encoding standard for metadata about services implementing the table access protocol TAP [TAP], referred to as TAPRegExt. Instance documents are part of the service's registry record or can be obtained from the service itself. They deliver information to both humans and software on the languages, output formats, and upload methods supported by the service, as well as data models implemented by the exposed tables, optional language features, and certain limits enforced by the service.

  8. Touring the Cosmos through Your Computer: A Guide to Free Desktop Planetarium Software

    NASA Astrophysics Data System (ADS)

    McCool, M.

    2009-11-01

    It always amazes me, unfortunately often in a negative way, how only a few people know how to make and deliver a good presentation. For many scientists it's usually their Achilles' heel. Many get so caught up in their work that when they present it at a scientific meeting or to the general public, their presentation often looks confusing, boring or sometimes even scary. The good news is that there are some general rules that can work magic with presentations.

  9. Content Classification and Context-Based Retrieval System for E-Learning

    ERIC Educational Resources Information Center

    Mittal, Ankush; Krishnan, Pagalthivarthi V.; Altman, Edward

    2006-01-01

    A recent focus in web based learning systems has been the development of reusable learning materials that can be delivered as personalized courses depending of a number of factors such as the user's background, his/her learning preferences, current knowledge based on previous assessments, or previous browsing patterns. The student is often…

  10. Developing Teaching Material Software Assisted for Numerical Methods

    NASA Astrophysics Data System (ADS)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  11. Desktop document delivery using portable document format (PDF) files and the Web.

    PubMed Central

    Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J

    1998-01-01

    Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL. PMID:9681165

  12. Desktop document delivery using portable document format (PDF) files and the Web.

    PubMed

    Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J

    1998-07-01

    Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL.

  13. Wireless Network Simulation in Aircraft Cabins

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Youssef, Mennatoallah; Vahala, Linda

    2004-01-01

    An electromagnetic propagation prediction tool was used to predict electromagnetic field strength inside airplane cabins. A commercial software package, Wireless Insite, was used to predict power levels inside aircraft cabins and the data was compared with previously collected experimental data. It was concluded that the software could qualitatively predict electromagnetic propagation inside the aircraft cabin environment.

  14. Real Time with the Librarian: Using Web Conferencing Software to Connect to Distance Students

    ERIC Educational Resources Information Center

    Riedel, Tom; Betty, Paul

    2013-01-01

    A pilot program to provide real-time library webcasts to Regis University distance students using Adobe Connect software was initiated in fall of 2011. Previously, most interaction between librarians and online students had been accomplished by asynchronous discussion threads in the Learning Management System. Library webcasts were offered in…

  15. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  16. A Core Plug and Play Architecture for Reusable Flight Software Systems

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  17. NASA's Space Launch System Program Update

    NASA Technical Reports Server (NTRS)

    May, Todd; Lyles, Garry

    2015-01-01

    Hardware and software for the world's most powerful launch vehicle for exploration is being welded, assembled, and tested today in high bays, clean rooms and test stands across the United States. NASA's Space Launch System (SLS) continued to make significant progress in the past year, including firing tests of both main propulsion elements, manufacturing of flight hardware, and the program Critical Design Review (CDR). Developed with the goals of safety, affordability, and sustainability, SLS will deliver unmatched capability for human and robotic exploration. The initial Block 1 configuration will deliver more than 70 metric tons (t) (154,000 pounds) of payload to low Earth orbit (LEO). The evolved Block 2 design will deliver some 130 t (286,000 pounds) to LEO. Both designs offer enormous opportunity and flexibility for larger payloads, simplifying payload design as well as ground and on-orbit operations, shortening interplanetary transit times, and decreasing overall mission risk. Over the past year, every vehicle element has manufactured or tested hardware, including flight hardware for Exploration Mission 1 (EM-1). This paper will provide an overview of the progress made over the past year and provide a glimpse of upcoming milestones on the way to a 2018 launch readiness date.

  18. Virtual microscopy and digital pathology in training and education.

    PubMed

    Hamilton, Peter W; Wang, Yinhai; McCullough, Stephen J

    2012-04-01

    Traditionally, education and training in pathology has been delivered using textbooks, glass slides and conventional microscopy. Over the last two decades, the number of web-based pathology resources has expanded dramatically with centralized pathological resources being delivered to many students simultaneously. Recently, whole slide imaging technology allows glass slides to be scanned and viewed on a computer screen via dedicated software. This technology is referred to as virtual microscopy and has created enormous opportunities in pathological training and education. Students are able to learn key histopathological skills, e.g. to identify areas of diagnostic relevance from an entire slide, via a web-based computer environment. Students no longer need to be in the same room as the slides. New human-computer interfaces are also being developed using more natural touch technology to enhance the manipulation of digitized slides. Several major initiatives are also underway introducing online competency and diagnostic decision analysis using virtual microscopy and have important future roles in accreditation and recertification. Finally, researchers are investigating how pathological decision-making is achieved using virtual microscopy and modern eye-tracking devices. Virtual microscopy and digital pathology will continue to improve how pathology training and education is delivered. © 2012 The Authors APMIS © 2012 APMIS.

  19. Service quality of delivered care from the perception of women with caesarean section and normal delivery.

    PubMed

    Tabrizi, Jafar S; Askari, Samira; Fardiazar, Zahra; Koshavar, Hossein; Gholipour, Kamal

    2014-01-01

    Our aim was to determine the service quality of delivered care for people with Caesarean Section and Normal Delivery. A cross-sectional study was conducted among 200 people who had caesarean section and normal delivery in Al-Zahra Teaching Hospital in Tabriz, north western Iran. Service quality was calculated using: Service Quality = 10 - (Importance × Performance) based on importance and performance of service quality aspects from the postpartum women's perspective.A hierarchical regression analysis was applied in two steps using the enter method to examine the associations between demographics and SQ scores. Data were analysed using the SPSS-17 software. "Confidentiality", "autonomy", "choice of care provider" and "communication" achieved scores at the highest level of quality; and "support group", "prompt attention", "prevention and early detection", "continuity of care", "dignity", "safety", "accessibility and "basic amenities" got service quality score less than eight. Statistically significant relationship was found between service quality score and continuity of care (P=0.008). A notable gap between the participants‟ expectations and what they have actually received in most aspects of provided care. So, there is an opportunityto improve the quality of delivered care.

  20. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II

    PubMed Central

    Watzlaf, Valerie J.M.; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti

    2011-01-01

    In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR. PMID:25945177

  1. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II.

    PubMed

    Watzlaf, Valerie J M; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti

    2011-01-01

    In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR.

  2. UltraTrack: Software for semi-automated tracking of muscle fascicles in sequences of B-mode ultrasound images.

    PubMed

    Farris, Dominic James; Lichtwark, Glen A

    2016-05-01

    Dynamic measurements of human muscle fascicle length from sequences of B-mode ultrasound images have become increasingly prevalent in biomedical research. Manual digitisation of these images is time consuming and algorithms for automating the process have been developed. Here we present a freely available software implementation of a previously validated algorithm for semi-automated tracking of muscle fascicle length in dynamic ultrasound image recordings, "UltraTrack". UltraTrack implements an affine extension to an optic flow algorithm to track movement of the muscle fascicle end-points throughout dynamically recorded sequences of images. The underlying algorithm has been previously described and its reliability tested, but here we present the software implementation with features for: tracking multiple fascicles in multiple muscles simultaneously; correcting temporal drift in measurements; manually adjusting tracking results; saving and re-loading of tracking results and loading a range of file formats. Two example runs of the software are presented detailing the tracking of fascicles from several lower limb muscles during a squatting and walking activity. We have presented a software implementation of a validated fascicle-tracking algorithm and made the source code and standalone versions freely available for download. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Monitoring software development through dynamic variables

    NASA Technical Reports Server (NTRS)

    Doerflinger, Carl W.; Basili, Victor R.

    1983-01-01

    Research conducted by the Software Engineering Laboratory (SEL) on the use of dynamic variables as a tool to monitor software development is described. Project independent measures which may be used in a management tool for monitoring software development are identified. Several FORTRAN projects with similar profiles are examined. The staff was experienced in developing these types of projects. The projects developed serve similar functions. Because these projects are similar some underlying relationships exist that are invariant between projects. These relationships, once well defined, may be used to compare the development of different projects to determine whether they are evolving the same way previous projects in this environment evolved.

  4. Prognostic communication preferences of migrant patients and their relatives.

    PubMed

    Mitchison, D; Butow, P; Sze, M; Aldridge, L; Hui, R; Vardy, J; Eisenbruch, M; Iedema, R; Goldstein, D

    2012-05-01

    Migrant patients comprise a significant proportion of Western oncologists' clientele. Although previous research has found that barriers exist in the communication between ethnically diverse patients and health professionals, little is known about their personal preferences for communication and information, or the concordance of views held between patients and family members. Seventy-three patients (31 Anglo-Australians, and 20 Chinese, 11 Arabic and 11 Greek migrants) and 65 relatives (25 Anglo-Australians, and 23 Chinese, 11 Arabic and 7 Greek migrants) were recruited through nine Sydney oncology clinics. Following prognostic consultations, participants were interviewed in their preferred language about their experiences and ideals regarding information and communication with oncologists. Interviews were audio-taped, translated and transcribed, and then thematically analysed using N-Vivo software. Consistency was found in patient preferences, regardless of ethnicity, in that almost all patients preferred prognostic information to be delivered in a caring and personalised manner from an authoritative oncologist. Contrary to previous research, migrant patients often expressed a desire for prognostic disclosure. Discordance was found between migrant patients and their families. These families displayed traditional non-Western preferences of non-disclosure of prognosis and wanted to actively influence consultations by meeting with oncologists separately beforehand and directing the oncologists on what and how information should be conveyed to patients. Many of the communication issues facing patients in the metastatic cancer setting are shared amongst Anglo-Australian and migrant patients alike. Understanding the dynamics within migrant families is also an important component in providing culturally sensitive communication. Future directions for research are provided. Copyright © 2011 John Wiley & Sons, Ltd.

  5. The software architecture to control the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Oya, I.; Füßling, M.; Antonino, P. O.; Conforti, V.; Hagge, L.; Melkumyan, D.; Morgenstern, A.; Tosti, G.; Schwanke, U.; Schwarz, J.; Wegner, P.; Colomé, J.; Lyard, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) project is an initiative to build two large arrays of Cherenkov gamma- ray telescopes. CTA will be deployed as two installations, one in the northern and the other in the southern hemisphere, containing dozens of telescopes of different sizes. CTA is a big step forward in the field of ground- based gamma-ray astronomy, not only because of the expected scientific return, but also due to the order-of- magnitude larger scale of the instrument to be controlled. The performance requirements associated with such a large and distributed astronomical installation require a thoughtful analysis to determine the best software solutions. The array control and data acquisition (ACTL) work-package within the CTA initiative will deliver the software to control and acquire the data from the CTA instrumentation. In this contribution we present the current status of the formal ACTL system decomposition into software building blocks and the relationships among them. The system is modelled via the Systems Modelling Language (SysML) formalism. To cope with the complexity of the system, this architecture model is sub-divided into different perspectives. The relationships with the stakeholders and external systems are used to create the first perspective, the context of the ACTL software system. Use cases are employed to describe the interaction of those external elements with the ACTL system and are traced to a hierarchy of functionalities (abstract system functions) describing the internal structure of the ACTL system. These functions are then traced to fully specified logical elements (software components), the deployment of which as technical elements, is also described. This modelling approach allows us to decompose the ACTL software in elements to be created and the ow of information within the system, providing us with a clear way to identify sub-system interdependencies. This architectural approach allows us to build the ACTL system model and trace requirements to deliverables (source code, documentation, etc.), and permits the implementation of a flexible use-case driven software development approach thanks to the traceability from use cases to the logical software elements. The Alma Common Software (ACS) container/component framework, used for the control of the Atacama Large Millimeter/submillimeter Array (ALMA) is the basis for the ACTL software and as such it is considered as an integral part of the software architecture.

  6. Integrating interface slicing into software engineering processes

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  7. Women's involvement in the decision to be delivered by caesarean section in Sub Saharan Africa.

    PubMed

    Nnaji, G A; Okafor, C; Muoghalu, K; Onyejimbe, U N; Umeononihu, O S

    2012-01-01

    The aim of this study is to determine the degree and nature of women's involvement in the decision to deliver by Caesarean section. A cross sectional descriptive multi-centre study on post partum women who were delivered by Caesarean section in the three study centres. The five most common indicators for Caesarean section include cephalo-pelvic disproportion, prolonged labour, malpresentation, pregnancy induced hypertension and ante partum haemorrhage, which accounted for 70% of the indications for Caesarean section. The commonest influence on respondents' decision to have Caesarean section was physician factors, followed by religious and cultural factors. Husbands influenced majority of the respondents during decision for Caesarean section. The respondents' agreement with decision for Caesarean section varied significantly as the number of previous Caesarean section, being highest among women with 4 previous Caesarean section. In conclusion, the majority of women were found to be involved in the decision to have Caesarean section, and the most influential factors on them during the decision process were physician factors and husband's presence. In this environment, the greater the number of previous Caesarean section a woman has had in the past the more likely would she accept Caesarean section in subsequent deliveries as a better option. There is need for further studies to determine the effect of socio-demographic factors on decision to have caesarean section as well as satisfaction of the outcome.

  8. Generic Safety Requirements for Developing Safe Insulin Pump Software

    PubMed Central

    Zhang, Yi; Jetley, Raoul; Jones, Paul L; Ray, Arnab

    2011-01-01

    Background The authors previously introduced a highly abstract generic insulin infusion pump (GIIP) model that identified common features and hazards shared by most insulin pumps on the market. The aim of this article is to extend our previous work on the GIIP model by articulating safety requirements that address the identified GIIP hazards. These safety requirements can be validated by manufacturers, and may ultimately serve as a safety reference for insulin pump software. Together, these two publications can serve as a basis for discussing insulin pump safety in the diabetes community. Methods In our previous work, we established a generic insulin pump architecture that abstracts functions common to many insulin pumps currently on the market and near-future pump designs. We then carried out a preliminary hazard analysis based on this architecture that included consultations with many domain experts. Further consultation with domain experts resulted in the safety requirements used in the modeling work presented in this article. Results Generic safety requirements for the GIIP model are presented, as appropriate, in parameterized format to accommodate clinical practices or specific insulin pump criteria important to safe device performance. Conclusions We believe that there is considerable value in having the diabetes, academic, and manufacturing communities consider and discuss these generic safety requirements. We hope that the communities will extend and revise them, make them more representative and comprehensive, experiment with them, and use them as a means for assessing the safety of insulin pump software designs. One potential use of these requirements is to integrate them into model-based engineering (MBE) software development methods. We believe, based on our experiences, that implementing safety requirements using MBE methods holds promise in reducing design/implementation flaws in insulin pump development and evolutionary processes, therefore improving overall safety of insulin pump software. PMID:22226258

  9. SU-F-T-300: Impact of Electron Density Modeling of ArcCHECK Cylindricaldiode Array On 3DVH Patient Specific QA Software Tool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patwe, P; Mhatre, V; Dandekar, P

    Purpose: 3DVH software is a patient specific quality assurance tool which estimates the 3D dose to the patient specific geometry with the help of Planned Dose Perturbation algorithm. The purpose of this study is to evaluate the impact of HU value of ArcCHECK phantom entered in Eclipse TPS on 3D dose & DVH QA analysis. Methods: Manufacturer of ArcCHECK phantom provides CT data set of phantom & recommends considering it as a homogeneous phantom with electron density (1.19 gm/cc or 282 HU) close to PMMA. We performed this study on Eclipse TPS (V13, VMS) & trueBEAM STx VMS Linac &more » ArcCHECK phantom (SNC). Plans were generated for 6MV photon beam, 20cm×20cm field size at isocentre & SPD (Source to phantom distance) of 86.7 cm to deliver 100cGy at isocentre. 3DVH software requires patients DICOM data generated by TPS & plan delivered on ArcCHECK phantom. Plans were generated in TPS by assigning different HU values to phantom. We analyzed gamma index & the dose profile for all plans along vertical down direction of beam’s central axis for Entry, Exit & Isocentre dose. Results: The global gamma passing rate (2% & 2mm) for manufacturer recommended HU value 282 was 96.3%. Detector entry, Isocentre & detector exit Doses were 1.9048 (1.9270), 1.00(1.0199) & 0.5078(0.527) Gy for TPS (Measured) respectively.The global gamma passing rate for electron density 1.1302 gm/cc was 98.6%. Detector entry, Isocentre & detector exit Doses were 1.8714 (1.8873), 1.00(0.9988) & 0.5211(0.516) Gy for TPS (Measured) respectively. Conclusion: Electron density value assigned by manufacturer does not hold true for every user. Proper modeling of electron density of ArcCHECK in TPS is essential to avoid systematic error in dose calculation of patient specific QA.« less

  10. Maternal intention to breast-feed and breast-feeding outcomes in term and preterm infants: Pregnancy Risk Assessment Monitoring System (PRAMS), 2000-2003.

    PubMed

    Colaizy, Tarah T; Saftlas, Audrey F; Morriss, Frank H

    2012-04-01

    To determine the effect of intention to breast-feed on short-term breast-feeding outcomes in women delivering term and preterm infants. Data from the US Centers for Disease Control and Prevention's Pregnancy Risk Assessment Monitoring System (PRAMS) for three states, Ohio, Michigan and Arkansas, during 2000-2003 were analysed. SAS 9·1·3 and SUDAAN 10 statistical software packages were used for analyses. Arkansas, Michigan and Ohio, USA. Mothers of recently delivered infants, selected by birth certificate sampling. Of 16,839 mothers included, 9·7% delivered preterm. Some 52·2% expressed definite intention to breast-feed, 16·8% expressed tentative intention, 4·3% were uncertain and 26·8% had no intention to breast-feed. Overall 65·2% initiated breast-feeding, 52·0% breast-fed for ≥4 weeks and 30·8% breast-fed for ≥10 weeks. Women with definite intention were more likely to initiate (OR = 24·3, 95% CI 18·4, 32·1), to breast-feed for ≥4 weeks (OR = 7·12, 95% CI 5·95, 8·51) and to breast-feed for ≥10 weeks (OR = 2·75, 95% CI 2·20, 3·45) compared with women with tentative intention. Levels of intention did not differ between women delivering preterm and term. Women delivering at <34 weeks were more likely to initiate breast-feeding (OR = 2·24, 95% CI 1·64, 3·06) and to breast-feed for ≥4 weeks (OR = 2·58, 95% CI 1·96, 3·41), but less likely to breast-feed for ≥10 weeks (OR = 0·55, 95% CI 0·44, 0·68), compared with those delivering at term. Women delivering between 34 and 36 weeks were less likely to breast-feed for ≥10 weeks than those delivering at term (OR = 0·63, 95% CI 0·49, 0·81). Prenatal intention to breast-feed is a powerful predictor of short-term breast-feeding outcomes in women delivering both at term and prematurely.

  11. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    PubMed

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  12. Issues and approaches for electronic document approval and transmittal using digital signatures and text authentication: Prototype documentation

    NASA Astrophysics Data System (ADS)

    Boling, M. E.

    1989-09-01

    Prototypes were assembled pursuant to recommendations made in report K/DSRD-96, Issues and Approaches for Electronic Document Approval and Transmittal Using Digital Signatures and Text Authentication, and to examine and discover the possibilities for integrating available hardware and software to provide cost effective systems for digital signatures and text authentication. These prototypes show that on a LAN, a multitasking, windowed, mouse/keyboard menu-driven interface can be assembled to provide easy and quick access to bit-mapped images of documents, electronic forms and electronic mail messages with a means to sign, encrypt, deliver, receive or retrieve and authenticate text and signatures. In addition they show that some of this same software may be used in a classified environment using host to terminal transactions to accomplish these same operations. Finally, a prototype was developed demonstrating that binary files may be signed electronically and sent by point to point communication and over ARPANET to remote locations where the authenticity of the code and signature may be verified. Related studies on the subject of electronic signatures and text authentication using public key encryption were done within the Department of Energy. These studies include timing studies of public key encryption software and hardware and testing of experimental user-generated host resident software for public key encryption. This software used commercially available command-line source code. These studies are responsive to an initiative within the Office of the Secretary of Defense (OSD) for the protection of unclassified but sensitive data. It is notable that these related studies are all built around the same commercially available public key encryption products from the private sector and that the software selection was made independently by each study group.

  13. General object-oriented software development

    NASA Technical Reports Server (NTRS)

    Seidewitz, Edwin V.; Stark, Mike

    1986-01-01

    Object-oriented design techniques are gaining increasing popularity for use with the Ada programming language. A general approach to object-oriented design which synthesizes the principles of previous object-oriented methods into the overall software life-cycle, providing transitions from specification to design and from design to code. It therefore provides the basis for a general object-oriented development methodology.

  14. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  15. KinFin: Software for Taxon-Aware Analysis of Clustered Protein Sequences.

    PubMed

    Laetsch, Dominik R; Blaxter, Mark L

    2017-10-05

    The field of comparative genomics is concerned with the study of similarities and differences between the information encoded in the genomes of organisms. A common approach is to define gene families by clustering protein sequences based on sequence similarity, and analyze protein cluster presence and absence in different species groups as a guide to biology. Due to the high dimensionality of these data, downstream analysis of protein clusters inferred from large numbers of species, or species with many genes, is nontrivial, and few solutions exist for transparent, reproducible, and customizable analyses. We present KinFin, a streamlined software solution capable of integrating data from common file formats and delivering aggregative annotation of protein clusters. KinFin delivers analyses based on systematic taxonomy of the species analyzed, or on user-defined, groupings of taxa, for example, sets based on attributes such as life history traits, organismal phenotypes, or competing phylogenetic hypotheses. Results are reported through graphical and detailed text output files. We illustrate the utility of the KinFin pipeline by addressing questions regarding the biology of filarial nematodes, which include parasites of veterinary and medical importance. We resolve the phylogenetic relationships between the species and explore functional annotation of proteins in clusters in key lineages and between custom taxon sets, identifying gene families of interest. KinFin can easily be integrated into existing comparative genomic workflows, and promotes transparent and reproducible analysis of clustered protein data. Copyright © 2017 Laetsch and Blaxter.

  16. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  17. Portable widefield imaging device for ICG-detection of the sentinel lymph node

    NASA Astrophysics Data System (ADS)

    Govone, Angelo Biasi; Gómez-García, Pablo Aurelio; Carvalho, André Lopes; Capuzzo, Renato de Castro; Magalhães, Daniel Varela; Kurachi, Cristina

    2015-06-01

    Metastasis is one of the major cancer complications, since the malignant cells detach from the primary tumor and reaches other organs or tissues. The sentinel lymph node (SLN) is the first lymphatic structure to be affected by the malignant cells, but its location is still a great challenge for the medical team. This occurs due to the fact that the lymph nodes are located between the muscle fibers, making it visualization difficult. Seeking to aid the surgeon in the detection of the SLN, the present study aims to develop a widefield fluorescence imaging device using the indocyanine green as fluorescence marker. The system is basically composed of a 780nm illumination unit, optical components for 810nm fluorescence detection, two CCD cameras, a laptop, and dedicated software. The illumination unit has 16 diode lasers. A dichroic mirror and bandpass filters select and deliver the excitation light to the interrogated tissue, and select and deliver the fluorescence light to the camera. One camera is responsible for the acquisition of visible light and the other one for the acquisition of the ICG fluorescence. The software developed at the LabVIEW® platform generates a real time merged image where it is possible to observe the fluorescence spots, related to the lymph nodes, superimposed at the image under white light. The system was tested in a mice model, and a first patient with tongue cancer was imaged. Both results showed the potential use of the presented fluorescence imaging system assembled for sentinel lymph node detection.

  18. Software dependability in the Tandem GUARDIAN system

    NASA Technical Reports Server (NTRS)

    Lee, Inhwan; Iyer, Ravishankar K.

    1995-01-01

    Based on extensive field failure data for Tandem's GUARDIAN operating system this paper discusses evaluation of the dependability of operational software. Software faults considered are major defects that result in processor failures and invoke backup processes to take over. The paper categorizes the underlying causes of software failures and evaluates the effectiveness of the process pair technique in tolerating software faults. A model to describe the impact of software faults on the reliability of an overall system is proposed. The model is used to evaluate the significance of key factors that determine software dependability and to identify areas for improvement. An analysis of the data shows that about 77% of processor failures that are initially considered due to software are confirmed as software problems. The analysis shows that the use of process pairs to provide checkpointing and restart (originally intended for tolerating hardware faults) allows the system to tolerate about 75% of reported software faults that result in processor failures. The loose coupling between processors, which results in the backup execution (the processor state and the sequence of events) being different from the original execution, is a major reason for the measured software fault tolerance. Over two-thirds (72%) of measured software failures are recurrences of previously reported faults. Modeling, based on the data, shows that, in addition to reducing the number of software faults, software dependability can be enhanced by reducing the recurrence rate.

  19. Pattern Matcher for Trees Constructed from Lists

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A software library has been developed that takes a high-level description of a pattern to be satisfied and applies it to a target. If the two match, it returns success; otherwise, it indicates a failure. The target is semantically a tree that is constructed from elements of terminal and non-terminal nodes represented through lists and symbols. Additionally, functionality is provided for finding the element in a set that satisfies a given pattern and doing a tree search, finding all occurrences of leaf nodes that match a given pattern. This process is valuable because it is a new algorithmic approach that significantly improves the productivity of the programmers and has the potential of making their resulting code more efficient by the introduction of a novel semantic representation language. This software has been used in many applications delivered to NASA and private industry, and the cost savings that have resulted from it are significant.

  20. Provision of Training for the IT Industry: The ELEVATE Project

    NASA Astrophysics Data System (ADS)

    Paraskakis, Iraklis; Konstantinidis, Andreas; Bouras, Thanassis; Perakis, Kostas; Pantelopoulos, Stelios; Hatziapostolou, Thanos

    This paper will present ELEVATE that aims to deliver an innovative training, educational and certification environment integrating the application software to be taught with the training procedure. ELEVATE aspires to address the training needs of software development SMEs and the solution proposed is based on three basic notions: to provide competence training that is tailored to the needs of the individual trainee, to allow the trainee to carry out authentic activities as well as problem based learning that draws from real life scenarios and finally to allow for the assessment and certification of the skills and competences acquired. In order to achieve the desired results the ELEVATE architecture utilises an Interactive Interoperability Layer, an Intelligent Personalization Trainer as well as the Training, Evaluation & Certification component. As an end product, the ELEVATE project The ELEVATE pedagogical model is based on blended learning, the e-Training component (an intelligent system that provides tailored training) and Learning 2.0.

  1. Digital Library Storage using iRODS Data Grids

    NASA Astrophysics Data System (ADS)

    Hedges, Mark; Blanke, Tobias; Hasan, Adil

    Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.

  2. A low cost Doppler system for vascular dialysis access surveillance.

    PubMed

    Molina, P S C; Moraes, R; Baggio, J F R; Tognon, E A

    2004-01-01

    The National Kidney Foundation guidelines for vascular access recommend access surveillance to avoid morbidity among patients undergoing hemodialysis. Methods to detect access failure based on CW Doppler system are being proposed to implement surveillance programs at lower cost. This work describes a low cost Doppler system implemented in a PC notebook designed to carry out this task. A Doppler board samples the blood flow velocity and delivers demodulated quadrature Doppler signals. These signals are sampled by a notebook sound card. Software for Windows OS (running at the notebook) applies CFFT to consecutive 11.6 ms intervals of Doppler signals. The sonogram is presented on the screen in real time. The software also calculates the maximum and the intensity weighted mean frequency envelopes. Since similar systems employ DSP boards to process the Doppler signals, cost reduction was achieved. The Doppler board electronic circuits and routines to process the Doppler signals are presented.

  3. Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd

    2005-01-01

    Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.

  4. ELOPTA: a novel microcontroller-based operant device.

    PubMed

    Hoffman, Adam M; Song, Jianjian; Tuttle, Elaina M

    2007-11-01

    Operant devices have been used for many years in animal behavior research, yet such devices a regenerally highly specialized and quite expensive. Although commercial models are somewhat adaptable and resilient, they are also extremely expensive and are controlled by difficult to learn proprietary software. As an alternative to commercial devices, we have designed and produced a fully functional, programmable operant device, using a PICmicro microcontroller (Microchip Technology, Inc.). The electronic operant testing apparatus (ELOPTA) is designed to deliver food when a study animal, in this case a bird, successfully depresses the correct sequence of illuminated keys. The device logs each keypress and can detect and log whenever a test animal i spositioned at the device. Data can be easily transferred to a computer and imported into any statistical analysis software. At about 3% the cost of a commercial device, ELOPTA will advance behavioral sciences, including behavioral ecology, animal learning and cognition, and ethology.

  5. Mobile applications for handheld devices to screen and randomize acute stroke patients in clinical trials.

    PubMed

    Qureshi, Ai; Connelly, B; Abbott, Ei; Maland, E; Kim, J; Blake, J

    2012-08-01

    The availability of internet connectivity and mobile application software used by low-power handheld devices makes smart phones of unique value in time-sensitive clinical trials. Trial-specific applications can be downloaded by investigators from various mobile software distribution platforms or web applications delivered over HTTP. The Antihypertensive Treatment in Acute Cerebral Hemorrhage (ATACH) II investigators in collaboration with MentorMate released the ATACH-II Patient Recruitment mobile application available on iPhone, Android, and Blackberry in 2011. The mobile application provides tools for pre-screening, assessment of eligibility, and randomization of patients. Since the release of ATACH-II mobile application, the CLEAR-IVH (Clot Lysis Evaluating Accelerated Resolution of Intraventricular Hemorrhage) trial investigators have also adopted such a mobile application. The video-conferencing capabilities of the most recent mobile devices open up additional opportunities to involve central coordinating centers in the recruitment process in real time.

  6. Mercury BLASTP: Accelerating Protein Sequence Alignment

    PubMed Central

    Jacob, Arpith; Lancaster, Joseph; Buhler, Jeremy; Harris, Brandon; Chamberlain, Roger D.

    2008-01-01

    Large-scale protein sequence comparison is an important but compute-intensive task in molecular biology. BLASTP is the most popular tool for comparative analysis of protein sequences. In recent years, an exponential increase in the size of protein sequence databases has required either exponentially more running time or a cluster of machines to keep pace. To address this problem, we have designed and built a high-performance FPGA-accelerated version of BLASTP, Mercury BLASTP. In this paper, we describe the architecture of the portions of the application that are accelerated in the FPGA, and we also describe the integration of these FPGA-accelerated portions with the existing BLASTP software. We have implemented Mercury BLASTP on a commodity workstation with two Xilinx Virtex-II 6000 FPGAs. We show that the new design runs 11-15 times faster than software BLASTP on a modern CPU while delivering close to 99% identical results. PMID:19492068

  7. Implementation of electronic medical records requires more than new software: Lessons on integrating and managing health technologies from Mbarara, Uganda.

    PubMed

    Madore, Amy; Rosenberg, Julie; Muyindike, Winnie R; Bangsberg, David R; Bwana, Mwebesa B; Martin, Jeffrey N; Kanyesigye, Michael; Weintraub, Rebecca

    2015-12-01

    Implementation lessons: • Technology alone does not necessarily lead to improvement in health service delivery, in contrast to the common assumption that advanced technology goes hand in hand with progress. • Implementation of electronic medical record (EMR) systems is a complex, resource-intensive process that, in addition to software, hardware, and human resource investments, requires careful planning, change management skills, adaptability, and continuous engagement of stakeholders. • Research requirements and goals must be balanced with service delivery needs when determining how much information is essential to collect and who should be interfacing with the EMR system. • EMR systems require ongoing monitoring and regular updates to ensure they are responsive to evolving clinical use cases and research questions. • High-quality data and analyses are essential for EMRs to deliver value to providers, researchers, and patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. An Independent Orbit Determination Simulation for the OSIRIS-REx Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Getzandanner, Kenneth; Rowlands, David; Mazarico, Erwan; Antreasian, Peter; Jackman, Coralie; Moreau, Michael

    2016-01-01

    After arriving at the near-Earth asteroid (101955) Bennu in late 2018, the OSIRIS-REx spacecraft will execute a series of observation campaigns and orbit phases to accurately characterize Bennu and ultimately collect a sample of pristine regolith from its surface. While in the vicinity of Bennu, the OSIRIS-REx navigation team will rely on a combination of ground-based radiometric tracking data and optical navigation (OpNav) images to generate and deliver precision orbit determination products. Long before arrival at Bennu, the navigation team is performing multiple orbit determination simulations and thread tests to verify navigation performance and ensure interfaces between multiple software suites function properly. In this paper, we will summarize the results of an independent orbit determination simulation of the Orbit B phase of the mission performed to test the interface between the OpNav image processing and orbit determination software packages.

  9. Development and implementation of software systems for imaging spectroscopy

    USGS Publications Warehouse

    Boardman, J.W.; Clark, R.N.; Mazer, A.S.; Biehl, L.L.; Kruse, F.A.; Torson, J.; Staenz, K.

    2006-01-01

    Specialized software systems have played a crucial role throughout the twenty-five year course of the development of the new technology of imaging spectroscopy, or hyperspectral remote sensing. By their very nature, hyperspectral data place unique and demanding requirements on the computer software used to visualize, analyze, process and interpret them. Often described as a marriage of the two technologies of reflectance spectroscopy and airborne/spaceborne remote sensing, imaging spectroscopy, in fact, produces data sets with unique qualities, unlike previous remote sensing or spectrometer data. Because of these unique spatial and spectral properties hyperspectral data are not readily processed or exploited with legacy software systems inherited from either of the two parent fields of study. This paper provides brief reviews of seven important software systems developed specifically for imaging spectroscopy.

  10. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  11. Software Process Assurance for Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.

  12. Undergraduate allergy teaching in a UK medical school: comparison of the described and delivered curriculum.

    PubMed

    Shehata, Yasser; Ross, Michael; Sheikh, Aziz

    2007-02-01

    Concerns have been raised about the adequacy of allergy teaching in UK undergraduate medical curricula. Our previous work, which involved undertaking a systematic analysis of the documented curricular learning objectives relating to allergy teaching in a UK medical school, found references to allergy teaching in each of the five years of study but also identified some apparent omissions in allergy teaching. These may represent actual gaps in relation to allergy training, or alternatively may reflect dissonance between the described and delivered curricula. To compare the described and delivered undergraduate curricula on allergy and allergy-related topics in a UK medical school. We identified and e-mailed the individuals responsible for each of the 43 modules in the five-year undergraduate medical programme at the University of Edinburgh, enquiring about the delivery of allergy-related teaching within their modules. We then compared these responses with the results of the previous study mapping allergy-related teaching across the undergraduate curriculum. Fifty-one individuals were identified as being responsible for leading the 43 modules in the curriculum. Forty-nine (96%) of these module organisers responded to our enquiry; these individuals represented 41 of the 43 modules (95%). Module organisers reported that allergy-related teaching and learning was delivered in 14 modules (33%), was absent in 13 (30%) modules, and may occur to varying degrees within a further 10 (23%) modules. Module organisers' responses about the delivered curriculum on allergy were consistent with the findings from documented learning objectives in 21 (49%) modules. They also reported allergy teaching and learning in modules which had not been identified by examination of the learning objectives; however, there were still important gaps in the allergy-related curriculum. Information gathered from teaching staff confirms that specific teaching and learning on allergic disorders is currently being delivered in all five years of the undergraduate curriculum. However, comparison between the described and delivered curricula on allergy revealed discrepancies highlighting the complex nature of the undergraduate curriculum and the difficulties involved in mapping specific teaching themes within them. This assessment has revealed gaps in allergy training which need to be addressed.

  13. User's Manual for LEWICE Version 3.2

    NASA Technical Reports Server (NTRS)

    Wright, William

    2008-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.

  14. Multimedia communications and services for the healthcare community

    NASA Astrophysics Data System (ADS)

    Robinson, James M.

    1994-11-01

    The NYNEX Media Broadband Service Trials in Boston examined the use of several multiple media applications from healthcare in conjunction with high speed fiber optic networks. As part of these trials, NYNEX developed a network-based software technology that simplifies and coordinates the delivery of complex voice, data, image, and video information. This permits two or more users to interact and collaborate with one another while sharing, displaying, and manipulating various media types. Different medical applications were trialed at four of Boston's major hospitals, ranging from teleradiology (which tested the quality of the diagnostic images and the need to collaborate) to telecardiology (which displayed diagnostic quality digital movies played in synchronicity). These trials allowed NYNEX to uniquely witness the needs and opportunities in the healthcare community for broadband communications with the necessary control capabilities and simplified user interface. As a result of the success of the initial trials, NYNEX has created a new business unit, Media Communications Services (MCS), to deliver a service offering based on this capability. New England Medical Center, as one of the initial trial sites, was chosen as a beta trial candidate, and wanted to further its previous work in telecardiology as well as telepsychiatry applications. Initial and subsequent deployments have been completed, and medical use is in progress.

  15. APNEA list mode data acquisition and real-time event processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins formore » TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.« less

  16. Analysis of SSEM Sensor Data Using BEAM

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Park, Han; James, Mark

    2004-01-01

    A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.

  17. Trajectory Software With Upper Atmosphere Model

    NASA Technical Reports Server (NTRS)

    Barrett, Charles

    2012-01-01

    The Trajectory Software Applications 6.0 for the Dec Alpha platform has an implementation of the Jacchia-Lineberry Upper Atmosphere Density Model used in the Mission Control Center for International Space Station support. Previous trajectory software required an upper atmosphere to support atmosphere drag calculations in the Mission Control Center. The Functional operation will differ depending on the end-use of the module. In general, the calling routine will use function-calling arguments to specify input to the processor. The atmosphere model will then compute and return atmospheric density at the time of interest.

  18. SU-E-T-83: A Study On Evaluating the Directional Dependency of 2D Seven 29 Ion Chamber Array Clinically with Different IMRT Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Syam; Aswathi, C.P.

    Purpose: To evaluate the directional dependency of 2D seven 29 ion chamber array clinically with different IMRT plans. Methods: 25 patients already treated with IMRT plans were selected for the study. Verification plans were created for each treatment plan in eclipse 10 treatment planning system using the AAA algorithm with the 2D array and the Octavius CT phantom. Verification plans were done 2 times for a single patient. First plan with real IMRT (plan-related approach) and second plan with zero degree gantry angle (field-related approach). Measurements were performed on a Varian Clinac-iX, linear accelerator equipped with a millennium 120 multileafmore » collimator. Fluence was measured for all the delivered plans and analyzed using the verisoft software. Comparison was done by selecting the fluence delivered in static gantry (zero degree gantry) versus IMRT with real gantry angles. Results: The gamma pass percentage is greater than 97 % for all IMRT delivered with zero gantry angle and between 95%–98% for real gantry angles. Dose difference between the TPS calculated and measured for IMRT delivered with zero gantry angle was found to be between (0.03 to 0.06Gy) and with real gantry angles between (0.02 to 0.05Gy). There is a significant difference between the gamma analysis between the zero degree and true angle with a significance of 0.002. Standard deviation of gamma pass percentage between the IMRT plans with zero gantry angle was 0.68 and for IMRT with true gantry angle was found to be 0.74. Conclusion: The gamma analysis for IMRT with zero degree gantry angles shows higher pass percentage than IMRT delivered with true gantry angles. Verification plans delivered with true gantry angles lower the verification accuracy when 2D array is used for measurement.« less

  19. Software for keratometry measurements using portable devices

    NASA Astrophysics Data System (ADS)

    Iyomasa, C. M.; Ventura, L.; De Groote, J. J.

    2010-02-01

    In this work we present an image processing software for automatic astigmatism measurements developed for a hand held keratometer. The system projects 36 light spots, from LEDs, displayed in a precise circle at the lachrymal film of the examined cornea. The displacement, the size and deformation of the reflected image of these light spots are analyzed providing the keratometry. The purpose of this research is to develop a software that performs fast and precise calculations in mainstream mobile devices. In another words, a software that can be implemented in portable computer systems, which could be of low cost and easy to handle. This project allows portability for keratometers and is a previous work for a portable corneal topographer.

  20. Trends in computer hardware and software.

    PubMed

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

Top