Extreme Programming: Maestro Style
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark
2009-01-01
"Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme-programming practices. The single most influential of these factors is that continuous interaction between customers and programmers is not feasible.
Arbitrating Control of Control and Display Units
NASA Technical Reports Server (NTRS)
Sugden, Paul C.
2007-01-01
The ARINC 739 Switch is a computer program that arbitrates control of two multi-function control and display units (MCDUs) between (1) a commercial flight-management computer (FMC) and (2) NASA software used in research on transport aircraft. (MCDUs are the primary interfaces between pilots and FMCs on many commercial aircraft.) This program was recently redesigned into a software library that can be embedded in research application programs. As part of the redesign, this software was combined with software for creating custom pages of information to be displayed on a CDU. This software commands independent switching of the left (pilot s) and right (copilot s) MCDUs. For example, a custom CDU page can control the left CDU while the FMC controls the right CDU. The software uses menu keys to switch control of the CDU between the FMC or a custom CDU page. The software provides an interface that enables custom CDU pages to insert keystrokes into the FMC s CDU input interface. This feature allows the custom CDU pages to manipulate the FMC as if it were a pilot.
Current trends for customized biomedical software tools.
Khan, Haseeb Ahmad
2017-01-01
In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.
Repository-Based Software Engineering (RBSE) program
NASA Technical Reports Server (NTRS)
1992-01-01
Support of a software engineering program was provided in the following areas: client/customer liaison; research representation/outreach; and program support management. Additionally, a list of deliverables is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mundy, D; Tryggestad, E; Beltran, C
Purpose: To develop daily and monthly quality assurance (QA) programs in support of a new spot-scanning proton treatment facility using a combination of commercial and custom equipment and software. Emphasis was placed on efficiency and evaluation of key quality parameters. Methods: The daily QA program was developed to test output, spot size and position, proton beam energy, and image guidance using the Sun Nuclear Corporation rf-DQA™3 device and Atlas QA software. The program utilizes standard Atlas linear accelerator tests repurposed for proton measurements and a custom jig for indexing the device to the treatment couch. The monthly QA program wasmore » designed to test mechanical performance, image quality, radiation quality, isocenter coincidence, and safety features. Many of these tests are similar to linear accelerator QA counterparts, but many require customized test design and equipment. Coincidence of imaging, laser marker, mechanical, and radiation isocenters, for instance, is verified using a custom film-based device devised and manufactured at our facility. Proton spot size and position as a function of energy are verified using a custom spot pattern incident on film and analysis software developed in-house. More details concerning the equipment and software developed for monthly QA are included in the supporting document. Thresholds for daily and monthly tests were established via perturbation analysis, early experience, and/or proton system specifications and associated acceptance test results. Results: The periodic QA program described here has been in effect for approximately 9 months and has proven efficient and sensitive to sub-clinical variations in treatment delivery characteristics. Conclusion: Tools and professional guidelines for periodic proton system QA are not as well developed as their photon and electron counterparts. The program described here efficiently evaluates key quality parameters and, while specific to the needs of our facility, could be readily adapted to other proton centers.« less
NASA Technical Reports Server (NTRS)
1982-01-01
Use of computer program STRCMACS has enabled Illinois Bell Telephone, a subsidiary of American Telephone and Telegraph to cut software development costs about 10 percent by reducing program maintenance and by allowing the department to bring other software into operation more quickly. It has also been useful in company training of programming staff.
Computer software management, evaluation, and dissemination
NASA Technical Reports Server (NTRS)
1983-01-01
The activities of the Computer Software Management and Information Center involving the collection, processing, and distribution of software developed under the auspices of NASA and certain other federal agencies are reported. Program checkout and evaluation, inventory control, customer services and marketing, dissemination, program maintenance, and special development tasks are discussed.
Operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
1983-01-01
The major operational areas of the COSMIC center are described. Quantitative data on the software submittals, program verification, and evaluation are presented. The dissemination activities are summarized. Customer services and marketing activities of the center for the calendar year are described. Those activities devoted to the maintenance and support of selected programs are described. A Customer Information system, the COSMIC Abstract Recording System Project, and the COSMIC Microfiche Project are summarized. Operational cost data are summarized.
Data-Driven Software Framework for Web-Based ISS Telescience
NASA Technical Reports Server (NTRS)
Tso, Kam S.
2005-01-01
Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.
NASA Technical Reports Server (NTRS)
Fisher, Marcus S.; Northey, Jeffrey; Stanton, William
2014-01-01
The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.
Johnston, Sharon; Wong, Sabrina T; Blackman, Stephanie; Chau, Leena W; Grool, Anne M; Hogg, William
2017-11-16
Recruiting family physicians into primary care research studies requires researchers to continually manage information coming in, going out, and coming in again. In many research groups, Microsoft Excel and Access are the usual data management tools, but they are very basic and do not support any automation, linking, or reminder systems to manage and integrate recruitment information and processes. We explored whether a commercial customer relationship management (CRM) software program - designed for sales people in businesses to improve customer relations and communications - could be used to make the research recruitment system faster, more effective, and more efficient. We found that while there was potential for long-term studies, it simply did not adapt effectively enough for our shorter study and recruitment budget. The amount of training required to master the software and our need for ongoing flexible and timely support were greater than the benefit of using CRM software for our study.
ERIC Educational Resources Information Center
Lieberth, Ann K.; Martin, Doug R.
1995-01-01
Because of the diversity of clients served by speech-language pathologists and audiologists, available commercial software may not meet all needs. Authoring programs allow the clinician to design software that can be customized for individual clients. This article describes an authoring program called HyperCard and its use in preparing hypermedia…
Review of the activities of COSMIC
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
The activities of the Computer Software Management and Information Center involving the collection, processing, and distribution of software developed under the auspices of NASA and certain other federal agencies are reported. Program checkout and evaluation, inventory control, customer services and marketing, dissemination, program maintenance, and special development tasks are discussed.
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
Scientific Software: How to Find What You Need and Get What You Pay for.
ERIC Educational Resources Information Center
Gabaldon, Diana J.
1984-01-01
Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
NASA Astrophysics Data System (ADS)
Boyle, P.; Chen, D.; Christ, N.; Clark, M.; Cohen, S.; Cristian, C.; Dong, Z.; Gara, A.; Joo, B.; Jung, C.; Kim, C.; Levkova, L.; Liao, X.; Liu, G.; Li, S.; Lin, H.; Mawhinney, R.; Ohta, S.; Petrov, K.; Wettig, T.; Yamaguchi, A.
2005-03-01
The QCDOC project has developed a supercomputer optimised for the needs of Lattice QCD simulations. It provides a very competitive price to sustained performance ratio of around $1 USD per sustained Megaflop/s in combination with outstanding scalability. Thus very large systems delivering over 5 TFlop/s of performance on the evolution of a single lattice is possible. Large prototypes have been built and are functioning correctly. The software environment raises the state of the art in such custom supercomputers. It is based on a lean custom node operating system that eliminates many unnecessary overheads that plague other systems. Despite the custom nature, the operating system implements a standards compliant UNIX-like programming environment easing the porting of software from other systems. The SciDAC QMP interface adds internode communication in a fashion that provides a uniform cross-platform programming environment.
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
Applying program comprehension techniques to improve software inspections
NASA Technical Reports Server (NTRS)
Rifkin, Stan; Deimel, Lionel
1994-01-01
Software inspections are widely regarded as a cost-effective mechanism for removing defects in software, though performing them does not always reduce the number of customer-discovered defects. We present a case study in which an attempt was made to reduce such defects through inspection training that introduced program comprehension ideas. The training was designed to address the problem of understanding the artifact being reviewed, as well as other perceived deficiencies of the inspection process itself. Measures, both formal and informal, suggest that explicit training in program understanding may improve inspection effectiveness.
A Computer Program for the Management of Prescription-Based Problems.
ERIC Educational Resources Information Center
Cotter, Patricia M.; Gumtow, Robert H.
1991-01-01
The Prescription Management Program, a software program using Apple's HyperCard on a MacIntosh, was developed to simplify the creation, storage, modification, and general management of prescription-based problems. Pharmacy instructors may customize the program to serve their individual teaching needs. (Author/DB)
ERIC Educational Resources Information Center
Association of Data Processing Service Organizations, Arlington, VA.
The problem of unauthorized computer software duplication impedes the production of upgraded products by software developers, who find thousands of illegal computer program copies have been made by customers who either innocently believe they are doing nothing wrong, or simply choose to ignore the law. Unauthorized duplication and use of software…
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
12 CFR 517.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., expert witnesses, customized training, relocation services, information systems technology (computer systems, database management, software and office automation), or micrographic services; or in support of...-Owned Businesses Outreach Program (Outreach Program) is to ensure that firms owned and operated by...
12 CFR 517.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., expert witnesses, customized training, relocation services, information systems technology (computer systems, database management, software and office automation), or micrographic services; or in support of...-Owned Businesses Outreach Program (Outreach Program) is to ensure that firms owned and operated by...
12 CFR 517.1 - Purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., expert witnesses, customized training, relocation services, information systems technology (computer systems, database management, software and office automation), or micrographic services; or in support of...-Owned Businesses Outreach Program (Outreach Program) is to ensure that firms owned and operated by...
PARIS II: DESIGNING GREENER SOLVENTS
PARIS II (the program for assisting the replacement of industrial solvents, version II), developed at the USEPA, is a unique software tool that can be used for customizing the design of replacement solvents and for the formulation of new solvents. This program helps users avoid ...
Pedrami, Farnoush; Asenso, Pamela; Devi, Sachin
2016-08-25
Objective. To identify trends in pharmacy education during last two decades using text mining. Methods. Articles published in the American Journal of Pharmaceutical Education (AJPE) in the past two decades were compiled in a database. Custom text analytics software was written using Visual Basic programming language in the Visual Basic for Applications (VBA) editor of Excel 2007. Frequency of words appearing in article titles was calculated using the custom VBA software. Data were analyzed to identify the emerging trends in pharmacy education. Results. Three educational trends emerged: active learning, interprofessional, and cultural competency. Conclusion. The text analytics program successfully identified trends in article topics and may be a useful compass to predict the future course of pharmacy education.
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA-developed Artificial Satellite Analysis Program (ASAP), was purchased from COSMIC and used to enhance OPNET, a program for developing simulations of communications satellite networks. OPNET's developer, MIL3, applied ASAP to support predictions of low Earth orbit, enabling the company to offer satellite modeling capability to customers earlier than if they had to actually develop the program.
Pratt and Whitney Overview and Advanced Health Management Program
NASA Technical Reports Server (NTRS)
Inabinett, Calvin
2008-01-01
Hardware Development Activity: Design and Test Custom Multi-layer Circuit Boards for use in the Fault Emulation Unit; Logic design performed using VHDL; Layout power system for lab hardware; Work lab issues with software developers and software testers; Interface with Engine Systems personnel with performance of Engine hardware components; Perform off nominal testing with new engine hardware.
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1994-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of April 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Five articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: GAP 1.0 - Groove Analysis Program, Version 1.0; SUBTRANS - Subband/Transform MATLAB Functions for Image Processing; CSDM - COLD-SAT Dynamic Model; CASRE - Computer Aided Software Reliability Estimation; and XOPPS - OEL Project Planner/Scheduler Tool. Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and disseminations are also described along with a budget summary.
NASA Technical Reports Server (NTRS)
2003-01-01
When NASA needed a real-time, online database system capable of tracking documentation changes in its propulsion test facilities, engineers at Stennis Space Center joined with ECT International, of Brookfield, Wisconsin, to create a solution. Through NASA's Dual-Use Program, ECT developed Exdata, a software program that works within the company's existing Promise software. Exdata not only satisfied NASA s requirements, but also expanded ECT s commercial product line. Promise, ECT s primary product, is an intelligent software program with specialized functions for designing and documenting electrical control systems. An addon to AutoCAD software, Promis e generates control system schematics, panel layouts, bills of material, wire lists, and terminal plans. The drawing functions include symbol libraries, macros, and automatic line breaking. Primary Promise customers include manufacturing companies, utilities, and other organizations with complex processes to control.
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
Comparative study of age estimation using dentinal translucency by digital and conventional methods.
Bommannavar, Sushma; Kulkarni, Meena
2015-01-01
Estimating age using the dentition plays a significant role in identification of the individual in forensic cases. Teeth are one of the most durable and strongest structures in the human body. The morphology and arrangement of teeth vary from person-to-person and is unique to an individual as are the fingerprints. Therefore, the use of dentition is the method of choice in the identification of the unknown. Root dentin translucency is considered to be one of the best parameters for dental age estimation. Traditionally, root dentin translucency was measured using calipers. Recently, the use of custom built software programs have been proposed for the same. The present study describes a method to measure root dentin translucency on sectioned teeth using a custom built software program Adobe Photoshop 7.0 version (Adobe system Inc, Mountain View California). A total of 50 single rooted teeth were sectioned longitudinally to derive a 0.25 mm uniform thickness and the root dentin translucency was measured using digital and caliper methods and compared. The Gustafson's morphohistologic approach is used in this study. Correlation coefficients of translucency measurements to age were statistically significant for both the methods (P < 0.125) and linear regression equations derived from both methods revealed better ability of the digital method to assess age. The custom built software program used in the present study is commercially available and widely used image editing software. Furthermore, this method is easy to use and less time consuming. The measurements obtained using this method are more precise and thus help in more accurate age estimation. Considering these benefits, the present study recommends the use of digital method to assess translucency for age estimation.
Comparative study of age estimation using dentinal translucency by digital and conventional methods
Bommannavar, Sushma; Kulkarni, Meena
2015-01-01
Introduction: Estimating age using the dentition plays a significant role in identification of the individual in forensic cases. Teeth are one of the most durable and strongest structures in the human body. The morphology and arrangement of teeth vary from person-to-person and is unique to an individual as are the fingerprints. Therefore, the use of dentition is the method of choice in the identification of the unknown. Root dentin translucency is considered to be one of the best parameters for dental age estimation. Traditionally, root dentin translucency was measured using calipers. Recently, the use of custom built software programs have been proposed for the same. Objectives: The present study describes a method to measure root dentin translucency on sectioned teeth using a custom built software program Adobe Photoshop 7.0 version (Adobe system Inc, Mountain View California). Materials and Methods: A total of 50 single rooted teeth were sectioned longitudinally to derive a 0.25 mm uniform thickness and the root dentin translucency was measured using digital and caliper methods and compared. The Gustafson's morphohistologic approach is used in this study. Results: Correlation coefficients of translucency measurements to age were statistically significant for both the methods (P < 0.125) and linear regression equations derived from both methods revealed better ability of the digital method to assess age. Conclusion: The custom built software program used in the present study is commercially available and widely used image editing software. Furthermore, this method is easy to use and less time consuming. The measurements obtained using this method are more precise and thus help in more accurate age estimation. Considering these benefits, the present study recommends the use of digital method to assess translucency for age estimation. PMID:25709325
Advanced software development workstation project ACCESS user's guide
NASA Technical Reports Server (NTRS)
1990-01-01
ACCESS is a knowledge based software information system designed to assist the user in modifying retrieved software to satisfy user specifications. A user's guide is presented for the knowledge engineer who wishes to create for ACCESS a knowledge base consisting of representations of objects in some software system. This knowledge is accessible to an end user who wishes to use the catalogued software objects to create a new application program or an input stream for an existing system. The application specific portion of an ACCESS knowledge base consists of a taxonomy of object classes, as well as instances of these classes. All objects in the knowledge base are stored in an associative memory. ACCESS provides a standard interface for the end user to browse and modify objects. In addition, the interface can be customized by the addition of application specific data entry forms and by specification of display order for the taxonomy and object attributes. These customization options are described.
Process description language: an experiment in robust programming for manufacturing systems
NASA Astrophysics Data System (ADS)
Spooner, Natalie R.; Creak, G. Alan
1998-10-01
Maintaining stable, robust, and consistent software is difficult in face of the increasing rate of change of customers' preferences, materials, manufacturing techniques, computer equipment, and other characteristic features of manufacturing systems. It is argued that software is commonly difficult to keep up to date because many of the implications of these changing features on software details are obscure. A possible solution is to use a software generation system in which the transformation of system properties into system software is made explicit. The proposed generation system stores the system properties, such as machine properties, product properties and information on manufacturing techniques, in databases. As a result this information, on which system control is based, can also be made available to other programs. In particular, artificial intelligence programs such as fault diagnosis programs, can benefit from using the same information as the control system, rather than a separate database which must be developed and maintained separately to ensure consistency. Experience in developing a simplified model of such a system is presented.
Program For Generating Interactive Displays
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
Sun/Unix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. Plus viewed as productivity tool for application developers and application end users, who benefit from resultant consistent and well-designed user interface sheltering them from intricacies of computer. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC and PS/2 compute
DockoMatic 2.0: high throughput inverse virtual screening and homology modeling.
Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T; McDougal, Owen M; Andersen, Timothy L
2013-08-26
DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly graphical user interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to (1) conduct high throughput inverse virtual screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELER programs and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education.
Framework for Development of Object-Oriented Software
NASA Technical Reports Server (NTRS)
Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan
2004-01-01
The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.
ERIC Educational Resources Information Center
Roberson, E. Wayne; Glowinski, Debra J.
The Computer Assisted Diagnostic Prescriptive Program (CADPP) is a customized databased curriculum management system which permits the user to load the following into a filing/retrieval software system: (1) learning characteristics of individual students (e.g., age, instructional level, learning modality); (2) skill-oriented characteristics of…
Flexible control techniques for a lunar base
NASA Technical Reports Server (NTRS)
Kraus, Thomas W.
1992-01-01
The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.
Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction
Venkatesan, R.
2016-01-01
Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets. PMID:27738649
Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.
Kumudha, P; Venkatesan, R
Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.
PScan 1.0: flexible software framework for polygon based multiphoton microscopy
NASA Astrophysics Data System (ADS)
Li, Yongxiao; Lee, Woei Ming
2016-12-01
Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-20
... Commissioner of CBP with authority to conduct limited test programs or procedures designed to evaluate planned.... Specifically, CBP is looking for test participants to include: 2-3 Ocean Carriers. At least one must be filing... their software ready to test with CBP once CBP begins the certification process. CBP will post the...
ERIC Educational Resources Information Center
Peterson, Dale
1984-01-01
Discusses the works of Darcy Gerbarg, Ruth Leavitt, David Em, Duane Palyka, and Harold Cohen, visual artists who work with computers to create art works by relying on standard hardware/software tools, using custom tools created for nonartistic tasks, manipulating images at the programing level, and programing creativity into computers themselves.…
Rigby, Darrell K; Reichheld, Frederick F; Schefter, Phil
2002-02-01
Customer relationship management is one of the hottest management tools today. But more than half of all CRM initiatives fail to produce the anticipated results. Why? And what can companies do to reverse that negative trend? The authors--three senior Bain consultants--have spent the past ten years analyzing customer-loyalty initiatives, both successful and unsuccessful, at more than 200 companies in a wide range of industries. They've found that CRM backfires in part because executives don't understand what they are implementing, let alone how much it will cost or how long it will take. The authors' research unveiled four common pitfalls that managers stumble into when trying to implement CRM. Each pitfall is a consequence of a single flawed assumption--that CRM is software that will automatically manage customer relationships. It isn't. Rather, CRM is the creation of customer strategies and processes to build customer loyalty, which are then supported by the technology. This article looks at best practices in CRM at several companies, including the New York Times Company, Square D, GE Capital, Grand Expeditions, and BMC Software. It provides an intellectual framework for any company that wants to start a CRM program or turn around a failing one.
DockoMatic 2.0: High Throughput Inverse Virtual Screening and Homology Modeling
Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T.; McDougal, Owen M.; Andersen, Timothy L.
2013-01-01
DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly Graphical User Interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to: (1) conduct high throughput Inverse Virtual Screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying a receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories, and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELLER programs, and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education. PMID:23808933
Improved Air Combat Awareness; with AESA and Next-Generation Signal Processing
2002-09-01
competence network Building techniques Software development environment Communication Computer architecture Modeling Real-time programming Radar...memory access, skewed load and store, 3.2 GB/s BW • Performance: 400 MFLOPS Runtime environment Custom runtime routines Driver routines Hardware
Experiences with Extreme Programming
ERIC Educational Resources Information Center
Sherrell, Linda; Krishna, Bhagavathy; Velaga, Natasha; Vejandla, Pavan; Satharla, Mahesh
2010-01-01
Agile methodologies have become increasingly popular among software developers as evidenced by industrial participation at related conferences. The popularity of agile practices over traditional techniques partly stems from the fact that these practices provide for more customer involvement and better accommodate rapidly changing requirements,…
Designing Control System Application Software for Change
NASA Technical Reports Server (NTRS)
Boulanger, Richard
2001-01-01
The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.
Program Aids Visualization Of Data
NASA Technical Reports Server (NTRS)
Truong, L. V.
1995-01-01
Living Color Frame System (LCFS) computer program developed to solve some problems that arise in connection with generation of real-time graphical displays of numerical data and of statuses of systems. Need for program like LCFS arises because computer graphics often applied for better understanding and interpretation of data under observation and these graphics become more complicated when animation required during run time. Eliminates need for custom graphical-display software for application programs. Written in Turbo C++.
NASA Technical Reports Server (NTRS)
2000-01-01
Software packages commercially marketed by Agri ImaGIS allow customers to analyze farm fields. Agri ImaGIS provides satellite images of farmland and agricultural views to US clients. The company approached NASA-MSU TechLink for access to technology that would improve the company's capabilities to deliver satellite images over the Internet. TechLink found that software with the desired functions had already been developed through NASA's Remote Sensing Database Program. Agri ImaGIS formed a partnership with the University of Minnesota group that allows the company to further develop the software to meet its Internet commerce needs.
2010-01-01
We present an extensible software model for the genotype and phenotype community, XGAP. Readers can download a standard XGAP (http://www.xgap.org) or auto-generate a custom version using MOLGENIS with programming interfaces to R-software and web-services or user interfaces for biologists. XGAP has simple load formats for any type of genotype, epigenotype, transcript, protein, metabolite or other phenotype data. Current functionality includes tools ranging from eQTL analysis in mouse to genome-wide association studies in humans. PMID:20214801
Swertz, Morris A; Velde, K Joeri van der; Tesson, Bruno M; Scheltema, Richard A; Arends, Danny; Vera, Gonzalo; Alberts, Rudi; Dijkstra, Martijn; Schofield, Paul; Schughart, Klaus; Hancock, John M; Smedley, Damian; Wolstencroft, Katy; Goble, Carole; de Brock, Engbert O; Jones, Andrew R; Parkinson, Helen E; Jansen, Ritsert C
2010-01-01
We present an extensible software model for the genotype and phenotype community, XGAP. Readers can download a standard XGAP (http://www.xgap.org) or auto-generate a custom version using MOLGENIS with programming interfaces to R-software and web-services or user interfaces for biologists. XGAP has simple load formats for any type of genotype, epigenotype, transcript, protein, metabolite or other phenotype data. Current functionality includes tools ranging from eQTL analysis in mouse to genome-wide association studies in humans.
Bibliographies without Tears: Bibliography-Managers Round-Up.
ERIC Educational Resources Information Center
Science Software Quarterly, 1984
1984-01-01
Reviews and compares "Sci-Mate,""Reference Manager," and "BIBLIOPHILE" software packages used for storage and retrieval tasks involving bibliographic data. Each program handles search tasks well; major differences are in the amount of flexibility in customizing the database structure, their import and export…
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
Jackson, James; Dixon, Mark R
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1994-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of January 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are discussed. Marketing and customer service activities in this period are presented as is the progress report of NASTRAN maintenance and support. Tables of disseminations and budget summary conclude the report.
ERIC Educational Resources Information Center
Enders, Craig K.
2005-01-01
The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…
Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan
2014-12-01
The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.
BASIC Data Manipulation And Display System (BDMADS)
NASA Technical Reports Server (NTRS)
Szuch, J. R.
1983-01-01
BDMADS, a BASIC Data Manipulation and Display System, is a collection of software programs that run on an Apple II Plus personal computer. BDMADS provides a user-friendly environment for the engineer in which to perform scientific data processing. The computer programs and their use are described. Jet engine performance calculations are used to illustrate the use of BDMADS. Source listings of the BDMADS programs are provided and should permit users to customize the programs for their particular applications.
TealLock 5.20 security software program for handheld devices.
Tahil, Fatimah A
2004-07-01
The TealLock has a simple graphic interface, and the program is user-friendly with well thought out options to customize security settings. The program is inexpensive and works seamlessly with the Palm OS platform's built-in basic Security application. The developer offers a 30-day free trial version and there is no downside to trying it to see if it meets your needs. It seems to be an effective security software program for psychiatrists who keep confidential and sensitive patient information on their PDAs. In keeping with HIPAA regulations, the TealLock bolsters security for protected health information stored on PDAs or other handheld devices by providing safeguards that address authentication, access control, encryption, and selected aspects of transmission.
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
Pozzi, Alessandro; Arcuri, Lorenzo; Moy, Peter K
2018-03-01
The growing interest in minimally invasive implant placement and delivery of a prefabricated provisional prosthesis immediately, thus minimizing "time to teeth," has led to the development of numerous 3-dimensional (3D) planning software programs. Given the enhancements associated with fully digital workflows, such as better 3D soft-tissue visualization and virtual tooth rendering, computer-guided implant surgery and immediate function has become an effective and reliable procedure. This article describes how modern implant planning software programs provide a comprehensive digital platform that enables efficient interplay between the surgical and restorative aspects of implant treatment. These new technologies that streamline the overall digital workflow allow transformation of the digital wax-up into a personalized, CAD/CAM-milled provisional restoration. Thus, collaborative digital workflows provide a novel approach for time-efficient delivery of a customized, screw-retained provisional restoration on the day of implant surgery, resulting in improved predictability for immediate function in the partially edentate patient.
Development and realization of the open fault diagnosis system based on XPE
NASA Astrophysics Data System (ADS)
Deng, Hui; Wang, TaiYong; He, HuiLong; Xu, YongGang; Zeng, JuXiang
2005-12-01
To make the complex mechanical equipment work in good service, the technology for realizing an embedded open system is introduced systematically, including open hardware configuration, customized embedded operation system and open software structure. The ETX technology is adopted in this system, integrating the CPU main-board functions, and achieving the quick, real-time signal acquisition and intelligent data analysis with applying DSP and CPLD data acquisition card. Under the open configuration, the signal bus mode such as PCI, ISA and PC/104 can be selected and the styles of the signals can be chosen too. In addition, through customizing XPE system, adopting the EWF (Enhanced Write Filter), and realizing the open system authentically, the stability of the system is enhanced. Multi-thread and multi-task programming techniques are adopted in the software programming process. Interconnecting with the remote fault diagnosis center via the net interface, cooperative diagnosis is conducted and the intelligent degree of the fault diagnosis is improved.
Deployable Command and Control System for Over the Horizon Small Boat Operations
2006-09-01
the HP iPAQ Navigation System bundle. There is no programmable Application Programming Interface (API), nor otherwise accessible methods to ...High Point Software which comes complete with a C# library to allow customized programs to access Bluetooth enabled GPS devices. GPSAccess...data could be displayed along with ownship’s positional data, but the program was designed to only work with the Ross radios and the MS Windows XP
An Internal Data Non-hiding Type Real-time Kernel and its Application to the Mechatronics Controller
NASA Astrophysics Data System (ADS)
Yoshida, Toshio
For the mechatronics equipment controller that controls robots and machine tools, high-speed motion control processing is essential. The software system of the controller like other embedded systems is composed of three layers software such as real-time kernel layer, middleware layer, and application software layer on the dedicated hardware. The application layer in the top layer is composed of many numbers of tasks, and application function of the system is realized by the cooperation between these tasks. In this paper we propose an internal data non-hiding type real-time kernel in which customizing the task control is possible only by change in the program code of the task side without any changes in the program code of real-time kernel. It is necessary to reduce the overhead caused by the real-time kernel task control for the speed-up of the motion control of the mechatronics equipment. For this, customizing the task control function is needed. We developed internal data non-cryptic type real-time kernel ZRK to evaluate this method, and applied to the control of the multi system automatic lathe. The effect of the speed-up of the task cooperation processing was able to be confirmed by combined task control processing on the task side program code using an internal data non-hiding type real-time kernel ZRK.
Thakkar, Jay; Barry, Tony; Thiagalingam, Aravinda; Redfern, Julie; McEwan, Alistair L; Rodgers, Anthony
2016-01-01
Background Mobile health (mHealth) has huge potential to deliver preventative health services. However, there is paucity of literature on theoretical constructs, technical, practical, and regulatory considerations that enable delivery of such services. Objectives The objective of this study was to outline the key considerations in the development of a text message-based mHealth program; thus providing broad recommendations and guidance to future researchers designing similar programs. Methods We describe the key considerations in designing the intervention with respect to functionality, technical infrastructure, data management, software components, regulatory requirements, and operationalization. We also illustrate some of the potential issues and decision points utilizing our experience of developing text message (short message service, SMS) management systems to support 2 large randomized controlled trials: TEXT messages to improve MEDication adherence & Secondary prevention (TEXTMEDS) and Tobacco, EXercise and dieT MEssages (TEXT ME). Results The steps identified in the development process were: (1) background research and development of the text message bank based on scientific evidence and disease-specific guidelines, (2) pilot testing with target audience and incorporating feedback, (3) software-hardware customization to enable delivery of complex personalized programs using prespecified algorithms, and (4) legal and regulatory considerations. Additional considerations in developing text message management systems include: balancing the use of customized versus preexisting software systems, the level of automation versus need for human inputs, monitoring, ensuring data security, interface flexibility, and the ability for upscaling. Conclusions A merging of expertise in clinical and behavioral sciences, health and research data management systems, software engineering, and mobile phone regulatory requirements is essential to develop a platform to deliver and manage support programs to hundreds of participants simultaneously as in TEXT ME and TEXTMEDS trials. This research provides broad principles that may assist other researchers in developing mHealth programs. PMID:27847350
Thakkar, Jay; Barry, Tony; Thiagalingam, Aravinda; Redfern, Julie; McEwan, Alistair L; Rodgers, Anthony; Chow, Clara K
2016-11-15
Mobile health (mHealth) has huge potential to deliver preventative health services. However, there is paucity of literature on theoretical constructs, technical, practical, and regulatory considerations that enable delivery of such services. The objective of this study was to outline the key considerations in the development of a text message-based mHealth program; thus providing broad recommendations and guidance to future researchers designing similar programs. We describe the key considerations in designing the intervention with respect to functionality, technical infrastructure, data management, software components, regulatory requirements, and operationalization. We also illustrate some of the potential issues and decision points utilizing our experience of developing text message (short message service, SMS) management systems to support 2 large randomized controlled trials: TEXT messages to improve MEDication adherence & Secondary prevention (TEXTMEDS) and Tobacco, EXercise and dieT MEssages (TEXT ME). The steps identified in the development process were: (1) background research and development of the text message bank based on scientific evidence and disease-specific guidelines, (2) pilot testing with target audience and incorporating feedback, (3) software-hardware customization to enable delivery of complex personalized programs using prespecified algorithms, and (4) legal and regulatory considerations. Additional considerations in developing text message management systems include: balancing the use of customized versus preexisting software systems, the level of automation versus need for human inputs, monitoring, ensuring data security, interface flexibility, and the ability for upscaling. A merging of expertise in clinical and behavioral sciences, health and research data management systems, software engineering, and mobile phone regulatory requirements is essential to develop a platform to deliver and manage support programs to hundreds of participants simultaneously as in TEXT ME and TEXTMEDS trials. This research provides broad principles that may assist other researchers in developing mHealth programs. ©Jay Thakkar, Tony Barry, Aravinda Thiagalingam, Julie Redfern, Alistair L McEwan, Anthony Rodgers, Clara K Chow. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 15.11.2016.
Computer programming for generating visual stimuli.
Bukhari, Farhan; Kurylo, Daniel D
2008-02-01
Critical to vision research is the generation of visual displays with precise control over stimulus metrics. Generating stimuli often requires adapting commercial software or developing specialized software for specific research applications. In order to facilitate this process, we give here an overview that allows nonexpert users to generate and customize stimuli for vision research. We first give a review of relevant hardware and software considerations, to allow the selection of display hardware, operating system, programming language, and graphics packages most appropriate for specific research applications. We then describe the framework of a generic computer program that can be adapted for use with a broad range of experimental applications. Stimuli are generated in the context of trial events, allowing the display of text messages, the monitoring of subject responses and reaction times, and the inclusion of contingency algorithms. This approach allows direct control and management of computer-generated visual stimuli while utilizing the full capabilities of modern hardware and software systems. The flowchart and source code for the stimulus-generating program may be downloaded from www.psychonomic.org/archive.
Application and systems software in Ada: Development experiences
NASA Technical Reports Server (NTRS)
Kuschill, Jim
1986-01-01
In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.
Astronaut Health Participant Summary Application
NASA Technical Reports Server (NTRS)
Johnson, Kathy; Krog, Ralph; Rodriguez, Seth; Wear, Mary; Volpe, Robert; Trevino, Gina; Eudy, Deborah; Parisian, Diane
2011-01-01
The Longitudinal Study of Astronaut Health (LSAH) Participant Summary software captures data based on a custom information model designed to gather all relevant, discrete medical events for its study participants. This software provides a summarized view of the study participant s entire medical record. The manual collapsing of all the data in a participant s medical record into a summarized form eliminates redundancy, and allows for the capture of entire medical events. The coding tool could be incorporated into commercial electronic medical record software for use in areas like public health surveillance, hospital systems, clinics, and medical research programs.
[Utility of Smartphone in Home Care Medicine - First Trial].
Takeshige, Toshiyuki; Hirano, Chiho; Nakagawa, Midori; Yoshioka, Rentaro
2015-12-01
The use of video calls for home care can reduce anxiety and offer patients peace of mind. The most suitable terminals at facilities to support home care have been iPad Air and iPhone with FaceTime software. However, usage has been limited to specific terminals. In order to eliminate the need for special terminals and software, we have developed a program that has been customized to meet the needs of facilities using Web Real Time Communication(WebRTC)in cooperation with the University of Aizu. With this software, video calls can accommodate the large number of home care patients.
The Impact of Software Culture on the Management of Community Data
NASA Astrophysics Data System (ADS)
Collins, J. A.; Pulsifer, P. L.; Sheffield, E.; Lewis, S.; Oldenburg, J.
2013-12-01
The Exchange for Local Observations and Knowledge of the Arctic (ELOKA), a program hosted at the National Snow and Ice Data Center (NSIDC), supports the collection, curation, and distribution of Local and Traditional Knowledge (LTK) data, as well as some quantitative data products. Investigations involving LTK data often involve community participation, and therefore require flexible and robust user interfaces to support a reliable process of data collection and management. Often, investigators focused on LTK and community-based monitoring choose to use ELOKA's data services based on our ability to provide rapid proof-of-concepts and economical delivery of a usable product. To satisfy these two overarching criteria, ELOKA is experimenting with modifications to its software development culture both in terms of how the software applications are developed as well as the kind of software applications (or components) being developed. Over the past several years, NSIDC has shifted its software development culture from one of assigning individual scientific programmers to support particular principal investigators or projects, to an Agile Software Methodology implementation using Scrum practices. ELOKA has participated in this process by working with other product owners to schedule and prioritize development work which is then implemented by a team of application developers. Scrum, along with practices such as Test Driven Development (TDD) and paired programming, improves the quality of the software product delivered to the user community. To meet the need for rapid prototyping and to maximize product development and support with limited developer input, our software development efforts are now focused on creating a platform of application modules that can be quickly customized to suit the needs of a variety of LTK projects. This approach is in contrast to the strategy of delivering custom applications for individual projects. To date, we have integrated components of the Nunaliit Atlas framework (a Java/JavaScript client-server web-based application) with an existing Ruby on Rails application. This approach requires transitioning individual applications to expose a service layer, thus allowing interapplication communication via RESTful services. In this presentation we will report on our experiences using Agile Scrum practices, our efforts to move from custom solutions to a platform of customizable modules, and the impact of each on our ability to support researchers and Arctic residents in the domain of community-based observations and knowledge.
NASA Technical Reports Server (NTRS)
1987-01-01
In a complex computer environment there is ample opportunity for error, a mistake by a programmer, or a software-induced undesirable side effect. In insurance, errors can cost a company heavily, so protection against inadvertent change is a must for the efficient firm. The data processing center at Transport Life Insurance Company has taken a step to guard against accidental changes by adopting a software package called EQNINT (Equations Interpreter Program). EQNINT cross checks the basic formulas in a program against the formulas that make up the major production system. EQNINT assures that formulas are coded correctly and helps catch errors before they affect the customer service or its profitability.
Proffitt, Rachel; Lange, Belinda
2015-01-01
The objective of this study was to determine the feasibility of a 6-week, game-based, in-home telerehabilitation exercise program using the Microsoft Kinect® for individuals with chronic stroke. Four participants with chronic stroke completed the intervention based on games designed with the customized Mystic Isle software. The games were tailored to each participant's specific rehabilitation needs to facilitate the attainment of individualized goals determined through the Canadian Occupational Performance Measure. Likert scale questionnaires assessed the feasibility and utility of the game-based intervention. Supplementary clinical outcome data were collected. All participants played the games with moderately high enjoyment. Participant feedback helped identify barriers to use (especially, limited free time) and possible improvements. An in-home, customized, virtual reality game intervention to provide rehabilitative exercises for persons with chronic stroke is practicable. However, future studies are necessary to determine the intervention's impact on participant function, activity, and involvement.
Back to the Source, or It's A You-Bet-Your-Business Game!
ERIC Educational Resources Information Center
Galvin, Wayne W.
1987-01-01
Many administrators are signing contracts for software products that leave their institutions completely unprotected in the event of a default by the vendor. It is proper for a customer to include contractual provisions whereby they may gain legal access to the program source code. (MLW)
Eco-Visualization: Promoting Environmental Stewardship in the Museum
ERIC Educational Resources Information Center
Holmes, Tiffany
2007-01-01
Eco-visualizations are artworks that reinterpret environmental data with custom software to promote stewardship. Eco-visualization technology offers a new way to dynamically picture environmental data and make it meaningful to a museum population. The questions are: How might museums create new projects and programs around place-based information?…
Repository-based software engineering program: Concept document
NASA Technical Reports Server (NTRS)
1992-01-01
This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.
Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files
NASA Technical Reports Server (NTRS)
2005-01-01
The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).
Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa
2013-01-01
This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software
Use of three-dimensional computer graphic animation to illustrate cleft lip and palate surgery.
Cutting, C; Oliker, A; Haring, J; Dayan, J; Smith, D
2002-01-01
Three-dimensional (3D) computer animation is not commonly used to illustrate surgical techniques. This article describes the surgery-specific processes that were required to produce animations to teach cleft lip and palate surgery. Three-dimensional models were created using CT scans of two Chinese children with unrepaired clefts (one unilateral and one bilateral). We programmed several custom software tools, including an incision tool, a forceps tool, and a fat tool. Three-dimensional animation was found to be particularly useful for illustrating surgical concepts. Positioning the virtual "camera" made it possible to view the anatomy from angles that are impossible to obtain with a real camera. Transparency allows the underlying anatomy to be seen during surgical repair while maintaining a view of the overlaying tissue relationships. Finally, the representation of motion allows modeling of anatomical mechanics that cannot be done with static illustrations. The animations presented in this article can be viewed on-line at http://www.smiletrain.org/programs/virtual_surgery2.htm. Sophisticated surgical procedures are clarified with the use of 3D animation software and customized software tools. The next step in the development of this technology is the creation of interactive simulators that recreate the experience of surgery in a safe, digital environment. Copyright 2003 Wiley-Liss, Inc.
ScanImage: flexible software for operating laser scanning microscopes.
Pologruto, Thomas A; Sabatini, Bernardo L; Svoboda, Karel
2003-05-17
Laser scanning microscopy is a powerful tool for analyzing the structure and function of biological specimens. Although numerous commercial laser scanning microscopes exist, some of the more interesting and challenging applications demand custom design. A major impediment to custom design is the difficulty of building custom data acquisition hardware and writing the complex software required to run the laser scanning microscope. We describe a simple, software-based approach to operating a laser scanning microscope without the need for custom data acquisition hardware. Data acquisition and control of laser scanning are achieved through standard data acquisition boards. The entire burden of signal integration and image processing is placed on the CPU of the computer. We quantitate the effectiveness of our data acquisition and signal conditioning algorithm under a variety of conditions. We implement our approach in an open source software package (ScanImage) and describe its functionality. We present ScanImage, software to run a flexible laser scanning microscope that allows easy custom design.
Modernization of software quality assurance
NASA Technical Reports Server (NTRS)
Bhaumik, Gokul
1988-01-01
The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Yusof, Muhammad Mat
2016-08-01
This paper reports the effect of proposed software products features on the satisfaction and dissatisfaction of potential customers of proposed software products. Kano model's functional and dysfunctional technique was used along with Berger et al.'s customer satisfaction coefficients. The result shows that only two features performed the most in influencing the satisfaction and dissatisfaction of would-be customers of the proposed software product. Attractive and one-dimensional features had the highest impact on the satisfaction and dissatisfaction of customers. This result will benefit requirements analysts, developers, designers, projects and sales managers in preparing for proposed products. Additional analysis showed that the Kano model's satisfaction and dissatisfaction scores were highly related to the Park et al.'s average satisfaction coefficient (r=96%), implying that these variables can be used interchangeably or in place of one another to elicit customer satisfaction. Furthermore, average satisfaction coefficients and satisfaction and dissatisfaction indexes were all positively and linearly correlated.
Software Acquisition Program Dynamics
2011-10-24
greatest capability, which requires latest technologies • Contractors prefer using latest technologies to boost staff competency for future bids Risk...mistakes Build foundation to test future mitigation/solution approaches to assess value • Qualitatively validate new approaches before applying them to...classroom training, eLearning , certification, and more—to serve the needs of customers and partners worldwide.
National Science Foundation 1989 Engineering Senior Design Projects To Aid the Disabled.
ERIC Educational Resources Information Center
Enderle, John D., Ed.
Through the Bioengineering and Research to Aid the Disabled program of the National Science Foundation, design projects were awarded competitively to 16 universities. Senior engineering students at each of the universities constructed custom devices and software for disabled individuals. This compendium contains a description of each project in…
The Object Formerly Known as the Textbook
ERIC Educational Resources Information Center
Young, Jeffrey R.
2013-01-01
Textbook publishers argue that their newest digital products should not even be called "textbooks." They are really software programs built to deliver a mix of text, videos, and homework assignments. But delivering them is just the beginning. No old-school textbook was able to be customized for each student in the classroom. The books never graded…
Embracing Open Source for NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin
2017-01-01
The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.
1994-03-01
22202-4302. and to the Office of Managmnt and Budget, Paperwork Reduction Project (0704-0188) Wahington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2...Project Officer ........................ 49 d. Program Management ..................... 50 2. User/Systems Development Team Relationship ...... 52 D...and 60’s, when programming and systems development was in its infancy, virtually all software was custom -made. The programmer designed, coded
Web-Based Software for Managing Research
NASA Technical Reports Server (NTRS)
Hoadley, Sherwood T.; Ingraldi, Anthony M.; Gough, Kerry M.; Fox, Charles; Cronin, Catherine K.; Hagemann, Andrew G.; Kemmerly, Guy T.; Goodman, Wesley L.
2007-01-01
aeroCOMPASS is a software system, originally designed to aid in the management of wind tunnels at Langley Research Center, that could be adapted to provide similar aid to other enterprises in which research is performed in common laboratory facilities by users who may be geographically dispersed. Included in aeroCOMPASS is Web-interface software that provides a single, convenient portal to a set of project- and test-related software tools and other application programs. The heart of aeroCOMPASS is a user-oriented document-management software subsystem that enables geographically dispersed users to easily share and manage a variety of documents. A principle of "write once, read many" is implemented throughout aeroCOMPASS to eliminate the need for multiple entry of the same information. The Web framework of aeroCOMPASS provides links to client-side application programs that are fully integrated with databases and server-side application programs. Other subsystems of aeroCOMPASS include ones for reserving hardware, tracking of requests and feedback from users, generating interactive notes, administration of a customer-satisfaction questionnaire, managing execution of tests, managing archives of metadata about tests, planning tests, and providing online help and instruction for users.
NASA Technical Reports Server (NTRS)
Gill, Esther Naomi
1986-01-01
A review was conducted of software packages currently on the market which might be integrated with the interface language and aid in reaching the objectives of customization, standardization, transparency, reliability, maintainability, language substitutions, expandability, portability, and flexibility. Recommendations are given for best choices in hardware and software acquisition for inhouse testing of these possible integrations. Software acquisition in the line of tools to aid expert-system development and/or novice program development, artificial intelligent voice technology and touch screen or joystick or mouse utilization as well as networking were recommended. Other recommendations concerned using the language Ada for the user interface language shell because of its high level of standardization, structure, and ability to accept and execute programs written in other programming languages, its DOD ownership and control, and keeping the user interface language simple so that multiples of users will find the commercialization of space within their realm of possibility which is, after all, the purpose of the Space Station.
ERIC Educational Resources Information Center
Gilden, Deborah
This paper discusses how presentation software can be used to design custom materials for a variety of people with special needs, including children and adults with low vision, people with developmental disabilities, and stroke patients with cognitive impairments. Benefits of using presentation software include: (1) presentation software gives the…
Code of Federal Regulations, 2010 CFR
2010-07-01
... applicant submits proof satisfactory to the U.S. Customs Service that the goods, software, or technology... satisfactory to the U.S. Customs Service of the location of goods, software, or technology outside the... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Importation of goods, software, or...
yourSky: Custom Sky-Image Mosaics via the Internet
NASA Technical Reports Server (NTRS)
Jacob, Joseph
2003-01-01
yourSky (http://yourSky.jpl.nasa.gov) is a computer program that supplies custom astronomical image mosaics of sky regions specified by requesters using client computers connected to the Internet. [yourSky is an upgraded version of the software reported in Software for Generating Mosaics of Astronomical Images (NPO-21121), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 16a.] A requester no longer has to engage in the tedious process of determining what subset of images is needed, nor even to know how the images are indexed in image archives. Instead, in response to a requester s specification of the size and location of the sky area, (and optionally of the desired set and type of data, resolution, coordinate system, projection, and image format), yourSky automatically retrieves the component image data from archives totaling tens of terabytes stored on computer tape and disk drives at multiple sites and assembles the component images into a mosaic image by use of a high-performance parallel code. yourSky runs on the server computer where the mosaics are assembled. Because yourSky includes a Web-interface component, no special client software is needed: ordinary Web browser software is sufficient.
FloWave.US: validated, open-source, and flexible software for ultrasound blood flow analysis.
Coolbaugh, Crystal L; Bush, Emily C; Caskey, Charles F; Damon, Bruce M; Towse, Theodore F
2016-10-01
Automated software improves the accuracy and reliability of blood velocity, vessel diameter, blood flow, and shear rate ultrasound measurements, but existing software offers limited flexibility to customize and validate analyses. We developed FloWave.US-open-source software to automate ultrasound blood flow analysis-and demonstrated the validity of its blood velocity (aggregate relative error, 4.32%) and vessel diameter (0.31%) measures with a skeletal muscle ultrasound flow phantom. Compared with a commercial, manual analysis software program, FloWave.US produced equivalent in vivo cardiac cycle time-averaged mean (TAMean) velocities at rest and following a 10-s muscle contraction (mean bias <1 pixel for both conditions). Automated analysis of ultrasound blood flow data was 9.8 times faster than the manual method. Finally, a case study of a lower extremity muscle contraction experiment highlighted the ability of FloWave.US to measure small fluctuations in TAMean velocity, vessel diameter, and mean blood flow at specific time points in the cardiac cycle. In summary, the collective features of our newly designed software-accuracy, reliability, reduced processing time, cost-effectiveness, and flexibility-offer advantages over existing proprietary options. Further, public distribution of FloWave.US allows researchers to easily access and customize code to adapt ultrasound blood flow analysis to a variety of vascular physiology applications. Copyright © 2016 the American Physiological Society.
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
HSCT4.0 Application: Software Requirements Specification
NASA Technical Reports Server (NTRS)
Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.
2001-01-01
The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.
NASA Technical Reports Server (NTRS)
Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.
1992-01-01
The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.
Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files
NASA Technical Reports Server (NTRS)
2004-01-01
The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).
Meta-tools for software development and knowledge acquisition
NASA Technical Reports Server (NTRS)
Eriksson, Henrik; Musen, Mark A.
1992-01-01
The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.
Software synthesis using generic architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay
1993-01-01
A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
ERIC Educational Resources Information Center
Jackson, James; Dixon, Mark R.
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows MOBLE operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection…
Spacecraft Avionics Software Development Then and Now: Different but the Same
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Garman, John (Jack); Vice, Jason
2012-01-01
NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA s historic Software Production Facility (SPF) was developed to serve complex avionics software solutions during an era dominated by mainframes, tape drives, and lower level programming languages. These systems have proven themselves resilient enough to serve the Shuttle Orbiter Avionics life cycle for decades. The SPF and its predecessor the Software Development Lab (SDL) at NASA s Johnson Space Center (JSC) hosted flight software (FSW) engineering, development, simulation, and test. It was active from the beginning of Shuttle Orbiter development in 1972 through the end of the shuttle program in the summer of 2011 almost 40 years. NASA s Kedalion engineering analysis lab is on the forefront of validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA s heritage culture in avionics software engineering. Kedalion has validated many of the Orion project s HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics environment, inserting new techniques and skills into the Multi-Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, COTS products, early rapid prototyping, in-house expertise and tools, and customer collaboration, NASA has adopted a cost effective paradigm that is currently serving Orion effectively. This paper will explore and contrast differences in technology employed over the years of NASA s space program, due largely to technological advances in hardware and software systems, while acknowledging that the basic software engineering and integration paradigms share many similarities.
PROFFITT, RACHEL; LANGE, BELINDA
2015-01-01
The objective of this study was to determine the feasibility of a 6-week, game-based, in-home telerehabilitation exercise program using the Microsoft Kinect® for individuals with chronic stroke. Four participants with chronic stroke completed the intervention based on games designed with the customized Mystic Isle software. The games were tailored to each participant’s specific rehabilitation needs to facilitate the attainment of individualized goals determined through the Canadian Occupational Performance Measure. Likert scale questionnaires assessed the feasibility and utility of the game-based intervention. Supplementary clinical outcome data were collected. All participants played the games with moderately high enjoyment. Participant feedback helped identify barriers to use (especially, limited free time) and possible improvements. An in-home, customized, virtual reality game intervention to provide rehabilitative exercises for persons with chronic stroke is practicable. However, future studies are necessary to determine the intervention’s impact on participant function, activity, and involvement. PMID:27563384
Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges
NASA Astrophysics Data System (ADS)
Maruping, Likoebe M.
Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.
Building an open-source robotic stereotaxic instrument.
Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O
2013-10-29
This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.
Software for minimalistic data management in large camera trap studies
Krishnappa, Yathin S.; Turner, Wendy C.
2014-01-01
The use of camera traps is now widespread and their importance in wildlife studies well understood. Camera trap studies can produce millions of photographs and there is a need for software to help manage photographs efficiently. In this paper, we describe a software system that was built to successfully manage a large behavioral camera trap study that produced more than a million photographs. We describe the software architecture and the design decisions that shaped the evolution of the program over the study’s three year period. The software system has the ability to automatically extract metadata from images, and add customized metadata to the images in a standardized format. The software system can be installed as a standalone application on popular operating systems. It is minimalistic, scalable and extendable so that it can be used by small teams or individual researchers for a broad variety of camera trap studies. PMID:25110471
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia
2009-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less
Simplified programming and control of automated radiosynthesizers through unit operations.
Claggett, Shane B; Quinn, Kevin M; Lazari, Mark; Moore, Melissa D; van Dam, R Michael
2013-07-15
Many automated radiosynthesizers for producing positron emission tomography (PET) probes provide a means for the operator to create custom synthesis programs. The programming interfaces are typically designed with the engineer rather than the radiochemist in mind, requiring lengthy programs to be created from sequences of low-level, non-intuitive hardware operations. In some cases, the user is even responsible for adding steps to update the graphical representation of the system. In light of these unnecessarily complex approaches, we have created software to perform radiochemistry on the ELIXYS radiosynthesizer with the goal of being intuitive and easy to use. Radiochemists were consulted, and a wide range of radiosyntheses were analyzed to determine a comprehensive set of basic chemistry unit operations. Based around these operations, we created a software control system with a client-server architecture. In an attempt to maximize flexibility, the client software was designed to run on a variety of portable multi-touch devices. The software was used to create programs for the synthesis of several 18F-labeled probes on the ELIXYS radiosynthesizer, with [18F]FDG detailed here. To gauge the user-friendliness of the software, program lengths were compared to those from other systems. A small sample group with no prior radiosynthesizer experience was tasked with creating and running a simple protocol. The software was successfully used to synthesize several 18F-labeled PET probes, including [18F]FDG, with synthesis times and yields comparable to literature reports. The resulting programs were significantly shorter and easier to debug than programs from other systems. The sample group of naive users created and ran a simple protocol within a couple of hours, revealing a very short learning curve. The client-server architecture provided reliability, enabling continuity of the synthesis run even if the computer running the client software failed. The architecture enabled a single user to control the hardware while others observed the run in progress or created programs for other probes. We developed a novel unit operation-based software interface to control automated radiosynthesizers that reduced the program length and complexity and also exhibited a short learning curve. The client-server architecture provided robustness and flexibility.
Simplified programming and control of automated radiosynthesizers through unit operations
2013-01-01
Background Many automated radiosynthesizers for producing positron emission tomography (PET) probes provide a means for the operator to create custom synthesis programs. The programming interfaces are typically designed with the engineer rather than the radiochemist in mind, requiring lengthy programs to be created from sequences of low-level, non-intuitive hardware operations. In some cases, the user is even responsible for adding steps to update the graphical representation of the system. In light of these unnecessarily complex approaches, we have created software to perform radiochemistry on the ELIXYS radiosynthesizer with the goal of being intuitive and easy to use. Methods Radiochemists were consulted, and a wide range of radiosyntheses were analyzed to determine a comprehensive set of basic chemistry unit operations. Based around these operations, we created a software control system with a client–server architecture. In an attempt to maximize flexibility, the client software was designed to run on a variety of portable multi-touch devices. The software was used to create programs for the synthesis of several 18F-labeled probes on the ELIXYS radiosynthesizer, with [18F]FDG detailed here. To gauge the user-friendliness of the software, program lengths were compared to those from other systems. A small sample group with no prior radiosynthesizer experience was tasked with creating and running a simple protocol. Results The software was successfully used to synthesize several 18F-labeled PET probes, including [18F]FDG, with synthesis times and yields comparable to literature reports. The resulting programs were significantly shorter and easier to debug than programs from other systems. The sample group of naive users created and ran a simple protocol within a couple of hours, revealing a very short learning curve. The client–server architecture provided reliability, enabling continuity of the synthesis run even if the computer running the client software failed. The architecture enabled a single user to control the hardware while others observed the run in progress or created programs for other probes. Conclusions We developed a novel unit operation-based software interface to control automated radiosynthesizers that reduced the program length and complexity and also exhibited a short learning curve. The client–server architecture provided robustness and flexibility. PMID:23855995
Porter, Mark W; Porter, Mark William; Milley, David; Oliveti, Kristyn; Ladd, Allen; O'Hara, Ryan J; Desai, Bimal R; White, Peter S
2008-11-06
Flexible, highly accessible collaboration tools can inherently conflict with controls placed on information sharing by offices charged with privacy protection, compliance, and maintenance of the general business environment. Our implementation of a commercial enterprise wiki within the academic research environment addresses concerns of all involved through the development of a robust user training program, a suite of software customizations that enhance security elements, a robust auditing program, allowance for inter-institutional wiki collaboration, and wiki-specific governance.
Do-It-Yourself Learning Games: Software That Lets You Pick the Questions--and Answers.
ERIC Educational Resources Information Center
Hively, Wells
1984-01-01
Reviews user-adaptable learning games that can be customized for any subject, including Tic Tac Show and the Game Show from Computer Advanced Ideas, which are question-answer learning programs based on game shows, and Master Match from Computer Advanced Ideas and Square Pairs from Scholastic Inc., which are based on the card game Concentration.…
2017-09-30
training sessions; internet service was either not available or was restricted. We overcame this problem by using Creo, a high-end software for...based on participants, and customize the training program. Similarly, we have also identified and catalogued a number of training aids in the form...align with AST2 interests in providing services , particularly associated with digital manufacturing and workforce training . Additionally, a number of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brant Peery; Sam Alessi; Randy Lee
2014-06-01
There is a need for a spatial decision support application that allows users to create customized metrics for comparing proposed locations of a new solar installation. This document discusses how PVMapper was designed to overcome the customization problem through the development of loosely coupled spatial and decision components in a JavaScript plugin architecture. This allows the user to easily add functionality and data to the system. The paper also explains how PVMapper provides the user with a dynamic and customizable decision tool that enables them to visually modify the formulas that are used in the decision algorithms that convert datamore » to comparable metrics. The technologies that make up the presentation and calculation software stack are outlined. This document also explains the architecture that allows the tool to grow through custom plugins created by the software users. Some discussion is given on the difficulties encountered while designing the system.« less
Supporting secure programming in web applications through interactive static analysis.
Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill
2014-07-01
Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.
Supporting secure programming in web applications through interactive static analysis
Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill
2013-01-01
Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513
Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence
2007-11-01
The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research.
Sato, Kuniya; Ooba, Masahiro; Takagi, Tomohiko; Furukawa, Zengo; Komiya, Seiichi; Yaegashi, Rihito
2013-12-01
Agile software development gains requirements from the direct discussion with customers and the development staff each time, and the customers evaluate the appropriateness of the requirement. If the customers divide the complicated requirement into individual requirements, the engineer who is in charge of software development can understand it easily. This is called division of requirement. However, the customers do not understand how much and how to divide the requirements. This paper proposes the method to divide a complicated requirement into individual requirements. Also, it shows the development of requirement specification editor which can describe individual requirements. The engineer who is in charge of software development can understand requirements easily.
Programs Model the Future of Air Traffic Management
NASA Technical Reports Server (NTRS)
2010-01-01
Through Small Business Innovation Research (SBIR) contracts with Ames Research Center, Intelligent Automation Inc., based in Rockville, Maryland, advanced specialized software the company had begun developing with U.S. Department of Defense funding. The agent-based infrastructure now allows NASA's Airspace Concept Evaluation System to explore ways of improving the utilization of the National Airspace System (NAS), providing flexible modeling of every part of the NAS down to individual planes, airports, control centers, and even weather. The software has been licensed to a number of aerospace and robotics customers, and has even been used to model the behavior of crowds.
Features of commercial computer software systems for medical examiners and coroners.
Hanzlick, R L; Parrish, R G; Ing, R
1993-12-01
There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.
Transportable Applications Environment Plus, Version 5.1
NASA Technical Reports Server (NTRS)
1994-01-01
Transportable Applications Environment Plus (TAE+) computer program providing integrated, portable programming environment for developing and running application programs based on interactive windows, text, and graphical objects. Enables both programmers and nonprogrammers to construct own custom application interfaces easily and to move interfaces and application programs to different computers. Used to define corporate user interface, with noticeable improvements in application developer's and end user's learning curves. Main components are; WorkBench, What You See Is What You Get (WYSIWYG) software tool for design and layout of user interface; and WPT (Window Programming Tools) Package, set of callable subroutines controlling user interface of application program. WorkBench and WPT's written in C++, and remaining code written in C.
Strengthening Software Authentication with the ROSE Software Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2006-06-15
Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less
Ephus: Multipurpose Data Acquisition Software for Neuroscience Experiments
Suter, Benjamin A.; O'Connor, Timothy; Iyer, Vijay; Petreanu, Leopoldo T.; Hooks, Bryan M.; Kiritani, Taro; Svoboda, Karel; Shepherd, Gordon M. G.
2010-01-01
Physiological measurements in neuroscience experiments often involve complex stimulus paradigms and multiple data channels. Ephus (http://www.ephus.org) is an open-source software package designed for general-purpose data acquisition and instrument control. Ephus operates as a collection of modular programs, including an ephys program for standard whole-cell recording with single or multiple electrodes in typical electrophysiological experiments, and a mapper program for synaptic circuit mapping experiments involving laser scanning photostimulation based on glutamate uncaging or channelrhodopsin-2 excitation. Custom user functions allow user-extensibility at multiple levels, including on-line analysis and closed-loop experiments, where experimental parameters can be changed based on recently acquired data, such as during in vivo behavioral experiments. Ephus is compatible with a variety of data acquisition and imaging hardware. This paper describes the main features and modules of Ephus and their use in representative experimental applications. PMID:21960959
The NASA computer aided design and test system
NASA Technical Reports Server (NTRS)
Gould, J. M.; Juergensen, K.
1973-01-01
A family of computer programs facilitating the design, layout, evaluation, and testing of digital electronic circuitry is described. CADAT (computer aided design and test system) is intended for use by NASA and its contractors and is aimed predominantly at providing cost effective microelectronic subsystems based on custom designed metal oxide semiconductor (MOS) large scale integrated circuits (LSIC's). CADAT software can be easily adopted by installations with a wide variety of computer hardware configurations. Its structure permits ease of update to more powerful component programs and to newly emerging LSIC technologies. The components of the CADAT system are described stressing the interaction of programs rather than detail of coding or algorithms. The CADAT system provides computer aids to derive and document the design intent, includes powerful automatic layout software, permits detailed geometry checks and performance simulation based on mask data, and furnishes test pattern sequences for hardware testing.
Implementation of a data management software system for SSME test history data
NASA Technical Reports Server (NTRS)
Abernethy, Kenneth
1986-01-01
The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.
Bridging the Particle Physics and Big Data Worlds
NASA Astrophysics Data System (ADS)
Pivarski, James
2017-09-01
For decades, particle physicists have developed custom software because the scale and complexity of our problems were unique. In recent years, however, the ``big data'' industry has begun to tackle similar problems, and has developed some novel solutions. Incorporating scientific Python libraries, Spark, TensorFlow, and machine learning tools into the physics software stack can improve abstraction, reliability, and in some cases performance. Perhaps more importantly, it can free physicists to concentrate on domain-specific problems. Building bridges isn't always easy, however. Physics software and open-source software from industry differ in many incidental ways and a few fundamental ways. I will show work from the DIANA-HEP project to streamline data flow from ROOT to Numpy and Spark, to incorporate ideas of functional programming into histogram aggregation, and to develop real-time, query-style manipulations of particle data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
HUBER, J.H.
An Enraf Densitometer is installed on tank 241-AY-102. The Densitometer will frequently be tasked to obtain and log density profiles. The activity can be effected a number of ways. Enraf Incorporated provides a software package called ''Logger18'' to its customers for the purpose of in-shop testing of their gauges. Logger18 is capable of accepting an input file which can direct the gauge to obtain a density profile for a given tank level and bottom limit. Logger18 is a complex, DOS based program which will require trained technicians and/or tank farm entries to obtain the data. ALARA considerations have prompted themore » development of a more user-friendly, computer-based interface to the Enraf densitometers. This document records the plan by which this new Enraf data acquisition software will be developed, reviewed, verified, and released. This plan applies to the development and implementation of a one-time-use software program, which will be called ''Enraf Control Panel.'' The software will be primarily used for remote operation of Enraf Densitometers for the purpose of obtaining and logging tank product density profiles.« less
National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox
Price, Curtis
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
2009-11-12
Service (IaaS) Software -as-a- Service ( SaaS ) Cloud Computing Types Platform-as-a- Service (PaaS) Based on Type of Capability Based on access Based...Mellon University Software -as-a- Service ( SaaS ) Application-specific capabilities, e.g., service that provides customer management Allows organizations...as a Service ( SaaS ) Model of software deployment in which a provider licenses an application to customers for use as a service on
Approaches to Linked Open Data at data.oceandrilling.org
NASA Astrophysics Data System (ADS)
Fils, D.
2012-12-01
The data.oceandrilling.org web application applies Linked Open Data (LOD) patterns to expose Deep Sea Drilling Project (DSDP), Ocean Drilling Program (ODP) and Integrated Ocean Drilling Program (IODP) data. Ocean drilling data is represented in a rich range of data formats: high resolution images, file based data sets and sample based data. This richness of data types has been well met by semantic approaches and will be demonstrated. Data has been extracted from CSV, HTML and RDBMS through custom software and existing packages for loading into a SPARQL 1.1 compliant triple store. Practices have been developed to streamline the maintenance of the RDF graphs and properly expose them using LOD approaches like VoID and HTML embedded structured data. Custom and existing vocabularies are used to allow semantic relations between resources. Use of the W3c draft RDF Data Cube Vocabulary and other approaches for encoding time scales, taxonomic fossil data and other graphs will be shown. A software layer written in Google Go mediates the RDF to web pipeline. The approach used is general and can be applied to other similar environments like node.js or Python Twisted. To facilitate communication user interface software libraries such as D3 and packages such as S2S and LodLive have been used. Additionally OpenSearch API's, structured data in HTML and SPARQL endpoints provide various access methods for applications. The data.oceandrilling.org is not viewed as a web site but as an application that communicate with a range of clients. This approach helps guide the development more along software practices than along web site authoring approaches.
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L
2013-06-01
To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert
2005-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less
The National Map Customer Requirements: Findings from Interviews and Surveys
Sugarbaker, Larry; Coray, Kevin E.; Poore, Barbara
2009-01-01
The purpose of this study was to receive customer feedback and to understand data and information requirements for The National Map. This report provides results and findings from interviews and surveys and will guide policy and operations decisions about data and information requirements leading to the development of a 5-year strategic plan for the National Geospatial Program. These findings are based on feedback from approximately 2,200 customers between February and August 2008. The U.S. Geological Survey (USGS) conducted more than 160 interviews with 200 individuals. The American Society for Photogrammetry and Remote Sensing (ASPRS) and the International Map Trade Association (IMTA) surveyed their memberships and received feedback from over 400 members. The Environmental Systems Research Institute (ESRI) received feedback from over 1,600 of its U.S.-based software users through an online survey sent to customers attending the ESRI International User Conference in the summer of 2008. The results of these surveys were shared with the USGS and have been included in this report.
A general UNIX interface for biocomputing and network information retrieval software.
Kiong, B K; Tan, T W
1993-10-01
We describe a UNIX program, HYBROW, which can integrate without modification a wide range of UNIX biocomputing and network information retrieval software. HYBROW works in conjunction with a separate set of ASCII files containing embedded hypertext-like links. The program operates like a hypertext browser featuring five basic links: file link, execute-only link, execute-display link, directory-browse link and field-filling link. Useful features of the interface may be developed using combinations of these links with simple shell scripts and examples of these are briefly described. The system manager who supports biocomputing users should find the program easy to maintain, and useful in assisting new and infrequent users; it is also simple to incorporate new programs. Moreover, the individual user can customize the interface, create dynamic menus, hypertext a document, invoke shell scripts and new programs simply with a basic understanding of the UNIX operating system and any text editor. This program was written in C language and uses the UNIX curses and termcap libraries. It is freely available as a tar compressed file (by anonymous FTP from nuscc.nus.sg).
NASA Technical Reports Server (NTRS)
Hussey, K. J.; Hall, J. R.; Mortensen, R. A.
1986-01-01
Image processing methods and software used to animate nonimaging remotely sensed data on cloud cover are described. Three FORTRAN programs were written in the VICAR2/TAE image processing domain to perform 3D perspective rendering, to interactively select parameters controlling the projection, and to interpolate parameter sets for animation images between key frames. Operation of the 3D programs and transferring the images to film is automated using executive control language and custom hardware to link the computer and camera.
Software Engineering Improvement Plan
NASA Technical Reports Server (NTRS)
2006-01-01
In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.
Software IV and V Research Priorities and Applied Program Accomplishments Within NASA
NASA Technical Reports Server (NTRS)
Blazy, Louis J.
2000-01-01
The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
Support for Diagnosis of Custom Computer Hardware
NASA Technical Reports Server (NTRS)
Molock, Dwaine S.
2008-01-01
The Coldfire SDN Diagnostics software is a flexible means of exercising, testing, and debugging custom computer hardware. The software is a set of routines that, collectively, serve as a common software interface through which one can gain access to various parts of the hardware under test and/or cause the hardware to perform various functions. The routines can be used to construct tests to exercise, and verify the operation of, various processors and hardware interfaces. More specifically, the software can be used to gain access to memory, to execute timer delays, to configure interrupts, and configure processor cache, floating-point, and direct-memory-access units. The software is designed to be used on diverse NASA projects, and can be customized for use with different processors and interfaces. The routines are supported, regardless of the architecture of a processor that one seeks to diagnose. The present version of the software is configured for Coldfire processors on the Subsystem Data Node processor boards of the Solar Dynamics Observatory. There is also support for the software with respect to Mongoose V, RAD750, and PPC405 processors or their equivalents.
Experience with custom processors in space flight applications
NASA Technical Reports Server (NTRS)
Fraeman, M. E.; Hayes, J. R.; Lohr, D. A.; Ballard, B. W.; Williams, R. L.; Henshaw, R. M.
1991-01-01
The Applied Physics Laboratory (APL) has developed a magnetometer instrument for a swedish satellite named Freja with launch scheduled for August 1992 on a Chinese Long March rocket. The magnetometer controller utilized a custom microprocessor designed at APL with the Genesil silicon compiler. The processor evolved from our experience with an older bit-slice design and two prior single chip efforts. The architecture of our microprocessor greatly lowered software development costs because it was optimized to provide an interactive and extensible programming environment hosted by the target hardware. Radiation tolerance of the microprocessor was also tested and was adequate for Freja's mission -- 20 kRad(Si) total dose and very infrequent latch-up and single event upset events.
ESnet authentication services and trust federations
NASA Astrophysics Data System (ADS)
Muruganantham, Dhivakaran; Helm, Mike; Genovese, Tony
2005-01-01
ESnet provides authentication services and trust federation support for SciDAC projects, collaboratories, and other distributed computing applications. The ESnet ATF team operates the DOEGrids Certificate Authority, available to all DOE Office of Science programs, plus several custom CAs, including one for the National Fusion Collaboratory and one for NERSC. The secure hardware and software environment developed to support CAs is suitable for supporting additional custom authentication and authorization applications that your program might require. Seamless, secure interoperation across organizational and international boundaries is vital to collaborative science. We are fostering the development of international PKI federations by founding the TAGPMA, the American regional PMA, and the worldwide IGTF Policy Management Authority (PMA), as well as participating in European and Asian regional PMAs. We are investigating and prototyping distributed authentication technology that will allow us to support the "roaming scientist" (distributed wireless via eduroam), as well as more secure authentication methods (one-time password tokens).
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei
2014-01-01
Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
Hypoxia, Monitoring, and Mitigation System
2015-08-01
Oxygen Saturation Measured via Pulse - Oximeter SRS Software Requirements Specification SW Software TI Texas Instruments uPROC Micro-Processor USAARL...Financial) Table of Figures Figure 1: Pulse OX custom module...Tasks 3, 4 and 5 have not been exercised. Sensor definition testing continued on the custom pulse -ox design. Additional refinement on the pulse
Parallelizing Data-Centric Programs
2013-09-25
results than current techniques, such as ImageWebs [HGO+10], given the same budget of matches performed. 4.2 Scalable Parallel Similarity Search The work...algorithms. 5 Data-Driven Applications in the Cloud In this project, we investigated what happens when data-centric software is moved from expensive custom ...returns appropriate answer tuples. Figure 9 (b) shows the mutual constraint satisfaction that takes place in answering for 122. The intent is that
Extending Cross-Generational Knowledge Flow Research in Edge Organizations
2008-06-01
letting Protégé generate the basic user interface, and then gradually write widgets and plug-ins to customize its look-and- feel and behavior . 4 3.0...2007a) focused on cross-generational knowledge flows in edge organizations. We found that cross- generational biases affect tacit knowledge transfer...the software engineering field, many matured methodologies already exist, such as Rational Unified Process (Hunt, 2003) or Extreme Programming (Beck
Map_plot and bgg_plot: software for integration of geoscience datasets
NASA Astrophysics Data System (ADS)
Gaillot, Philippe; Punongbayan, Jane T.; Rea, Brice
2004-02-01
Since 1985, the Ocean Drilling Program (ODP) has been supporting multidisciplinary research in exploring the structure and history of Earth beneath the oceans. After more than 200 Legs, complementary datasets covering different geological environments, periods and space scales have been obtained and distributed world-wide using the ODP-Janus and Lamont Doherty Earth Observatory-Borehole Research Group (LDEO-BRG) database servers. In Earth Sciences, more than in any other science, the ensemble of these data is characterized by heterogeneous formats and graphical representation modes. In order to fully and quickly assess this information, a set of Unix/Linux and Generic Mapping Tool-based C programs has been designed to convert and integrate datasets acquired during the present ODP and the future Integrated ODP (IODP) Legs. Using ODP Leg 199 datasets, we show examples of the capabilities of the proposed programs. The program map_plot is used to easily display datasets onto 2-D maps. The program bgg_plot (borehole geology and geophysics plot) displays data with respect to depth and/or time. The latter program includes depth shifting, filtering and plotting of core summary information, continuous and discrete-sample core measurements (e.g. physical properties, geochemistry, etc.), in situ continuous logs, magneto- and bio-stratigraphies, specific sedimentological analyses (lithology, grain size, texture, porosity, etc.), as well as core and borehole wall images. Outputs from both programs are initially produced in PostScript format that can be easily converted to Portable Document Format (PDF) or standard image formats (GIF, JPEG, etc.) using widely distributed conversion programs. Based on command line operations and customization of parameter files, these programs can be included in other shell- or database-scripts, automating plotting procedures of data requests. As an open source software, these programs can be customized and interfaced to fulfill any specific plotting need of geoscientists using ODP-like datasets.
GfaPy: a flexible and extensible software library for handling sequence graphs in Python.
Gonnella, Giorgio; Kurtz, Stefan
2017-10-01
GFA 1 and GFA 2 are recently defined formats for representing sequence graphs, such as assembly, variation or splicing graphs. The formats are adopted by several software tools. Here, we present GfaPy, a software package for creating, parsing and editing GFA graphs using the programming language Python. GfaPy supports GFA 1 and GFA 2, using the same interface and allows for interconversion between both formats. The software package provides a simple interface for custom record types, which is an important new feature of GFA 2 (compared to GFA 1). This enables new applications of the format. GfaPy is available open source at https://github.com/ggonnella/gfapy and installable via pip. gonnella@zbh.uni-hamburg.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
ATK-ForceField: a new generation molecular dynamics software package
NASA Astrophysics Data System (ADS)
Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt
2017-12-01
ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
System, methods and apparatus for program optimization for multi-threaded processor architectures
Bastoul, Cedric; Lethin, Richard A; Leung, Allen K; Meister, Benoit J; Szilagyi, Peter; Vasilache, Nicolas T; Wohlford, David E
2015-01-06
Methods, apparatus and computer software product for source code optimization are provided. In an exemplary embodiment, a first custom computing apparatus is used to optimize the execution of source code on a second computing apparatus. In this embodiment, the first custom computing apparatus contains a memory, a storage medium and at least one processor with at least one multi-stage execution unit. The second computing apparatus contains at least two multi-stage execution units that allow for parallel execution of tasks. The first custom computing apparatus optimizes the code for parallelism, locality of operations and contiguity of memory accesses on the second computing apparatus. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims.
Bottom-feeding for blockbuster businesses.
Rosenblum, David; Tomlinson, Doug; Scott, Larry
2003-03-01
Marketing experts tell companies to analyze their customer portfolios and weed out buyer segments that don't generate attractive returns. Loyalty experts stress the need to aim retention programs at "good" customers--profitable ones- and encourage the "bad" ones to buy from competitors. And customer-relationship-management software provides ever more sophisticated ways to identify and eliminate poorly performing customers. On the surface, the movement to banish unprofitable customers seems reasonable. But writing off a customer relationship simply because it is currently unprofitable is at best rash and at worst counterproductive. Executives shouldn't be asking themselves, How can we shun unprofitable customers? They need to ask, How can we make money off the customers that everyone else is shunning? When you look at apparently unattractive segments through this lens, you often see opportunities to serve those segments in ways that fundamentally change customer economics. Consider Paychex, a payroll-processing company that built a nearly billion-dollar business by serving small companies. Established players had ignored these customers on the assumption that small companies couldn't afford the service. When founder Tom Golisano couldn't convince his bosses at Electronic Accounting Systems that they were missing a major opportunity, he started a company that now serves 390,000 U.S. customers, each employing around 14 people. In this article, the authors look closely at bottom-feeders--companies that assessed the needs of supposedly unattractive customers and redesigned their business models to turn a profit by fulfilling those needs. And they offer lessons other executives can use to do the same.
Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.
2010-01-01
Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475
Guzman, Jessica; Lee, Elizabeth; Draper, David; Valivullah, Zaheer; Yu, Guoyun; Sincan, Murat; Gahl, William A.; Adams, David R.
2015-01-01
The Undiagnosed Diseases Program (UDP) was started in 2008 with the goals of making diagnoses and facilitating related translational research. The individuals and families seen by the UDP are often unique and medically complex. Approximately 40% of UDP cases are pediatric. The Undiagnosed Diseases Program Integrated Collaboration System (UDPICS) was designed to create a collaborative workspace for researchers, clinicians and families. We describe our progress in developing the system to date, focusing on design rationale, challenges and issues that are likely to be common in the development of similar systems in the future. PMID:27417368
Collaboration, Communication and Co-ordination in Agile Software Development Practice
NASA Astrophysics Data System (ADS)
Robinson, Hugh; Sharp, Helen
This chapter analyses the results of a series of observational studies of
Educational Software Acquisition for Microcomputers.
ERIC Educational Resources Information Center
Erikson, Warren; Turban, Efraim
1985-01-01
Examination of issues involved in acquiring appropriate microcomputer software for higher education focuses on the following points: developing your own software; finding commercially available software; using published evaluations; pre-purchase testing; customizing and adapting commercial software; post-purchase testing; and software use. A…
Chew, Avenell L.; Lamey, Tina; McLaren, Terri; De Roach, John
2016-01-01
Purpose To present en face optical coherence tomography (OCT) images generated by graph-search theory algorithm-based custom software and examine correlation with other imaging modalities. Methods En face OCT images derived from high density OCT volumetric scans of 3 healthy subjects and 4 patients using a custom algorithm (graph-search theory) and commercial software (Heidelberg Eye Explorer software (Heidelberg Engineering)) were compared and correlated with near infrared reflectance, fundus autofluorescence, adaptive optics flood-illumination ophthalmoscopy (AO-FIO) and microperimetry. Results Commercial software was unable to generate accurate en face OCT images in eyes with retinal pigment epithelium (RPE) pathology due to segmentation error at the level of Bruch’s membrane (BM). Accurate segmentation of the basal RPE and BM was achieved using custom software. The en face OCT images from eyes with isolated interdigitation or ellipsoid zone pathology were of similar quality between custom software and Heidelberg Eye Explorer software in the absence of any other significant outer retinal pathology. En face OCT images demonstrated angioid streaks, lesions of acute macular neuroretinopathy, hydroxychloroquine toxicity and Bietti crystalline deposits that correlated with other imaging modalities. Conclusions Graph-search theory algorithm helps to overcome the limitations of outer retinal segmentation inaccuracies in commercial software. En face OCT images can provide detailed topography of the reflectivity within a specific layer of the retina which correlates with other forms of fundus imaging. Our results highlight the need for standardization of image reflectivity to facilitate quantification of en face OCT images and longitudinal analysis. PMID:27959968
NASA Astrophysics Data System (ADS)
Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.
In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).
NASA Technical Reports Server (NTRS)
Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.
2017-01-01
In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).
Steady-State Cycle Deck Launcher Developed for Numerical Propulsion System Simulation
NASA Technical Reports Server (NTRS)
VanDrei, Donald E.
1997-01-01
One of the objectives of NASA's High Performance Computing and Communications Program's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to reduce the time and cost of generating aerothermal numerical representations of engines, called customer decks. These customer decks, which are delivered to airframe companies by various U.S. engine companies, numerically characterize an engine's performance as defined by the particular U.S. airframe manufacturer. Until recently, all numerical models were provided with a Fortran-compatible interface in compliance with the Society of Automotive Engineers (SAE) document AS681F, and data communication was performed via a standard, labeled common structure in compliance with AS681F. Recently, the SAE committee began to develop a new standard: AS681G. AS681G addresses multiple language requirements for customer decks along with alternative data communication techniques. Along with the SAE committee, the NPSS Steady-State Cycle Deck project team developed a standard Application Program Interface (API) supported by a graphical user interface. This work will result in Aerospace Recommended Practice 4868 (ARP4868). The Steady-State Cycle Deck work was validated against the Energy Efficient Engine customer deck, which is publicly available. The Energy Efficient Engine wrapper was used not only to validate ARP4868 but also to demonstrate how to wrap an existing customer deck. The graphical user interface for the Steady-State Cycle Deck facilitates the use of the new standard and makes it easier to design and analyze a customer deck. This software was developed following I. Jacobson's Object-Oriented Design methodology and is implemented in C++. The AS681G standard will establish a common generic interface for U.S. engine companies and airframe manufacturers. This will lead to more accurate cycle models, quicker model generation, and faster validation leading to specifications. The standard will facilitate cooperative work between industry and NASA. The NPSS Steady-State Cycle Deck team released a batch version of the Steady-State Cycle Deck in March 1996. Version 1.1 was released in June 1996. During fiscal 1997, NPSS accepted enhancements and modifications to the Steady-State Cycle Deck launcher. Consistent with NPSS' commercialization plan, these modifications will be done by a third party that can provide long-term software support.
COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL
NASA Technical Reports Server (NTRS)
Roush, G. B.
1994-01-01
The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.
Technology: Making the Connections. Innovations in the Apparel Industry. Resources in Technology.
ERIC Educational Resources Information Center
Threlfall, K. Denise
1996-01-01
Describes the partnership between Levi Strauss & Co., the largest brand-name apparel manufacturer in the world, and Custom Clothing Technology, the developer of software to customize jeans for female customers. (JOW)
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
Flight Dynamics and Control of a Morphing UAV: Bio inspired by Natural Fliers
2017-02-17
Approved for public release: distribution unlimited. IV Modelling and Sizing Tornado Vortex Lattice Method (VLM) was used for aerodynamic prediction... Tornado is a Vortex Lattice Method software programmed in MATLAB; it was selected due to its fast solving time and ability to be controlled through...custom MATLAB scripts. Tornado VLM models the wing as thin sheet of discrete vortices and computes the pressure and force distributions around the
Software Manages Documentation in a Large Test Facility
NASA Technical Reports Server (NTRS)
Gurneck, Joseph M.
2001-01-01
The 3MCS computer program assists and instrumentation engineer in performing the 3 essential functions of design, documentation, and configuration management of measurement and control systems in a large test facility. Services provided by 3MCS are acceptance of input from multiple engineers and technicians working at multiple locations;standardization of drawings;automated cross-referencing; identification of errors;listing of components and resources; downloading of test settings; and provision of information to customers.
A Case Study of Human-in-the-loop for Telescope Operation
2014-08-22
comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE (DD-MM-YY) 2. REPORT TYPE 3...preferred commercial camera control software; required for autofocus and advanced mount model configuration) • Dome control – custom Python program...his overnight telescope shifts. He was essentially self-taught using his personally owned telescope that was a different model from the AFIT
A Discussion of the Software Quality Assurance Role
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2010-01-01
The basic idea underlying this paper is that the conventional understanding of the role of a Software Quality Assurance (SQA) engineer is unduly limited. This is because few have asked who the customers of a SQA engineer are. Once you do this, you can better define what tasks a SQA engineer should perform, as well as identify the knowledge and skills that such a person should have. The consequence of doing this is that a SQA engineer can provide greater value to his or her customers. It is the position of this paper that a SQA engineer providing significant value to his or her customers must not only assume the role of an auditor, but also that of a software and systems engineer. This is because software engineers and their managers particularly value contributions that directly impact products and their development. These ideas are summarized as lessons learned, based on my experience at Jet Propulsion Laboratory (JPL).
State of the Art of Network Security Perspectives in Cloud Computing
NASA Astrophysics Data System (ADS)
Oh, Tae Hwan; Lim, Shinyoung; Choi, Young B.; Park, Kwang-Roh; Lee, Heejo; Choi, Hyunsang
Cloud computing is now regarded as one of social phenomenon that satisfy customers' needs. It is possible that the customers' needs and the primary principle of economy - gain maximum benefits from minimum investment - reflects realization of cloud computing. We are living in the connected society with flood of information and without connected computers to the Internet, our activities and work of daily living will be impossible. Cloud computing is able to provide customers with custom-tailored features of application software and user's environment based on the customer's needs by adopting on-demand outsourcing of computing resources through the Internet. It also provides cloud computing users with high-end computing power and expensive application software package, and accordingly the users will access their data and the application software where they are located at the remote system. As the cloud computing system is connected to the Internet, network security issues of cloud computing are considered as mandatory prior to real world service. In this paper, survey and issues on the network security in cloud computing are discussed from the perspective of real world service environments.
Instrumentino: An Open-Source Software for Scientific Instruments.
Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C
2015-01-01
Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.
Cenozoic Antarctic DiatomWare/BugCam: An aid for research and teaching
Wise, S.W.; Olney, M.; Covington, J.M.; Egerton, V.M.; Jiang, S.; Ramdeen, D.K.; ,; Schrader, H.; Sims, P.A.; Wood, A.S.; Davis, A.; Davenport, D.R.; Doepler, N.; Falcon, W.; Lopez, C.; Pressley, T.; Swedberg, O.L.; Harwood, D.M.
2007-01-01
Cenozoic Antarctic DiatomWare/BugCam© is an interactive, icon-driven digital-image database/software package that displays over 500 illustrated Cenozoic Antarctic diatom taxa along with original descriptions (including over 100 generic and 20 family-group descriptions). This digital catalog is designed primarily for use by micropaleontologists working in the field (at sea or on the Antarctic continent) where hard-copy literature resources are limited. This new package will also be useful for classroom/lab teaching as well as for any paleontologists making or refining taxonomic identifications at the microscope. The database (Cenozoic Antarctic DiatomWare) is displayed via a custom software program (BugCam) written in Visual Basic for use on PCs running Windows 95 or later operating systems. BugCam is a flexible image display program that utilizes an intuitive thumbnail “tree” structure for navigation through the database. The data are stored on Micrsosoft EXCEL spread sheets, hence no separate relational database program is necessary to run the package
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1994-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of May 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Nine articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: (1) WFI - Windowing System for Test and Simulation; (2) HZETRN - A Free Space Radiation Transport and Shielding Program; (3) COMGEN-BEM - Composite Model Generation-Boundary Element Method; (4) IDDS - Interactive Data Display System; (5) CET93/PC - Chemical Equilibrium with Transport Properties, 1993; (6) SDVIC - Sub-pixel Digital Video Image Correlation; (7) TRASYS - Thermal Radiation Analyzer System (HP9000 Series 700/800 Version without NASADIG); (8) NASADIG - NASA Device Independent Graphics Library, Version 6.0 (VAX VMS Version); and (9) NASADIG - NASA Device Independent Graphics Library, Version 6.0 (UNIX Version). Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and dissemination are also described along with a budget summary.
Linear sine wave profiling to machine instability targets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, Derek William; Martinez, John Israel
2016-08-01
Specialized machining processes and programming have been developed to deliver thin tin and copper Richtmyer-Meshkov instability targets that have different amplitude perturbations across the face of one 4-in.-diameter target. Typical targets have anywhere from two to five different regions of sine waves that have different amplitudes varying from 4 to 200 μm across the face of the target. The puck is composed of multiple rings that are zero press fit together and diamond turned to create a flat platform with a tolerance of 2 μm for the shock experiment. A custom software program was written in Labview to write themore » point-to-point program for the diamond-turning profiler through the X-Y-Z movements to cut the pure planar straight sine wave geometry. As a result, the software is optimized to push the profile of the whole part into the face while eliminating any unneeded passes that do not cut any material.« less
Martin, Roger L
2011-06-01
A few years ago the software development company Intuit realized that it needed a new approach to galvanizing customers. The company's Net Promoter Score was faltering, and customer recommendations of new products were especially disappointing. Intuit decided to hold a two-day, off-site meeting for the company's top 300 managers with a focus on the role of design in innovation. One of the days was dedicated to a program called Design for Delight. The centerpiece of the day was a PowerPoint presentation by Intuit founder Scott Cook, who realized midway through that he was no Steve Jobs: The managers listened dutifully, but there was little energy in the room. By contrast, a subsequent exercise in which the participants worked through a design challenge by creating prototypes, getting feedback, iterating, and refining, had them mesmerized. The eventual result was the creation of a team of nine design-thinking coaches--"innovation catalysts"--from across Intuit who were made available to help any work group create prototypes, run experiments, and learn from customers. The process includes a "painstorm" (to determine the customer's greatest pain point), a "soljam" (to generate and then winnow possible solutions), and a "code-jam" (to write code "good enough" to take to customers within two weeks). Design for Delight has enabled employees throughout Intuit to move from satisfying customers to delighting them.
NASA Technical Reports Server (NTRS)
2000-01-01
A former Ames employee, Monte Zweben, founded a new company, Blue Martini Software, that provides software to companies seeking to personalize their products to individual customers. This customer targeting approach is accomplished through the use of artificial intelligence concepts Zweben worked on while at Ames. The Ames AI research has found applications in clickstream mining and purchasing behavior data collection.
Employees' Perception of Learning New Software from Customized Training Materials
ERIC Educational Resources Information Center
Dean, Kristi L.
2010-01-01
The purpose of this research is to conduct a descriptive survey research study that will look at the value of using customized training materials to train employees to learn how to use software. The data will be repeatedly compared; ensuring the design of the research and the corresponding data collection method provides a panoramic and…
Cleanroom Software Engineering Reference Model. Version 1.0.
1996-11-01
teams. It also serves as a baseline for continued evolution of Cleanroom practice. The scope of the CRM is software management , specification...addition to project staff, participants include management , peer organization representatives, and customer representatives as appropriate for...2 Review the status of the process with management , the project team, peer groups, and the customer . These verification activities include
Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort
NASA Technical Reports Server (NTRS)
Bao, Han P.
1991-01-01
The goal of this project is to supplement the footwear design system of North Carolina State University (NCSU) with a software module to design and manufacture a combination sole. The four areas of concentration were: customization of NASCAD (NASA Computer Aided Design) to the footwear project; use of CENCIT data; computer aided manufacturing activities; and beginning work for the bottom elements of shoes. The task of generating a software module for producing a sole was completed with a demonstrated product realization. The software written in C was delivered to NCSU for inclusion in their design system for custom footwear known as LASTMOD. The machining process of the shoe last was improved using a spiral tool path approach.
NASA Technical Reports Server (NTRS)
Gerard, Mireille (Editor); Edwards, Pamela W. (Editor)
1988-01-01
Technological and planning issues for data management, processing, and communication on Space Station Freedom are discussed in reviews and reports by U.S., European, and Japanese experts. The space-information-system strategies of NASA, ESA, and NASDA are discussed; customer needs are analyzed; and particular attention is given to communication and data systems, standards and protocols, integrated system architectures, software and automation, and plans and approaches being developed on the basis of experience from past programs. Also included are the reports from workshop sessions on design to meet customer needs, the accommodation of growth and new technologies, and system interoperability.
OVERFLOW-Interaction with Industry
NASA Technical Reports Server (NTRS)
Buning, Pieter G.; George, Michael W. (Technical Monitor)
1996-01-01
A Navier-Stokes flow solver, OVERFLOW, has been developed by researchers at NASA Ames Research Center to use overset (Chimera) grids to simulate the flow about complex aerodynamic shapes. Primary customers of the OVERFLOW flow solver and related software include McDonnell Douglas and Boeing, as well as the NASA Focused Programs for Advanced Subsonic Technology (AST) and High Speed Research (HSR). Code development has focused on customer issues, including improving code performance, ability to run on workstation clusters and the NAS SP2, and direct interaction with industry on accuracy assessment and validation. Significant interaction with NAS has produced a capability tailored to the Ames computing environment, and code contributions have come from a wide range of sources, both within and outside Ames.
Cone-beam micro-CT system based on LabVIEW software.
Ionita, Ciprian N; Hoffmann, Keneth R; Bednarek, Daniel R; Chityala, Ravishankar; Rudin, Stephen
2008-09-01
Construction of a cone-beam computed tomography (CBCT) system for laboratory research usually requires integration of different software and hardware components. As a result, building and operating such a complex system require the expertise of researchers with significantly different backgrounds. Additionally, writing flexible code to control the hardware components of a CBCT system combined with designing a friendly graphical user interface (GUI) can be cumbersome and time consuming. An intuitive and flexible program structure, as well as the program GUI for CBCT acquisition, is presented in this note. The program was developed in National Instrument's Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) graphical language and is designed to control a custom-built CBCT system but has been also used in a standard angiographic suite. The hardware components are commercially available to researchers and are in general provided with software drivers which are LabVIEW compatible. The program structure was designed as a sequential chain. Each step in the chain takes care of one or two hardware commands at a time; the execution of the sequence can be modified according to the CBCT system design. We have scanned and reconstructed over 200 specimens using this interface and present three examples which cover different areas of interest encountered in laboratory research. The resulting 3D data are rendered using a commercial workstation. The program described in this paper is available for use or improvement by other researchers.
Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J
2015-01-01
Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169
Siebel, T
2001-03-01
There is a growing awareness among corporations that the quality of the customer experience they provide directly affects their bottom line. Many are turning to high-flying software maker Siebel Systems for help in managing those relationships. The young company holds a leadership position in an explosive market-enterprise application software. But customer satisfaction, not dot-com chic, is foremost on the mind of Siebel Systems' founder, chairman, and CEO, Tom Siebel. The buttoned-down Siebel rejects the freewheeling management style and culture that characterize many Silicon Valley companies. As the former CEO of Gain Technology and a former executive at Oracle, Siebel believes in putting customers ahead of technology, discipline ahead of inspiration. In this interview, conducted at the company's San Mateo, California, headquarters, Siebel describes how this obsessive focus on customer satisfaction has been the driving force behind the company's success. He talks about how the organization remains true to its core values: a deep commitment to providing customer satisfaction; responsible fiscal practices that have created a cash-positive business amid today's cash-negative dot-coms; and general professionalism. "The notion of dressing in jeans and a T-shirt to greet the CEO of a major financial institution who just got off the plane from Munich is not acceptable," he says. Siebel Systems rejects the concept of going to war with rivals; instead, the CEO says, the company has forged an ecosystem of partnerships that allows it to support and integrate its own systems with other companies' software products and ultimately ease the customer's software installations. Indeed, Siebel says, the CEO's most important job is to understand what customers need and deliver that.
Hynes, Martin; Wang, Han; Kilmartin, Liam
2009-01-01
Over the last decade, there has been substantial research interest in the application of accelerometry data for many forms of automated gait and activity analysis algorithms. This paper introduces a summary of new "of-the-shelf" mobile phone handset platforms containing embedded accelerometers which support the development of custom software to implement real time analysis of the accelerometer data. An overview of the main software programming environments which support the development of such software, including Java ME based JSR 256 API, C++ based Motion Sensor API and the Python based "aXYZ" module, is provided. Finally, a sample application is introduced and its performance evaluated in order to illustrate how a standard mobile phone can be used to detect gait activity using such a non-intrusive and easily accepted sensing platform.
A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Wyss, Gregory Dane
2004-07-01
This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a librarymore » that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.« less
Infrastructure for Rapid Development of Java GUI Programs
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip
2006-01-01
The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.
Implementation of a School-wide Clinical Intervention Documentation System
Stevenson, T. Lynn; Fox, Brent I.; Andrus, Miranda; Carroll, Dana
2011-01-01
Objective. To evaluate the effectiveness and impact of a customized Web-based software program implemented in 2006 for school-wide documentation of clinical interventions by pharmacy practice faculty members, pharmacy residents, and student pharmacists. Methods. The implementation process, directed by a committee of faculty members and school administrators, included preparation and refinement of the software, user training, development of forms and reports, and integration of the documentation process within the curriculum. Results. Use of the documentation tool consistently increased from May 2007 to December 2010. Over 187,000 interventions were documented with over $6.2 million in associated cost avoidance. Conclusions. Successful implementation of a school-wide documentation tool required considerable time from the oversight committee and a comprehensive training program for all users, with ongoing monitoring of data collection practices. Data collected proved to be useful to show the impact of faculty members, residents, and student pharmacists at affiliated training sites. PMID:21829264
Situational Lightning Climatologies for Central Florida: Phase III
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III
2008-01-01
This report describes work done by the Applied Meteorology Unit (AMU) to add composite soundings to the Advanced Weather Interactive Processing System (AWIPS). This allows National Weather Service (NWS) forecasters to compare the current atmospheric state with climatology. In a previous phase, the AMU created composite soundings for four rawinsonde observation stations in Florida, for each of eight flow regimes. The composite soundings were delivered to the NWS Melbourne (MLB) office for display using the NSHARP software program. NWS MLB requested that the AMU make the composite soundings available for display in AWIPS. The AMU first created a procedure to customize AWIPS so composite soundings could be displayed. A unique four-character identifier was created for each of the 32 composite soundings. The AMU wrote a Tool Command Language/Tool Kit (TcVTk) software program to convert the composite soundings from NSHARP to Network Common Data Form (NetCDF) format. The NetCDF files were then displayable by AWIPS.
Supporting Coral Reef Ecosystem Management Decisions Appropriate to Climate Change
NASA Astrophysics Data System (ADS)
Hendee, J. C.; Fletcher, P.; Shein, K. A.
2013-05-01
There has been a perception that the myriad of environmental information products derived from satellite and other instrumental sources means ipso facto that there is a direct use for them by environmental managers. Trouble is, as information providers, for the most part we don't really know what decisions managers face daily, nor is it a trivial matter to ascertain the effect of management decisions on the environment, at least in a time frame that facilitates timely maintenance and enhancement of decision support software. To bridge this gap in understanding, we conducted a Needs Assessment (using methodology from the NOAA/Coastal Services Center's Product Design and Evaluation training program) from December, 2011 through May, 2012, in which we queried 15 resource managers in southeast Florida to identify the types of climate data and information products they needed to understand the effects of climate change in their region of purview, and how best these products should be delivered and subsequently enhanced or corrected. Our intent has been to develop a suite of software and information products customized specifically for environmental managers. This report summarizes our success to date, including a report on the development of software for gathering and presenting specific types of climate data, and a narrative about how some U.S. government sponsored efforts, such as Giovanni and TerraVis, as well as non-governmental sponsored efforts such as Marxan, Zonation, SimCLIM, and other off-the-shelf software might be customized for use in specific regions.
Air Markets Program Data (AMPD)
The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.
An Integrated System for Wildlife Sensing
2014-08-14
design requirement. “Sensor Controller” software. A custom Sensor Controller application was developed for the Android device in order to collect...and log readings from that device’s sensors. “Camera Controller” software. A custom Camera Controller application was developed for the Android device...into 2 separate Android applications (Figure 4). The Sensor Controller logs readings periodically from the Android device’s organic sensors, and
Accelerating semantic graph databases on commodity clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morari, Alessandro; Castellana, Vito G.; Haglin, David J.
We are developing a full software system for accelerating semantic graph databases on commodity cluster that scales to hundreds of nodes while maintaining constant query throughput. Our framework comprises a SPARQL to C++ compiler, a library of parallel graph methods and a custom multithreaded runtime layer, which provides a Partitioned Global Address Space (PGAS) programming model with fork/join parallelism and automatic load balancing over a commodity clusters. We present preliminary results for the compiler and for the runtime.
SNS programming environment user's guide
NASA Technical Reports Server (NTRS)
Tennille, Geoffrey M.; Howser, Lona M.; Humes, D. Creig; Cronin, Catherine K.; Bowen, John T.; Drozdowski, Joseph M.; Utley, Judith A.; Flynn, Theresa M.; Austin, Brenda A.
1992-01-01
The computing environment is briefly described for the Supercomputing Network Subsystem (SNS) of the Central Scientific Computing Complex of NASA Langley. The major SNS computers are a CRAY-2, a CRAY Y-MP, a CONVEX C-210, and a CONVEX C-220. The software is described that is common to all of these computers, including: the UNIX operating system, computer graphics, networking utilities, mass storage, and mathematical libraries. Also described is file management, validation, SNS configuration, documentation, and customer services.
NASA Technology Transfer System
NASA Technical Reports Server (NTRS)
Tran, Peter B.; Okimura, Takeshi
2017-01-01
NTTS is the IT infrastructure for the Agency's Technology Transfer (T2) program containing 60,000+ technology portfolio supporting all ten NASA field centers and HQ. It is the enterprise IT system for facilitating the Agency's technology transfer process, which includes reporting of new technologies (e.g., technology invention disclosures NF1679), protecting intellectual properties (e.g., patents), and commercializing technologies through various technology licenses, software releases, spinoffs, and success stories using custom built workflow, reporting, data consolidation, integration, and search engines.
Universal mechatronics coordinator
NASA Astrophysics Data System (ADS)
Muir, Patrick F.
1999-11-01
Mechatronic systems incorporate multiple actuators and sensor which must be properly coordinated to achieve the desired system functionality. Many mechatronic systems are designed as one-of-a-kind custom projects without consideration for facilitating future system or alterations and extensions to the current syste. Thus, subsequent changes to the system are slow, different, and costly. It has become apparent that manufacturing processes, and thus the mechatronics which embody them, need to be agile in order to more quickly and easily respond to changing customer demands or market pressures. To achieve agility, both the hardware and software of the system need to be designed such that the creation of new system and the alteration and extension of current system is fast and easy. This paper describes the design of a Universal Mechatronics Coordinator (UMC) which facilitates agile setup and changeover of coordination software for mechatronic systems. The UMC is capable of sequencing continuous and discrete actions that are programmed as stimulus-response pairs, as state machines, or a combination of the two. It facilitates the modular, reusable programing of continuous actions such as servo control algorithms, data collection code, and safety checking routines; and discrete actions such as reporting achieved states, and turning on/off binary devices. The UMC has been applied to the control of a z- theta assembly robot for the Minifactory project and is applicable to a spectrum of widely differing mechatronic systems.
A Web interface generator for molecular biology programs in Unix.
Letondal, C
2001-01-01
Almost all users encounter problems using sequence analysis programs. Not only are they difficult to learn because of the parameters, syntax and semantic, but many are different. That is why we have developed a Web interface generator for more than 150 molecular biology command-line driven programs, including: phylogeny, gene prediction, alignment, RNA, DNA and protein analysis, motif discovery, structure analysis and database searching programs. The generator uses XML as a high-level description language of the legacy software parameters. Its aim is to provide users with the equivalent of a basic Unix environment, with program combination, customization and basic scripting through macro registration. The program has been used for three years by about 15000 users throughout the world; it has recently been installed on other sites and evaluated as a standard user interface for EMBOSS programs.
Brettin, Thomas; Davis, James J.; Disz, Terry; ...
2015-02-10
The RAST (Rapid Annotation using Subsystem Technology) annotation engine was built in 2008 to annotate bacterial and archaeal genomes. It works by offering a standard software pipeline for identifying genomic features (i.e., protein-encoding genes and RNA) and annotating their functions. Recently, in order to make RAST a more useful research tool and to keep pace with advancements in bioinformatics, it has become desirable to build a version of RAST that is both customizable and extensible. In this paper, we describe the RAST tool kit (RASTtk), a modular version of RAST that enables researchers to build custom annotation pipelines. RASTtk offersmore » a choice of software for identifying and annotating genomic features as well as the ability to add custom features to an annotation job. RASTtk also accommodates the batch submission of genomes and the ability to customize annotation protocols for batch submissions. This is the first major software restructuring of RAST since its inception.« less
Spreadsheet-based program for alignment of overlapping DNA sequences.
Anbazhagan, R; Gabrielson, E
1999-06-01
Molecular biology laboratories frequently face the challenge of aligning small overlapping DNA sequences derived from a long DNA segment. Here, we present a short program that can be used to adapt Excel spreadsheets as a tool for aligning DNA sequences, regardless of their orientation. The program runs on any Windows or Macintosh operating system computer with Excel 97 or Excel 98. The program is available for use as an Excel file, which can be downloaded from the BioTechniques Web site. Upon execution, the program opens a specially designed customized workbook and is capable of identifying overlapping regions between two sequence fragments and displaying the sequence alignment. It also performs a number of specialized functions such as recognition of restriction enzyme cutting sites and CpG island mapping without costly specialized software.
Customer Communication Challenges and Solutions in Globally Distributed Agile Software Development
NASA Astrophysics Data System (ADS)
Pikkarainen, Minna; Korkala, Mikko
Working in the globally distributed market is one of the key trends among the software organizations all over the world. [1-5]. Several factors have contributed to the growth of distributed software development; time-zone independent ”follow the sun” development, access to well-educated labour, maturation of the technical infrastructure and reduced costs are some of the most commonly cited benefits of distributed development [3, 6-8]. Furthermore, customers are often located in different countries because of the companies’ internationalization purposes or good market opportunities.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Detailed Design Documentation, without the Pain
NASA Astrophysics Data System (ADS)
Ramsay, C. D.; Parkes, S.
2004-06-01
Producing detailed forms of design documentation, such as pseudocode and structured flowcharts, to describe the procedures of a software system:(1) allows software developers to model and discuss their understanding of a problem and the design of a solution free from the syntax of a programming language,(2) facilitates deeper involvement of non-technical stakeholders, such as the customer or project managers, whose influence ensures the quality, correctness and timeliness of the resulting system,(3) forms comprehensive documentation of the system for its future maintenance, reuse and/or redeployment.However, such forms of documentation require effort to create and maintain.This paper describes a software tool which is currently being developed within the Space Systems Research Group at the University of Dundee which aims to improve the utility of, and the incentive for, creating detailed design documentation for the procedures of a software system. The rationale for creating such a tool is briefly discussed, followed by a description of the tool itself, a summary of its perceived benefits, and plans for future work.
Developing Simulated Cyber Attack Scenarios Against Virtualized Adversary Networks
2017-03-01
MAST is a custom software framework originally designed to facilitate the training of network administrators on live networks using SimWare. The MAST...or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services ...scenario development and testing in a virtual test environment. Commercial and custom software tools that provide the ability to conduct network
ERIC Educational Resources Information Center
Biju, Soly Mathew
2008-01-01
Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…
Warning: Projects May Be Closer than They Appear
NASA Technical Reports Server (NTRS)
Africa, Colby
2004-01-01
I had been working for two years as the technical product manager for a large software company, when their partner company gave me a call. They needed good software engineers to customize a new version of software, and they thought I was their guy. They told me what they wanted to do to the software, and they even showed me some prototypes. Their idea was to take the basic software tool that the large company was producing and make it more accessible to the customer. They would do this by building in flexibility based on user skill level and organizational maturity. I thought that was a fascinating approach, and I bought into it in a big way. I decided to leave my job and join up with the smaller company as their director of software engineering.
Handheld emissions detector (HED): overview and development
NASA Astrophysics Data System (ADS)
Valentino, George J.; Schimmel, David
2009-05-01
Nova Engineering, Cincinnati OH, a division of L-3 Communications (L-3 Nova), under the sponsorship of Program Manager Soldier Warrior (PM-SWAR), Fort Belvoir, VA, has developed a Soldier portable, light-weight, hand-held, geolocation sensor and processing system called the Handheld Emissions Detector (HED). The HED is a broadband custom receiver and processor that allows the user to easily sense, direction find, and locate a broad range of emitters in the user's surrounding area. Now in its second design iteration, the HED incorporates a set of COTS components that are complemented with L-3 Nova custom RF, power, digital, and mechanical components, plus custom embedded and application software. The HED user interfaces are designed to provide complex information in a readily-understandable form, thereby providing actionable results for operators. This paper provides, where possible, the top-level characteristics of the HED as well as the rationale behind its design philosophy along with its applications in both DOD and Commercial markets.
Generic Kalman Filter Software
NASA Technical Reports Server (NTRS)
Lisano, Michael E., II; Crues, Edwin Z.
2005-01-01
The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.
Investigation of roughing machining simulation by using visual basic programming in NX CAM system
NASA Astrophysics Data System (ADS)
Hafiz Mohamad, Mohamad; Nafis Osman Zahid, Muhammed
2018-03-01
This paper outlines a simulation study to investigate the characteristic of roughing machining simulation in 4th axis milling processes by utilizing visual basic programming in NX CAM systems. The selection and optimization of cutting orientation in rough milling operation is critical in 4th axis machining. The main purpose of roughing operation is to approximately shape the machined parts into finished form by removing the bulk of material from workpieces. In this paper, the simulations are executed by manipulating a set of different cutting orientation to generate estimated volume removed from the machine parts. The cutting orientation with high volume removal is denoted as an optimum value and chosen to execute a roughing operation. In order to run the simulation, customized software is developed to assist the routines. Operations build-up instructions in NX CAM interface are translated into programming codes via advanced tool available in the Visual Basic Studio. The codes is customized and equipped with decision making tools to run and control the simulations. It permits the integration with any independent program files to execute specific operations. This paper aims to discuss about the simulation program and identifies optimum cutting orientations for roughing processes. The output of this study will broaden up the simulation routines performed in NX CAM systems.
ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra
NASA Astrophysics Data System (ADS)
Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.
31 CFR 800.301 - Transactions that are covered transactions.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and maintained relationships with its prior customers, all of which were transferred to Corporation A..., customer list, equipment, and inventory management software used to operate the facility. Under these facts... physical facility, and would not include customer lists, intellectual property, or other proprietary...
31 CFR 800.301 - Transactions that are covered transactions.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., and maintained relationships with its prior customers, all of which were transferred to Corporation A..., customer list, equipment, and inventory management software used to operate the facility. Under these facts... physical facility, and would not include customer lists, intellectual property, or other proprietary...
31 CFR 800.301 - Transactions that are covered transactions.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., and maintained relationships with its prior customers, all of which were transferred to Corporation A..., customer list, equipment, and inventory management software used to operate the facility. Under these facts... physical facility, and would not include customer lists, intellectual property, or other proprietary...
31 CFR 800.301 - Transactions that are covered transactions.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., and maintained relationships with its prior customers, all of which were transferred to Corporation A..., customer list, equipment, and inventory management software used to operate the facility. Under these facts... physical facility, and would not include customer lists, intellectual property, or other proprietary...
31 CFR 800.301 - Transactions that are covered transactions.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., and maintained relationships with its prior customers, all of which were transferred to Corporation A..., customer list, equipment, and inventory management software used to operate the facility. Under these facts... physical facility, and would not include customer lists, intellectual property, or other proprietary...
Issues in Defining Software Architectures in a GIS Environment
NASA Technical Reports Server (NTRS)
Acosta, Jesus; Alvorado, Lori
1997-01-01
The primary mission of the Pan-American Center for Earth and Environmental Studies (PACES) is to advance the research areas that are relevant to NASA's Mission to Planet Earth program. One of the activities at PACES is the establishment of a repository for geographical, geological and environmental information that covers various regions of Mexico and the southwest region of the U.S. and that is acquired from NASA and other sources through remote sensing, ground studies or paper-based maps. The center will be providing access of this information to other government entities in the U.S. and Mexico, and research groups from universities, national laboratories and industry. Geographical Information Systems(GIS) provide the means to manage, manipulate, analyze and display geographically referenced information that will be managed by PACES. Excellent off-the-shelf software exists for a complete GIS as well as software for storing and managing spatial databases, processing images, networking and viewing maps with layered information. This allows the user flexibility in combining systems to create a GIS or to mix these software packages with custom-built application programs. Software architectural languages provide the ability to specify the computational components and interactions among these components, an important topic in the domain of GIS because of the need to integrate numerous software packages. This paper discusses the characteristics that architectural languages address with respect to the issues relating to the data that must be communicated between software systems and components when systems interact. The paper presents a background on GIS in section 2. Section 3 gives an overview of software architecture and architectural languages. Section 4 suggests issues that may be of concern when defining the software architecture of a GIS. The last section discusses the future research effort and finishes with a summary.
Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Vice, Jason
2011-01-01
NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.
Zhou, Zhi; de Bedout, Juan Manuel; Kern, John Michael; Biyik, Emrah; Chandra, Ramu Sharat
2013-01-22
A system for optimizing customer utility usage in a utility network of customer sites, each having one or more utility devices, where customer site is communicated between each of the customer sites and an optimization server having software for optimizing customer utility usage over one or more networks, including private and public networks. A customer site model for each of the customer sites is generated based upon the customer site information, and the customer utility usage is optimized based upon the customer site information and the customer site model. The optimization server can be hosted by an external source or within the customer site. In addition, the optimization processing can be partitioned between the customer site and an external source.
Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D
2012-10-01
The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.
NASA Technical Reports Server (NTRS)
Rabideau, Gregg; Chien, Steve; Knight, Russell; Schaffer, Steven; Tran, Daniel; Cichy, Benjamin; Sherwood, Robert
2006-01-01
The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and random access memories.
NASA Technical Reports Server (NTRS)
1995-01-01
The Interactive Data Language (IDL), developed by Research Systems, Inc., is a tool for scientists to investigate their data without having to write a custom program for each study. IDL is based on the Mariners Mars spectral Editor (MMED) developed for studies from NASA's Mars spacecraft flights. The company has also developed Environment for Visualizing Images (ENVI), an image processing system for easily analyzing remotely sensed data written in IDL. The Visible Human CD, another Research Systems product, is the first complete digital reference of photographic images for exploring human anatomy.
APEX 3: a multi-purpose test platform for auditory psychophysical experiments.
Francart, Tom; van Wieringen, Astrid; Wouters, Jan
2008-07-30
APEX 3 is a software test platform for auditory behavioral experiments. It provides a generic means of setting up experiments without any programming. The supported output devices include sound cards and cochlear implants from Cochlear Corporation and Advanced Bionics Corporation. Many psychophysical procedures are provided and there is an interface to add custom procedures. Plug-in interfaces are provided for data filters and external controllers. APEX 3 is supported under Linux and Windows and is available free of charge.
Rodríguez-Tizcareño, Mario H; Barajas, Lizbeth; Pérez-Gásque, Marisol; Gómez, Salvador
2012-06-01
This report presents a protocol used to transfer the virtual treatment plan data to the surgical and prosthetic reality and its clinical application, bone site augmentation with computer-custom milled bovine bone graft blocks to their ideal architecture form, implant insertion based on image-guided stent fabrication, and the restorative manufacturing process through computed tomography-based software programs and navigation systems and the computer-aided design and manufacturing techniques for the treatment of the edentulous maxilla.
Neural-Network-Development Program
NASA Technical Reports Server (NTRS)
Phillips, Todd A.
1993-01-01
NETS, software tool for development and evaluation of neural networks, provides simulation of neural-network algorithms plus computing environment for development of such algorithms. Uses back-propagation learning method for all of networks it creates. Enables user to customize patterns of connections between layers of network. Also provides features for saving, during learning process, values of weights, providing more-precise control over learning process. Written in ANSI standard C language. Machine-independent version (MSC-21588) includes only code for command-line-interface version of NETS 3.0.
Generalizing the extensibility of a dynamic geometry software
NASA Astrophysics Data System (ADS)
Herceg, Đorđe; Radaković, Davorka; Herceg, Dejana
2012-09-01
Plug-and-play visual components in a Dynamic Geometry Software (DGS) enable development of visually attractive, rich and highly interactive dynamic drawings. We are developing SLGeometry, a DGS that contains a custom programming language, a computer algebra system (CAS engine) and a graphics subsystem. The basic extensibility framework on SLGeometry supports dynamic addition of new functions from attribute annotated classes that implement runtime metadata registration in code. We present a general plug-in framework for dynamic importing of arbitrary Silverlight user interface (UI) controls into SLGeometry at runtime. The CAS engine maintains a metadata storage that describes each imported visual component and enables two-way communication between the expressions stored in the engine and the UI controls on the screen.
EPANET Multi-Species Extension Software and User's Manual ...
Software and User's Manual EPANET is used in homeland security research to model contamination threats to water systems. Historically, EPANET has been limited to tracking the dynamics of a single chemical transported through a network of pipes and storage tanks, such as a fluoride used in a tracer study or free chlorine used in a disinfection decay study. Recently, the NHSRC released a new extension to EPANET called EPANET-MSX (Multi-Species eXtension) that allows for the consideration of multiple interacting species in the bulk flow and on the pipe walls. This capability has been incorporated into both a stand-alone executable program as well as a toolkit library of functions that programmers can use to build customized applications.
The Bio-Community Perl toolkit for microbial ecology.
Angly, Florent E; Fields, Christopher J; Tyson, Gene W
2014-07-01
The development of bioinformatic solutions for microbial ecology in Perl is limited by the lack of modules to represent and manipulate microbial community profiles from amplicon and meta-omics studies. Here we introduce Bio-Community, an open-source, collaborative toolkit that extends BioPerl. Bio-Community interfaces with commonly used programs using various file formats, including BIOM, and provides operations such as rarefaction and taxonomic summaries. Bio-Community will help bioinformaticians to quickly piece together custom analysis pipelines and develop novel software. Availability an implementation: Bio-Community is cross-platform Perl code available from http://search.cpan.org/dist/Bio-Community under the Perl license. A readme file describes software installation and how to contribute. © The Author 2014. Published by Oxford University Press.
Specifications for Thesaurus Software.
ERIC Educational Resources Information Center
Milstead, Jessica L.
1991-01-01
Presents specifications for software that is designed to support manual development and maintenance of information retrieval thesauri. Evaluation of existing software and design of custom software is discussed, requirements for integration with larger systems and for the user interface are described, and relationships among terms are discussed.…
NASA Technical Reports Server (NTRS)
Mitchell, Sherry L.
2018-01-01
The Customer Avionics Interface Development and Analysis (CAIDA) supports the testing of the Launch Control System (LCS), NASA's command and control system for the Space Launch System (SLS), Orion Multi-Purpose Crew Vehicle (MPCV), and ground support equipment. The objective of the semester-long internship was to support day-to-day operations of CAIDA and help prepare for verification and validation of CAIDA software.
Mobile Food Ordering Application using Android OS Platform
NASA Astrophysics Data System (ADS)
Yosep Ricky, Michael
2014-03-01
The purpose of this research is making an ordering food application based on Android with New Order, Order History, Restaurant Profile, Order Status, Tracking Order, and Setting Profile features. The research method used in this research is water model of System Development Life Cycle (SDLC) method with following phases: requirement definition, analyzing and determining the features needed in developing application and making the detail definition of each features, system and software design, designing the flow of developing application by using storyboard design, user experience design, Unified Modeling Language (UML) design, and database structure design, implementation an unit testing, making database and translating the result of designs to programming language code then doing unit testing, integration and System testing, integrating unit program to one unit system then doing system testing, operation and maintenance, operating the result of system testing and if any changes and reparations needed then the previous phases could be back. The result of this research is an ordering food application based on Android for customer and courier user, and a website for restaurant and admin user. The conclusion of this research is to help customer in making order easily, to give detail information needed by customer, to help restaurant in receiving order, and to help courier while doing delivery.
Virtually fabricated guide for placement of the C-tube miniplate.
Paek, Janghyun; Jeong, Do-Min; Kim, Yong; Kim, Seong-Hun; Chung, Kyu-Rhim; Nelson, Gerald
2014-05-01
This paper introduces a virtually planned and stereolithographically fabricated guiding system that will allow the clinician to plan carefully for the best location of the device and to achieve an accurate position without complications. The scanned data from preoperative dental casts were edited to obtain preoperative 3-dimensional (3D) virtual models of the dentition. After the 3D virtual models were repositioned, the 3D virtual surgical guide was fabricated. A surgical guide was created onscreen, and then these virtual guides were materialized into real ones using the stereolithographic technique. Whereas the previously described guide required laboratory work to be performed by the orthodontist, our technique is more convenient because the laboratory work is done remotely by computer-aided design/computer-aided manufacturing technology. Because the miniplate is firmly held in place as the patient holds his or her mandibular teeth against the occlusal pad of the surgical guide, there is no risk that the miniscrews can slide on the bone surface during placement. The software program (2.5-dimensional software) in this study combines 2-dimensional cephalograms with 3D virtual dental models. This software is an effective and efficient alternative to 3D software when 3D computed tomography data are not available. To confidently and safely place a miniplate with screw fixation, a simple customized guide for an orthodontic miniplate was introduced. The use of a custom-made, rigid guide when placing miniplates will minimize complications such as vertical mislocation or slippage of the miniplate during placement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Software design by reusing architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay; Nii, H. Penny
1992-01-01
Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.
Test Generator for MATLAB Simulations
NASA Technical Reports Server (NTRS)
Henry, Joel
2011-01-01
MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.
NASA's Lunar Impact Monitoring Program
NASA Technical Reports Server (NTRS)
Suggs, Robert M.; Cooke, William; Swift, Wesley; Hollon, Nicholas
2007-01-01
NASA's Meteoroid Environment Office nas implemented a program to monitor the Moon for meteoroid impacts from the Marshall Space Flight Center. Using off-the-shelf telescopes and video equipment, the moon is monitored for as many as 10 nights per month, depending on weather. Custom software automatically detects flashes which are confirmed by a second telescope, photometrically calibrated using background stars, and published on a website for correlation with other observations, Hypervelocity impact tests at the Ames Vertical Gun Facility have been performed to determine the luminous efficiency ana ejecta characteristics. The purpose of this research is to define the impact ejecta environment for use by lunar spacecraft designers of the Constellation (manned lunar) Program. The observational techniques and preliminary results will be discussed.
Viewing ISS Data in Real Time via the Internet
NASA Technical Reports Server (NTRS)
Myers, Gerry; Chamberlain, Jim
2004-01-01
EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.
An integrated CAD/CAM/robotic milling method for custom cementless femoral prostheses.
Wen-ming, Xi; Ai-min, Wang; Qi, Wu; Chang-hua, Liu; Jian-fei, Zhu; Fang-fang, Xia
2015-09-01
Aseptic loosening is the primary cause of cementless femoral prosthesis failure and is related to the primary stability of the cementless femoral prosthesis in the femoral cavity. The primary stability affects both the osseointegration and the long-term stability of cementless femoral prostheses. A custom cementless femoral prosthesis can improve the fit and fill of the prosthesis in the femoral cavity and decrease the micromotion of the proximal prosthesis such that the primary stability of the custom prosthesis can be improved, and osseointegration of the proximal prosthesis is achieved. These results will help to achieve long-term stability in total hip arthroplasty (THA). In this paper, we introduce an integrated CAD/CAM/robotic method of milling custom cementless femoral prostheses. The 3D reconstruction model uses femoral CT images and 3D design software to design a CAD model of the custom prosthesis. After the transformation matrices between two units of the robotic system are calibrated, consistency between the CAM software and the robotic system can be achieved, and errors in the robotic milling can be limited. According to the CAD model of the custom prosthesis, the positions of the robotic tool points are produced by the CAM software of the CNC machine. The normal vector of the three adjacent robotic tool point positions determines the pose of the robotic tool point. In conclusion, the fit rate of custom pig femur stems in the femoral cavities was 90.84%. After custom femoral prostheses were inserted into the femoral cavities, the maximum gaps between the prostheses and the cavities measured less than 1 mm at the diaphysis and 1.3 mm at the metaphysis. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Software support for improving technology infusion
NASA Technical Reports Server (NTRS)
Feather, M. S.; Hicks, K. A.; Johnson, K. R.; Cornford, S. L.
2003-01-01
This paper focuses on describing the custom software tool, DDP, that was developed to support the TIMA process, and on showing how the needs of the TIMA process have influenced the development of the structure and capabilities of the DDP software.
Software Acquisition Improvement in the Aeronautical Systems Center
2008-09-01
software fielded, a variety of different methods were suggested by the interviewees. These included blocks, suites and other tailored processes developed...12 Selection of Research Method ...DoD look to the commercial market to buy tools, methods , environments, and application software, instead of custom-built software (DSB: 1987). These
ERIC Educational Resources Information Center
What Works Clearinghouse, 2010
2010-01-01
The combination of "Carnegie Learning Curricula and Cognitive Tutor[R] Software" merges algebra textbooks with interactive software developed around an artificial intelligence model that identifies strengths and weaknesses in an individual student's mastery of mathematical concepts. The software customizes prompts to focus on areas in…
Managing configuration software of ground software applications with glueware
NASA Technical Reports Server (NTRS)
Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.
2003-01-01
This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.
SeedVicious: Analysis of microRNA target and near-target sites.
Marco, Antonio
2018-01-01
Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
GenomeGraphs: integrated genomic data visualization with R.
Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine
2009-01-06
Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.
Shao, Zhen-Xuan; Wang, Jian-Shun; Lin, Zhong-Ke; Ni, Wen-Fei; Wang, Xiang-Yang
2017-01-01
Transpedicular transdiscal screw fixation is an alternative technique used in lumbar spine fixation; however, it requires an accurate screw trajectory. The aim of this study is to design a novel 3D-printed custom drill guide and investigate its accuracy to guide the trajectory of transpedicular transdiscal (TPTD) lumbar screw fixation. Dicom images of thirty lumbar functional segment units (FSU, two segments) of L1–L4 were acquired from the PACS system in our hospital (patients who underwent a CT scan for other abdomen diseases and had normal spine anatomy) and imported into reverse design software for three-dimensional reconstructions. Images were used to print the 3D lumbar models and were imported into CAD software to design an optimal TPTD screw trajectory and a matched custom drill guide. After both the 3D printed FSU models and 3D-printed custom drill guide were prepared, the TPTD screws will be guided with a 3D-printed custom drill guide and introduced into the 3D printed FSU models. No significant statistical difference in screw trajectory angles was observed between the digital model and the 3D-printed model (P > 0.05). Our present study found that, with the help of CAD software, it is feasible to design a TPTD screw custom drill guide that could guide the accurate TPTD screw trajectory on 3D-printed lumbar models. PMID:28717599
Custom software development for use in a clinical laboratory
Sinard, John H.; Gershkovich, Peter
2012-01-01
In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care. PMID:23372985
Custom software development for use in a clinical laboratory.
Sinard, John H; Gershkovich, Peter
2012-01-01
In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care.
Combining Agile and Traditional: Customer Communication in Distributed Environment
NASA Astrophysics Data System (ADS)
Korkala, Mikko; Pikkarainen, Minna; Conboy, Kieran
Distributed development is a radically increasing phenomenon in modern software development environments. At the same time, traditional and agile methodologies and combinations of those are being used in the industry. Agile approaches place a large emphasis on customer communication. However, existing knowledge on customer communication in distributed agile development seems to be lacking. In order to shed light on this topic and provide practical guidelines for companies in distributed agile environments, a qualitative case study was conducted in a large globally distributed software company. The key finding was that it might be difficult for an agile organization to get relevant information from a traditional type of customer organization, even though the customer communication was indicated to be active and utilized via multiple different communication media. Several challenges discussed in this paper referred to "information blackout" indicating the importance of an environment fostering meaningful communication. In order to evaluate if this environment can be created a set of guidelines is proposed.
A Reconfigurable Simulation-Based Test System for Automatically Assessing Software Operating Skills
ERIC Educational Resources Information Center
Su, Jun-Ming; Lin, Huan-Yu
2015-01-01
In recent years, software operating skills, the ability in computer literacy to solve problems using specific software, has become much more important. A great deal of research has also proven that students' software operating skills can be efficiently improved by practicing customized virtual and simulated examinations. However, constructing…
IT Software Development and IT Operations Strategic Alignment: An Agile DevOps Model
ERIC Educational Resources Information Center
Hart, Michael
2017-01-01
Information Technology (IT) departments that include development and operations are essential to develop software that meet customer needs. DevOps is a term originally constructed from software development and IT operations. DevOps includes the collaboration of all stakeholders such as software engineers and systems administrators involved in the…
ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.
Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.
Reusable experiment controllers, case studies
NASA Astrophysics Data System (ADS)
Buckley, Brian A.; Gaasbeck, Jim Van
1996-03-01
Congress has given NASA and the science community a reality check. The tight and ever shrinking budgets are trimming the fat from many space science programs. No longer can a Principal Investigator (PI) afford to waste development dollars on re-inventing spacecraft controllers, experiment/payload controllers, ground control systems, or test sets. Inheritance of the Ground Support Equipment (GSE) from one program to another is not a significant re-use of technology to develop a science mission in these times. Reduction of operational staff and highly autonomous experiments are needed to reduce the sustaining cost of a mission. The re-use of an infrastructure from one program to another is needed to truly attain the cost and time savings required. Interface and Control Systems, Inc. (ICS) has a long history of re-usable software. Navy, Air Force, and NASA programs have benefited from the re-use of a common control system from program to program. Several standardization efforts in the AIAA have adopted the Spacecraft Command Language (SCL) architecture as a point solution to satisfy requirements for re-use and autonomy. The Environmental Research Institute of Michigan (ERIM) has been a long-standing customer of ICS and are working on their 4th generation system using SCL. Much of the hardware and software infrastructure has been re-used from mission to mission with little cost for re-hosting a new experiment. The same software infrastructure has successfully been used on Clementine, and an end-to-end system is being deployed for the Far Ultraviolet Spectroscopic Explorer (FUSE) for Johns Hopkins University. A case study of the ERIM programs, Clementine and FUSE will be detailed in this paper.
Cui, Yang; Hanley, Luke
2015-06-01
ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.
Cui, Yang; Hanley, Luke
2015-01-01
ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science. PMID:26133872
NASA Astrophysics Data System (ADS)
Cui, Yang; Hanley, Luke
2015-06-01
ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.
A Generic Ground Framework for Image Expertise Centres and Small-Sized Production Centres
NASA Astrophysics Data System (ADS)
Sellé, A.
2009-05-01
Initiated by the Pleiadas Earth Observation Program, the CNES (French Space Agency) has developed a generic collaborative framework for its image quality centre, highly customisable for any upcoming expertise centre. This collaborative framework has been design to be used by a group of experts or scientists that want to share data and processings and manage interfaces with external entities. Its flexible and scalable architecture complies with the core requirements: defining a user data model with no impact on the software (generic access data), integrating user processings with a GUI builder and built-in APIs, and offering a scalable architecture to fit any preformance requirement and accompany growing projects. The CNES jas given licensing grants for two software companies that will be able to redistribute this framework to any customer.
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
A Language for Specifying Compiler Optimizations for Generic Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcock, Jeremiah J.
2007-01-01
Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less
Rush, Perry O; Boone, William R
2009-01-01
This article provides information regarding the introduction of virtual education into classroom instruction, wherein a method of classroom instruction was developed with the use of a computer, digital camera, and various software programs. This approach simplified testing procedures, thus reducing institutional costs substantially by easing the demand for manpower, and seemed to improve average grade performance. Organized files with hundreds of digital pictures have created a range of instructor resources. Much of the new course materials were organized onto compact disks to complement course notes. Customizing presentations with digital technology holds potential benefits for students, instructors and the institution.
Numerical aerodynamic simulation facility. Preliminary study extension
NASA Technical Reports Server (NTRS)
1978-01-01
The production of an optimized design of key elements of the candidate facility was the primary objective of this report. This was accomplished by effort in the following tasks: (1) to further develop, optimize and describe the function description of the custom hardware; (2) to delineate trade off areas between performance, reliability, availability, serviceability, and programmability; (3) to develop metrics and models for validation of the candidate systems performance; (4) to conduct a functional simulation of the system design; (5) to perform a reliability analysis of the system design; and (6) to develop the software specifications to include a user level high level programming language, a correspondence between the programming language and instruction set and outline the operation system requirements.
External Data and Attribute Hyperlink Programs for Promis*e(Registered Trademark)
NASA Technical Reports Server (NTRS)
Derengowski, Rich; Gruel, Andrew
2001-01-01
External Data and Attribute Hyperlink are computer programs that can be added to Promis*e(trademark) which is a commercial software system that automates routine tasks in the design (including drawing schematic diagrams) of electrical control systems. The programs were developed under the Stennis Space Center's (SSC) Dual Use Technology Development Program to provide capabilities for SSC's BMCS configuration management system which uses Promis*e(trademark). The External Data program enables the storage and management of information in an external database linked to a drawing. Changes can be made either in the database or on the drawing. Information that originates outside Promis*e(trademark) can be stored in custom fields that can be added to the database. Although this information is not available in Promis*e(trademark) printed drawings, it can be associated with symbols in the drawings, and can be retrieved through the drawings when the software is running. The Attribute Hyperlink program enables the addition of hyperlink information as attributes of symbols. This program enables the formation of a direct hyperlink between a schematic diagram and an Internet site or a file on a compact disk, on the user's hard drive, or on another computer on a network to which the user's computer is connected. The user can then obtain information directly related to the part (e.g., maintenance, or troubleshooting information) associated with the hyperlink.
NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities
NASA Technical Reports Server (NTRS)
Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.
2015-01-01
Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The
eSciMart: Web Platform for Scientific Software Marketplace
NASA Astrophysics Data System (ADS)
Kryukov, A. P.; Demichev, A. P.
2016-10-01
In this paper we suggest a design of a web marketplace where users of scientific application software and databases, presented in the form of web services, as well as their providers will have presence simultaneously. The model, which will be the basis for the web marketplace is close to the customer-to-customer (C2C) model, which has been successfully used, for example, on the auction sites such as eBay (ebay.com). Unlike the classical model of C2C the suggested marketplace focuses on application software in the form of web services, and standardization of API through which application software will be integrated into the web marketplace. A prototype of such a platform, entitled eSciMart, is currently being developed at SINP MSU.
Using Rose and Compass for Authentication
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2009-07-09
Many recent non-proliferation software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project. ROSEmore » is an LLNL-developed robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. It continues to be extended to support the automated analysis of binaries (x86, ARM, and PowerPC). We continue to extend ROSE to address a number of security specific requirements and apply it to software authentication for non-proliferation projects. We will give an update on the status of our work.« less
NASA Technical Reports Server (NTRS)
Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam
2011-01-01
Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding [8]. In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.
Aoun, Bachir
2016-05-05
A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group. © 2016 Wiley Periodicals, Inc.
Aoun, Bachir
2016-01-22
Here, a new Reverse Monte Carlo (RMC) package ‘fullrmc’ for atomic or rigid body and molecular, amorphous or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython ,C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with amore » set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modelling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. Also fullrmc provides a unique way with almost no additional computational cost to recur a group’s selection, allowing the system to go out of local minimas by refining a group’s position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aoun, Bachir
Here, a new Reverse Monte Carlo (RMC) package ‘fullrmc’ for atomic or rigid body and molecular, amorphous or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython ,C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with amore » set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modelling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. Also fullrmc provides a unique way with almost no additional computational cost to recur a group’s selection, allowing the system to go out of local minimas by refining a group’s position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group.« less
Bar-Code System for a Microbiological Laboratory
NASA Technical Reports Server (NTRS)
Law, Jennifer; Kirschner, Larry
2007-01-01
A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.
Improved Airborne System for Sensing Wildfires
NASA Technical Reports Server (NTRS)
McKeown, Donald; Richardson, Michael
2008-01-01
The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.
Evaluation of customer satisfaction level of different projects.
Das, Nandini; Samanta, Niladri
2005-01-01
Customer satisfaction as the key element for success in business is a major concern for any industry. In this paper we propose a customer satisfaction index using principal component analysis for a software solution company. This index was used as an input to the marketing division to identify their potential customers from their past experience. Since this is a very common problem for any industry, the same approach can be used in similar situations.
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Distribution Feeder Modeling for Time-Series Simulation of Voltage Management Strategies: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giraldez Miner, Julieta I; Gotseff, Peter; Nagarajan, Adarsh
This paper presents techniques to create baseline distribution models using a utility feeder from Hawai'ian Electric Company. It describes the software-to-software conversion, steady-state, and time-series validations of a utility feeder model. It also presents a methodology to add secondary low-voltage circuit models to accurately capture the voltage at the customer meter level. This enables preparing models to perform studies that simulate how customer-sited resources integrate into legacy utility distribution system operations.
Development of Analytical Plug-ins for ENSITE: Version 1.0
2017-11-01
ENSITE’s core-software platform builds upon leading geospatial platforms already in use by the Army and is designed to offer an easy-to-use, customized ...use by the Army and is designed to offer an easy-to-use, customized set of workflows for CB planners. Within this platform are added software compo...public good . Find out more at www.erdc.usace.army.mil. To search for other technical reports published by ERDC, visit the ERDC online library at
Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft
NASA Technical Reports Server (NTRS)
Matusow, Carla
1999-01-01
As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.
Parallel Wavefront Analysis for a 4D Interferometer
NASA Technical Reports Server (NTRS)
Rao, Shanti R.
2011-01-01
This software provides a programming interface for automating data collection with a PhaseCam interferometer from 4D Technology, and distributing the image-processing algorithm across a cluster of general-purpose computers. Multiple instances of 4Sight (4D Technology s proprietary software) run on a networked cluster of computers. Each connects to a single server (the controller) and waits for instructions. The controller directs the interferometer to several images, then assigns each image to a different computer for processing. When the image processing is finished, the server directs one of the computers to collate and combine the processed images, saving the resulting measurement in a file on a disk. The available software captures approximately 100 images and analyzes them immediately. This software separates the capture and analysis processes, so that analysis can be done at a different time and faster by running the algorithm in parallel across several processors. The PhaseCam family of interferometers can measure an optical system in milliseconds, but it takes many seconds to process the data so that it is usable. In characterizing an adaptive optics system, like the next generation of astronomical observatories, thousands of measurements are required, and the processing time quickly becomes excessive. A programming interface distributes data processing for a PhaseCam interferometer across a Windows computing cluster. A scriptable controller program coordinates data acquisition from the interferometer, storage on networked hard disks, and parallel processing. Idle time of the interferometer is minimized. This architecture is implemented in Python and JavaScript, and may be altered to fit a customer s needs.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.
2012-12-01
The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.
Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses
ERIC Educational Resources Information Center
Mitra, Sandeep
2014-01-01
This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…
Rapid Development of Custom Software Architecture Design Environments
1999-08-01
the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture
Li, Peng; Tang, Youchao; Li, Jia; Shen, Longduo; Tian, Weidong; Tang, Wei
2013-09-01
The aim of this study is to describe the sequential software processing of computed tomography (CT) dataset for reconstructing the finite element analysis (FEA) mandibular model with custom-made plate, and to provide a theoretical basis for clinical usage of this reconstruction method. A CT scan was done on one patient who had mandibular continuity defects. This CT dataset in DICOM format was imported into Mimics 10.0 software in which a three-dimensional (3-D) model of the facial skeleton was reconstructed and the mandible was segmented out. With Geomagic Studio 11.0, one custom-made plate and nine virtual screws were designed. All parts of the reconstructed mandible were converted into NURBS and saved as IGES format for importing into pro/E 4.0. After Boolean operation and assembly, the model was switched to ANSYS Workbench 12.0. Finally, after applying the boundary conditions and material properties, an analysis was performed. As results, a 3-D FEA model was successfully developed using the softwares above. The stress-strain distribution precisely indicated biomechanical performance of the reconstructed mandible on the normal occlusion load, without stress concentrated areas. The Von-Mises stress in all parts of the model, from the maximum value of 50.9MPa to the minimum value of 0.1MPa, was lower than the ultimate tensile strength. In conclusion, the described strategy could speedily and successfully produce a biomechanical model of a reconstructed mandible with custom-made plate. Using this FEA foundation, the custom-made plate may be improved for an optimal clinical outcome. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.
When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less
31 CFR 1024.220 - Customer identification programs for mutual funds.
Code of Federal Regulations, 2011 CFR
2011-07-01
... mutual funds. 1024.220 Section 1024.220 Money and Finance: Treasury Regulations Relating to Money and... FUNDS Programs § 1024.220 Customer identification programs for mutual funds. (a) Customer identification program: minimum requirements—(1) In general. A mutual fund must implement a written Customer...
CamBAfx: Workflow Design, Implementation and Application for Neuroimaging
Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John
2009-01-01
CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470
High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software
NASA Astrophysics Data System (ADS)
Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.
2013-08-01
GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.
Laying the cornerstone: an employee-driven customer service program.
Davis, Stephen M; Chinnis, Ann S; Dunmire, J Erin
2006-01-01
In the 21st-century healthcare environment, customer service remains critical to the fiscal viability of healthcare organizations. Continued competition for patients and diminishing reimbursements have necessitated the establishment of customer service programs to attract patients and retain outstanding employees. These programs should increase quality experiences for both internal customers (employees) and external customers (patients). This article describes a unique employee-driven customer service initiative titled Serving Together Achieving Results. Obstacles to implementing a customer service program in a multifaceted academic setting are highlighted, and the use of a novel tool, Q technique, to prioritize employee feedback is discussed.
Software-based evaluation of toric IOL orientation in a multicenter clinical study.
Kasthurirangan, Sanjeev; Feuchter, Lucas; Smith, Pamela; Nixon, Donald
2014-12-01
To evaluate the rotational stability of a new one-piece hydrophobic acrylic toric intraocular lens (IOL) using a custom-developed software for analysis of slit-lamp photographs. In a prospective, multicenter study, 174 eyes were implanted with the TECNIS Toric IOL (Abbott Medical Optics, Inc., Santa Ana, CA). A custom-developed software was used to analyze high-resolution slit-lamp photographs of 156 eyes taken at day 1 (baseline) and 1, 3, and 6 months postoperatively. The software uses iris and sclera landmarks to align the baseline image and later images for comparison. Validation of software was performed through repeated analyses of protractor images rotated from 0.1° to 10.0° and randomly selected photographs of 20 eyes. Software validation showed precision (repeatability plus reproducibility variation) of 0.02° using protractor images and 2.22° using slit-lamp photographs. Good quality slit-lamp images and clear landmarks were necessary for precise measurements. At 6 months, 94.2% of eyes had 5° or less change in IOL orientation versus baseline; only 2 eyes (1.4%) had axis shift greater than 30°. Most eyes were within 5° or less of rotation between 1 and 3 months (92.9%) and 3 and 6 months (94.1%). Mean absolute axis change (± standard deviation) from 1 day to 6 months was 2.70° ± 5.51°. The new custom software was precise and quick in analyzing slit-lamp photographs to determine postoperative toric IOL rotation. Copyright 2014, SLACK Incorporated.
Aslanidou, Katerina; Kau, Chung How; Vlachos, Christos; Saleh, Tayem Abou
2017-01-01
The aim of this case report was to present the procedure of fabricating a customized occlusal splint, through a revolutionary software that combines cone beam computed tomography (CBCT) with jaw motion tracking (JMT) data and superimposes a digital impression. The case report was conducted on a 46-year-old female patient diagnosed with the temporomandibular disorder. A CBCT scan and an optical impression were obtained. The range of the patient's mandibular movements was captured with a JMT device. The data were combined in the SICAT software (SICAT, Sirona, Bonn, Germany). The software enabled the visualization of patient-specific mandibular movements and provided a real dynamic anatomical evaluation of the condylar position in the glenoid fossa. After the assessment of the range of movements during opening, protrusion, and lateral movements all the data were sent to SICAT and a customized occlusal splint was manufactured. The SICAT software provides a three-dimensional real-dynamic simulation of mandibular movements relative to the patient-specific anatomy of the jaw; thus, it opens new possibilities and potentials for the management of temporomandibular disorders.
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
Progress report on current status of computer software management and information center (COSMIC) includes the following areas: inventory, evaluation and publication, marketing, customer service, maintenance and support, and budget summary.
Key ingredients needed when building large data processing systems for scientists
NASA Technical Reports Server (NTRS)
Miller, K. C.
2002-01-01
Why is building a large science software system so painful? Weren't teams of software engineers supposed to make life easier for scientists? Does it sometimes feel as if it would be easier to write the million lines of code in Fortran 77 yourself? The cause of this dissatisfaction is that many of the needs of the science customer remain hidden in discussions with software engineers until after a system has already been built. In fact, many of the hidden needs of the science customer conflict with stated needs and are therefore very difficult to meet unless they are addressed from the outset in a system's architectural requirements. What's missing is the consideration of a small set of key software properties in initial agreements about the requirements, the design and the cost of the system.
Viceconti, M; Testi, D; Gori, R; Zannoni, C
2000-01-01
The present work describes a technology transfer project called HIPCOM devoted to the re-engineering of the process used by a medical devices manufacturer to design custom-made hip prostheses. Although it started with insufficient support from the end-user management, a very tight scheduling and a moderate budget, the project developed into what is considered by all partners a success story. In particular, the development of the design software, called HIPCOM Interactive Design Environment (HIDE) was completed in a time shorter than any optimistic expectation. The software was quite stable since its first beta version, and once introduced at the user site it fully replaced the original procedure in less than two months. One year after the early adoption, more than 80 custom-made prostheses had been designed with HIDE and the user had reported only two bugs, both cosmetics. The scope of the present work was to report the development experience and to investigate the reasons for these positive results, with particular reference to the development procedure and the software architecture. The choice of TCL/TK as development language and the adoption of well-defined software architecture were found to be the success key factors. Other important determinants were found to be the adoption of an incremental software engineering strategy, well suited for small to medium projects and the presence in the development staff of a technology transfer expert.
ERIC Educational Resources Information Center
Islam, Kaliym A.
2017-01-01
The problem addressed in this study was that customer education programs are intended to strengthen customer loyalty; however, research on the effects of customer education on customer loyalty remains insufficient. This phenomenological study investigated how the lived experiences of customers' participating in financial services' customer…
Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission
NASA Technical Reports Server (NTRS)
Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan
2010-01-01
The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.
NASA Technical Reports Server (NTRS)
Feather, M. S.
2001-01-01
Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.
Evaluating a Service-Oriented Architecture
2007-09-01
See the description on page 13. SaaS Software as a service ( SaaS ) is a software delivery model where customers don’t own a copy of the application... serviceability REST Representational State Transfer RIA rich internet application RPC remote procedure call SaaS software as a service SAML Security...Evaluating a Service -Oriented Architecture Phil Bianco, Software Engineering Institute Rick Kotermanski, Summa Technologies Paulo Merson
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1993-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of August, 1993. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are discussed. Ten articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: (1) MOM3D - A Method of Moments Code for Electromagnetic Scattering (UNIX Version); (2) EM-Animate - Computer Program for Displaying and Animating the Steady-State Time-Harmonic Electromagnetic Near Field and Surface-Current Solutions; (3) MOM3D - A Method of Moments Code for Electromagnetic Scattering (IBM PC Version); (4) M414 - MIL-STD-414 Variable Sampling Procedures Computer Program; (5) MEDOF - Minimum Euclidean Distance Optimal Filter; (6) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (Macintosh Version); (7) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (IBM PC Version); (8) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (UNIX Version); (9) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (DEC VAX VMS Version); and (10) TFSSRA - Thick Frequency Selective Surface with Rectangular Apertures. Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and dissemination are also described along with a budget summary.
NASA Technical Reports Server (NTRS)
Roth, Don J.; Rapchun, David A.; Jones, Hollis H.
2001-01-01
The Cloud Absorption Radiometer (CAR) instrument has been the most frequently used airborne instrument built in-house at NASA Goddard Space Flight Center, having flown scientific research missions on-board various aircraft to many locations in the United States, Azores, Brazil, and Kuwait since 1983. The CAR instrument is capable of measuring scattered light by clouds in fourteen spectral bands in UV, visible and near-infrared region. This document describes the control, data acquisition, display, and file storage software for the new version of CAR. This software completely replaces the prior CAR Data System and Control Panel with a compact and robust virtual instrument computer interface. Additionally, the instrument is now usable for the first time for taking data in an off-aircraft mode. The new instrument is controlled via a LabVIEW v5. 1.1-developed software interface that utilizes, (1) serial port writes to write commands to the controller module of the instrument, and (2) serial port reads to acquire data from the controller module of the instrument. Step-by-step operational procedures are provided in this document. A suite of other software programs has been developed to complement the actual CAR virtual instrument. These programs include: (1) a simulator mode that allows pretesting of new features that might be added in the future, as well as demonstrations to CAR customers, and development at times when the instrument/hardware is off-location, and (2) a post-experiment data viewer that can be used to view all segments of individual data cycles and to locate positions where 'start' and stop' byte sequences were incorrectly formulated by the instrument controller. The CAR software described here is expected to be the basis for CAR operation for many missions and many years to come.
FastScript3D - A Companion to Java 3D
NASA Technical Reports Server (NTRS)
Koenig, Patti
2005-01-01
FastScript3D is a computer program, written in the Java 3D(TM) programming language, that establishes an alternative language that helps users who lack expertise in Java 3D to use Java 3D for constructing three-dimensional (3D)-appearing graphics. The FastScript3D language provides a set of simple, intuitive, one-line text-string commands for creating, controlling, and animating 3D models. The first word in a string is the name of a command; the rest of the string contains the data arguments for the command. The commands can also be used as an aid to learning Java 3D. Developers can extend the language by adding custom text-string commands. The commands can define new 3D objects or load representations of 3D objects from files in formats compatible with such other software systems as X3D. The text strings can be easily integrated into other languages. FastScript3D facilitates communication between scripting languages [which enable programming of hyper-text markup language (HTML) documents to interact with users] and Java 3D. The FastScript3D language can be extended and customized on both the scripting side and the Java 3D side.
Organizing Diverse, Distributed Project Information
NASA Technical Reports Server (NTRS)
Keller, Richard M.
2003-01-01
SemanticOrganizer is a software application designed to organize and integrate information generated within a distributed organization or as part of a project that involves multiple, geographically dispersed collaborators. SemanticOrganizer incorporates the capabilities of database storage, document sharing, hypermedia navigation, and semantic-interlinking into a system that can be customized to satisfy the specific information-management needs of different user communities. The program provides a centralized repository of information that is both secure and accessible to project collaborators via the World Wide Web. SemanticOrganizer's repository can be used to collect diverse information (including forms, documents, notes, data, spreadsheets, images, and sounds) from computers at collaborators work sites. The program organizes the information using a unique network-structured conceptual framework, wherein each node represents a data record that contains not only the original information but also metadata (in effect, standardized data that characterize the information). Links among nodes express semantic relationships among the data records. The program features a Web interface through which users enter, interlink, and/or search for information in the repository. By use of this repository, the collaborators have immediate access to the most recent project information, as well as to archived information. A key advantage to SemanticOrganizer is its ability to interlink information together in a natural fashion using customized terminology and concepts that are familiar to a user community.
Shams, Assadollah; Yarmohammadian, Mohammad Hosein; Abbarik, Hadi Hayati
2012-01-01
Today, the challenges of quality improvement and customer focus as well as systems development are important and inevitable matters in higher education institutes. There are some highly competitive challenges among educational institutes, including accountability to social needs, increasing costs of education, diversity in educational methods and centers and their consequent increasing competition, and the need for adaptation of new information and knowledge to focus on students as the main customers. Hence, the purpose of this study was to determine the rate of costumer focus based on Isfahan University of Medical Sciences students' viewpoints and to suggest solutions to improve this rate. This was a cross-sectional study carried out in 2011. The statistical population included all the students of seven faculties of Isfahan University of Medical Sciences. According to statistical formulae, the sample size consisted of 384 subjects. Data collection tools included researcher-made questionnaire whose reliability was found to be 87% by Cronbach's alpha coefficient. Finally, using the SPSS statistical software and statistical methods of independent t-test and one-way analysis of variance (ANOVA), Likert scale based data were analyzed. The mean of overall score for customer focus (student-centered) of Isfahan University of Medical Sciences was 46.54. Finally, there was a relation between the mean of overall score for customer focus and gender, educational levels, and students' faculties. Researcher suggest more investigation between Medical University and others. It is a difference between medical sciences universities and others regarding the customer focus area, since students' gender must be considered as an effective factor in giving healthcare services quality. In order to improve the customer focus, it is essential to take facilities, field of study, faculties, and syllabus into consideration.
Shams, Assadollah; Yarmohammadian, Mohammad Hosein; Abbarik, Hadi Hayati
2012-01-01
Background: Today, the challenges of quality improvement and customer focus as well as systems development are important and inevitable matters in higher education institutes. There are some highly competitive challenges among educational institutes, including accountability to social needs, increasing costs of education, diversity in educational methods and centers and their consequent increasing competition, and the need for adaptation of new information and knowledge to focus on students as the main customers. Hence, the purpose of this study was to determine the rate of costumer focus based on Isfahan University of Medical Sciences students’ viewpoints and to suggest solutions to improve this rate. Materials and Methods: This was a cross-sectional study carried out in 2011. The statistical population included all the students of seven faculties of Isfahan University of Medical Sciences. According to statistical formulae, the sample size consisted of 384 subjects. Data collection tools included researcher-made questionnaire whose reliability was found to be 87% by Cronbach's alpha coefficient. Finally, using the SPSS statistical software and statistical methods of independent t-test and one-way analysis of variance (ANOVA), Likert scale based data were analyzed. Results: The mean of overall score for customer focus (student-centered) of Isfahan University of Medical Sciences was 46.54. Finally, there was a relation between the mean of overall score for customer focus and gender, educational levels, and students’ faculties. Researcher suggest more investigation between Medical University and others. Conclusion: It is a difference between medical sciences universities and others regarding the customer focus area, since students’ gender must be considered as an effective factor in giving healthcare services quality. In order to improve the customer focus, it is essential to take facilities, field of study, faculties, and syllabus into consideration. PMID:23555127
A Mechanism of Modeling and Verification for SaaS Customization Based on TLA
NASA Astrophysics Data System (ADS)
Luan, Shuai; Shi, Yuliang; Wang, Haiyang
With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.
Issues and Methods for Assessing COTS Reliability, Maintainability, and Availability
NASA Technical Reports Server (NTRS)
Schneidewind, Norman F.; Nikora, Allen P.
1998-01-01
Many vendors produce products that are not domain specific (e.g., network server) and have limited functionality (e.g., mobile phone). In contrast, many customers of COTS develop systems that am domain specific (e.g., target tracking system) and have great variability in functionality (e.g., corporate information system). This discussion takes the viewpoint of how the customer can ensure the quality of COTS components. In evaluating the benefits and costs of using COTS, we must consider the environment in which COTS will operate. Thus we must distinguish between using a non-mission critical application like a spreadsheet program to produce a budget and a mission critical application like military strategic and tactical operations. Whereas customers will tolerate an occasional bug in the former, zero tolerance is the rule in the latter. We emphasize the latter because this is the arena where there are major unresolved problems in the application of COTS. Furthermore, COTS components may be embedded in the larger customer system. We refer to these as embedded systems. These components must be reliable, maintainable, and available, and must be with the larger system in order for the customer to benefit from the advertised advantages of lower development and maintenance costs. Interestingly, when the claims of COTS advantages are closely examined, one finds that to a great extent these COTS components consist of hardware and office products, not mission critical software [1]. Obviously, COTS components are different from custom components with respect to one or more of the following attributes: source, development paradigm, safety, reliability, maintainability, availability, security, and other attributes. However, the important question is whether they should be treated differently when deciding to deploy them for operational use; we suggest the answer is no. We use reliability as an example to justify our answer. In order to demonstrate its reliability, a COTS component must pass the same reliability evaluations as the custom components, otherwise the COTS components will be the weakest link in the chain of components and will be the determinant of software system reliability. The challenge is that there will be less information available for evaluating COTS components than for custom components but this does not mean we should despair and do nothing. Actually, there is a lot we can do even in the absence of documentation on COTS components because the customer will have information about how COTS components are to be used in the larger system. To illustrate our approach, we will consider the reliability, maintainability, and availability (RMA) of COTS components as used in larger systems. Finally, COTS suppliers might consider increasing visibility into their products to assist customers in determining the components' fitness for use in a particular application. We offer ideas of information that would be useful to customers, and what vendors might do to provide it.
MODTRAN6: a major upgrade of the MODTRAN radiative transfer code
NASA Astrophysics Data System (ADS)
Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette
2014-06-01
The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.
2013-01-01
Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. PMID:23631706
Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt
2013-04-30
Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license.
Application of computer graphics in the design of custom orthopedic implants.
Bechtold, J E
1986-10-01
Implementation of newly developed computer modelling techniques and computer graphics displays and software have greatly aided the orthopedic design engineer and physician in creating a custom implant with good anatomic conformity in a short turnaround time. Further advances in computerized design and manufacturing will continue to simplify the development of custom prostheses and enlarge their niche in the joint replacement market.
Surviving Nuclear Winter Towards a Service-Led Business
NASA Astrophysics Data System (ADS)
Rocha, Michael; Chou, Timothy
During the tech-led recession in 2001 a little known transformation occurred at the world's largest business software company. This transformation was led by a realization that existing customers of mature software need service of the products they purchased more than just purchasing new products. Organizing around the installed base of customers both defined new organizations, as well as new technology to power the specialists. This paper both gives a glimpse of the Oracle transformation as well as lays out some fundamental tenants of anyone interested in a service-led business.
NASA Astrophysics Data System (ADS)
Li, Yajie; Zhao, Yongli; Zhang, Jie; Yu, Xiaosong; Chen, Haoran; Zhu, Ruijie; Zhou, Quanwei; Yu, Chenbei; Cui, Rui
2017-01-01
A Virtual Network Operator (VNO) is a provider and reseller of network services from other telecommunications suppliers. These network providers are categorized as virtual because they do not own the underlying telecommunication infrastructure. In terms of business operation, VNO can provide customers with personalized services by leasing network infrastructure from traditional network providers. The unique business modes of VNO lead to the emergence of network on demand (NoD) services. The conventional network provisioning involves a series of manual operation and configuration, which leads to high cost in time. Considering the advantages of Software Defined Networking (SDN), this paper proposes a novel NoD service provisioning solution to satisfy the private network need of VNOs. The solution is first verified in the real software defined multi-domain optical networks with multi-vendor OTN equipment. With the proposed solution, NoD service can be deployed via online web portals in near-real time. It reinvents the customer experience and redefines how network services are delivered to customers via an online self-service portal. Ultimately, this means a customer will be able to simply go online, click a few buttons and have new services almost instantaneously.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-14
... for Unauthorized Access to Customer Information and Customer Notice AGENCY: Office of Thrift...: Interagency Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice... physical safeguards to: (1) Ensure the security and confidentiality of customer records and information; (2...
Pragmatic quality metrics for evolutionary software development models
NASA Technical Reports Server (NTRS)
Royce, Walker
1990-01-01
Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.
17 CFR 270.0-11 - Customer identification programs.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Customer identification... (CONTINUED) RULES AND REGULATIONS, INVESTMENT COMPANY ACT OF 1940 § 270.0-11 Customer identification programs... implementing regulation at 31 CFR 103.131, which requires a customer identification program to be implemented...
Soil Carbon Data: long tail recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-07-25
The software is intended to be part of an open source effort regarding soils data. The software provides customized data ingestion scripts for soil carbon related data sets and scripts for output databases that conform to common templates.
Insurance coverage of customers induces dishonesty of sellers in markets for credence goods.
Kerschbamer, Rudolf; Neururer, Daniel; Sutter, Matthias
2016-07-05
Honesty is a fundamental pillar for cooperation in human societies and thus for their economic welfare. However, humans do not always act in an honest way. Here, we examine how insurance coverage affects the degree of honesty in credence goods markets. Such markets are plagued by strong incentives for fraudulent behavior of sellers, resulting in estimated annual costs of billions of dollars to customers and the society as a whole. Prime examples of credence goods are all kinds of repair services, the provision of medical treatments, the sale of software programs, and the provision of taxi rides in unfamiliar cities. We examine in a natural field experiment how computer repair shops take advantage of customers' insurance for repair costs. In a control treatment, the average repair price is about EUR 70, whereas the repair bill increases by more than 80% when the service provider is informed that an insurance would reimburse the bill. Our design allows decomposing the sources of this economically impressive difference, showing that it is mainly due to the overprovision of parts and overcharging of working time. A survey among repair shops shows that the higher bills are mainly ascribed to insured customers being less likely to be concerned about minimizing costs because a third party (the insurer) pays the bill. Overall, our results strongly suggest that insurance coverage greatly increases the extent of dishonesty in important sectors of the economy with potentially huge costs to customers and whole economies.
Arc_Mat: a Matlab-based spatial data analysis toolbox
NASA Astrophysics Data System (ADS)
Liu, Xingjian; Lesage, James
2010-03-01
This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampornpan, Teerapat; Fisher, Forest W.
2010-01-01
Version 5.0 of the AutoGen software has been released. Previous versions, variously denoted Autogen and autogen, were reported in two articles: Automated Sequence Generation Process and Software (NPO-30746), Software Tech Briefs (Special Supplement to NASA Tech Briefs), September 2007, page 30, and Autogen Version 2.0 (NPO- 41501), NASA Tech Briefs, Vol. 31, No. 10 (October 2007), page 58. To recapitulate: AutoGen (now signifying automatic sequence generation ) automates the generation of sequences of commands in a standard format for uplink to spacecraft. AutoGen requires fewer workers than are needed for older manual sequence-generation processes, and greatly reduces sequence-generation times. The sequences are embodied in spacecraft activity sequence files (SASFs). AutoGen automates generation of SASFs by use of another previously reported program called APGEN. AutoGen encodes knowledge of different mission phases and of how the resultant commands must differ among the phases. AutoGen also provides means for customizing sequences through use of configuration files. The approach followed in developing AutoGen has involved encoding the behaviors of a system into a model and encoding algorithms for context-sensitive customizations of the modeled behaviors. This version of AutoGen addressed the MRO (Mars Reconnaissance Orbiter) primary science phase (PSP) mission phase. On previous Mars missions this phase has more commonly been referred to as mapping phase. This version addressed the unique aspects of sequencing orbital operations and specifically the mission specific adaptation of orbital operations for MRO. This version also includes capabilities for MRO s role in Mars relay support for UHF relay communications with the MER rovers and the Phoenix lander.
Repeat Customer Success in Extension
ERIC Educational Resources Information Center
Bess, Melissa M.; Traub, Sarah M.
2013-01-01
Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…
Testing Software Development Project Productivity Model
NASA Astrophysics Data System (ADS)
Lipkin, Ilya
Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.
O'Callaghan, Sean; De Souza, David P; Isaac, Andrew; Wang, Qiao; Hodkinson, Luke; Olshansky, Moshe; Erwin, Tim; Appelbe, Bill; Tull, Dedreia L; Roessner, Ute; Bacic, Antony; McConville, Malcolm J; Likić, Vladimir A
2012-05-30
Gas chromatography-mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface.
Rahmani, Zienolabedin; Ranjbar, Mansour; Gara, Ali Asgar Nadi; Gorji, Mohammad Ali Heidari
2017-06-01
Healthcare providers are competitive, owing to heightened customers' awareness and expectations of health care services. The aim of this study was to determine the relationship between customer value creation and loyalty with mediator trust and customer satisfaction. This is a cross sectional survey study. Participants were 196 patients referred to private hospitals in Sari city, Iran from May to June 2014 which were selected by convenience sampling method. Data were collected using questionnaires. Data were analyzed using the structural equation modeling software Smart PLS. The results revealed a relationship between customer value creation and customer loyalty in a Sari city private hospital, and customer satisfaction and trust, mediate the relationship between customer value creation and customer loyalty. The results also revealed significant positive relationship between customer satisfaction and trust (p=0.000 r=0.585). customer satisfaction and trust mediate the relationship between customer value creation and customer loyalty.
2005 5th Annual CMMI Technology Conference and User Group. Volume 4: Thursday
2005-11-17
Identification and Involvement in the CMMI, Mr. James R. Armstrong , Systems and Software Consortium Ensuring the Right Process is Deployed Right...Customer-Driven Organization Chart Marketing Management: Analysis, Planning, Implementation and Control Philip Kotler © Prentice Hall Being Customer
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
Differentiation of tumor from viable myocardium using cardiac tagging with MR imaging.
Bouton, S; Yang, A; McCrindle, B W; Kidd, L; McVeigh, E R; Zerhouni, E A
1991-01-01
We report the application of myocardial tagging by MR to define tissue planes and differentiate contractile from noncontractile tissue in a neonate with congenital cardiac rhabdomyoma. Using custom-written pulse programming software, six 2 mm thick radiofrequency (RF) slice-selective presaturation pulses (tags) were used to label the chest wall and myocardium in a star pattern in diastole, approximately 60 ms before the R-wave gating trigger. This method successfully delineated the myocardium from noncontractile tumor, providing information that influenced clinical management. This RF tagging technique allowed us to confirm the exact intramyocardial location of a congenital cardiac tumor.
Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer
NASA Astrophysics Data System (ADS)
Stryjewski, Wieslaw J.
1991-08-01
A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.
DPOI: Distributed software system development platform for ocean information service
NASA Astrophysics Data System (ADS)
Guo, Zhongwen; Hu, Keyong; Jiang, Yongguo; Sun, Zhaosui
2015-02-01
Ocean information management is of great importance as it has been employed in many areas of ocean science and technology. However, the developments of Ocean Information Systems (OISs) often suffer from low efficiency because of repetitive work and continuous modifications caused by dynamic requirements. In this paper, the basic requirements of OISs are analyzed first, and then a novel platform DPOI is proposed to improve development efficiency and enhance software quality of OISs by providing off-the-shelf resources. In the platform, the OIS is decomposed hierarchically into a set of modules, which can be reused in different system developments. These modules include the acquisition middleware and data loader that collect data from instruments and files respectively, the database that stores data consistently, the components that support fast application generation, the web services that make the data from distributed sources syntactical by use of predefined schemas and the configuration toolkit that enables software customization. With the assistance of the development platform, the software development needs no programming and the development procedure is thus accelerated greatly. We have applied the development platform in practical developments and evaluated its efficiency in several development practices and different development approaches. The results show that DPOI significantly improves development efficiency and software quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.; Murray, Regan; Bynum, Michael
Water utilities are vulnerable to a wide variety of human-caused and natural disasters. These disruptive events can result in loss of water service, contaminated water, pipe breaks, and failed equipment. Furthermore, long term changes in water supply and customer demand can have a large impact on the operating conditions of the network. The ability to maintain drinking water service during and following these types of events is critical. Simulation and analysis tools can help water utilities explore how their network will respond to disruptive events and plan effective mitigation strategies. The U.S. Environmental Protection Agency and Sandia National Laboratories aremore » developing new software tools to meet this need. The Water Network Tool for Resilience (WNTR, pronounced winter) is a Python package designed to help water utilities investigate resilience of water distribution systems over a wide range of hazardous scenarios and to evaluate resilience-enhancing actions. The following documentation includes installation instructions and examples, description of software features, and software license. It is assumed that the reader is familiar with the Python Programming Language. References are included for additional background on software components. Online documentation, hosted at http://wntr.readthedocsio/, will be updated as new features are added. The online version includes API documentation and information for developers.« less
Payload Operations Support Team Tools
NASA Technical Reports Server (NTRS)
Askew, Bill; Barry, Matthew; Burrows, Gary; Casey, Mike; Charles, Joe; Downing, Nicholas; Jain, Monika; Leopold, Rebecca; Luty, Roger; McDill, David;
2007-01-01
Payload Operations Support Team Tools is a software system that assists in (1) development and testing of software for payloads to be flown aboard the space shuttles and (2) training of payload customers, flight controllers, and flight crews in payload operations
LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER
NASA Technical Reports Server (NTRS)
Will, H.
1994-01-01
The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.
Open source software to control Bioflo bioreactors.
Burdge, David A; Libourel, Igor G L
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.
Open Source Software to Control Bioflo Bioreactors
Burdge, David A.; Libourel, Igor G. L.
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828
Rahmani, Zienolabedin; Ranjbar, Mansour; Gara, Ali Asgar Nadi; gorji, Mohammad Ali Heidari
2017-01-01
Background Healthcare providers are competitive, owing to heightened customers’ awareness and expectations of health care services. Objective The aim of this study was to determine the relationship between customer value creation and loyalty with mediator trust and customer satisfaction. Methods This is a cross sectional survey study. Participants were 196 patients referred to private hospitals in Sari city, Iran from May to June 2014 which were selected by convenience sampling method. Data were collected using questionnaires. Data were analyzed using the structural equation modeling software Smart PLS. Results The results revealed a relationship between customer value creation and customer loyalty in a Sari city private hospital, and customer satisfaction and trust, mediate the relationship between customer value creation and customer loyalty. The results also revealed significant positive relationship between customer satisfaction and trust (p=0.000 r=0.585). Conclusion customer satisfaction and trust mediate the relationship between customer value creation and customer loyalty. PMID:28848619
NSSDC provides network access to key data via NDADS
NASA Technical Reports Server (NTRS)
Behnke, Jeanne; King, Joseph
1994-01-01
The National Space Science Data Center (NSSDC) is making a growing fraction of its most customer-desirable data electronically accessible via both the local and wide area networks. NSSDC is witnessing a great increase in its data dissemination owing to this network accessibility. To provide its customers the best data accessibility, the NSSDC makes data available from a nearline, mass storage system, the NSSDC Data Archive and Dissemination Service (NDADS). The NDADS, the initial version was made available in January 1992, is a customized system of hardware and software that provides users access to the nearline data via ANONYMOUS FTP, an e-mail interface (ARMS), and a C-based software library. In January 1992, the NDADS registered 416 requests for 1,957 files. By December of 1994, NDADS had been populated with 800 gigabytes of electronically accessible data and had registered 1458 requests for 20,887 files. In this report we describe the NDADS system, both hardware and software. Later in the report, we discuss some of the lessons that were learned as a result of operating NDADS, particularly in the area of ingest and dissemination.
OpenSQUID: A Flexible Open-Source Software Framework for the Control of SQUID Electronics
Jaeckel, Felix T.; Lafler, Randy J.; Boyd, S. T. P.
2013-02-06
We report commercially available computer-controlled SQUID electronics are usually delivered with software providing a basic user interface for adjustment of SQUID tuning parameters, such as bias current, flux offset, and feedback loop settings. However, in a research context it would often be useful to be able to modify this code and/or to have full control over all these parameters from researcher-written software. In the case of the STAR Cryoelectronics PCI/PFL family of SQUID control electronics, the supplied software contains modules for automatic tuning and noise characterization, but does not provide an interface for user code. On the other hand, themore » Magnicon SQUIDViewer software package includes a public application programming interface (API), but lacks auto-tuning and noise characterization features. To overcome these and other limitations, we are developing an "open-source" framework for controlling SQUID electronics which should provide maximal interoperability with user software, a unified user interface for electronics from different manufacturers, and a flexible platform for the rapid development of customized SQUID auto-tuning and other advanced features. Finally, we have completed a first implementation for the STAR Cryoelectronics hardware and have made the source code for this ongoing project available to the research community on SourceForge (http://opensquid.sourceforge.net) under the GNU public license.« less
Software selection based on analysis and forecasting methods, practised in 1C
NASA Astrophysics Data System (ADS)
Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.
2015-09-01
The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.
System integration test plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
This document presents the system integration test plan for the Commercial-Off-The-Shelf, PassPort and PeopleSoft software, and custom software created to work with the COTS products. The PP software is an integrated application for AP, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheet. The PS software is an integrated application for Project Costing, General Ledger, Human Resources/Training, Payroll, and Base Benefits.
A distributed data acquisition software scheme for the Laboratory Telerobotic Manipulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, P.L.; Glassell, R.L.; Rowe, J.C.
1990-01-01
A custom software architecture was developed for use in the Laboratory Telerobotic Manipulator (LTM) to provide support for the distributed data acquisition electronics. This architecture was designed to provide a comprehensive development environment that proved to be useful for both hardware and software debugging. This paper describes the development environment and the operational characteristics of the real-time data acquisition software. 8 refs., 5 figs.
ERIC Educational Resources Information Center
Villano, Matt
2007-01-01
In the corporate world, the notion of customer relationship management (CRM) is nothing new. That particular technology sector is now jam-packed with software that enables organizations to monitor and manage every interaction with a customer, from the very first experience on, throughout the lifecycle of the relationship. That relationship spans…
The endothelial sample size analysis in corneal specular microscopy clinical examinations.
Abib, Fernando C; Holzchuh, Ricardo; Schaefer, Artur; Schaefer, Tania; Godois, Ronialci
2012-05-01
To evaluate endothelial cell sample size and statistical error in corneal specular microscopy (CSM) examinations. One hundred twenty examinations were conducted with 4 types of corneal specular microscopes: 30 with each BioOptics, CSO, Konan, and Topcon corneal specular microscopes. All endothelial image data were analyzed by respective instrument software and also by the Cells Analyzer software with a method developed in our lab. A reliability degree (RD) of 95% and a relative error (RE) of 0.05 were used as cut-off values to analyze images of the counted endothelial cells called samples. The sample size mean was the number of cells evaluated on the images obtained with each device. Only examinations with RE < 0.05 were considered statistically correct and suitable for comparisons with future examinations. The Cells Analyzer software was used to calculate the RE and customized sample size for all examinations. Bio-Optics: sample size, 97 ± 22 cells; RE, 6.52 ± 0.86; only 10% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 162 ± 34 cells. CSO: sample size, 110 ± 20 cells; RE, 5.98 ± 0.98; only 16.6% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 157 ± 45 cells. Konan: sample size, 80 ± 27 cells; RE, 10.6 ± 3.67; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 336 ± 131 cells. Topcon: sample size, 87 ± 17 cells; RE, 10.1 ± 2.52; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 382 ± 159 cells. A very high number of CSM examinations had sample errors based on Cells Analyzer software. The endothelial sample size (examinations) needs to include more cells to be reliable and reproducible. The Cells Analyzer tutorial routine will be useful for CSM examination reliability and reproducibility.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
.... As such, marketing fee programs,\\7\\ and customer posting incentive programs,\\8\\ are based on... its current Priority Customer Rebate Program (the ``Program'') until October 31, 2013.\\3\\ The Program... Priority Customer \\6\\ order transmitted by that Member which is executed on the Exchange in all multiply...
visPIG--a web tool for producing multi-region, multi-track, multi-scale plots of genetic data.
Scales, Matthew; Jäger, Roland; Migliorini, Gabriele; Houlston, Richard S; Henrion, Marc Y R
2014-01-01
We present VISual Plotting Interface for Genetics (visPIG; http://vispig.icr.ac.uk), a web application to produce multi-track, multi-scale, multi-region plots of genetic data. visPIG has been designed to allow users not well versed with mathematical software packages and/or programming languages such as R, Matlab®, Python, etc., to integrate data from multiple sources for interpretation and to easily create publication-ready figures. While web tools such as the UCSC Genome Browser or the WashU Epigenome Browser allow custom data uploads, such tools are primarily designed for data exploration. This is also true for the desktop-run Integrative Genomics Viewer (IGV). Other locally run data visualisation software such as Circos require significant computer skills of the user. The visPIG web application is a menu-based interface that allows users to upload custom data tracks and set track-specific parameters. Figures can be downloaded as PDF or PNG files. For sensitive data, the underlying R code can also be downloaded and run locally. visPIG is multi-track: it can display many different data types (e.g association, functional annotation, intensity, interaction, heat map data,…). It also allows annotation of genes and other custom features in the plotted region(s). Data tracks can be plotted individually or on a single figure. visPIG is multi-region: it supports plotting multiple regions, be they kilo- or megabases apart or even on different chromosomes. Finally, visPIG is multi-scale: a sub-region of particular interest can be 'zoomed' in. We describe the various features of visPIG and illustrate its utility with examples. visPIG is freely available through http://vispig.icr.ac.uk under a GNU General Public License (GPLv3).
Robotic NDE inspection of advanced solid rocket motor casings
NASA Technical Reports Server (NTRS)
Mcneelege, Glenn E.; Sarantos, Chris
1994-01-01
The Advanced Solid Rocket Motor program determined the need to inspect ASRM forgings and segments for potentially catastrophic defects. To minimize costs, an automated eddy current inspection system was designed and manufactured for inspection of ASRM forgings in the initial phases of production. This system utilizes custom manipulators and motion control algorithms and integrated six channel eddy current data acquisition and analysis hardware and software. Total system integration is through a personal computer based workcell controller. Segment inspection demands the use of a gantry robot for the EMAT/ET inspection system. The EMAT/ET system utilized similar mechanical compliancy and software logic to accommodate complex part geometries. EMAT provides volumetric inspection capability while eddy current is limited to surface and near surface inspection. Each aspect of the systems are applicable to other industries, such as, inspection of pressure vessels, weld inspection, and traditional ultrasonic inspection applications.
Laboratory Animal Management Assistant (LAMA): a LIMS for active research colonies.
Milisavljevic, Marko; Hearty, Taryn; Wong, Tony Y T; Portales-Casamar, Elodie; Simpson, Elizabeth M; Wasserman, Wyeth W
2010-06-01
Laboratory Animal Management Assistant (LAMA) is an internet-based system for tracking large laboratory mouse colonies. It has a user-friendly interface with powerful search capabilities that ease day-to-day tasks such as tracking breeding cages and weaning litters. LAMA was originally developed to manage hundreds of new mouse strains generated by a large functional genomics program, the Pleiades Promoter Project ( http://www.pleiades.org ). The software system has proven to be highly flexible, suitable for diverse management approaches to mouse colonies. It allows custom tagging and grouping of animals, simplifying project-specific handling and access to data. Finally, LAMA was developed in close collaboration with mouse technicians to ease the transition from paper- or Excel-based management systems to computerized tracking, allowing data export in a popular spreadsheet format and automatic printing of cage cards. LAMA is an open-access software tool, freely available to the research community at http://launchpad.net/mousedb .
Engqvist, Martin K M; Nielsen, Jens
2015-08-21
The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Integrated prototyping environment for programmable automation
NASA Astrophysics Data System (ADS)
da Costa, Francis; Hwang, Vincent S. S.; Khosla, Pradeep K.; Lumia, Ronald
1992-11-01
We propose a rapid prototyping environment for robotic systems, based on tenets of modularity, reconfigurability and extendibility that may help build robot systems `faster, better, and cheaper.' Given a task specification, (e.g., repair brake assembly), the user browses through a library of building blocks that include both hardware and software components. Software advisors or critics recommend how blocks may be `snapped' together to speedily construct alternative ways to satisfy task requirements. Mechanisms to allow `swapping' competing modules for comparative test and evaluation studies are also included in the prototyping environment. After some iterations, a stable configuration or `wiring diagram' emerges. This customized version of the general prototyping environment still contains all the hooks needed to incorporate future improvements in component technologies and to obviate unplanned obsolescence. The prototyping environment so described is relevant for both interactive robot programming (telerobotics) and iterative robot system development (prototyping).
National Water-Quality Assessment (NAWQA) area-characterization toolbox
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.
NASA Technical Reports Server (NTRS)
1997-01-01
Cogent Software, Inc. was formed in January 1995 by David Atkinson and Irene Woerner, both former employees of the Jet Propulsion Laboratory (JPL). Several other Cogent employees also worked at JPL. Atkinson headed JPL's Information Systems Technology section and Woerner lead the Advanced User Interfaces Group. Cogent's mission is to help companies organize and manage their online content by developing advanced software for the next generation of online directories and information catalogs. The company offers a complete range of Internet solutions, including Internet access, Web site design, local and wide-area networks, and custom software for online commerce applications. Cogent also offers DesignSphere Online, an electronic community for the communications arts industry. Customers range from small offices to manufacturers with thousands of employees, including Chemi-Con, one of the largest manufacturers of capacitors in the world.
[Software for performing a global phenotypic and genotypic nutritional assessment].
García de Diego, L; Cuervo, M; Martínez, J A
2013-01-01
The nutritional assessment of a patient needs the simultaneous managing a extensive information and a great number of databases, as both aspects of the process of nutrition and the clinical situation of the patient are analyzed. The introduction of computers in the nutritional area constitutes an extraordinary advance in the administration of nutrition information, providing a complete assessment of nutritional aspects in a quick and easy way. To develop a computer program that can be used as a tool for assessing the nutritional status of the patient, the education of clinical staff, for epidemiological studies and for educational purposes. Based on a computer program which assists the health specialist to perform a full nutritional evaluation of the patient, through the registration and assessment of the phenotypic and genotypic features. The application provides nutritional prognosis based on anthropometric and biochemical parameters, images of states of malnutrition, questionnaires to characterize diseases, diagnostic criteria, identification of alleles associated with the development of specific metabolic illnesses and questionnaires of quality of life, for a custom actuation. The program includes, as part of the nutritional assessment of the patient, food intake analysis, design of diets and promotion of physical activity, introducing food frequency questionnaires, dietary recalls, healthy eating indexes, model diets, fitness tests, and recommendations, recalls and questionnaires of physical activity. A computer program performed under Java Swing, using SQLite database and some external libraries such as JfreeChart for plotting graphs. This brand new designed software is composed of five blocks categorized into ten modules named: Patients, Anthropometry, Clinical History, Biochemistry, Dietary History, Diagnostic (with genetic make up), Quality of life, Physical activity, Energy expenditure and Diets. Each module has a specific function which evaluates a different aspect of the nutritional status of the patient. UNyDIET is a global computer program, customized and upgradeable, easy to use and versatile, aimed to health specialists, medical staff, dietitians, nutritionists, scientists and educators. This tool can be used as a working instrument in programs promoting health, nutritional and clinical assessments as well as in the evaluation of health care quality, in epidemiological studies, in nutrition intervention programs and teaching. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.
Process for Managing and Customizing HPC Operating Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, David ML
2014-04-02
A process for maintaining a custom HPC operating system was developed at the Environmental Molecular Sciences Laboratory (EMSL) over the past ten years. This process is generic and flexible to manage continuous change as well as keep systems updated while managing communication through well defined pieces of software.
Customized News in Your Mailbox.
ERIC Educational Resources Information Center
Rudich, Joe
1996-01-01
Customized Internet services deliver news and selected research via e-mail, fax, Web browser, or their own software. Some are clipping services while others are full-fledged online newspapers. Most charge a monthly subscription fee, but a few are free to registered users. Provides the addresses, cost, scope, and evaluation of eight services. (PEN)
78 FR 13673 - HTC America, Inc.; Analysis of Proposed Consent Order To Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-28
... modifying various pre-installed applications and components in order to differentiate its products from.... As the customized applications and components are pre-installed on the device, consumers do not... together, failed to provide reasonable and appropriate security in the design and customization of software...
ECCE Toolkit: Prototyping Sensor-Based Interaction.
Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma
2017-02-23
Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.
ECCE Toolkit: Prototyping Sensor-Based Interaction
Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma
2017-01-01
Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit. PMID:28241502
The Integrated Medical Model: Outcomes from Independent Review
NASA Technical Reports Server (NTRS)
Myers, J.; Garcia, Y.; Griffin, D.; Arellano, J.; Boley, L.; Goodenow, D. A.; Kerstman, E.; Reyes, D.; Saile, L.; Walton, M.;
2017-01-01
In 2016, the Integrated Medical Model (IMM) v4.0 underwent an extensive external review in preparation for transition to an operational status. In order to insure impartiality of the review process, the Exploration Medical Capabilities Element of NASA's Human Research Program convened the review through the Systems Review Office at NASA Goddard Space Flight Center (GSFC). The review board convened by GSFC consisted of persons from both NASA and academia with expertise in the fields of statistics, epidemiology, modeling, software development, aerospace medicine, and project management (see Figure 1). The board reviewed software and code standards, as well as evidence pedigree associated with both the input and outcomes information. The board also assesses the models verification, validation, sensitivity to parameters and ability to answer operational questions. This talk will discuss the processes for designing the review, how the review progressed and the findings from the board, as well as summarize the IMM project responses to those findings. Overall, the board found that the IMM is scientifically sound, represents a necessary, comprehensive approach to identifying medical and environmental risks facing astronauts in long duration missions and is an excellent tool for communication between engineers and physicians. The board also found IMM and its customer(s) should convene an additional review of the IMM data sources and to develop a sustainable approach to augment, peer review, and maintain the information utilized in the IMM. The board found this is critically important because medical knowledge continues to evolve. Delivery of IMM v4.0 to the Crew Health and Safety (CHS) Program will occur in the 2017. Once delivered for operational decision support, IMM v4.0 will provide CHS with additional quantitative capability in to assess astronaut medical risks and required medical capabilities to help drive down overall mission risks.
NASA Technical Reports Server (NTRS)
Srivastava, Priyaka; Kraus, Jeff; Murawski, Robert; Golden, Bertsel, Jr.
2015-01-01
NASAs Space Communications and Navigation (SCaN) program manages three active networks: the Near Earth Network, the Space Network, and the Deep Space Network. These networks simultaneously support NASA missions and provide communications services to customers worldwide. To efficiently manage these resources and their capabilities, a team of student interns at the NASA Glenn Research Center is developing a distributed system to model the SCaN networks. Once complete, the system shall provide a platform that enables users to perform capacity modeling of current and prospective missions with finer-grained control of information between several simulation and modeling tools. This will enable the SCaN program to access a holistic view of its networks and simulate the effects of modifications in order to provide NASA with decisional information. The development of this capacity modeling system is managed by NASAs Strategic Center for Education, Networking, Integration, and Communication (SCENIC). Three primary third-party software tools offer their unique abilities in different stages of the simulation process. MagicDraw provides UMLSysML modeling, AGIs Systems Tool Kit simulates the physical transmission parameters and de-conflicts scheduled communication, and Riverbed Modeler (formerly OPNET) simulates communication protocols and packet-based networking. SCENIC developers are building custom software extensions to integrate these components in an end-to-end space communications modeling platform. A central control module acts as the hub for report-based messaging between client wrappers. Backend databases provide information related to mission parameters and ground station configurations, while the end user defines scenario-specific attributes for the model. The eight SCENIC interns are working under the direction of their mentors to complete an initial version of this capacity modeling system during the summer of 2015. The intern team is composed of four students in Computer Science, two in Computer Engineering, one in Electrical Engineering, and one studying Space Systems Engineering.
Tool for Analysis and Reduction of Scientific Data
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN as a whole (up to version 2.0) has been summarized, and selected aspects of ASPEN have been discussed in several previous NASA Tech Briefs articles. Restated briefly, ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and randomaccess memories. Domain-specific reasoning modules (e.g., modules for determining orbits for spacecraft) can easily be plugged into ASPEN 3.0. Improvements over other, similar software that have been incorporated into ASPEN 3.0 include a provision for more expressive time-line values, new parsing capabilities afforded by an ASPEN language based on Extensible Markup Language, improved search capabilities, and improved interfaces to other, utility-type software (notably including MATLAB).
Auto-Generated Semantic Processing Services
NASA Technical Reports Server (NTRS)
Davis, Rodney; Hupf, Greg
2009-01-01
Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.
Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi
2014-11-01
Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.
Radiation effects in reconfigurable FPGAs
NASA Astrophysics Data System (ADS)
Quinn, Heather
2017-04-01
Field-programmable gate arrays (FPGAs) are co-processing hardware used in image and signal processing. FPGA are programmed with custom implementations of an algorithm. These algorithms are highly parallel hardware designs that are faster than software implementations. This flexibility and speed has made FPGAs attractive for many space programs that need in situ, high-speed signal processing for data categorization and data compression. Most commercial FPGAs are affected by the space radiation environment, though. Problems with TID has restricted the use of flash-based FPGAs. Static random access memory based FPGAs must be mitigated to suppress errors from single-event upsets. This paper provides a review of radiation effects issues in reconfigurable FPGAs and discusses methods for mitigating these problems. With careful design it is possible to use these components effectively and resiliently.
Dufendach, Kevin R; Koch, Sabine; Unertl, Kim M; Lehmann, Christoph U
2017-10-26
Early involvement of stakeholders in the design of medical software is particularly important due to the need to incorporate complex knowledge and actions associated with clinical work. Standard user-centered design methods include focus groups and participatory design sessions with individual stakeholders, which generally limit user involvement to a small number of individuals due to the significant time investments from designers and end users. The goal of this project was to reduce the effort for end users to participate in co-design of a software user interface by developing an interactive web-based crowdsourcing platform. In a randomized trial, we compared a new web-based crowdsourcing platform to standard participatory design sessions. We developed an interactive, modular platform that allows responsive remote customization and design feedback on a visual user interface based on user preferences. The responsive canvas is a dynamic HTML template that responds in real time to user preference selections. Upon completion, the design team can view the user's interface creations through an administrator portal and download the structured selections through a REDCap interface. We have created a software platform that allows users to customize a user interface and see the results of that customization in real time, receiving immediate feedback on the impact of their design choices. Neonatal clinicians used the new platform to successfully design and customize a neonatal handoff tool. They received no specific instruction and yet were able to use the software easily and reported high usability. VandAID, a new web-based crowdsourcing platform, can involve multiple users in user-centered design simultaneously and provides means of obtaining design feedback remotely. The software can provide design feedback at any stage in the design process, but it will be of greatest utility for specifying user requirements and evaluating iterative designs with multiple options.
20 CFR 669.330 - How are services delivered to the customer?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false How are services delivered to the customer... Farmworker Jobs Program Customers and Available Program Services § 669.330 How are services delivered to the customer? To ensure that all services are focused on the customer's needs, services are provided through a...
20 CFR 669.330 - How are services delivered to the customer?
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false How are services delivered to the customer... Farmworker Jobs Program Customers and Available Program Services § 669.330 How are services delivered to the customer? To ensure that all services are focused on the customer's needs, services are provided through a...
Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark
2016-07-05
There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.
Automatic Requirements Specification Extraction from Natural Language (ARSENAL)
2014-10-01
designers, implementers) involved in the design of software systems. However, natural language descriptions can be informal, incomplete, imprecise...communication of technical descriptions between the various stakeholders (e.g., customers, designers, imple- menters) involved in the design of software systems...the accuracy of the natural language processing stage, the degree of automation, and robustness to noise. 1 2 Introduction Software systems operate in
National Software Reference Library (NSRL)
National Institute of Standards and Technology Data Gateway
National Software Reference Library (NSRL) (PC database for purchase) A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.
35 Ways to Take a "Byte" out of Software Costs. Fund Raising Ideas from COMPress Customers.
ERIC Educational Resources Information Center
COMPress, Wentworth, NH.
Based on a survey sponsored by COMPress Quarterly of various schools to determine the extent of the problem of lack of funds for purchasing computer software and how schools have coped with the problem, this booklet describes numerous ways to raise funds for software purchases. Nearly 1,000 questionnaires were returned and this booklet was…
NASA Technical Reports Server (NTRS)
Mangieri, Mark
2005-01-01
ARED flight instrumentation software is associated with an overall custom designed resistive exercise system that will be deployed on the International Space Station (ISS). This innovative software application fuses together many diverse and new technologies into a robust and usable package. The software takes advantage of touchscreen user interface technology by providing a graphical user interface on a Windows based tablet PC, meeting a design constraint of keyboard-less interaction with flight crewmembers. The software interacts with modified commercial data acquisition (DAQ) hardware to acquire multiple channels of sensor measurment from the ARED device. This information is recorded on the tablet PC and made available, via International Space Station (ISS) Wireless LAN (WLAN) and telemetry subsystems, to ground based mission medics and trainers for analysis. The software includes a feature to accept electronically encoded prescriptions of exercises that guide crewmembers through a customized regimen of resistive weight training, based on personal analysis. These electronically encoded prescriptions are provided to the crew via ISS WLAN and telemetry subsystems. All personal data is securely associated with an individual crew member, based on a PIN ID mechanism.
Accelerating a MPEG-4 video decoder through custom software/hardware co-design
NASA Astrophysics Data System (ADS)
Díaz, Jorge L.; Barreto, Dacil; García, Luz; Marrero, Gustavo; Carballo, Pedro P.; Núñez, Antonio
2007-05-01
In this paper we present a novel methodology to accelerate an MPEG-4 video decoder using software/hardware co-design for wireless DAB/DMB networks. Software support includes the services provided by the embedded kernel μC/OS-II, and the application tasks mapped to software. Hardware support includes several custom co-processors and a communication architecture with bridges to the main system bus and with a dual port SRAM. Synchronization among tasks is achieved at two levels, by a hardware protocol and by kernel level scheduling services. Our reference application is an MPEG-4 video decoder composed of several software functions and written using a special C++ library named CASSE. Profiling and space exploration techniques were used previously over the Advanced Simple Profile (ASP) MPEG-4 decoder to determinate the best HW/SW partition developed here. This research is part of the ARTEMI project and its main goal is the establishment of methodologies for the design of real-time complex digital systems using Programmable Logic Devices with embedded microprocessors as target technology and the design of multimedia systems for broadcasting networks as reference application.
Technical Note: Unified imaging and robotic couch quality assurance.
Cook, Molly C; Roper, Justin; Elder, Eric S; Schreibmann, Eduard
2016-09-01
To introduce a simplified quality assurance (QA) procedure that integrates tests for the linac's imaging components and the robotic couch. Current QA procedures for evaluating the alignment of the imaging system and linac require careful positioning of a phantom at isocenter before image acquisition and analysis. A complementary procedure for the robotic couch requires an initial displacement of the phantom and then evaluates the accuracy of repositioning the phantom at isocenter. We propose a two-in-one procedure that introduces a custom software module and incorporates both checks into one motion for increased efficiency. The phantom was manually set with random translational and rotational shifts, imaged with the in-room imaging system, and then registered to the isocenter using a custom software module. The software measured positioning accuracy by comparing the location of the repositioned phantom with a CAD model of the phantom at isocenter, which is physically verified using the MV port graticule. Repeatability of the custom software was tested by an assessment of internal marker location extraction on a series of scans taken over differing kV and CBCT acquisition parameters. The proposed method was able to correctly position the phantom at isocenter within acceptable 1 mm and 1° SRS tolerances, verified by both physical inspection and the custom software. Residual errors for mechanical accuracy were 0.26 mm vertically, 0.21 mm longitudinally, 0.55 mm laterally, 0.21° in pitch, 0.1° in roll, and 0.67° in yaw. The software module was shown to be robust across various scan acquisition parameters, detecting markers within 0.15 mm translationally in kV acquisitions and within 0.5 mm translationally and 0.3° rotationally across CBCT acquisitions with significant variations in voxel size. Agreement with vendor registration methods was well within 0.5 mm; differences were not statistically significant. As compared to the current two-step approach, the proposed QA procedure streamlines the workflow, accounts for rotational errors in imaging alignment, and simulates a broad range of variations in setup errors seen in clinical practice.
MassCascade: Visual Programming for LC-MS Data Processing in Metabolomics.
Beisken, Stephan; Earll, Mark; Portwood, David; Seymour, Mark; Steinbeck, Christoph
2014-04-01
Liquid chromatography coupled to mass spectrometry (LC-MS) is commonly applied to investigate the small molecule complement of organisms. Several software tools are typically joined in custom pipelines to semi-automatically process and analyse the resulting data. General workflow environments like the Konstanz Information Miner (KNIME) offer the potential of an all-in-one solution to process LC-MS data by allowing easy integration of different tools and scripts. We describe MassCascade and its workflow plug-in for processing LC-MS data. The Java library integrates frequently used algorithms in a modular fashion, thus enabling it to serve as back-end for graphical front-ends. The functions available in MassCascade have been encapsulated in a plug-in for the workflow environment KNIME, allowing combined use with e.g. statistical workflow nodes from other providers and making the tool intuitive to use without knowledge of programming. The design of the software guarantees a high level of modularity where processing functions can be quickly replaced or concatenated. MassCascade is an open-source library for LC-MS data processing in metabolomics. It embraces the concept of visual programming through its KNIME plug-in, simplifying the process of building complex workflows. The library was validated using open data.
Chance, K G; Green, C G
2001-01-01
It has been shown in the for-profit sector (business, service, and manufacturing) that the success of an organization depends on its ability to satisfy customer requirements while eliminating waste and reducing costs. The purpose of this article was to examine the impact of current practices in customer focus on program participation rates in the Virginia WIC Program. The results of this study showed that the use of customer-focused strategies was correlated to program participation rates in the WIC Program. The mean data showed that teamwork and accessibility were at unsatisfactory levels in Virginia.
PT-SAFE: a software tool for development and annunciation of medical audible alarms.
Bennett, Christopher L; McNeer, Richard R
2012-03-01
Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.
Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J
2004-09-24
Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.
The right stuff ... meeting your customer needs.
Rubin, P; Carrington, S
1999-11-01
Meeting (and exceeding) your customers' needs is a requirement for competing in the current business world. New tools and techniques must be employed to deal with the rapidly changing global environment. This article describes the success of a global supply chain integration project for a division of a large multinational corporation. A state-of-the-art ERP software package was implemented in conjunction with major process changes to improve the organization's ability to promise and deliver product to their customers.
Trade-off decisions in distribution utility management
NASA Astrophysics Data System (ADS)
Slavickas, Rimas Anthony
As a result of the "unbundling" of traditional monopolistic electricity generation and transmission enterprises into a free-market economy, power distribution utilities are faced with very difficult decisions pertaining to electricity supply options and quality of service to the customers. The management of distribution utilities has become increasingly complex, versatile, and dynamic to the extent that conventional, non-automated management tools are almost useless and obsolete. This thesis presents a novel and unified approach to managing electricity supply options and quality of service to customers. The technique formulates the problem in terms of variables, parameters, and constraints. An advanced Mixed Integer Programming (MIP) optimization formulation is developed together with novel, logical, decision-making algorithms. These tools enable the utility management to optimize various cost components and assess their time-trend impacts, taking into account the intangible issues such as customer perception, customer expectation, social pressures, and public response to service deterioration. The above concepts are further generalized and a Logical Proportion Analysis (LPA) methodology and associated software have been developed. Solutions using numbers are replaced with solutions using words (character strings) which more closely emulate the human decision-making process and advance the art of decision-making in the power utility environment. Using practical distribution utility operation data and customer surveys, the developments outlined in this thesis are successfully applied to several important utility management problems. These involve the evaluation of alternative electricity supply options, the impact of rate structures on utility business, and the decision of whether to continue to purchase from a main grid or generate locally (partially or totally) by building Non-Utility Generation (NUG).
Analysis of Air Traffic Track Data with the AutoBayes Synthesis System
NASA Technical Reports Server (NTRS)
Schumann, Johann Martin Philip; Cate, Karen; Lee, Alan G.
2010-01-01
The Next Generation Air Traffic System (NGATS) is aiming to provide substantial computer support for the air traffic controllers. Algorithms for the accurate prediction of aircraft movements are of central importance for such software systems but trajectory prediction has to work reliably in the presence of unknown parameters and uncertainties. We are using the AutoBayes program synthesis system to generate customized data analysis algorithms that process large sets of aircraft radar track data in order to estimate parameters and uncertainties. In this paper, we present, how the tasks of finding structure in track data, estimation of important parameters in climb trajectories, and the detection of continuous descent approaches can be accomplished with compact task-specific AutoBayes specifications. We present an overview of the AutoBayes architecture and describe, how its schema-based approach generates customized analysis algorithms, documented C/C++ code, and detailed mathematical derivations. Results of experiments with actual air traffic control data are discussed.
Exploring the Role of Value Networks for Software Innovation
NASA Astrophysics Data System (ADS)
Morgan, Lorraine; Conboy, Kieran
This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.
Using Selection Pressure as an Asset to Develop Reusable, Adaptable Software Systems
NASA Technical Reports Server (NTRS)
Berrick, Stephen; Lynnes, Christopher
2007-01-01
The Goddard Earth Sciences Data and Information Services Center (GES DISC) at NASA has over the years developed and honed several reusable architectural components for supporting large-scale data centers with a large customer base. These include a processing system (S4PM) and an archive system (S4PA) based upon a workflow engine called the Simple Scalable Script based Science Processor (S4P) and an online data visualization and analysis system (Giovanni). These subsystems are currently reused internally in a variety of combinations to implement customized data management on behalf of instrument science teams and other science investigators. Some of these subsystems (S4P and S4PM) have also been reused by other data centers for operational science processing. Our experience has been that development and utilization of robust interoperable and reusable software systems can actually flourish in environments defined by heterogeneous commodity hardware systems the emphasis on value-added customer service and the continual goal for achieving higher cost efficiencies. The repeated internal reuse that is fostered by such an environment encourages and even forces changes to the software that make it more reusable and adaptable. Allowing and even encouraging such selective pressures to software development has been a key factor In the success of S4P and S4PM which are now available to the open source community under the NASA Open source Agreement
A Formal Approach to Domain-Oriented Software Design Environments
NASA Technical Reports Server (NTRS)
Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.
An ontology based information system for the management of institutional repository's collections
NASA Astrophysics Data System (ADS)
Tsolakidis, A.; Kakoulidis, P.; Skourlas, C.
2015-02-01
In this paper we discuss a simple methodological approach to create, and customize institutional repositories for the domain of the technological education. The use of the open source software platform of DSpace is proposed to build up the repository application and provide access to digital resources including research papers, dissertations, administrative documents, educational material, etc. Also the use of owl ontologies is proposed for indexing and accessing the various, heterogeneous items stored in the repository. Customization and operation of a platform for the selection and use of terms or parts of similar existing owl ontologies is also described. This platform could be based on the open source software Protégé that supports owl, is widely used, and also supports visualization, SPARQL etc. The combined use of the owl platform and the DSpace repository form a basis for creating customized ontologies, accommodating the semantic metadata of items and facilitating searching.
19 CFR 115.14 - Meeting on program.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 1 2012-04-01 2012-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...
19 CFR 115.14 - Meeting on program.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 1 2014-04-01 2014-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...
19 CFR 115.14 - Meeting on program.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 1 2013-04-01 2013-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...
19 CFR 115.14 - Meeting on program.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...
19 CFR 115.14 - Meeting on program.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 1 2011-04-01 2011-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...
Customer Education: The Silent Revolution.
ERIC Educational Resources Information Center
Zemke, Ron
1985-01-01
Discusses the marketing value and strategic necessity of planned and promoted customer education. The article examines customer training by the manufacturer as a definite trend in the microcomputer industry. Elements of a good customer training program are described along with suggestions for starting such a program. (CT)
Customer-experienced rapid prototyping
NASA Astrophysics Data System (ADS)
Zhang, Lijuan; Zhang, Fu; Li, Anbo
2008-12-01
In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.
An integrated database with system optimization and design features
NASA Technical Reports Server (NTRS)
Arabyan, A.; Nikravesh, P. E.; Vincent, T. L.
1992-01-01
A customized, mission-specific relational database package was developed to allow researchers working on the Mars oxygen manufacturing plant to enter physical description, engineering, and connectivity data through a uniform, graphical interface and to store the data in formats compatible with other software also developed as part of the project. These latter components include an optimization program to maximize or minimize various criteria as the system evolves into its final design; programs to simulate the behavior of various parts of the plant in Martian conditions; an animation program which, in different modes, provides visual feedback to designers and researchers about the location of and temperature distribution among components as well as heat, mass, and data flow through the plant as it operates in different scenarios; and a control program to investigate the stability and response of the system under different disturbance conditions. All components of the system are interconnected so that changes entered through one component are reflected in the others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conger, Robin L.; Spanner, Gary E.
2011-11-02
The businesses that have utilized PNNL's Technology Assistance Program were sent a survey to solicit feedback about the program and to determine what, if any, outcomes resulted from the assistance provided. As part of its small business outreach, Pacific Northwest National Laboratory (PNNL) offers technology assistance to businesses with fewer than 500 employees throughout the nation and to businesses of any size in the 2 counties that contain the Hanford site. Upon request, up to 40 staff-hours of a researcher's time can be provided to address technology issues at no charge to the requesting firm. During FY 2011, PNNL completedmore » assistance for 54 firms. Topics of the technology assistance covered a broad range, including environment, energy, industrial processes, medical, materials, computers and software, and sensors. In FY 2011, PNNL's Technology Assistance Program (TAP) was funded by PNNL Overheads. Over the past 16 years, the Technology Assistance Program has received total funding of nearly $2.8 million from several federal and private sources.« less
Extreme Programming in a Research Environment
NASA Technical Reports Server (NTRS)
Wood, William A.; Kleb, William L.
2002-01-01
This article explores the applicability of Extreme Programming in a scientific research context. The cultural environment at a government research center differs from the customer-centric business view. The chief theoretical difficulty lies in defining the customer to developer relationship. Specifically, can Extreme Programming be utilized when the developer and customer are the same person? Eight of Extreme Programming's 12 practices are perceived to be incompatible with the existing research culture. Further, six of the nine 'environments that I know don't do well with XP' apply. A pilot project explores the use of Extreme Programming in scientific research. The applicability issues are addressed and it is concluded that Extreme Programming can function successfully in situations for which it appears to be ill-suited. A strong discipline for mentally separating the customer and developer roles is found to be key for applying Extreme Programming in a field that lacks a clear distinction between the customer and the developer.
Stellar Inertial Navigation Workstation
NASA Technical Reports Server (NTRS)
Johnson, W.; Johnson, B.; Swaminathan, N.
1989-01-01
Software and hardware assembled to support specific engineering activities. Stellar Inertial Navigation Workstation (SINW) is integrated computer workstation providing systems and engineering support functions for Space Shuttle guidance and navigation-system logistics, repair, and procurement activities. Consists of personal-computer hardware, packaged software, and custom software integrated together into user-friendly, menu-driven system. Designed to operate on IBM PC XT. Applied in business and industry to develop similar workstations.
Multimodal visualization interface for data management, self-learning and data presentation.
Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M
2006-10-01
A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.
Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities
NASA Technical Reports Server (NTRS)
Ross, Richard W.
2001-01-01
The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.
Research pressure instrumentation for NASA Space Shuttle main engine, modification no. 5
NASA Technical Reports Server (NTRS)
Anderson, P. J.; Nussbaum, P.; Gustafson, G.
1984-01-01
The objective of the research project described is to define and demonstrate methods to advance the state of the art of pressure sensors for the space shuttle main engine (SSME). Silicon piezoresistive technology was utilized in completing tasks: generation and testing of three transducer design concepts for solid state applications; silicon resistor characterization at cryogenic temperatures; experimental chip mounting characterization; frequency response optimization and prototype design and fabrication. Excellent silicon sensor performance was demonstrated at liquid nitrogen temperature. A silicon resistor ion implant dose was customized for SSME temperature requirements. A basic acoustic modeling software program was developed as a design tool to evaluate frequency response characteristics.
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
Implementation of an OAIS Repository Using Free, Open Source Software
NASA Astrophysics Data System (ADS)
Flathers, E.; Gessler, P. E.; Seamon, E.
2015-12-01
The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design of the repository, based upon open standards to support interoperability with other institutions' systems and with future versions of our own software components. We also describe the implementation process, including our use of GitHub as a collaboration tool and code repository.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-29
... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image System (DIS) and Simplified Entry (SE); Correction AGENCY: U.S. Customs and Border Protection, Department...
Front End Software for Online Database Searching. Part 2: The Marketplace.
ERIC Educational Resources Information Center
Levy, Louise R.; Hawkins, Donald T.
1986-01-01
This article analyzes the front end software marketplace and discusses some of the complex forces influencing it. Discussion covers intermediary market; end users (library customers, scientific and technical professionals, corporate business specialists, consumers); marketing strategies; a British front end development firm; competitive pressures;…
Accelerated Math[TM]. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2011
2011-01-01
"Accelerated Math"[TM], published by Renaissance Learning, is a software tool used to customize assignments and monitor progress in math for students in grades 1-12. The "Accelerated Math"[TM] software creates individualized assignments aligned with state standards and national guidelines, scores student work, and generates…
NASA Technical Reports Server (NTRS)
1992-01-01
CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.
Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G
2015-01-01
Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.
ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs
2011-01-01
Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938
Is NIPARS Working as Advertised? An Analysis of NIPARS Program Customer Service
1992-09-01
AD-A259 733IN I II I ll IMiiiI Gil III 11 AFIT/GLM/LSM/92S- 17 IS NIPARS WORKING AS ADVERTISED ? AN ANALYSIS OFNIPARS PROGRAM CUSTOMER SERVICE THESIS...and/or Dist Speoiai. AFIT/GLM/LSM/92S-17 IS NIPARS WORKING AS ADVERTISED ? AN ANALYSIS OF NIPARS PROGRAM CUSTOMER SERVICE THESIS Presented to the...measures. x1i IS NIPARS WORKING AS ADVERTISED ? AN ANALYSIS OF NIPARS PROGRAM CUSTOMER SERVICE L Introduction 1.1 Overview Foreign policy must start with
Agile Development Methods for Space Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay; Webster, Chris
2012-01-01
Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).
Demonstrating High-Accuracy Orbital Access Using Open-Source Tools
NASA Technical Reports Server (NTRS)
Gilbertson, Christian; Welch, Bryan
2017-01-01
Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.
3D OCT imaging in clinical settings: toward quantitative measurements of retinal structures
NASA Astrophysics Data System (ADS)
Zawadzki, Robert J.; Fuller, Alfred R.; Zhao, Mingtao; Wiley, David F.; Choi, Stacey S.; Bower, Bradley A.; Hamann, Bernd; Izatt, Joseph A.; Werner, John S.
2006-02-01
The acquisition speed of current FD-OCT (Fourier Domain - Optical Coherence Tomography) instruments allows rapid screening of three-dimensional (3D) volumes of human retinas in clinical settings. To take advantage of this ability requires software used by physicians to be capable of displaying and accessing volumetric data as well as supporting post processing in order to access important quantitative information such as thickness maps and segmented volumes. We describe our clinical FD-OCT system used to acquire 3D data from the human retina over the macula and optic nerve head. B-scans are registered to remove motion artifacts and post-processed with customized 3D visualization and analysis software. Our analysis software includes standard 3D visualization techniques along with a machine learning support vector machine (SVM) algorithm that allows a user to semi-automatically segment different retinal structures and layers. Our program makes possible measurements of the retinal layer thickness as well as volumes of structures of interest, despite the presence of noise and structural deformations associated with retinal pathology. Our software has been tested successfully in clinical settings for its efficacy in assessing 3D retinal structures in healthy as well as diseased cases. Our tool facilitates diagnosis and treatment monitoring of retinal diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, Amelie; Hedman, Bruce; Taylor, Robert P.
Many states have implemented ratepayer-funded programs to acquire energy efficiency as a predictable and reliable resource for meeting existing and future energy demand. These programs have become a fixture in many U.S. electricity and natural gas markets as they help postpone or eliminate the need for expensive generation and transmission investments. Industrial energy efficiency (IEE) is an energy efficiency resource that is not only a low cost option for many of these efficiency programs, but offers productivity and competitive benefits to manufacturers as it reduces their energy costs. However, some industrial customers are less enthusiastic about participating in these programs.more » IEE ratepayer programs suffer low participation by industries across many states today despite a continual increase in energy efficiency program spending across all types of customers, and significant energy efficiency funds can often go unused for industrial customers. This paper provides four detailed case studies of companies that benefited from participation in their utility’s energy efficiency program offerings and highlights the business value brought to them by participation in these programs. The paper is designed both for rate-payer efficiency program administrators interested in improving the attractiveness and effectiveness of industrial efficiency programs for their industrial customers and for industrial customers interested in maximizing the value of participating in efficiency programs.« less
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
Real-time animation software for customized training to use motor prosthetic systems.
Davoodi, Rahman; Loeb, Gerald E
2012-03-01
Research on control of human movement and development of tools for restoration and rehabilitation of movement after spinal cord injury and amputation can benefit greatly from software tools for creating precisely timed animation sequences of human movement. Despite their ability to create sophisticated animation and high quality rendering, existing animation software are not adapted for application to neural prostheses and rehabilitation of human movement. We have developed a software tool known as MSMS (MusculoSkeletal Modeling Software) that can be used to develop models of human or prosthetic limbs and the objects with which they interact and to animate their movement using motion data from a variety of offline and online sources. The motion data can be read from a motion file containing synthesized motion data or recordings from a motion capture system. Alternatively, motion data can be streamed online from a real-time motion capture system, a physics-based simulation program, or any program that can produce real-time motion data. Further, animation sequences of daily life activities can be constructed using the intuitive user interface of Microsoft's PowerPoint software. The latter allows expert and nonexpert users alike to assemble primitive movements into a complex motion sequence with precise timing by simply arranging the order of the slides and editing their properties in PowerPoint. The resulting motion sequence can be played back in an open-loop manner for demonstration and training or in closed-loop virtual reality environments where the timing and speed of animation depends on user inputs. These versatile animation utilities can be used in any application that requires precisely timed animations but they are particularly suited for research and rehabilitation of movement disorders. MSMS's modeling and animation tools are routinely used in a number of research laboratories around the country to study the control of movement and to develop and test neural prostheses for patients with paralysis or amputations.
Cross Sectional Study of Agile Software Development Methods and Project Performance
ERIC Educational Resources Information Center
Lambert, Tracy
2011-01-01
Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…
Using Knowledge Management to Revise Software-Testing Processes
ERIC Educational Resources Information Center
Nogeste, Kersti; Walker, Derek H. T.
2006-01-01
Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…
Cognitive Tutor[R] Algebra I. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2009
2009-01-01
The "Cognitive Tutor[R] Algebra I" curriculum, published by Carnegie Learning, is an approach that combines algebra textbooks with interactive software. The software is developed around an artificial intelligence model that identifies strengths and weaknesses in each individual student's mastery of mathematical concepts. It then customizes prompts…
PLATO[R] Achieve Now. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2010
2010-01-01
"PLATO[R] Achieve Now" is a software-based curriculum for the elementary and middle school grades. Instructional content is delivered via the PlayStation Portable (PSP[R]) system, allowing students to access learning materials in various settings. Software-based assessments are used to customize individual instruction, allowing students…
DOT National Transportation Integrated Search
2015-04-01
The principal objectives and scope of this project were to provide a software tracking tool to improve : decision-making for highway safety. A literature search revealed that purchasing and customizing : existing software was not feasible and a new s...
ERIC Educational Resources Information Center
Ye, Lei; Recker, Mimi; Walker, Andrew; Leary, Heather; Yuan, Min
2015-01-01
This article reports results from a scale-up study of the impact of a software tool designed to support teachers in the digital learning era. This tool, the Curriculum Customization Service (CCS), enables teachers to access open educational resources from multiple providers, customize them for classroom instruction, and share them with other…
Customer Relationship Management: A Case Study from a Metropolitan Campus of a Regional University
ERIC Educational Resources Information Center
Pember, Edward R.; Owens, Alison; Yaghi, Shazhi
2014-01-01
This paper investigates the users and uses of a centralised customer relationship management (CRM) system at a regional Australian university to improve the understanding of the staff experience of interacting with this customised technology. How and why the software is used by a cross section of university departments is explored through…
NASA Technical Reports Server (NTRS)
Wilmot, Jonathan
2005-01-01
The contents include the following: High availability. Hardware is in harsh environment. Flight processor (constraints) very widely due to power and weight constraints. Software must be remotely modifiable and still operate while changes are being made. Many custom one of kind interfaces for one of a kind missions. Sustaining engineering. Price of failure is high, tens to hundreds of millions of dollars.
Migrating To The Cloud: Preparing The USMC CDET For MCEITS
2016-03-01
Service SAAR System Authorization Access Request SaaS Software as a... Service (IaaS), Platform as a Service (PaaS), Software as a Service ( SaaS ), and Data as a Service (DaaS) (Takai, 2012). A closer examination of each...8 3. Software as a Service NIST described SaaS as a model of cloud computing where the service provider offers its customers fee-based access
Insurance coverage of customers induces dishonesty of sellers in markets for credence goods
Kerschbamer, Rudolf; Neururer, Daniel; Sutter, Matthias
2016-01-01
Honesty is a fundamental pillar for cooperation in human societies and thus for their economic welfare. However, humans do not always act in an honest way. Here, we examine how insurance coverage affects the degree of honesty in credence goods markets. Such markets are plagued by strong incentives for fraudulent behavior of sellers, resulting in estimated annual costs of billions of dollars to customers and the society as a whole. Prime examples of credence goods are all kinds of repair services, the provision of medical treatments, the sale of software programs, and the provision of taxi rides in unfamiliar cities. We examine in a natural field experiment how computer repair shops take advantage of customers’ insurance for repair costs. In a control treatment, the average repair price is about EUR 70, whereas the repair bill increases by more than 80% when the service provider is informed that an insurance would reimburse the bill. Our design allows decomposing the sources of this economically impressive difference, showing that it is mainly due to the overprovision of parts and overcharging of working time. A survey among repair shops shows that the higher bills are mainly ascribed to insured customers being less likely to be concerned about minimizing costs because a third party (the insurer) pays the bill. Overall, our results strongly suggest that insurance coverage greatly increases the extent of dishonesty in important sectors of the economy with potentially huge costs to customers and whole economies. PMID:27325784
2012-01-01
Background Gas chromatography–mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. Results PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). Conclusions PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface. PMID:22647087
Validation of a Custom-made Software for DQE Assessment in Mammography Digital Detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayala-Dominguez, L.; Perez-Ponce, H.; Brandan, M. E.
2010-12-07
This works presents the validation of a custom-made software, designed and developed in Matlab, intended for routine evaluation of detective quantum efficiency DQE, according to algorithms described in the IEC 62220-1-2 standard. DQE, normalized noise power spectrum NNPS and pre-sampling modulation transfer function MTF were calculated from RAW images from a GE Senographe DS (FineView disabled) and a Siemens Novation system. Calculated MTF is in close agreement with results obtained with alternative codes: MTF lowbar tool (Maidment), ImageJ plug-in (Perez-Ponce) and MIQuaELa (Ayala). Overall agreement better than {approx_equal}90% was found in MTF; the largest differences were observed at frequencies closemore » to the Nyquist limit. For the measurement of NNPS and DQE, agreement is similar to that obtained in the MTF. These results suggest that the developed software can be used with confidence for image quality assessment.« less
Hardware for dynamic quantum computing.
Ryan, Colm A; Johnson, Blake R; Ristè, Diego; Donovan, Brian; Ohki, Thomas A
2017-10-01
We describe the hardware, gateware, and software developed at Raytheon BBN Technologies for dynamic quantum information processing experiments on superconducting qubits. In dynamic experiments, real-time qubit state information is fed back or fed forward within a fraction of the qubits' coherence time to dynamically change the implemented sequence. The hardware presented here covers both control and readout of superconducting qubits. For readout, we created a custom signal processing gateware and software stack on commercial hardware to convert pulses in a heterodyne receiver into qubit state assignments with minimal latency, alongside data taking capability. For control, we developed custom hardware with gateware and software for pulse sequencing and steering information distribution that is capable of arbitrary control flow in a fraction of superconducting qubit coherence times. Both readout and control platforms make extensive use of field programmable gate arrays to enable tailored qubit control systems in a reconfigurable fabric suitable for iterative development.
Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center
NASA Astrophysics Data System (ADS)
Berger, Thomas
2016-07-01
The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.
Customizing graphical user interface technology for spacecraft control centers
NASA Technical Reports Server (NTRS)
Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald
1993-01-01
The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.
NASA Astrophysics Data System (ADS)
Tavakkoli-Moghaddam, Reza; Forouzanfar, Fateme; Ebrahimnejad, Sadoullah
2013-07-01
This paper considers a single-sourcing network design problem for a three-level supply chain. For the first time, a novel mathematical model is presented considering risk-pooling, the inventory existence at distribution centers (DCs) under demand uncertainty, the existence of several alternatives to transport the product between facilities, and routing of vehicles from distribution centers to customer in a stochastic supply chain system, simultaneously. This problem is formulated as a bi-objective stochastic mixed-integer nonlinear programming model. The aim of this model is to determine the number of located distribution centers, their locations, and capacity levels, and allocating customers to distribution centers and distribution centers to suppliers. It also determines the inventory control decisions on the amount of ordered products and the amount of safety stocks at each opened DC, selecting a type of vehicle for transportation. Moreover, it determines routing decisions, such as determination of vehicles' routes starting from an opened distribution center to serve its allocated customers and returning to that distribution center. All are done in a way that the total system cost and the total transportation time are minimized. The Lingo software is used to solve the presented model. The computational results are illustrated in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramer, Chris; Fadrhonc, Emily Martin; Goldman, Charles
Utility customer-supported financing programs are receiving increased attention as a strategy for achieving energy saving goals. Rationales for using utility customer funds to support financing initiatives
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-04
... Priority Customer Rebate Program (the ``Program'') until November 30, 2013.\\3\\ The Program currently... Customer \\6\\ order transmitted by that Member which is executed on the Exchange in all multiply-listed... thresholds are calculated based on the customer average daily volume over the course of the month. Volume...
Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.
[Effects of a Customized Birth Control Program for Married Immigrant Postpartum Mothers].
Kim, So Young; Choi, So Young
2016-12-01
This study was conducted to develop a customized birth control program and identify its effects on attitude, subjective norm, behavioral control, intention, and behavior of contraception among immigrant postpartum mothers. In this experimental study, Vietnamese, Filipino or Cambodian married immigrant postpartum mothers were recruited. They were assigned to the experiment group (n=21) or control group (n=21). The customized birth control program was provided to the experimental group for 4 weeks. The experimental group showed a significant increase in the score of attitude, subjective norm, behavioral control, intention, and behavior of contraception. Findings in this study indicate that the customized postpartum birth control program, a systematic and integrative intervention program composed of customized health education, counseling and telephone monitoring, is able to provide effective planning for postpartum health promotion and birth control behavior practice in married immigrant women.
The effect of requirements prioritization on avionics system conceptual design
NASA Astrophysics Data System (ADS)
Lorentz, John
This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldman, C.; Hopper, N.; Sezgen, O.
2004-07-01
There is growing interest in policies, programs and tariffs that encourage customer loads to provide demand response (DR) to help discipline wholesale electricity markets. Proposals at the retail level range from eliminating fixed rate tariffs as the default service for some or all customer groups to reinstituting utility-sponsored load management programs with market-based inducements to curtail. Alternative rate designs include time-of-use (TOU), day-ahead real-time pricing (RTP), critical peak pricing, and even pricing usage at real-time market balancing prices. Some Independent System Operators (ISOs) have implemented their own DR programs whereby load curtailment capabilities are treated as a system resource andmore » are paid an equivalent value. The resulting load reductions from these tariffs and programs provide a variety of benefits, including limiting the ability of suppliers to increase spot and long-term market-clearing prices above competitive levels (Neenan et al., 2002; Boren stein, 2002; Ruff, 2002). Unfortunately, there is little information in the public domain to characterize and quantify how customers actually respond to these alternative dynamic pricing schemes. A few empirical studies of large customer RTP response have shown modest results for most customers, with a few very price-responsive customers providing most of the aggregate response (Herriges et al., 1993; Schwarz et al., 2002). However, these studies examined response to voluntary, two-part RTP programs implemented by utilities in states without retail competition.1 Furthermore, the researchers had limited information on customer characteristics so they were unable to identify the drivers to price response. In the absence of a compelling characterization of why customers join RTP programs and how they respond to prices, many initiatives to modernize retail electricity rates seem to be stymied.« less
Using Selection Pressure as an Asset to Develop Reusable, Adaptable Software Systems
NASA Astrophysics Data System (ADS)
Berrick, S. W.; Lynnes, C.
2007-12-01
The Goddard Earth Sciences Data and Information Services Center (GES DISC) at NASA has over the years developed and honed a number of reusable architectural components for supporting large-scale data centers with a large customer base. These include a processing system (S4PM) and an archive system (S4PA) based upon a workflow engine called the Simple, Scalable, Script-based Science Processor (S4P); an online data visualization and analysis system (Giovanni); and the radically simple and fast data search tool, Mirador. These subsystems are currently reused internally in a variety of combinations to implement customized data management on behalf of instrument science teams and other science investigators. Some of these subsystems (S4P and S4PM) have also been reused by other data centers for operational science processing. Our experience has been that development and utilization of robust, interoperable, and reusable software systems can actually flourish in environments defined by heterogeneous commodity hardware systems, the emphasis on value-added customer service, and continual cost reduction pressures. The repeated internal reuse that is fostered by such an environment encourages and even forces changes to the software that make it more reusable and adaptable. Allowing and even encouraging such selective pressures to software development has been a key factor in the success of S4P and S4PM, which are now available to the open source community under the NASA Open Source Agreement.
Real time data acquisition of a countrywide commercial microwave link network
NASA Astrophysics Data System (ADS)
Chwala, Christian; Keis, Felix; Kunstmann, Harald
2015-04-01
Research in recent years has shown that data from commercial microwave link networks can provide very valuable precipitation information. Since these networks comprise the backbone of the cell phone network, they provide countrywide coverage. However acquiring the necessary data from the network operators is still difficult. Data is usually made available for researchers with a large time delay and often at irregular basis. This of course hinders the exploitation of commercial microwave link data in operational applications like QPE forecasts running at national meteorological services. To overcome this, we have developed a custom software in joint cooperation with our industry partner Ericsson. The software is installed on a dedicated server at Ericsson and is capable of acquiring data from the countrywide microwave link network in Germany. In its current first operational testing phase, data from several hundred microwave links in southern Germany is recorded. All data is instantaneously sent to our server where it is stored and organized in an emerging database. Time resolution for the Ericsson data is one minute. The custom acquisition software, however, is capable of processing higher sampling rates. Additionally we acquire and manage 1 Hz data from four microwave links operated by the skiing resort in Garmisch-Partenkirchen. We will present the concept of the data acquisition and show details of the custom-built software. Additionally we will showcase the accessibility and basic processing of real time microwave link data via our database web frontend.
Systematic Propulsion Optimization Tools (SPOT)
NASA Technical Reports Server (NTRS)
Bower, Mark; Celestian, John
1992-01-01
This paper describes a computer program written by senior-level Mechanical Engineering students at the University of Alabama in Huntsville which is capable of optimizing user-defined delivery systems for carrying payloads into orbit. The custom propulsion system is designed by the user through the input of configuration, payload, and orbital parameters. The primary advantages of the software, called Systematic Propulsion Optimization Tools (SPOT), are a user-friendly interface and a modular FORTRAN 77 code designed for ease of modification. The optimization of variables in an orbital delivery system is of critical concern in the propulsion environment. The mass of the overall system must be minimized within the maximum stress, force, and pressure constraints. SPOT utilizes the Design Optimization Tools (DOT) program for the optimization techniques. The SPOT program is divided into a main program and five modules: aerodynamic losses, orbital parameters, liquid engines, solid engines, and nozzles. The program is designed to be upgraded easily and expanded to meet specific user needs. A user's manual and a programmer's manual are currently being developed to facilitate implementation and modification.
Computing Q-D Relationships for Storage of Rocket Fuels
NASA Technical Reports Server (NTRS)
Jester, Keith
2005-01-01
The Quantity Distance Measurement Tool is a GIS BASEP computer program that aids safety engineers by calculating quantity-distance (Q-D) relationships for vessels that contain explosive chemicals used in testing rocket engines. (Q-D relationships are standard relationships between specified quantities of specified explosive materials and minimum distances by which they must be separated from persons, objects, and other explosives to obtain specified types and degrees of protection.) The program uses customized geographic-information-system (GIS) software and calculates Q-D relationships in accordance with NASA's Safety Standard For Explosives, Propellants, and Pyrotechnics. Displays generated by the program enable the identification of hazards, showing the relationships of propellant-storage-vessel safety buffers to inhabited facilities and public roads. Current Q-D information is calculated and maintained in graphical form for all vessels that contain propellants or other chemicals, the explosiveness of which is expressed in TNT equivalents [amounts of trinitrotoluene (TNT) having equivalent explosive effects]. The program is useful in the acquisition, siting, construction, and/or modification of storage vessels and other facilities in the development of an improved test-facility safety program.
Hammel, J M; Van Der Loos, H F; Lepage, P; Burgar, C; Perkash, I; Shafer, D; Topp, E; Lees, D
1994-01-01
This paper describes the results of the program-development phase of the Vocational Training Facility (VTF) taking place at the Palo Alto Veterans Affairs Medical Center Rehabilitation Research and Development Center. The VTF staff has developed a self-paced, multimedia curriculum comprised of adapted training packages, interactive videos, and additional training and testing materials designed to teach entry-level desktop publishing and reasonable accommodation skills to individuals with spinal cord injuries. The curriculum is taught via the Macintosh™ computer to allow independent, "hands-off" access to training materials. Each student is given an integrated workstation that is equipped with the Desktop Vocational Assistant Robot (De VAR); a set of low-and high-technology assistive hardware, software, and devices; and ergonomic furniture and adaptations customized to fit individual learning and access needs. Each student completes a 12-week, full-time training program followed by a 3-month internship with a local corporate sponsor. This paper summarizes the evaluation results of the VTF program by the first nine students, with spinal cord injuries ranging paraplegia to high-level quadriplegia, who have completed the program.
NASA Technical Reports Server (NTRS)
Sproles, Darrell W.; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.
ERIC Educational Resources Information Center
Brewer, Julie; And Others
1995-01-01
Presents three articles that discuss customer service in libraries, with a focus on planning for service management, a customer service program for library staff, and a quality improvement process. Highlights include developing and implementing service strategies, dealing with requests, redefining work relationships, coworkers as customers,…
NASA Astrophysics Data System (ADS)
Cinquini, L.; Bell, G. M.; Williams, D.; Harney, J.
2012-12-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing state-of-the-art services for the management and access of Earth system data. ESGF is currently used to serve the totality of the model output used for the forthcoming IPCC 5th assessment report on climate change, as well as supporting observational and reanalysis datasets. Also, it is been adopted by several other projects that focus on global, regional and local climate modeling. The ESGF software stack is composed of several modular applications that cover related but disjoint areas of functionality: data publishing, data search and discovery, data access, user management, security, and federation. Overall, the ESGF infrastructure offers a configurable end-to-end solution to the problem of enabling web-based access to large amounts of geospatial data. This talk will present the architectural and configuration options that are available to a data provider leveraging ESGF to serve their data: which services to expose, how to scale to larger data collections, how to establish access control, how to customize the user interface, and others. Additionally, the framework provides extension points that allow each site to plug in custom functionality such as crawling of specific metadata repositories, exposing domain-specific analysis and visualization services, developing custom access clients that interact with the system APIs. These configuration and extension capabilities are based on simple but effective domain-specific object models, that underpin the software applications: the data model, the security model, and the federation model. The ESGF software stack is developed collaboratively by software engineers at many institutions around the world, and is made freely available to the community under an open source license to promote adoption, reuse, inspection and continuous improvement.
Information technologies in optimization process of monitoring of software and hardware status
NASA Astrophysics Data System (ADS)
Nikitin, P. V.; Savinov, A. N.; Bazhenov, R. I.; Ryabov, I. V.
2018-05-01
The article describes a model of a hardware and software monitoring system for a large company that provides customers with software as a service (SaaS solution) using information technology. The main functions of the monitoring system are: provision of up-todate data for analyzing the state of the IT infrastructure, rapid detection of the fault and its effective elimination. The main risks associated with the provision of these services are described; the comparative characteristics of the software are given; author's methods of monitoring the status of software and hardware are proposed.
Gheewala, Pankti A; Peterson, Gregory M; Zaidi, Syed Tabish R; Jose, Matthew D; Castelino, Ronald L
2018-06-14
Community pharmacists are well positioned to deliver chronic kidney disease (CKD) screening services. However, little is known about the challenges faced by pharmacists during service implementation. This study aimed to explore community pharmacists' experiences and perceived barriers of implementing a CKD risk assessment service. Data collection was performed by using semistructured, open-ended interview questions. Pharmacists who had implemented a CKD screening service in Tasmania, Australia, were eligible to participate. A purposeful sampling strategy was used to select pharmacists, with variation in demographics and pharmacy location. A conventional content analysis approach was used to conduct the qualitative study. Transcripts were thematically analyzed by using the NVivo 11 software program. Initially, a list of free nodes was generated and data were coded exhaustively into relevant nodes. These nodes were then regrouped to form highly conceptualized themes. Five broad themes emerged from the analysis: contextual fit within community pharmacy; perceived scope of pharmacy practice; customer perception toward disease prevention; CKD - an underestimated disease; and remuneration for a beneficial service. Pharmacists found the CKD service efficient, user-friendly, and of substantial benefit to their customers. However, several pharmacists observed that their customers lacked interest in disease prevention, and had limited understanding of CKD. More importantly, pharmacists perceived the scope of pharmacy practice to depend substantially on interprofessional collaboration between pharmacists and general practitioners, and customer acknowledgment of pharmacists' role in disease prevention. Community pharmacists perceived the CKD service to be worth incorporating into pharmacy practice. To increase uptake, future CKD services should aim to improve customer awareness about CKD before providing risk assessment. Further research investigating strategies to enhance general practitioner involvement in pharmacist-initiated disease prevention services is also needed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-15
... Effectiveness of Proposed Rule Change To Adopt a Priority Customer Rebate Program July 9, 2013. Pursuant to... of the Proposed Rule Change The Exchange is filing a proposal to adopt a Priority Customer Rebate... Priority Customer Rebate Program (the ``Program'') for the period beginning July 1, 2013 and ending...
Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory
ERIC Educational Resources Information Center
Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo
2005-01-01
We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…
Selling Software: How Vendors Manipulate Research and Cheat Students
ERIC Educational Resources Information Center
Oppenheimer, Todd
2007-01-01
Educational software makers are often rebuffed by educational authorities, whose endorsements could lead to governmental stamps of approval, and thus explosive sales. But they usually get warmer receptions in the offices of the nation's school superintendents, who are, after all, their primary customers. The system was not supposed to work this…
Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems
2010-12-01
the software for reevaluation. Once the ree- valuation process is completed, CERT provides the client a report detailing the software’s con - formance...Flagged Nonconformities (FNC) Software System TP/FNC Ratio Mozilla Firefox version 2.0 6/12 50% Linux kernel version 2.6.15 10/126 8% Wine...inappropriately tuned for analysis of the Linux kernel, which has anomalous results. Customizing SCALe to work with energy system software will help
Comparing Acquisition Strategies: Open Architecture versus Product Lines
2010-04-30
software • New SOW language for accepting software deliveries – Enables third-party reuse • Additional SOW language regarding conducting software code walkthroughs and for using integrated development environments ...change the business environment must be the primary factor that drives the technical approach. Accordingly, there are business case decisions to be...elements of a system design should be made available to the customer to observe throughout the design process. Electronic access to the design environment
Technology Transfer Challenges for High-Assurance Software Engineering Tools
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.
2003-01-01
In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.
A customer-friendly Space Station
NASA Technical Reports Server (NTRS)
Pivirotto, D. S.
1984-01-01
This paper discusses the relationship of customers to the Space Station Program currently being defined by NASA. Emphasis is on definition of the Program such that the Space Station will be conducive to use by customers, that is by people who utilize the services provided by the Space Station and its associated platforms and vehicles. Potential types of customers are identified. Scenarios are developed for ways in which different types of customers can utilize the Space Station. Both management and technical issues involved in making the Station 'customer friendly' are discussed.
The application of virtual reality systems as a support of digital manufacturing and logistics
NASA Astrophysics Data System (ADS)
Golda, G.; Kampa, A.; Paprocka, I.
2016-08-01
Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.
Discrepancy Reporting Management System
NASA Technical Reports Server (NTRS)
Cooper, Tonja M.; Lin, James C.; Chatillon, Mark L.
2004-01-01
Discrepancy Reporting Management System (DRMS) is a computer program designed for use in the stations of NASA's Deep Space Network (DSN) to help establish the operational history of equipment items; acquire data on the quality of service provided to DSN customers; enable measurement of service performance; provide early insight into the need to improve processes, procedures, and interfaces; and enable the tracing of a data outage to a change in software or hardware. DRMS is a Web-based software system designed to include a distributed database and replication feature to achieve location-specific autonomy while maintaining a consistent high quality of data. DRMS incorporates commercial Web and database software. DRMS collects, processes, replicates, communicates, and manages information on spacecraft data discrepancies, equipment resets, and physical equipment status, and maintains an internal station log. All discrepancy reports (DRs), Master discrepancy reports (MDRs), and Reset data are replicated to a master server at NASA's Jet Propulsion Laboratory; Master DR data are replicated to all the DSN sites; and Station Logs are internal to each of the DSN sites and are not replicated. Data are validated according to several logical mathematical criteria. Queries can be performed on any combination of data.
Using the ATL HDI 1000 to collect demodulated RF data for monitoring HIFU lesion formation
NASA Astrophysics Data System (ADS)
Anand, Ajay; Kaczkowski, Peter J.; Daigle, Ron E.; Huang, Lingyun; Paun, Marla; Beach, Kirk W.; Crum, Lawrence A.
2003-05-01
The ability to accurately track and monitor the progress of lesion formation during HIFU (High Intensity Focused Ultrasound) therapy is important for the success of HIFU-based treatment protocols. To aid in the development of algorithms for accurately targeting and monitoring formation of HIFU induced lesions, we have developed a software system to perform RF data acquisition during HIFU therapy using a commercially available clinical ultrasound scanner (ATL HDI 1000, Philips Medical Systems, Bothell, WA). The HDI 1000 scanner functions on a software dominant architecture, permitting straightforward external control of its operation and relatively easy access to quadrature demodulated RF data. A PC running a custom developed program sends control signals to the HIFU module via GPIB and to the HDI 1000 via Telnet, alternately interleaving HIFU exposures and RF frame acquisitions. The system was tested during experiments in which HIFU lesions were created in excised animal tissue. No crosstalk between the HIFU beam and the ultrasound imager was detected, thus demonstrating synchronization. Newly developed acquisition modes allow greater user control in setting the image geometry and scanline density, and enables high frame rate acquisition. This system facilitates rapid development of signal-processing based HIFU therapy monitoring algorithms and their implementation in image-guided thermal therapy systems. In addition, the HDI 1000 system can be easily customized for use with other emerging imaging modalities that require access to the RF data such as elastographic methods and new Doppler-based imaging and tissue characterization techniques.
A Framework for the Design of Service Systems
NASA Astrophysics Data System (ADS)
Tan, Yao-Hua; Hofman, Wout; Gordijn, Jaap; Hulstijn, Joris
We propose a framework for the design and implementation of service systems, especially to design controls for long-term sustainable value co-creation. The framework is based on the software support tool e3-control. To illustrate the framework we use a large-scale case study, the Beer Living Lab, for simplification of customs procedures in international trade. The BeerLL shows how value co-creation can be achieved by reduction of administrative burden in international beer export due to electronic customs. Participants in the BeerLL are Heineken, IBM and Dutch Tax & Customs.
BioWord: A sequence manipulation suite for Microsoft Word
2012-01-01
Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326
BioWord: a sequence manipulation suite for Microsoft Word.
Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan
2012-06-07
The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.
A class Hierarchical, object-oriented approach to virtual memory management
NASA Technical Reports Server (NTRS)
Russo, Vincent F.; Campbell, Roy H.; Johnston, Gary M.
1989-01-01
The Choices family of operating systems exploits class hierarchies and object-oriented programming to facilitate the construction of customized operating systems for shared memory and networked multiprocessors. The software is being used in the Tapestry laboratory to study the performance of algorithms, mechanisms, and policies for parallel systems. Described here are the architectural design and class hierarchy of the Choices virtual memory management system. The software and hardware mechanisms and policies of a virtual memory system implement a memory hierarchy that exploits the trade-off between response times and storage capacities. In Choices, the notion of a memory hierarchy is captured by abstract classes. Concrete subclasses of those abstractions implement a virtual address space, segmentation, paging, physical memory management, secondary storage, and remote (that is, networked) storage. Captured in the notion of a memory hierarchy are classes that represent memory objects. These classes provide a storage mechanism that contains encapsulated data and have methods to read or write the memory object. Each of these classes provides specializations to represent the memory hierarchy.
Engineering Software Suite Validates System Design
NASA Technical Reports Server (NTRS)
2007-01-01
EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers
Object-oriented approach to fast display of electrophysiological data under MS-windows.
Marion-Poll, F
1995-12-01
Microcomputers provide neuroscientists an alternative to a host of laboratory equipment to record and analyze electrophysiological data. Object-oriented programming tools bring an essential link between custom needs for data acquisition and analysis with general software packages. In this paper, we outline the layout of basic objects that display and manipulate electrophysiological data files. Visual inspection of the recordings is a basic requirement of any data analysis software. We present an approach that allows flexible and fast display of large data sets. This approach involves constructing an intermediate representation of the data in order to lower the number of actual points displayed while preserving the aspect of the data. The second group of objects is related to the management of lists of data files. Typical experiments designed to test the biological activity of pharmacological products include scores of files. Data manipulation and analysis are facilitated by creating multi-document objects that include the names of all experiment files. Implementation steps of both objects are described for an MS-Windows hosted application.
An advanced software suite for the processing and analysis of silicon luminescence images
NASA Astrophysics Data System (ADS)
Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.
2017-06-01
Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.
The Use Of Videography For Three-Dimensional Motion Analysis
NASA Astrophysics Data System (ADS)
Hawkins, D. A.; Hawthorne, D. L.; DeLozier, G. S.; Campbell, K. R.; Grabiner, M. D.
1988-02-01
Special video path editing capabilities with custom hardware and software, have been developed for use in conjunction with existing video acquisition hardware and firmware. This system has simplified the task of quantifying the kinematics of human movement. A set of retro-reflective markers are secured to a subject performing a given task (i.e. walking, throwing, swinging a golf club, etc.). Multiple cameras, a video processor, and a computer work station collect video data while the task is performed. Software has been developed to edit video files, create centroid data, and identify marker paths. Multi-camera path files are combined to form a 3D path file using the DLT method of cinematography. A separate program converts the 3D path file into kinematic data by creating a set of local coordinate axes and performing a series of coordinate transformations from one local system to the next. The kinematic data is then displayed for appropriate review and/or comparison.
Media processors using a new microsystem architecture designed for the Internet era
NASA Astrophysics Data System (ADS)
Wyland, David C.
1999-12-01
The demands of digital image processing, communications and multimedia applications are growing more rapidly than traditional design methods can fulfill them. Previously, only custom hardware designs could provide the performance required to meet the demands of these applications. However, hardware design has reached a crisis point. Hardware design can no longer deliver a product with the required performance and cost in a reasonable time for a reasonable risk. Software based designs running on conventional processors can deliver working designs in a reasonable time and with low risk but cannot meet the performance requirements. What is needed is a media processing approach that combines very high performance, a simple programming model, complete programmability, short time to market and scalability. The Universal Micro System (UMS) is a solution to these problems. The UMS is a completely programmable (including I/O) system on a chip that combines hardware performance with the fast time to market, low cost and low risk of software designs.
ERIC Educational Resources Information Center
Bayer, Jerrie; Llewellyn, Steven
2011-01-01
Library customers have more remote information choices than ever before, so we must ensure that when they do come to the library, they experience a welcoming environment, a high standard of service, and receive equitable levels of service across campus. Developing a customer service program was a logical next step to reinforce the ongoing…
Anssari Moin, David; Derksen, Wiebe; Waars, Hugo; Hassan, Bassam; Wismeijer, Daniel
2017-05-01
The aim of this study was to introduce a new concept for computer-assisted template-guided placement of a custom 3D-designed/3D-printed implant with congruent custom 3D-designed/3D-printed surgical tooling and to test the feasibility and accuracy of this method in-vitro. One partially edentulous human mandibular cadaver was scanned with a cone-beam computed tomography (CBCT) system and intra-oral scan system. The 3D data of this cadaver were imported in specialized software and used to analyse the region of a missing tooth. Based on the functional and anatomical parameters, an individual implant with congruent surgical tooling and surgical guided template was designed and 3D-printed. The guided osteotomy was performed, and the custom implant inserted. To evaluate the planned implant position in comparison with the placed implant position, the mandible with implant was scanned again with the CBCT system and software matching was applied to measure the accuracy of the procedure. The angular deflection with the planned implant position was 0.40°. When comparing the 3D positions of the shoulder, there is a deviation of 0.72 mm resulting in an apical deviation of 0.72 mm. With the use of currently available technology, it is very well feasible to create in a virtual simulation a custom implant with congruent custom surgical tooling and to transfer this to a clinical setting. However, further research on multiple levels is needed to explore this novel approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
Creating a successful relationship with customers.
Cotton, L; Sparrow, E
1998-01-01
In 1997, several employers commissioned an inpatient survey for a group of businesses that included hospitals in southeast Michigan. Its results indicated that the University of Michigan Health System (UMHS) needed to become more customer-focused. To meet this challenge, UMHS mandated that customer service to its patients and their families should be its first priority. A pilot project in the radiology department's pediatric division was established to recognize and reward employees for outstanding service to customers. The program is now used to reward employees throughout the radiology department, on the assumption that when employees feel special, so will their customers. Management's focus is on employees--they are the health system. The department also invested in employee development, a continuous training program that centers on customer service and teaches tools and skills for better communication. The goal of the development program at UMHS is to exceed the needs of its customers.
State formulating lifeline program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-09-01
The Board of Public Utilities (BPU) of New Jersey is formulating a lifeline program which would provide low-income and elderly customers with reduced utility rates. It is estimated that 30% of the households in New Jersey will qualify for the program. While the legislation calls for the lowest effective rate of any customer class, each utility would have its own lifeline program because of differing rates among utility companies. Eligibility requirements would be applied statewide. The utilities will fund the new program by restructuring the existing rates for regular customers. In which case lifeline recipients' rate would decrease while regularmore » customers' bills would increase. Eventually, the BPU expects to fund about 10% of the senior citizens' portion of the program with the state's casino gambling revenues.« less
Software agents for the dissemination of remote terrestrial sensing data
NASA Technical Reports Server (NTRS)
Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.
1994-01-01
Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation allows agents to advertize their capabilities and results to other agents, thereby allowing the collection of agents to reuse each others work.
Burton, R; Mauk, D
1993-03-01
By integrating customer satisfaction planning and industrial engineering techniques when examining internal costs and efficiencies, materiel managers are able to better realize what concepts will best meet their customers' needs. Defining your customer(s), applying industrial engineering techniques, completing work sampling studies, itemizing recommendations and benefits to each alternative, performing feasibility and cost-analysis matrixes and utilizing resources through productivity monitoring will get you on the right path toward selecting concepts to use. This article reviews the above procedures as they applied to one hospital's decision-making process to determine whether to incorporate a stockless inventory program. Through an analysis of customer demand, the hospital realized that stockless was the way to go, but not by outsourcing the function--the hospital incorporated an in-house stockless inventory program.
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
A Statewide Management Information System for the Control of Sexually Transmitted Diseases
Fichtner, Ronald R.; Blount, Joseph H.; Spencer, Jack N.
1983-01-01
The persistent endemicity in the U.S. of infectious syphilis and gonorrhea, together with increasing diagnoses of gonococcal-related pelvic inflammatory disease in women and genital herpes infections, have intensified pressures on state and local VD control programs to measure, analyze, and interpret the distribution and transmission of these and other sexually transmitted diseases. In response, the Division of Venereal Disease Control (DVDC) of the Centers for Disease Control (CDC) is participating in the development of three state-wide, prototype sexually transmitted disease (STD) management information systems. A systems analysis of a typical state-wide STD control program indicated that timely, comprehensive, informational support to public health managers and policy makers should be combined with rapid, direct support of program activities using an on-line, integrated data base, computer system with telecommunications capability. This methodology uses a data base management system, query facility for ad hoc inquiries, custom design philosophies, but utilizes distinct hardware and software implementations.
FASTER - A tool for DSN forecasting and scheduling
NASA Technical Reports Server (NTRS)
Werntz, David; Loyola, Steven; Zendejas, Silvino
1993-01-01
FASTER (Forecasting And Scheduling Tool for Earth-based Resources) is a suite of tools designed for forecasting and scheduling JPL's Deep Space Network (DSN). The DSN is a set of antennas and other associated resources that must be scheduled for satellite communications, astronomy, maintenance, and testing. FASTER consists of MS-Windows based programs that replace two existing programs (RALPH and PC4CAST). FASTER was designed to be more flexible, maintainable, and user friendly. FASTER makes heavy use of commercial software to allow for customization by users. FASTER implements scheduling as a two pass process: the first pass calculates a predictive profile of resource utilization; the second pass uses this information to calculate a cost function used in a dynamic programming optimization step. This information allows the scheduler to 'look ahead' at activities that are not as yet scheduled. FASTER has succeeded in allowing wider access to data and tools, reducing the amount of effort expended and increasing the quality of analysis.
Customized Training Marketing Plan.
ERIC Educational Resources Information Center
Lay, Ted
This report outlines Oregon's Lane Community College's (LCC's) plan for marketing its customized training program for business, community organizations, public agencies, and their employees. Following a mission statement for the customized training program, a brief analysis is provided of the economic environment; of competition from educational…
Recent Survey and Application of the simSUNDT Software
NASA Astrophysics Data System (ADS)
Persson, G.; Wirdelius, H.
2010-02-01
The simSUNDT software is based on a previous developed program (SUNDT). The latest version has been customized in order to generate realistic synthetic data (including a grain noise model), compatible with a number of off-line analysis software. The software consists of a Windows®-based preprocessor and postprocessor together with a mathematical kernel (UTDefect), dealing with the actual mathematical modeling. The model employs various integral transforms and integral equation and enables simulations of the entire ultrasonic testing situation. The model is completely three-dimensional though the simulated component is two-dimensional, bounded by the scanning surface and a planar back surface as an option. It is of great importance that inspection methods that are applied are proper validated and that their capability of detection of cracks and defects are quantified. In order to achieve this, statistical methods such as Probability of Detection (POD) often are applied, with the ambition to estimate the detectability as a function of defect size. Despite the fact that the proposed procedure with the utilization of test pieces is very expensive, it also tends to introduce a number of possible misalignments between the actual NDT situation that is to be performed and the proposed experimental simulation. The presentation will describe the developed model that will enable simulation of a phased array NDT inspection and the ambition to use this simulation software to generate POD information. The paper also includes the most recent developments of the model including some initial experimental validation of the phased array probe model.
31 CFR 1023.220 - Customer identification programs for broker-dealers.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...
31 CFR 1023.220 - Customer identification programs for broker-dealers.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...
31 CFR 1023.220 - Customer identification programs for broker-dealers.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...
31 CFR 1023.220 - Customer identification programs for broker-dealers.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...
International Space Station Increment Operations Services
NASA Astrophysics Data System (ADS)
Michaelis, Horst; Sielaff, Christian
2002-01-01
The Industrial Operator (IO) has defined End-to-End services to perform efficiently all required operations tasks for the Manned Space Program (MSP) as agreed during the Ministerial Council in Edinburgh in November 2001. Those services are the result of a detailed task analysis based on the operations processes as derived from the Space Station Program Implementation Plans (SPIP) and defined in the Operations Processes Documents (OPD). These services are related to ISS Increment Operations and ATV Mission Operations. Each of these End-to-End services is typically characterised by the following properties: It has a clearly defined starting point, where all requirements on the end-product are fixed and associated performance metrics of the customer are well defined. It has a clearly defined ending point, when the product or service is delivered to the customer and accepted by him, according to the performance metrics defined at the start point. The implementation of the process might be restricted by external boundary conditions and constraints mutually agreed with the customer. As far as those are respected the IO has the free choice to select methods and means of implementation. The ISS Increment Operations Service (IOS) activities required for the MSP Exploitation program cover the complete increment specific cycle starting with the support to strategic planning and ending with the post increment evaluation. These activities are divided into sub-services including the following tasks: - ISS Planning Support covering the support to strategic and tactical planning up to the generation - Development &Payload Integration Support - ISS Increment Preparation - ISS Increment Execution These processes are tight together by the Increment Integration Management, which provides the planning and scheduling of all activities as well as the technical management of the overall process . The paper describes the entire End-to-End ISS Increment Operations service and the implementation to support the Columbus Flight 1E related increment and subsequent ISS increments. Special attention is paid to the implications caused by long term operations on hardware, software and operations personnel.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... Parts 1, 3, 22 et al. Enhancing Protections Afforded Customers and Customer Funds Held by Futures... Customers and Customer Funds Held by Futures Commission Merchants and Derivatives Clearing Organizations... amend existing regulations to require enhanced customer protections, risk management programs, internal...
Egger, Jan; Gall, Markus; Tax, Alois; Ücal, Muammer; Zefferer, Ulrike; Li, Xing; von Campe, Gord; Schäfer, Ute; Schmalstieg, Dieter; Chen, Xiaojun
2017-01-01
In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow.
Egger, Jan; Gall, Markus; Tax, Alois; Ücal, Muammer; Zefferer, Ulrike; Li, Xing; von Campe, Gord; Schäfer, Ute; Schmalstieg, Dieter; Chen, Xiaojun
2017-01-01
In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow. PMID:28264062
Results of the Software Process Improvement Efforts of the Early Adopters in NAVAIR 4.0
2007-12-01
and customer satisfaction. AIRSpeed utilizes a structured, problem solving methodology called DMAIC (Define, Measure, Analyze, Improve, Control...widely used in business. DMAIC leads project teams through the logical steps from problem definition to problem resolution. Each phase has a specific set...costs and improving productivity and customer satisfaction. AIRSpeed utilizes the DMAIC (Define, Measure, Analyze, Improve, Control) structured problem
Insider Threats in the Software Development Lifecycle
2014-11-05
employee, contractor, or other business partner who • has or had authorized access to an organization’s network , system or data and • intentionally...organization’s network , system, or data and who, through • their action/inaction without malicious intent • cause harm or substantially increase...and female Male Target Network , systems, or data PII or Customer Information IP (trade secrets) or Customer Information Access Used
19 CFR 24.22 - Fees for certain services.
Code of Federal Regulations, 2014 CFR
2014-04-01
... following address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive... address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive, Suite....S. Customs and Border Protection, Revenue Division, Attn: User Fee Team, 6650 Telecom Drive, Suite...
19 CFR 24.22 - Fees for certain services.
Code of Federal Regulations, 2013 CFR
2013-04-01
... following address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive... address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive, Suite....S. Customs and Border Protection, Revenue Division, Attn: User Fee Team, 6650 Telecom Drive, Suite...
Databank Software for the 1990s and Beyond--Part 1: The User's Wish List.
ERIC Educational Resources Information Center
Basch, Reva
1990-01-01
Describes desired software enhancements identified by the Southern California Online Users Group in the areas of search language, database selection, document retrieval and display, user interface, customer support, and cost and economic issues. The need to prioritize these wishes and to determine whether features should reside in the mainframe or…
Selecting Advanced Software Technology in Two Small Manufacturing Enterprises
2004-05-01
improving workflow to further reduce delivery times, enhance customer service, and obtain a competitive advantage . The company wanted help... environment , stakeholders’ needs, ecommerce , shop floor visualization, and collaboration capability. These statements are not significantly different...for the purpose of describing a software environment . This identification does not imply any recommendation or endorsement by NIST, the SEI, CMU, or
A Talking Computers System for Persons with Vision and Speech Handicaps. Final Report.
ERIC Educational Resources Information Center
Visek & Maggs, Urbana, IL.
This final report contains a detailed description of six software systems designed to assist individuals with blindness and/or speech disorders in using inexpensive, off-the-shelf computers rather than expensive custom-made devices. The developed software is not written in the native machine language of any particular brand of computer, but in the…
Standardization in software conversion of (ROM) estimating
NASA Technical Reports Server (NTRS)
Roat, G. H.
1984-01-01
Technical problems and their solutions comprise by far the majority of work involved in space simulation engineering. Fixed price contracts with schedule award fees are becoming more and more prevalent. Accurate estimation of these jobs is critical to maintain costs within limits and to predict realistic contract schedule dates. Computerized estimating may hold the answer to these new problems, though up to now computerized estimating has been complex, expensive, and geared to the business world, not to technical people. The objective of this effort was to provide a simple program on a desk top computer capable of providing a Rough Order of Magnitude (ROM) estimate in a short time. This program is not intended to provide a highly detailed breakdown of costs to a customer, but to provide a number which can be used as a rough estimate on short notice. With more debugging and fine tuning, a more detailed estimate can be made.
An approach to software cost estimation
NASA Technical Reports Server (NTRS)
Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.
1984-01-01
A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.
31 CFR 103.131 - Customer identification programs for mutual funds.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Finance FINANCIAL RECORDKEEPING AND REPORTING OF CURRENCY AND FOREIGN TRANSACTIONS Anti-Money Laundering Programs Anti-Money Laundering Programs § 103.131 Customer identification programs for mutual funds. (a... mutual fund's anti-money laundering program required under the regulations implementing 31 U.S.C. 5318(h...
31 CFR 103.122 - Customer identification programs for broker-dealers.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Finance FINANCIAL RECORDKEEPING AND REPORTING OF CURRENCY AND FOREIGN TRANSACTIONS Anti-Money Laundering Programs Anti-Money Laundering Programs § 103.122 Customer identification programs for broker-dealers. (a... anti-money laundering compliance program required under 31 U.S.C. 5318(h). (2) Identity verification...
Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)
NASA Astrophysics Data System (ADS)
Valentine, Timothy
2017-09-01
The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.
Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY · 13
NASA Technical Reports Server (NTRS)
Bauman, William; Crawford, Winifred; Watson, Leela; Shafer, Jaclyn; Huddleston, Lisa
2013-01-01
The AMU team worked on seven tasks for their customers: (1) Ms. Crawford completed the objective lightning forecast tool for east -central Florida airports and delivered the tool and the final report to the customers. (2) Ms. Shafer continued work for Vandenberg Air Force Base on an automated tool to relate pressure gradients to peak winds. (3) Dr. Huddleston updated and delivered the tool that shows statistics on the timing of the first lightning strike of the day in the Kennedy Space Center (KSC)/Cape Canaveral Air Force Station (CCAFS) area. (4) Dr. Bauman continued work on a severe weather forecast tool focused on the Eastern Range (ER). (5) Ms. Crawford acquired the software and radar data needed to create a dual-Doppler analysis over the east-central Florida and KSC/CCAFS areas. (6) Mr. Decker continued developing a wind pairs database for the Launch Services Program to use when evaluating upper-level winds for launch vehicles. (7) Dr. Watson continued work to assimilate observational data into the high-resolution model configurations she created for Wallops Flight Facility and the ER.
Hard real-time closed-loop electrophysiology with the Real-Time eXperiment Interface (RTXI)
George, Ansel; Dorval, Alan D.; Christini, David J.
2017-01-01
The ability to experimentally perturb biological systems has traditionally been limited to static pre-programmed or operator-controlled protocols. In contrast, real-time control allows dynamic probing of biological systems with perturbations that are computed on-the-fly during experimentation. Real-time control applications for biological research are available; however, these systems are costly and often restrict the flexibility and customization of experimental protocols. The Real-Time eXperiment Interface (RTXI) is an open source software platform for achieving hard real-time data acquisition and closed-loop control in biological experiments while retaining the flexibility needed for experimental settings. RTXI has enabled users to implement complex custom closed-loop protocols in single cell, cell network, animal, and human electrophysiology studies. RTXI is also used as a free and open source, customizable electrophysiology platform in open-loop studies requiring online data acquisition, processing, and visualization. RTXI is easy to install, can be used with an extensive range of external experimentation and data acquisition hardware, and includes standard modules for implementing common electrophysiology protocols. PMID:28557998
Sustainable Software Decisions for Long-term Projects (Invited)
NASA Astrophysics Data System (ADS)
Shepherd, A.; Groman, R. C.; Chandler, C. L.; Gaylord, D.; Sun, M.
2013-12-01
Adopting new, emerging technologies can be difficult for established projects that are positioned to exist for years to come. In some cases the challenge lies in the pre-existing software architecture. In others, the challenge lies in the fluctuation of resources like people, time and funding. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006 by combining the data management offices for the U.S. GLOBEC and U.S. JGOFS programs to publish data for researchers funded by the National Science Foundation (NSF). Since its inception, BCO-DMO has been supporting access and discovery of these data through web-accessible software systems, and the office has worked through many of the challenges of incorporating new technologies into its software systems. From migrating human readable, flat file metadata storage into a relational database, and now, into a content management system (Drupal) to incorporating controlled vocabularies, new technologies can radically affect the existing software architecture. However, through the use of science-driven use cases, effective resource management, and loosely coupled software components, BCO-DMO has been able to adapt its existing software architecture to adopt new technologies. One of the latest efforts at BCO-DMO revolves around applying metadata semantics for publishing linked data in support of data discovery. This effort primarily affects the metadata web interface software at http://bco-dmo.org and the geospatial interface software at http://mapservice.bco-dmo.org/. With guidance from science-driven use cases and consideration of our resources, implementation decisions are made using a strategy to loosely couple the existing software systems to the new technologies. The results of this process led to the use of REST web services and a combination of contributed and custom Drupal modules for publishing BCO-DMO's content using the Resource Description Framework (RDF) via an instance of the Virtuoso Open-Source triplestore.
Langer, Dominik; van 't Hoff, Marcel; Keller, Andreas J; Nagaraja, Chetan; Pfäffli, Oliver A; Göldi, Maurice; Kasper, Hansjörg; Helmchen, Fritjof
2013-04-30
Intravital microscopy such as in vivo imaging of brain dynamics is often performed with custom-built microscope setups controlled by custom-written software to meet specific requirements. Continuous technological advancement in the field has created a need for new control software that is flexible enough to support the biological researcher with innovative imaging techniques and provide the developer with a solid platform for quickly and easily implementing new extensions. Here, we introduce HelioScan, a software package written in LabVIEW, as a platform serving this dual role. HelioScan is designed as a collection of components that can be flexibly assembled into microscope control software tailored to the particular hardware and functionality requirements. Moreover, HelioScan provides a software framework, within which new functionality can be implemented in a quick and structured manner. A specific HelioScan application assembles at run-time from individual software components, based on user-definable configuration files. Due to its component-based architecture, HelioScan can exploit synergies of multiple developers working in parallel on different components in a community effort. We exemplify the capabilities and versatility of HelioScan by demonstrating several in vivo brain imaging modes, including camera-based intrinsic optical signal imaging for functional mapping of cortical areas, standard two-photon laser-scanning microscopy using galvanometric mirrors, and high-speed in vivo two-photon calcium imaging using either acousto-optic deflectors or a resonant scanner. We recommend HelioScan as a convenient software framework for the in vivo imaging community. Copyright © 2013 Elsevier B.V. All rights reserved.
An Integrated Unix-based CAD System for the Design and Testing of Custom VLSI Chips
NASA Technical Reports Server (NTRS)
Deutsch, L. J.
1985-01-01
A computer aided design (CAD) system that is being used at the Jet Propulsion Laboratory for the design of custom and semicustom very large scale integrated (VLSI) chips is described. The system consists of a Digital Equipment Corporation VAX computer with the UNIX operating system and a collection of software tools for the layout, simulation, and verification of microcircuits. Most of these tools were written by the academic community and are, therefore, available to JPL at little or no cost. Some small pieces of software have been written in-house in order to make all the tools interact with each other with a minimal amount of effort on the part of the designer.
Knowledge-based approach for generating target system specifications from a domain model
NASA Technical Reports Server (NTRS)
Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan
1992-01-01
Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.
Link Analysis in the Mission Planning Lab
NASA Technical Reports Server (NTRS)
McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang
2011-01-01
The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.