Science.gov

Sample records for complex software specifications

  1. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ...The U.S. Nuclear Regulatory Commission (NRC or the Commission) is issuing for public comment draft regulatory guide (DG), DG-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used in Safety Systems of Nuclear Power Plants.'' The DG-1209 is proposed Revision 1 of RG 1.172, dated September 1997. This revision endorses, with clarifications, the......

  2. Master Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    A basic function of a computational grid such as the NASA Information Power Grid (IPG) is to allow users to execute applications on remote computer systems. The Globus Resource Allocation Manager (GRAM) provides this functionality in the IPG and many other grids at this time. While the functionality provided by GRAM clients is adequate, GRAM does not support useful features such as staging several sets of files, running more than one executable in a single job submission, and maintaining historical information about execution operations. This specification is intended to provide the environmental and software functional requirements for the IPG Job Manager V2.0 being developed by AMTI for NASA.

  3. Compositional Specification of Software Architecture

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes our experience using parameterized algebraic specifications to model properties of software architectures. The goal is to model the decomposition of requirements independent of the style used to implement the architecture. We begin by providing an overview of the role of architecture specification in software development. We then describe how architecture specifications are build up from component and connector specifications and give an overview of insights gained from a case study used to validate the method.

  4. SSL: A software specification language

    NASA Technical Reports Server (NTRS)

    Austin, S. L.; Buckles, B. P.; Ryan, J. P.

    1976-01-01

    SSL (Software Specification Language) is a new formalism for the definition of specifications for software systems. The language provides a linear format for the representation of the information normally displayed in a two-dimensional module inter-dependency diagram. In comparing SSL to FORTRAN or ALGOL, it is found to be largely complementary to the algorithmic (procedural) languages. SSL is capable of representing explicitly module interconnections and global data flow, information which is deeply imbedded in the algorithmic languages. On the other hand, SSL is not designed to depict the control flow within modules. The SSL level of software design explicitly depicts intermodule data flow as a functional specification.

  5. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  6. Does software design complexity affect maintenance effort?

    NASA Technical Reports Server (NTRS)

    Epping, Andreas; Lott, Christopher M.

    1994-01-01

    The design complexity of a software system may be characterized within a refinement level (e.g., data flow among modules), or between refinement levels (e.g., traceability between the specification and the design). We analyzed an existing set of data from NASA's Software Engineering Laboratory to test whether changing software modules with high design complexity requires more personnel effort than changing modules with low design complexity. By analyzing variables singly, we identified strong correlations between software design complexity and change effort for error corrections performed during the maintenance phase. By analyzing variables in combination, we found patterns which identify modules in which error corrections were costly to perform during the acceptance test phase.

  7. Software Complexity Threatens Performance Portability

    SciTech Connect

    Gamblin, T.

    2015-09-11

    Modern HPC software packages are rarely self-contained. They depend on a large number of external libraries, and many spend large fractions of their runtime in external subroutines. Performance portability depends not only on the effort of application teams, but also on the availability of well-tuned libraries. At most sites, the burden of maintaining libraries is shared by code teams and facilities. Facilities typically provide well-tuned default versions, but code teams frequently build with bleeding-edge compilers to achieve high performance. For this reason, HPC has no “standard” software stack, unlike other domains where performance is not critical. Incompatibilities among compilers and software versions force application teams and facility staff to re-build custom versions of libraries for each new toolchain. Because the number of potential configurations is combinatorial, and because HPC software is notoriously difficult to port to new machines [3, 7, 8], the tuning effort required to support and maintain performance-portable libraries outstrips the available manpower at most sites. Software complexity is a growing obstacle to performance portability for HPC.

  8. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  9. Software requirements: Guidance and control software development specification

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward; Rich, Don C.; Lowman, Douglas S.; Buckland, R. C.

    1990-01-01

    The software requirements for an implementation of Guidance and Control Software (GCS) are specified. The purpose of the GCS is to provide guidance and engine control to a planetary landing vehicle during its terminal descent onto a planetary surface and to communicate sensory information about that vehicle and its descent to some receiving device. The specification was developed using the structured analysis for real time system specification methodology by Hatley and Pirbhai and was based on a simulation program used to study the probability of success of the 1976 Viking Lander missions to Mars. Three versions of GCS are being generated for use in software error studies.

  10. Propagation and stability in software: A complex network perspective

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Wang, Ping

    2015-09-01

    In this paper, we attempt to understand the propagation and stability feature of large-scale complex software from the perspective of complex networks. Specifically, we introduced the concept of "propagation scope" to investigate the problem of change propagation in complex software. Although many complex software networks exhibit clear "small-world" and "scale-free" features, we found that the propagation scope of complex software networks is much lower than that of small-world networks and scale-free networks. Furthermore, because the design of complex software always obeys the principles of software engineering, we introduced the concept of "edge instability" to quantify the structural difference among complex software networks, small-world networks and scale-free networks. We discovered that the edge instability distribution of complex software networks is different from that of small-world networks and scale-free networks. We also found a typical structure that contributes to the edge instability distribution of complex software networks. Finally, we uncovered the correlation between propagation scope and edge instability in complex networks by eliminating the edges with different instability ranges.

  11. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  12. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  13. Specifications and programs for computer software validation

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Kleir, R.; Davis, T.; Henneman, M.; Haller, A.; Lasseter, G. L.

    1973-01-01

    Three software products developed during the study are reported and include: (1) FORTRAN Automatic Code Evaluation System, (2) the Specification Language System, and (3) the Array Index Validation System.

  14. Facing software complexity on large telescopes

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard; Bester, Deon; Brink, Janus; Gumede, Clifford; Schalekamp, Hendrik J.

    2004-09-01

    The successful development of any complex control system requires a blend of good software management, an appropriate computer architecture and good software engineering. Due to the large number of controlled parts, high performance goals and required operational efficiency, the control systems for large telescopes are particularly challenging to develop and maintain. In this paper the authors highlight some of the specific challenges that need to be met by control system developers to meet the requirements within a limited budget and schedule. They share some of the practices applied during the development of the Southern African Large Telescope (SALT) and describe specific aspects of the design that contribute to meeting these challenges. The topics discussed include: development methodology, defining the level of system integration, computer architecture, interface management, software standards, language selection, user interface design and personnel selection. Time will reveal the full truth, but the authors believe that the significant progress achieved in commissioning SALT (now 6 months from telescope completion), can largely be attributed to the combined application of these practices and design concepts.

  15. Software complex for geophysical data visualization

    NASA Astrophysics Data System (ADS)

    Kryukov, Ilya A.; Tyugin, Dmitry Y.; Kurkin, Andrey A.; Kurkina, Oxana E.

    2013-04-01

    The effectiveness of current research in geophysics is largely determined by the degree of implementation of the procedure of data processing and visualization with the use of modern information technology. Realistic and informative visualization of the results of three-dimensional modeling of geophysical processes contributes significantly into the naturalness of physical modeling and detailed view of the phenomena. The main difficulty in this case is to interpret the results of the calculations: it is necessary to be able to observe the various parameters of the three-dimensional models, build sections on different planes to evaluate certain characteristics and make a rapid assessment. Programs for interpretation and visualization of simulations are spread all over the world, for example, software systems such as ParaView, Golden Software Surfer, Voxler, Flow Vision and others. However, it is not always possible to solve the problem of visualization with the help of a single software package. Preprocessing, data transfer between the packages and setting up a uniform visualization style can turn into a long and routine work. In addition to this, sometimes special display modes for specific data are required and existing products tend to have more common features and are not always fully applicable to certain special cases. Rendering of dynamic data may require scripting languages that does not relieve the user from writing code. Therefore, the task was to develop a new and original software complex for the visualization of simulation results. Let us briefly list of the primary features that are developed. Software complex is a graphical application with a convenient and simple user interface that displays the results of the simulation. Complex is also able to interactively manage the image, resize the image without loss of quality, apply a two-dimensional and three-dimensional regular grid, set the coordinate axes with data labels and perform slice of data. The

  16. Domain and Specification Models for Software Engineering

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper discusses our approach to representing application domain knowledge for specific software engineering tasks. Application domain knowledge is embodied in a domain model. Domain models are used to assist in the creation of specification models. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model. One aspect of the system-hierarchical organization is described in detail.

  17. Domain specific software design for decision aiding

    NASA Technical Reports Server (NTRS)

    Keller, Kirby; Stanley, Kevin

    1992-01-01

    McDonnell Aircraft Company (MCAIR) is involved in many large multi-discipline design and development efforts of tactical aircraft. These involve a number of design disciplines that must be coordinated to produce an integrated design and a successful product. Our interpretation of a domain specific software design (DSSD) is that of a representation or framework that is specialized to support a limited problem domain. A DSSD is an abstract software design that is shaped by the problem characteristics. This parallels the theme of object-oriented analysis and design of letting the problem model directly drive the design. The DSSD concept extends the notion of software reusability to include representations or frameworks. It supports the entire software life cycle and specifically leads to improved prototyping capability, supports system integration, and promotes reuse of software designs and supporting frameworks. The example presented in this paper is the task network architecture or design which was developed for the MCAIR Pilot's Associate program. The task network concept supported both module development and system integration within the domain of operator decision aiding. It is presented as an instance where a software design exhibited many of the attributes associated with DSSD concept.

  18. Software engineering with application-specific languages

    NASA Technical Reports Server (NTRS)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  19. Reducing the complexity of software systems - A strategic software perspective

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.

    1992-01-01

    The results of a combined management and technical initiative aimed at reducing the size and complexity associated with developing operations planning, scheduling, and resource management software systems are presented. The initiative has produced operations concepts, functional requirements, system architectures, a comprehensive lexicon, and software tools to revolutionize the traditional software technology and development practices for planning, scheduling, and resource management systems used in space operations control centers. Examples of technology and practices to reduce complexity include a method for projecting design consequences from an operations concept, a universal architecture for heuristic algorithms, an object-oriented framework for describing large classes of problems that parametrically adapt to all domain peculiarities, the identification of general approaches which respond to changes with minimum impact on systems implementations, and a management structure for prototyping to minimize the risks of ill-conceived designs.

  20. Software errors and complexity: An empirical investigation

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Perricone, Berry T.

    1983-01-01

    The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.

  1. Domain-specific functional software testing: A progress report

    NASA Technical Reports Server (NTRS)

    Nonnenmann, Uwe

    1992-01-01

    Software Engineering is a knowledge intensive activity that involves defining, designing, developing, and maintaining software systems. In order to build effective systems to support Software Engineering activities, Artificial Intelligence techniques are needed. The application of Artificial Intelligence technology to Software Engineering is called Knowledge-based Software Engineering (KBSE). The goal of KBSE is to change the software life cycle such that software maintenance and evolution occur by modifying the specifications and then rederiving the implementation rather than by directly modifying the implementation. The use of domain knowledge in developing KBSE systems is crucial. Our work is mainly related to one area of KBSE that is called automatic specification acquisition. One example is the WATSON prototype on which our current work is based. WATSON is an automatic programming system for formalizing specifications for telephone switching software mainly restricted to POTS, i.e., plain old telephone service. Our current approach differentiates itself from other approaches in two antagonistic ways. On the one hand, we address a large and complex real-world problem instead of a 'toy domain' as in many research prototypes. On the other hand, to allow such scaling, we had to relax the ambitious goal of complete automatic programming, to the easier task of automatic testing.

  2. Incubator Display Software Cost Reduction Toolset Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Moran, Susanne; Jeffords, Ralph

    2005-01-01

    The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.

  3. Assessment Environment for Complex Systems Software Guide

    NASA Technical Reports Server (NTRS)

    2013-01-01

    This Software Guide (SG) describes the software developed to test the Assessment Environment for Complex Systems (AECS) by the West Virginia High Technology Consortium (WVHTC) Foundation's Mission Systems Group (MSG) for the National Aeronautics and Space Administration (NASA) Aeronautics Research Mission Directorate (ARMD). This software is referred to as the AECS Test Project throughout the remainder of this document. AECS provides a framework for developing, simulating, testing, and analyzing modern avionics systems within an Integrated Modular Avionics (IMA) architecture. The purpose of the AECS Test Project is twofold. First, it provides a means to test the AECS hardware and system developed by MSG. Second, it provides an example project upon which future AECS research may be based. This Software Guide fully describes building, installing, and executing the AECS Test Project as well as its architecture and design. The design of the AECS hardware is described in the AECS Hardware Guide. Instructions on how to configure, build and use the AECS are described in the User's Guide. Sample AECS software, developed by the WVHTC Foundation, is presented in the AECS Software Guide. The AECS Hardware Guide, AECS User's Guide, and AECS Software Guide are authored by MSG. The requirements set forth for AECS are presented in the Statement of Work for the Assessment Environment for Complex Systems authored by NASA Dryden Flight Research Center (DFRC). The intended audience for this document includes software engineers, hardware engineers, project managers, and quality assurance personnel from WVHTC Foundation (the suppliers of the software), NASA (the customer), and future researchers (users of the software). Readers are assumed to have general knowledge in the field of real-time, embedded computer software development.

  4. Scheduling Software for Complex Scenarios

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Preparing a vehicle and its payload for a single launch is a complex process that involves thousands of operations. Because the equipment and facilities required to carry out these operations are extremely expensive and limited in number, optimal assignment and efficient use are critically important. Overlapping missions that compete for the same resources, ground rules, safety requirements, and the unique needs of processing vehicles and payloads destined for space impose numerous constraints that, when combined, require advanced scheduling. Traditional scheduling systems use simple algorithms and criteria when selecting activities and assigning resources and times to each activity. Schedules generated by these simple decision rules are, however, frequently far from optimal. To resolve mission-critical scheduling issues and predict possible problem areas, NASA historically relied upon expert human schedulers who used their judgment and experience to determine where things should happen, whether they will happen on time, and whether the requested resources are truly necessary.

  5. Software Accelerates Computing Time for Complex Math

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  6. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  7. Software Process Assurance for Complex Electronics (SPACE)

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.

  8. Software Process Assurance for Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.

  9. Orbital flight simulation utility software unit specifications

    NASA Technical Reports Server (NTRS)

    Wilson, S. W.

    1986-01-01

    The HP PASCAL source code contained in pages 6 through 104 was developed for the Mission Planning and Analysis Division (MPAD) and takes the place of detailed flow charts defining the specifications for a Utility Software Unit designed to support orbital flight simulators such as MANHANDLE and GREAS (General Research and Engineering Analysis Simulator). Besides providing basic input/output, mathematical, vector, matrix, quaternion, and statistical routines for such simulators, one of the primary functions of the Utility Software Unit is to isolate all system-dependent code in one well-defined compartment, thereby facilitating transportation of the simulations from one computer to another. Directives to the PASCAL compilers of the HP-9000 Series 200 PASCAL 3.0 operating system and the HP-9000 Series 500 HP-UX 5.0 operations systems are also provided.

  10. Domain specific software architectures: Command and control

    NASA Technical Reports Server (NTRS)

    Braun, Christine; Hatch, William; Ruegsegger, Theodore; Balzer, Bob; Feather, Martin; Goldman, Neil; Wile, Dave

    1992-01-01

    GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing.

  11. Requirements management system browser software requirements specification

    SciTech Connect

    Frank, D.D.

    1996-10-01

    The purpose of this document is to define the essential user requirements for the Requirements Management System Browser (RMSB) application. This includes specifications for the Graphical User Interface (GUI) and the supporting database structures. The RMSB application is needed to provide an easy to use PC-based interface to browse system engineering data stored and managed in a UNIX software application. The system engineering data include functions, requirements, and architectures that make up the Tank Waste Remediation System (TWRS) technical baseline. This document also covers the requirements for a software application titled ``RMSB Data Loader (RMSB- DL)``, referred to as the ``Parser.`` The Parser is needed to read and parse a data file and load the data structure supporting the Browser.

  12. Light duty utility arm software requirements specification

    SciTech Connect

    Kiebel, G.R.

    1995-12-18

    This document defines the software requirements for the integrated control and data acquisition system of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product.

  13. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  14. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... issued with a temporary identification as Draft Regulatory Guide, DG-1209 on August 22, 2012 (77 FR 50726... COMMISSION Software Requirement Specifications for Digital Computer Software Used in Safety Systems of... 1 of RG 1.172, ``Software Requirement Specifications for Digital Computer Software used in...

  15. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  16. Software for Simulating a Complex Robot

    NASA Technical Reports Server (NTRS)

    Goza, S. Michael

    2003-01-01

    RoboSim (Robot Simulation) is a computer program that simulates the poses and motions of the Robonaut a developmental anthropomorphic robot that has a complex system of joints with 43 degrees of freedom and multiple modes of operation and control. RoboSim performs a full kinematic simulation of all degrees of freedom. It also includes interface components that duplicate the functionality of the real Robonaut interface with control software and human operators. Basically, users see no difference between the real Robonaut and the simulation. Consequently, new control algorithms can be tested by computational simulation, without risk to the Robonaut hardware, and without using excessive Robonaut-hardware experimental time, which is always at a premium. Previously developed software incorporated into RoboSim includes Enigma (for graphical displays), OSCAR (for kinematical computations), and NDDS (for communication between the Robonaut and external software). In addition, RoboSim incorporates unique inverse-kinematical algorithms for chains of joints that have fewer than six degrees of freedom (e.g., finger joints). In comparison with the algorithms of OSCAR, these algorithms are more readily adaptable and provide better results when using equivalent sets of data.

  17. Salvo: Seismic imaging software for complex geologies

    SciTech Connect

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  18. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  19. Formalization and visualization of domain-specific software architectures

    NASA Technical Reports Server (NTRS)

    Bailor, Paul D.; Luginbuhl, David R.; Robinson, John S.

    1992-01-01

    This paper describes a domain-specific software design system based on the concepts of software architectures engineering and domain-specific models and languages. In this system, software architectures are used as high level abstractions to formulate a domain-specific software design. The software architecture serves as a framework for composing architectural fragments (e.g., domain objects, system components, and hardware interfaces) that make up the knowledge (or model) base for solving a problem in a particular application area. A corresponding software design is generated by analyzing and describing a system in the context of the software architecture. While the software architecture serves as the framework for the design, this concept is insufficient by itself for supplying the additional details required for a specific design. Additional domain knowledge is still needed to instantiate components of the architecture and develop optimized algorithms for the problem domain. One possible way to obtain the additional details is through the use of domain-specific languages. Thus, the general concept of a software architecture and the specific design details provided by domain-specific languages are combined to create what can be termed a domain-specific software architecture (DSSA).

  20. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  1. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  2. Preparation guide for class B software specification documents

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1979-01-01

    General conceptual requirements and specific application rules and procedures are provided for the production of software specification documents in conformance with deep space network software standards and class B standards. Class B documentation is identified as the appropriate level applicable to implementation, sustaining engineering, and operational uses by qualified personnel. Special characteristics of class B documents are defined.

  3. Industrial Source Complex (ISC) dispersion model. Software

    SciTech Connect

    Schewe, G.; Sieurin, E.

    1980-01-01

    The model updates various EPA dispersion model algorithms and combines them in two computer programs that can be used to assess the air quality impact of emissions from the wide variety of source types associated with an industrial source complex. The ISC Model short-term program ISCST, an updated version of the EPA Single Source (CRSTER) Model uses sequential hourly meteorological data to calculate values of average concentration or total dry deposition for time periods of 1, 2, 3, 4, 6, 8, 12 and 24 hours. Additionally, ISCST may be used to calculate 'N' is 366 days. The ISC Model long-term computer program ISCLT, a sector-averaged model that updates and combines basic features of the EPA Air Quality Display Model (AQDM) and the EPA Climatological Dispersion Model (CDM), uses STAR Summaries to calculate seasonal and/or annual average concentration or total deposition values. Both the ISCST and ISCLT programs make the same basic dispersion-model assumptions. Additionally, both the ISCST and ISCLT programs use either a polar or a Cartesian receptor grid...Software Description: The programs are written in the FORTRAN IV programming language for implementation on a UNIVAC 1110 computer and also on medium-to-large IBM or CDC systems. 65,000k words of core storage are required to operate the model.

  4. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  5. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  6. Reasoning about software specifications - A case study

    NASA Technical Reports Server (NTRS)

    Wild, Chris; Chen, JI; Eckhardt, Dave

    1989-01-01

    The launch intercept condition (LIC) problem was analyzed to better understand the complexity of the automated reasoning process and the nature and role of background knowledge. An extension to resolution-based logic processing systems, called constraint logic programming, is described. Consideration is also given to a knowledge-driven incremental reasoning architecture which integrates a generic constraint logic programming system with a knowledge base of the domain.

  7. Early Childhood Educational Software: Specific Features and Issues of Localization

    ERIC Educational Resources Information Center

    Nikolopoulou, Kleopatra

    2007-01-01

    The computer has now become a recognized tool in the education of young children and when used appropriately can reinforce their learning experiences. This paper reviews specific features (relating to pedagogic design, software content and user-interface design) of early childhood educational software and discusses issues in favor of its…

  8. Analyzing Software Specifications for Mode Confusion Potential

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Pinnel, L. Denise; Sandys, Sean David; Koga, Shuichi; Reese, Jon Damon

    1998-01-01

    Increased automation in complex systems has led to changes in the human controller's role and to new types of technology-induced human error. Attempts to mitigate these errors have primarily involved giving more authority to the automation, enhancing operator training, or changing the interface. While these responses may be reasonable under many circumstances, an alternative is to redesign the automation in ways that do not reduce necessary or desirable functionality or to change functionality where the tradeoffs are judged to be acceptable. This paper describes an approach to detecting error-prone automation features early in the development process while significant changes can still be made to the conceptual design of the system. The information about such error-prone features can also be useful in the design of the operator interface, operational procedures, or operator training.

  9. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  10. Independent verification and validation of large software requirement specification databases

    SciTech Connect

    Twitchell, K.E.

    1992-04-01

    To enhance quality, an independent verification and validation (IV V) review is conducted as software requirements are defined. Requirements are inspected for consistency and completeness. IV V strives to detect defects early in the software development life cycle and to prevent problems before they occur. The IV V review process of a massive software requirements specification, the Reserve Component Automation System (RCAS) Functional Description (FD) is explored. Analysis of the RCAS FD error history determined that there are no predictors of errors. The size of the FD mandates electronic analysis of the databases. Software which successfully performs automated consistency and completeness checks is discussed. The process of verifying the quality of analysis software is described. The use of intuitive ad hoc techniques, in addition to the automatic analysis of the databases, is required because of the varying content of the requirements databases. The ad hoc investigation process is discussed. Case studies are provided to illustrate how the process works. This thesis demonstrates that it is possible to perform an IV V review on a massive software requirements specification. Automatic analysis enables inspecting for completeness and consistency. The work with the RCAS FD clearly indicates that the IV V review process is not static; it must continually grow, adapt, and change as conditions warrant. The ad hoc investigation process provides this required flexibility This process also analyzes errors discovered by manual review and automatic processing. The analysis results in the development of new algorithms and the addition of new programs to the automatic inspection software.

  11. AM, administrative software ease complex Maryland job

    SciTech Connect

    Troch, S.J.; Agnes, D.C.; Catonzaro, J.S.; Oberlechner, L.E.

    1995-06-01

    A gas distribution looping project, in three segments that traversed a complete range of installation and alignment issues, recently was completed by Baltimore Gas and Electric Co. (BG and E) in northern Maryland. The major projects unit in the company`s gas system engineering and design section was responsible for total oversight of the three projects. This included design, engineering, permitting, right-of-way acquisition, construction, testing and restoration, as well as liaison with other company divisions. A specially selected subcontractor team was organized to provide the latest technology. A project management system, comprised mainly of personal computer applications, was implemented to provide: engineering and design coordination; accurate interface among easement, real estate acquisition data, plats, surveys, permitting and design documents; accurate right-of-way identification; data storage and accessibility of all real estate information for use in design and budgeting; an interface of environmental conditions with topography and design; a computer database that is compatible with existing computer libraries and industry-available software, for producing drawings. Controls for projects costs, budget and schedule were provided by the project management system. This was accomplished by interaction of four data systems: real estate, accounting/budget, geographical information system (GIS), global positioning system (GPS). Construction progress was monitored with a scheduling application that ultimately provided justification for contractor progress payments. The amount of pipe laid in any given time span, as documented by field inspector reports, was entered into the scheduling application. The scheduling software calculated the percent completed and provided information for monitoring progress.

  12. Psychosocial Risks Generated By Assets Specific Design Software

    NASA Astrophysics Data System (ADS)

    Remus, Furtună; Angela, Domnariu; Petru, Lazăr

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  13. The KASE approach to domain-specific software systems

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Designing software systems, like all design activities, is a knowledge-intensive task. Several studies have found that the predominant cause of failures among system designers is lack of knowledge: knowledge about the application domain, knowledge about design schemes, knowledge about design processes, etc. The goal of domain-specific software design systems is to explicitly represent knowledge relevant to a class of applications and use it to partially or completely automate various aspects of the designing systems within that domain. The hope is that this would reduce the intellectual burden on the human designers and lead to more efficient software development. In this paper, we present a domain-specific system built on top of KASE, a knowledge-assisted software engineering environment being developed at the Stanford Knowledge Systems Laboratory. We introduce the main ideas underlying the construction of domain specific systems within KASE, illustrate the application of the idea in the synthesis of a system for tracking aircraft from radar signals, and discuss some of the issues in constructing domain-specific systems.

  14. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  15. Complexity for Artificial Substrates (CASU): Software for Creating and Visualising Habitat Complexity

    PubMed Central

    Loke, Lynette H. L.; Jachowski, Nicholas R.; Bouma, Tjeerd J.; Ladle, Richard J.; Todd, Peter A.

    2014-01-01

    Physical habitat complexity regulates the structure and function of biological communities, although the mechanisms underlying this relationship remain unclear. Urbanisation, pollution, unsustainable resource exploitation and climate change have resulted in the widespread simplification (and loss) of habitats worldwide. One way to restore physical complexity to anthropogenically simplified habitats is through the use of artificial substrates, which also offer excellent opportunities to explore the effects of different components (variables) of complexity on biodiversity and community structure that would be difficult to separate in natural systems. Here, we describe a software program (CASU) that enables users to visualise static, physical complexity. CASU also provides output files that can be used to create artificial substrates for experimental and/or restoration studies. It has two different operational modes: simple and advanced. In simple mode, users can adjust the five main variables of informational complexity (i.e. the number of object types, relative abundance of object types, density of objects, variability and range in the objects’ dimensions, and their spatial arrangement) and visualise the changes as they do so. The advanced mode allows users to design artificial substrates by fine-tuning the complexity variables as well as alter object-specific parameters. We illustrate how CASU can be used to create tiles of different designs for application in a marine environment. Such an ability to systematically influence physical complexity could greatly facilitate ecological restoration by allowing conservationists to rebuild complexity in degraded and simplified habitats. PMID:24551074

  16. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    NASA Technical Reports Server (NTRS)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  17. Independent verification and validation of large software requirement specification databases

    SciTech Connect

    Twitchell, K.E.

    1992-04-01

    To enhance quality, an independent verification and validation (IV&V) review is conducted as software requirements are defined. Requirements are inspected for consistency and completeness. IV&V strives to detect defects early in the software development life cycle and to prevent problems before they occur. The IV&V review process of a massive software requirements specification, the Reserve Component Automation System (RCAS) Functional Description (FD) is explored. Analysis of the RCAS FD error history determined that there are no predictors of errors. The size of the FD mandates electronic analysis of the databases. Software which successfully performs automated consistency and completeness checks is discussed. The process of verifying the quality of analysis software is described. The use of intuitive ad hoc techniques, in addition to the automatic analysis of the databases, is required because of the varying content of the requirements databases. The ad hoc investigation process is discussed. Case studies are provided to illustrate how the process works. This thesis demonstrates that it is possible to perform an IV&V review on a massive software requirements specification. Automatic analysis enables inspecting for completeness and consistency. The work with the RCAS FD clearly indicates that the IV&V review process is not static; it must continually grow, adapt, and change as conditions warrant. The ad hoc investigation process provides this required flexibility This process also analyzes errors discovered by manual review and automatic processing. The analysis results in the development of new algorithms and the addition of new programs to the automatic inspection software.

  18. Towards a Domain Specific Software Architecture for Scientific Data Distribution

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Lindholm, D. M.

    2011-12-01

    A reference architecture is a "design that satisfies a clearly distinguished subset of the functional capabilities identified in the reference requirements within the boundaries of certain design and implementation constraints, also identified in reference requirements." [Tracz, 1995] Recognizing the value of a reference architecture, NASA's ESDSWG's Standards Process Group (SPG) is introducing a multi-disciplinary science data systems (SDS) reference architecture in order to provide an implementation neutral, template solution for an architecture to support scientific data systems in general [Burnett, et al, 2011]. This reference architecture describes common features and patterns in scientific data systems, and can thus provide guidelines in building and improving such systems. But, guidelines alone may not be sufficient to actually build a system. A domain specific software architecture (DSSA) is "an assemblage of software components, specialized for a particular type of task (domain), generalized for effective use across that domain, composed in a standardized structure (topology) effective for building successful applications." [Tracz, 1995]. It can be thought of as relatively specific reference architecture. The "DSSA Process" is a software life cycle developed at Carnegie Melon's Software Engineering Institute that is based on the development and use of domain-specific software architectures, components, and tools. The process has four distinct activities: 1) develop a domain specific base/model, 2) populate and maintain the library, 3) build applications, 4) operate and maintain applications [Armitage, 1993]. The DSSA process may provide the missing link between guidelines and actual system construction. In this presentation we focus specifically on the realm of scientific data access and distribution. Assuming the role of domain experts in building data access systems, we report the results of creating a DSSA for scientific data distribution. We describe

  19. Observation-Driven Configuration of Complex Software Systems

    NASA Astrophysics Data System (ADS)

    Sage, Aled

    2010-06-01

    The ever-increasing complexity of software systems makes them hard to comprehend, predict and tune due to emergent properties and non-deterministic behaviour. Complexity arises from the size of software systems and the wide variety of possible operating environments: the increasing choice of platforms and communication policies leads to ever more complex performance characteristics. In addition, software systems exhibit different behaviour under different workloads. Many software systems are designed to be configurable so that policies can be chosen to meet the needs of various stakeholders. For complex software systems it can be difficult to accurately predict the effects of a change and to know which configuration is most appropriate. This thesis demonstrates that it is useful to run automated experiments that measure a selection of system configurations. Experiments can find configurations that meet the stakeholders' needs, find interesting behavioural characteristics, and help produce predictive models of the system's behaviour. The design and use of ACT (Automated Configuration Tool) for running such experiments is described, in combination a number of search strategies for deciding on the configurations to measure. Design Of Experiments (DOE) is discussed, with emphasis on Taguchi Methods. These statistical methods have been used extensively in manufacturing, but have not previously been used for configuring software systems. The novel contribution here is an industrial case study, applying the combination of ACT and Taguchi Methods to DC-Directory, a product from Data Connection Ltd (DCL). The case study investigated the applicability of Taguchi Methods for configuring complex software systems. Taguchi Methods were found to be useful for modelling and configuring DC- Directory, making them a valuable addition to the techniques available to system administrators and developers.

  20. Solid Waste Information and Tracking System (SWITS) Software Requirements Specification

    SciTech Connect

    MAY, D.L.

    2000-03-22

    This document is the primary document establishing requirements for the Solid Waste Information and Tracking System (SWITS) as it is converted to a client-server architecture. The purpose is to provide the customer and the performing organizations with the requirements for the SWITS in the new environment. This Software Requirement Specification (SRS) describes the system requirements for the SWITS Project, and follows the PHMC Engineering Requirements, HNF-PRO-1819, and Computer Software Qualify Assurance Requirements, HNF-PRO-309, policies. This SRS includes sections on general description, specific requirements, references, appendices, and index. The SWITS system defined in this document stores information about the solid waste inventory on the Hanford site. Waste is tracked as it is generated, analyzed, shipped, stored, and treated. In addition to inventory reports a number of reports for regulatory agencies are produced.

  1. The Influence of Software Complexity on the Maintenance Effort: Case Study on Software Developed within Educational Process

    ERIC Educational Resources Information Center

    Radulescu, Iulian Ionut

    2006-01-01

    Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…

  2. Engine Structures Analysis Software: Component Specific Modeling (COSMO)

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-01-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  3. Engine structures analysis software: Component Specific Modeling (COSMO)

    NASA Astrophysics Data System (ADS)

    McKnight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-08-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  4. Managing scientific software complexity with Bocca and CCA

    SciTech Connect

    Benjamin, Allan A.; Norris, Boyana; Elwasif, Wael R; Armstrong, Robert C.

    2008-01-01

    In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enable application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.

  5. Managing Scientific Software Complexity with Bocca and CCA

    DOE PAGESBeta

    Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; Armstrong, Robert C.

    2008-01-01

    In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less

  6. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  7. Fuzzy complexes: Specific binding without complete folding.

    PubMed

    Sharma, Rashmi; Raduly, Zsolt; Miskei, Marton; Fuxreiter, Monika

    2015-09-14

    Specific molecular recognition is assumed to require a well-defined set of contacts and devoid of conformational and interaction ambiguities. Growing experimental evidence demonstrates however, that structural multiplicity or dynamic disorder can be retained in protein complexes, termed as fuzziness. Fuzzy regions establish alternative contacts between specific partners usually via transient interactions. Nature often tailors the dynamic properties of these segments via post-translational modifications or alternative splicing to fine-tune affinity. Most experimentally characterized fuzzy complexes are involved in regulation of gene-expression, signal transduction and cell-cycle regulation. Fuzziness is also characteristic to viral protein complexes, cytoskeleton structure, and surprisingly in a few metabolic enzymes. A plausible role of fuzzy complexes in increasing half-life of intrinsically disordered proteins is also discussed. PMID:26226339

  8. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  9. Specific Language Impairment at Adolescence: Avoiding Complexity

    ERIC Educational Resources Information Center

    Tuller, Laurice; Henry, Celia; Sizaret, Eva; Barthez, Marie-Anne

    2012-01-01

    This study explores complex language in adolescents with specific language impairment (SLI) with the aim of finding out how aspects of language characteristic of typical syntactic development after childhood fare and, in particular, whether there is evidence that individuals with SLI avoid using structures whose syntactic derivation involves…

  10. Software programs that address site-specific inventory characteristics issues.

    SciTech Connect

    Dare, J. H.; Cournoyer, M. E.

    2001-01-01

    The proper characterization of Hazardous, Mixed Low-Level, and Mixed Transuranic waste enhances productivity and safety. Hazardous material criteria that need to be considered include physical and health hazards inherent to the waste stream. Other factors that may influence characterization include: particulate diameter, complexing or chelating agent properties, lead, and mercury content, pressurized containers, and P-listed wastes. To meet these requirements are only a simple matter of generating a database with the proper fields. Manufactures and institutional databases bank huge sources of information, such as, work control documents, substance identification, container types, components of mixtures, physical property data, and regulatory data. In this report, utilization of commercially available software programs to take advantage of these resources in addressing waste characterization issues are presented. The application of user-friendly programs eliminates part of the tediousness associated with the complex requirements of certifying to general waste acceptance criteria with minimal impact on programmatic work. In other words, tapping into manufacturer and institutional database provides a way to take advantage of the combined expertise of these resources in managing a cost effective waste certification program as well as adding a quality assurance element to the program.

  11. Specificity, promiscuity, and the structure of complex information processing networks

    NASA Astrophysics Data System (ADS)

    Myers, Christopher

    2006-03-01

    Both the top-down designs of engineered systems and the bottom-up serendipities of biological evolution must negotiate tradeoffs between specificity and control: overly specific interactions between components can make systems brittle and unevolvable, while more generic interactions can require elaborate control in order to aggregate specificity from distributed pieces. Complex information processing systems reveal network organizations that navigate this landscape of constraints: regulatory and signaling networks in cells involve the coordination of molecular interactions that are surprisingly promiscuous, and object-oriented design in software systems emphasizes the polymorphic composition of objects of minimal necessary specificity [C.R. Myers, Phys Rev E 68, 046116 (2003)]. Models of information processing arising both in systems biology and engineered computation are explored to better understand how particular network organizations can coordinate the activity of promiscuous components to achieve robust and evolvable function.

  12. SIM_EXPLORE: Software for Directed Exploration of Complex Systems

    NASA Technical Reports Server (NTRS)

    Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.

    2013-01-01

    Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software

  13. Orbital flight simulation utility software unit specifications, revision 1

    NASA Technical Reports Server (NTRS)

    Wilson, S. W.

    1986-01-01

    The HP PASCAL source code defines the specifications for a Utility Software Unit (USU) designed to support orbital flight simulators such as MANHANDLE and GREAS (General Research and Engineering Analysis Simulator). Besides providing basic input/output, mathematical, matrix, quaternion, and statistical routines for such simulators, one of the primary functions of the USU is to isolate all system-dependent codes in one well-defined compartment, thereby facilitating transportation of the simulations from one computer to another. Directives are given for the PASCAL compilers of the HP-9000 Series 200 Pascal 3.0 and the HP-9000 Series 500 HP-UX 5.0 operating systems that produce a single file of relocatable code from four separate files of source code. Three of the source code files are common to both operating systems. The fourth source code file (utilspif.I) contains all of the system-dependent PASCAL code for the USU. A fifth file of source code written in C is required to interface utilspif.I with the HP-UX I/O package. The Pascal 3.0 compiler directives and the driver source code for a unit rest program and counterparts for the HP-UX 5.0 operating system are given. The major portion of the unit test program source code is common to both operating systems. Unit test results from the Pascal 3.0 operating system and results from the HP-UX operating system are given.

  14. Computer software requirements specification for the world model light duty utility arm system

    SciTech Connect

    Ellis, J.E.

    1996-02-01

    This Computer Software Requirements Specification defines the software requirements for the world model of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product. (This deploys end effectors into underground storage tanks by means of robotic arm on end of telescoping mast.)

  15. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  16. CarbBuilder: Software for building molecular models of complex oligo- and polysaccharide structures.

    PubMed

    Kuttel, Michelle M; Ståhle, Jonas; Widmalm, Göran

    2016-08-15

    CarbBuilder is a portable software tool for producing three-dimensional molecular models of carbohydrates from the simple text specification of a primary structure. CarbBuilder can generate a wide variety of carbohydrate structures, ranging from monosaccharides to large, branched polysaccharides. Version 2.0 of the software, described in this article, supports monosaccharides of both mammalian and bacterial origin and a range of substituents for derivatization of individual sugar residues. This improved version has a sophisticated building algorithm to explore the range of possible conformations for a specified carbohydrate molecule. Illustrative examples of models of complex polysaccharides produced by CarbBuilder demonstrate the capabilities of the software. CarbBuilder is freely available under the Artistic License 2.0 from https://people.cs.uct.ac.za/~mkuttel/Downloads.html. © 2016 Wiley Periodicals, Inc. PMID:27317625

  17. CHSSI Software for Geometrically Complex Unsteady Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Meakin, Robert L.; Potsdam, Mark A.

    2001-01-01

    A comprehensive package of scalable overset grid CFD software is reviewed. The software facilitates accurate simulation of complete aircraft aerodynamics, including viscous effects, unsteadiness, and relative motion between component parts. The software significantly lowers the manpower and computer costs normally associated with such efforts. The software is discussed in terms of current capabilities and planned future enhancements.

  18. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  19. Delivering Software Process-Specific Project Courses in Tertiary Education Environment: Challenges and Solution

    ERIC Educational Resources Information Center

    Rong, Guoping; Shao, Dong

    2012-01-01

    The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…

  20. Acquiring Software Project Specifications in a Virtual World

    ERIC Educational Resources Information Center

    Ng, Vincent; Tang, Zoe

    2012-01-01

    In teaching software engineering, it is often interesting to introduce real life scenarios for students to experience and to learn how to collect information from respective clients. The ideal arrangement is to have some real clients willing to spend time to provide their ideas of a target system through interviews. However, this arrangement…

  1. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    NASA Astrophysics Data System (ADS)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  2. As-built design specification for proportion estimate software subsystem

    NASA Technical Reports Server (NTRS)

    Obrien, S. (Principal Investigator)

    1980-01-01

    The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.

  3. Modeling Complex Cross-Systems Software Interfaces Using SysML

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin

    2013-01-01

    The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).

  4. On the nature of bias and defects in the software specification process

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1992-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. This paper describes the problem of bias. Additionally, this paper presents a model of the specification and design processes describing individual subprocesses in terms of precision/detail diagrams and a model of bias in multi-attribute software specifications. While studying how bias is introduced into a specification we realized that software defects and bias are dual problems of a single phenomenon. This was used to explain the large proportion of faults found during the coding phase at the Software Engineering Laboratory at NASA/GSFC.

  5. A requirements specification for a software design support system

    NASA Technical Reports Server (NTRS)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  6. The optimal community detection of software based on complex networks

    NASA Astrophysics Data System (ADS)

    Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong

    2016-02-01

    The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.

  7. Application of software quality assurance to a specific scientific code development task

    SciTech Connect

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development.

  8. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  9. The Development and Application of Composite Complexity Models and a Relative Complexity Metric in a Software Maintenance Environment

    NASA Astrophysics Data System (ADS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that no new defects are introduced in the development phase of the software process, and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modi fications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  10. Reference and PDF-manager software: complexities, support and workflow.

    PubMed

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those. PMID:21058181

  11. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    NASA Technical Reports Server (NTRS)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  12. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  13. Controlling Combinatorial Complexity in Software and Malware Behavior Computation

    SciTech Connect

    Pleszkoch, Mark G; Linger, Richard C

    2015-01-01

    Virtually all software is out of intellectual control in that no one knows its full behavior. Software Behavior Computation (SBC) is a new technology for understanding everything software does. SBC applies the mathematics of denotational semantics implemented by function composition in Functional Trace Tables (FTTs) to compute the behavior of programs, expressed as disjoint cases of conditional concurrent assignments. In some circumstances, combinatorial explosions in the number of cases can occur when calculating the behavior of sequences of multiple branching structures. This paper describes computational methods that avoid combinatorial explosions. The predicates that control branching structures such as ifthenelses can be organized into three categories: 1) Independent, resulting in no behavior case explosion, 2) Coordinated, resulting in two behavior cases, or 3) Goaloriented, with potential exponential growth in the number of cases. Traditional FTT-based behavior computation can be augmented by two additional computational methods, namely, Single-Value Function Abstractions (SVFAs) and, introduced in this paper, Relational Trace Tables (RTTs). These methods can be applied to the three predicate categories to avoid combinatorial growth in behavior cases while maintaining mathematical correctness.

  14. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  15. Geometric Algebra Software for Teaching Complex Numbers, Vectors and Spinors.

    ERIC Educational Resources Information Center

    Lounesto, Pertti; And Others

    1990-01-01

    Presents a calculator-type computer program, CLICAL, in conjunction with complex number, vector, and other geometric algebra computations. Compares the CLICAL with other symbolic programs for algebra. (Author/YP)

  16. Preliminary flight software specification for the petite amateur Navy satellite (PANSAT)

    NASA Astrophysics Data System (ADS)

    Ford, Teresa O.

    1994-03-01

    PANSAT is a small, spread-spectrum, communications satellite under design at the Naval Postgraduate School. It will support a store and forward bulletin board system for use by the amateur radio community. The flight software is responsible for the autonomous telemetry collection and hardware control operations of the satellite, communications and file transfer protocols allowing access to the bulletin board system, and command interpretation and response to ground control commands. in this thesis, the complete flight software architecture and module interfaces are specified using the Estelle Format Description Technique. The module bodies dealing with communications and file transfer protocols are specified in detail in Estelle. The current design goat for the remainder of the flight software modules are discussed. Appendices include the preliminary flight software specification itself, a data flow diagram interpretation of the specification, and a summary of the Estelle syntax used.

  17. Specific features of technetium mononuclear octahedral oxo complexes: A review

    SciTech Connect

    Sergienko, V. S. Churakov, A. V.

    2013-01-15

    The specific structural features of technetium mononuclear octahedral oxo complexes have been considered. The structures of d{sup 2}-Tc(V) mono- and dioxo complexes, d{sup 2}-Tc(V) pseudodioxo compounds (Tc(V) mono-oxo complexes with an additional multiply bonded RO{sup -} ligand), and d{sup 0}-Tc(VII) trioxo compounds are analyzed.

  18. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  19. Mathematical model and software complex for computer simulation of field emission electron sources

    SciTech Connect

    Nikiforov, Konstantin

    2015-03-10

    The software complex developed in MATLAB allows modelling of function of diode and triode structures based on field emission electron sources with complex sub-micron geometry, their volt-ampere characteristics, calculating distribution of electric field for educational and research needs. The goal of this paper is describing the physical-mathematical model, calculation methods and algorithms the software complex is based on, demonstrating the principles of its function and showing results of its work. For getting to know the complex, a demo version with graphical user interface is presented.

  20. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    SciTech Connect

    Glasscock, J.A.

    1995-03-08

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies.

  1. CARDS: A blueprint and environment for domain-specific software reuse

    NASA Technical Reports Server (NTRS)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  2. Demontration of Integrated Optimization Software at the Baldwin Energy Complex

    SciTech Connect

    Rob James; John McDermott; Sanjay Patnaik; Steve Piche`

    2009-01-07

    This project encompassed the design, development, and demonstration of integrated online optimization systems at Dynegy Midwest Generation's Baldwin Energy Complex (BEC) located in Baldwin, Illinois. The overall project objective was to improve coal-based generation's emission profile, efficiency, maintenance requirements and plant asset life in order to enhance the long-term viability of the United States abundant coal resources. Five separate but integrated optimization products were developed, addressing combustion, sootblowing, SCR operations, overall unit thermal performance, and plant-wide availability optimization. Optimization results are inherently unit-specific and cannot be known for a particular generating unit in advance. However, NeuCo believed that the following were reasonable targets for the completed, integrated set of products: Furnace NOx reduction improvement by 5%, Heat rate improvement by 1.5%, Increase of annual Available MWh by 1.5%, Commensurate reductions in greenhouse gases, mercury, and particulates; and Commensurate increases in profitability from lower costs, improved reliability, and greater commercial availability. The goal during Phase I was to establish each system and demonstrate their integration in unified plant optimization. Efforts during Phase I focused on: (1) developing, deploying, integrating, and testing prototypes for each of the five products; (2) identifying and addressing issues required for the products to integrate with plant operations; and (3) systematically collecting and assimilating feedback to improve subsequent product releases. As described in the Phase II continuation application NeuCo successfully achieved the goal for Phase I. The goal of Phase II was to improve upon the products installed and tested in Phase I and to quantify the benefits of the integrated system. As this report documents, NeuCo has also successfully achieved the goal for Phase II. The overall results of the project, compared with the

  3. Forecast and restoration of geomagnetic activity indices by using the software-computational neural network complex

    NASA Astrophysics Data System (ADS)

    Barkhatov, Nikolay; Revunov, Sergey

    2010-05-01

    It is known that currently used indices of geomagnetic activity to some extent reflect the physical processes occurring in the interaction of the perturbed solar wind with Earth's magnetosphere. Therefore, they are connected to each other and with the parameters of near-Earth space. The establishment of such nonlinear connections is interest. For such purposes when the physical problem is complex or has many parameters the technology of artificial neural networks is applied. Such approach for development of the automated forecast and restoration method of geomagnetic activity indices with the establishment of creative software-computational neural network complex is used. Each neural network experiments were carried out at this complex aims to search for a specific nonlinear relation between the analyzed indices and parameters. At the core of the algorithm work program a complex scheme of the functioning of artificial neural networks (ANN) of different types is contained: back propagation Elman network, feed forward network, fuzzy logic network and Kohonen layer classification network. Tools of the main window of the complex (the application) the settings used by neural networks allow you to change: the number of hidden layers, the number of neurons in the layer, the input and target data, the number of cycles of training. Process and the quality of training the ANN is a dynamic plot of changing training error. Plot of comparison of network response with the test sequence is result of the network training. The last-trained neural network with established nonlinear connection for repeated numerical experiments can be run. At the same time additional training is not executed and the previously trained network as a filter input parameters get through and output parameters with the test event are compared. At statement of the large number of different experiments provided the ability to run the program in a "batch" mode is stipulated. For this purpose the user a

  4. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  5. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  6. An algorithm to find critical execution paths of software based on complex network

    NASA Astrophysics Data System (ADS)

    Huang, Guoyan; Zhang, Bing; Ren, Rong; Ren, Jiadong

    2015-01-01

    The critical execution paths play an important role in software system in terms of reducing the numbers of test date, detecting the vulnerabilities of software structure and analyzing software reliability. However, there are no efficient methods to discover them so far. Thus in this paper, a complex network-based software algorithm is put forward to find critical execution paths (FCEP) in software execution network. First, by analyzing the number of sources and sinks in FCEP, software execution network is divided into AOE subgraphs, and meanwhile, a Software Execution Network Serialization (SENS) approach is designed to generate execution path set in each AOE subgraph, which not only reduces ring structure's influence on path generation, but also guarantees the nodes' integrity in network. Second, according to a novel path similarity metric, similarity matrix is created to calculate the similarity among sets of path sequences. Third, an efficient method is taken to cluster paths through similarity matrices, and the maximum-length path in each cluster is extracted as the critical execution path. At last, a set of critical execution paths is derived. The experimental results show that the FCEP algorithm is efficient in mining critical execution path under software complex network.

  7. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  8. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips. PMID:3536223

  9. The role of reliability graph models in assuring dependable operation of complex hardware/software systems

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Davis, Gloria J.; Pedar, A.

    1991-01-01

    The complexity of computer systems currently being designed for critical applications in the scientific, commercial, and military arenas requires the development of new techniques for utilizing models of system behavior in order to assure 'ultra-dependability'. The complexity of these systems, such as Space Station Freedom and the Air Traffic Control System, stems from their highly integrated designs containing both hardware and software as critical components. Reliability graph models, such as fault trees and digraphs, are used frequently to model hardware systems. Their applicability for software systems has also been demonstrated for software safety analysis and the analysis of software fault tolerance. This paper discusses further uses of graph models in the design and implementation of fault management systems for safety critical applications.

  10. Counselor Cognitions: General and Domain-Specific Complexity

    ERIC Educational Resources Information Center

    Welfare, Laura E.; Borders, L. DiAnne

    2010-01-01

    Counselor cognitive complexity is an important factor in counseling efficacy. The Counselor Cognitions Questionnaire (L. E. Welfare, 2006) and the Washington University Sentence Completion Test (J. Loevinger & R. Wessler, 1970) were used to explore the nature of general and domain-specific cognitive complexity. Counseling experience, supervisory…

  11. Treated effluent disposal system process control computer software requirements and specification

    SciTech Connect

    Graf, F.A. Jr.

    1994-06-03

    The software requirements for the monitor and control system that will be associated with the effluent collection pipeline system known as the 200 Area Treated Effluent Disposal System is covered. The control logic for the two pump stations and specific requirements for the graphic displays are detailed.

  12. Managing Complexity in Next Generation Robotic Spacecraft: From a Software Perspective

    NASA Technical Reports Server (NTRS)

    Reinholtz, Kirk

    2008-01-01

    This presentation highlights the challenges in the design of software to support robotic spacecraft. Robotic spacecraft offer a higher degree of autonomy, however currently more capabilities are required, primarily in the software, while providing the same or higher degree of reliability. The complexity of designing such an autonomous system is great, particularly while attempting to address the needs for increased capabilities and high reliability without increased needs for time or money. The efforts to develop programming models for the new hardware and the integration of software architecture are highlighted.

  13. Hepsoft - an approach for up to date multi-platform deployment of HEP specific software

    NASA Astrophysics Data System (ADS)

    Roiser, S.

    2011-12-01

    LHC experiments are depending on a rich palette of software components to build their specific applications. These underlying software components include the ROOT analysis framework, the Geant4 simulation toolkit, monte carlo generators, grid middle-ware, graphics libraries, scripting languages, databases, tools, etc. which are provided centrally in up to date versions on multiple platforms (Linux, Mac, Windows). Until recently this set of packages has been tested and released in a tree like structure as a consistent set of versions across operating systems, architectures and compilers for LHC experiments only. Because of the tree like deployment these releases were only usable in connection with a configuration management tool which provided the proper build and run-time environments and was hindering other parties outside LHC from easily using this palette of packages. In a new approach the releases will be grouped in "flat structure" such that interested parties can start using it without configuration management, retaining all the above mentioned advantages. In addition to an increased usability the software shall also be distributed via system provided package deployment systems (rpm, apt, etc.). The approach of software deployment is following the ideas of providing a wide range of HEP specific software packages and tools in a coherent, up to date and modular way on multiple platforms. The target audience for such software deployments are individual developers or smaller development groups / experiments who don't have the resources to maintain this kind of infrastructure. This new software deployment strategy has already been successfully implemented for groups at CERN.

  14. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  15. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  16. Specific Requirements of Physiotherapists on the Practical Use of Software in the Therapeutical Process.

    PubMed

    Messer-Misak, Karin; Egger, Rudolf

    2016-01-01

    The current healthcare system requires more effective management. New media and technology are supposed to support the demands of the current healthcare system. By the example of physiotherapy, the primary objective of this study was to define the specific requirements of therapists on the practical use of software which cover the administration, documentation and evaluation of the entire therapy process, including a database with pictures/videos about exercises which can be adapted individually by the therapists. Another objective was to show what conditions for a successful implementation of advanced applications during the entire treatment process have to be fulfilled. The approach of mixed-methods designs was chosen. In the first part a two-stage qualitative study was carried out, followed by a quantitative survey. The results show that the use of the software regarding the therapy-related part is dependent on how adaptable the software is to the special needs of the therapists, that the whole treatment process is mapped on the software and that an additional training during the professional practice must be implemented in order to deploy the use of the software successfully in the therapeutic process. PMID:27139399

  17. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics. PMID:18066681

  18. On the recognition of complex structures: Computer software using artificial intelligence applied to pattern recognition

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1974-01-01

    An approach to simultaneous interpretation of objects in complex structures so as to maximize a combined utility function is presented. Results of the application of a computer software system to assign meaning to regions in a segmented image based on the principles described in this paper and on a special interactive sequential classification learning system, which is referenced, are demonstrated.

  19. Combining Ontologies with Domain Specific Languages: A Case Study from Network Configuration Software

    NASA Astrophysics Data System (ADS)

    Miksa, Krzysztof; Sabina, Pawel; Kasztelnik, Marek

    One of the important aspects of Model-Driven Engineering (MDE) is to consider application-domain variability, which leads to creation of Domain Specific Languages (DSL). As with DSLs models are concise, easy to understand and maintain, this approach greatly increases the productivity and software quality. Usually, the DSLs in MDE are described with a metamodel and a concrete syntax definition. The models expressed in the DSL are linguistic instantiations of the language concepts found in the metamodel.

  20. Host computer software specifications for a zero-g payload manhandling simulator

    NASA Technical Reports Server (NTRS)

    Wilson, S. W.

    1986-01-01

    The HP PASCAL source code was developed for the Mission Planning and Analysis Division (MPAD) of NASA/JSC, and takes the place of detailed flow charts defining the host computer software specifications for MANHANDLE, a digital/graphical simulator that can be used to analyze the dynamics of onorbit (zero-g) payload manhandling operations. Input and output data for representative test cases are contained.

  1. Tools to aid the specification and design of flight software, appendix B

    NASA Technical Reports Server (NTRS)

    Bristow, G.

    1980-01-01

    The tasks that are normally performed during the specification and architecture design stages of software development are identified. Ways that tools could perform, or aid the performance, of such tasks are also identified. Much of the verification and analysis that is suggested is currently rarely performed during these early stages, but it is believed that this analysis should be done as early as possible so as to detect errors as early as possible.

  2. METU-SNP: an integrated software system for SNP-complex disease association analysis.

    PubMed

    Ustünkar, Gürkan; Aydın Son, Yeşim

    2011-01-01

    Recently, there has been increasing research to discover genomic biomarkers, haplotypes, and potentially other variables that together contribute to the development of diseases. Single Nucleotide Polymorphisms (SNPs) are the most common form of genomic variations and they can represent an individual’s genetic variability in greatest detail. Genome-wide association studies (GWAS) of SNPs, high-dimensional case-control studies, are among the most promising approaches for identifying disease causing variants. METU-SNP software is a Java based integrated desktop application specifically designed for the prioritization of SNP biomarkers and the discovery of genes and pathways related to diseases via analysis of the GWAS case-control data. Outputs of METU-SNP can easily be utilized for the downstream biomarkers research to allow the prediction and the diagnosis of diseases and other personalized medical approaches. Here, we introduce and describe the system functionality and architecture of the METU-SNP. We believe that the METU-SNP will help researchers with the reliable identification of SNPs that are involved in the etiology of complex diseases, ultimately supporting the development of personalized medicine approaches and targeted drug discoveries. PMID:22156365

  3. A discussion of higher order software concepts as they apply to functional requirements and specifications. [space shuttles and guidance

    NASA Technical Reports Server (NTRS)

    Hamilton, M.

    1973-01-01

    The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.

  4. An Automated Method for Identifying Inconsistencies within Diagrammatic Software Requirements Specifications

    NASA Technical Reports Server (NTRS)

    Zhang, Zhong

    1997-01-01

    The development of large-scale, composite software in a geographically distributed environment is an evolutionary process. Often, in such evolving systems, striving for consistency is complicated by many factors, because development participants have various locations, skills, responsibilities, roles, opinions, languages, terminology and different degrees of abstraction they employ. This naturally leads to many partial specifications or viewpoints. These multiple views on the system being developed usually overlap. From another aspect, these multiple views give rise to the potential for inconsistency. Existing CASE tools do not efficiently manage inconsistencies in distributed development environment for a large-scale project. Based on the ViewPoints framework the WHERE (Web-Based Hypertext Environment for requirements Evolution) toolkit aims to tackle inconsistency management issues within geographically distributed software development projects. Consequently, WHERE project helps make more robust software and support software assurance process. The long term goal of WHERE tools aims to the inconsistency analysis and management in requirements specifications. A framework based on Graph Grammar theory and TCMJAVA toolkit is proposed to detect inconsistencies among viewpoints. This systematic approach uses three basic operations (UNION, DIFFERENCE, INTERSECTION) to study the static behaviors of graphic and tabular notations. From these operations, subgraphs Query, Selection, Merge, Replacement operations can be derived. This approach uses graph PRODUCTIONS (rewriting rules) to study the dynamic transformations of graphs. We discuss the feasibility of implementation these operations. Also, We present the process of porting original TCM (Toolkit for Conceptual Modeling) project from C++ to Java programming language in this thesis. A scenario based on NASA International Space Station Specification is discussed to show the applicability of our approach. Finally

  5. siRNA Design Software for a Target Gene-Specific RNA Interference

    PubMed Central

    Naito, Yuki; Ui-Tei, Kumiko

    2012-01-01

    RNA interference (RNAi) is a mechanism through which small interfering RNA (siRNA) induces sequence-specific posttranscriptional gene silencing. RNAi is commonly recognized as a powerful tool not only for functional genomics but also for therapeutic applications. Twenty-one-nucleotide-long siRNA suppresses the expression of the intended gene whose transcript possesses perfect complementarity to the siRNA guide strand. Hence, its silencing effect has been assumed to be extremely specific. However, accumulated evidences revealed that siRNA could downregulate unintended genes with partial complementarities mainly to the seven-nucleotide seed region of siRNA. This phenomenon is referred to as off-target effect. We have revealed that the capability to induce off-target effect is strongly correlated to the thermodynamic stability in siRNA seed-target duplex. For understanding accurate target gene function and successful therapeutic application, it may be critical to select a target gene-specific siRNA with minimized off-target effect. Here we present our siRNA design software for a target-specific RNAi. In addition, we also introduce the software programs open to the public for designing functional siRNAs. PMID:22701467

  6. siRNA Design Software for a Target Gene-Specific RNA Interference.

    PubMed

    Naito, Yuki; Ui-Tei, Kumiko

    2012-01-01

    RNA interference (RNAi) is a mechanism through which small interfering RNA (siRNA) induces sequence-specific posttranscriptional gene silencing. RNAi is commonly recognized as a powerful tool not only for functional genomics but also for therapeutic applications. Twenty-one-nucleotide-long siRNA suppresses the expression of the intended gene whose transcript possesses perfect complementarity to the siRNA guide strand. Hence, its silencing effect has been assumed to be extremely specific. However, accumulated evidences revealed that siRNA could downregulate unintended genes with partial complementarities mainly to the seven-nucleotide seed region of siRNA. This phenomenon is referred to as off-target effect. We have revealed that the capability to induce off-target effect is strongly correlated to the thermodynamic stability in siRNA seed-target duplex. For understanding accurate target gene function and successful therapeutic application, it may be critical to select a target gene-specific siRNA with minimized off-target effect. Here we present our siRNA design software for a target-specific RNAi. In addition, we also introduce the software programs open to the public for designing functional siRNAs. PMID:22701467

  7. UQLab - A Software Platform for Uncertainty Quantification of Complex System Models

    NASA Astrophysics Data System (ADS)

    Wang, C.; Duan, Q.; Gong, W.

    2014-12-01

    UQLab (Uncertainty quantification Laboratory) is a flexible, user-friendly software platform that integrates different kinds of UQ methods including experimental design, sensitivity analysis, uncertainty analysis, surrogate modeling and optimization methods to characterize uncertainty of complex system models. It is written in Python language and can run on all common operating systems. UQLab has a graphic user interface (GUI) that allows users to enter commands and output analysis results via pull-down menus. It is equipped with a model driver generator that allows any system model to be linked with the software. The only requirement is to make sure the executable code, control file and output file of interest of a model accessible by the software. Through two geophysics models: the Sacramento Soil Moisture Accounting Model (SAC-SMA) and Common Land Model (CoLM), this presentation intends to demonstrate that UQLab is an effective and easy UQ tool to use, and can be applied to a wide range of applications.

  8. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  9. Managing Complexity - Developing the Node Control Software For The International Space Station

    NASA Technical Reports Server (NTRS)

    Wood, Donald B.

    2000-01-01

    On December 4th, 1998 at 3:36 AM STS-88 (the space shuttle Endeavor) was launched with the "Node 1 Unity Module" in its payload bay. After working on the Space Station program for a very long time, that launch was one of the most beautiful sights I had ever seen! As the Shuttle proceeded to rendezvous with the Russian American module know as Zarya, I returned to Houston quickly to start monitoring the activation of the software I had spent the last 3 years working on. The FGB module (also known as "Zarya"), was grappled by the shuttle robotic arm, and connected to the Unity module. Crewmembers then hooked up the power and data connections between Zarya and Unity. On December 7th, 1998 at 9:49 PM CST the Node Control Software was activated. On December 15th, 1998, the Node-l/Zarya "cornerstone" of the International Space Station was left on-orbit. The Node Control Software (NCS) is the first software flown by NASA for the International Space Station (ISS). The ISS Program is considered the most complex international engineering effort ever undertaken. At last count some 18 countries are active partners in this global venture. NCS has performed all of its intended functions on orbit, over 200 miles above us. I'll be describing how we built the NCS software.

  10. Practical Findings from Applying the PSD Model for Evaluating Software Design Specifications

    NASA Astrophysics Data System (ADS)

    Räisänen, Teppo; Lehto, Tuomas; Oinas-Kukkonen, Harri

    This paper presents practical findings from applying the PSD model to evaluating the support for persuasive features in software design specifications for a mobile Internet device. On the one hand, our experiences suggest that the PSD model fits relatively well for evaluating design specifications. On the other hand, the model would benefit from more specific heuristics for evaluating each technique to avoid unnecessary subjectivity. Better distinction between the design principles in the social support category would also make the model easier to use. Practitioners who have no theoretical background can apply the PSD model to increase the persuasiveness of the systems they design. The greatest benefit of the PSD model for researchers designing new systems may be achieved when it is applied together with a sound theory, such as the Elaboration Likelihood Model. Using the ELM together with the PSD model, one may increase the chances for attitude change.

  11. Assessing software upgrades, plan properties and patient geometry using intensity modulated radiation therapy (IMRT) complexity metrics

    SciTech Connect

    McGarry, Conor K.; Chinneck, Candice D.; O'Toole, Monica M.; O'Sullivan, Joe M; Prise, Kevin M.; Hounsell, Alan R.

    2011-04-15

    Purpose: The aim of this study is to compare the sensitivity of different metrics to detect differences in complexity of intensity modulated radiation therapy (IMRT) plans following upgrades, changes to planning parameters, and patient geometry. Correlations between complexity metrics are also assessed. Method: A program was developed to calculate a series of metrics used to describe the complexity of IMRT fields using monitor units (MUs) and multileaf collimator files: Modulation index (MI), modulation complexity score (MCS), and plan intensity map variation (PIMV). Each metric, including the MUs, was used to assess changes in beam complexity for six prostate patients, following upgrades in the inverse planning optimization software designed to incorporate direct aperture optimization (DAO). All beams were delivered to a 2D ionization chamber array and compared to those calculated using gamma analysis. Each complexity metric was then calculated for all beams, on a different set of six prostate IMRT patients, to assess differences between plans calculated using different minimum field sizes and different maximum segment numbers. Different geometries, including CShape, prostate, and head and neck phantoms, were also assessed using the metrics. Correlations between complexity metrics were calculated for 20 prostate IMRT patients. Results: MU, MCS, MI, and PIMV could all detect reduced complexity following an upgrade to the optimization leaf sequencer, although only MI and MCS could detect a reduction in complexity when one-step optimization (DAO) was employed rather than two-step optimization. All metrics detected a reduction in complexity when the minimum field size was increased from 1 to 4 cm and all apart from PIMV detected reduced complexity when the number of segments was significantly reduced. All metrics apart from MI showed differences in complexity depending on the treatment site. Significant correlations exist between all metrics apart from MI and PIMV for

  12. Hardware-Software Complex for a Study of High-Power Microwave Pulse Parameters

    NASA Astrophysics Data System (ADS)

    Gal'chenko, V. G.; Gladkova, T. A.

    2016-06-01

    An instrumental complex is developed for a study of high-power microwave pulse parameters. The complex includes a bench for calibrating detectors and a measuring instrument for evaluating the microwave pulse parameters. The calibration of the measurement channels of microwave pulses propagating through different elements of the experimental setup is an important problem of experimental research. The available software for calibration of the measuring channels has a significant disadvantage related with the necessity of input of a number of additional parameters directly into the program. The software realized in the Qt 4.5 C++ medium is presented, which significantly simplifies the process of calibration data input in the dialog mode of setting the parameters of the medium of microwave pulse propagation.

  13. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  14. Hardware-software complex for chlorophyll estimation in phytocenoses under field conditions

    NASA Astrophysics Data System (ADS)

    Yatsenko, V.; Kochubey, S.; Donets, V.; Kazantsev, T.

    2005-10-01

    Vegetation is a sensitive indicator suitable for testing of ecological stresses and natural anomalies of the technogenic character. First, it is determined by the prompt response of photosynthetic apparatus to changes of environmental conditions, mainly by change of green pigment (chlorophyll) content in leaves. Second, the specific kind of a reflectance spectrum of leaves is due to chlorophyll presence in them, and the area in the range of 500-80 nm is extremely sensitive to variations of its pigment content. Thirdly, there are interesting results now concerning spectral properties of leaves and crops canopies obtaining with high-resolution spectroscopy. The data are high informative in relation to content of chlorophyll and some other biochemical constituents of a cell. The high resistance to various types of noises is inherent to methods developed on the basis of such spectral data. We have developed a method for chlorophyll estimation using the 1-st derivative plots of reflectance spectral curves. The method gives good results for plant-soil systems with both for 100% and incomplete projective covering as our simulation models show. Field measurements of chlorophyll content in closed and open canopies crops confirm the results. A hardware-software complex has been produced by us for chlorophyll determining under field conditions. It consists of spectral and computing blocks. First of them is a two-beam spectrometer of high resolution supplied by a system to visualize of measured object. The irradiance and temperature sensors are included to the spectral block as well as GPS-receiver. The following technical characteristics are inherent to the block: spectral range 500-800 nm, band-pass 1.5 nm, field of view 16x16o, scanning time 0.1-1.0 s, dynamic range of signal 1:1024 (10 bit), signal/noise ratio 400, amount of pixels in image 1240, range of estimated chlorophyll concentrations 1.5-8.0 mg/dm2, supply voltage 12 V, weight 8 kg. Computing block is intended for

  15. Specificity and complexity in bacterial quorum-sensing systems

    PubMed Central

    Hawver, Lisa A.; Jung, Sarah A.; Ng, Wai-Leung

    2016-01-01

    Quorum sensing (QS) is a microbial cell-to-cell communication process that relies on the production and detection of chemical signals called autoinducers (AIs) to monitor cell density and species complexity in the population. QS allows bacteria to behave as a cohesive group and coordinate collective behaviors. While most QS receptors display high specificity to their AI ligands, others are quite promiscuous in signal detection. How do specific QS receptors respond to their cognate signals with high fidelity? Why do some receptors maintain low signal recognition specificity? In addition, many QS systems are composed of multiple intersecting signaling pathways: what are the benefits of preserving such a complex signaling network when a simple linear ‘one-to-one’ regulatory pathway seems sufficient to monitor cell density? Here, we will discuss different molecular mechanisms employed by various QS systems that ensure productive and specific QS responses. Moreover, the network architectures of some well-characterized QS circuits will be reviewed to understand how the wiring of different regulatory components achieves different biological goals. PMID:27354348

  16. Complex software dedicated for design and simulation of LPCE process for heavy water detritiation

    SciTech Connect

    Bornea, A.; Petrutiu, C.; Zamfirache, M.

    2015-03-15

    The main purpose of this paper is to present a comprehensive software, SICA, designed to be used in water-hydrogen liquid phase catalytic exchange process (LPCE). The software calculates the water-gas catalytic isotopic exchange process, following the transfer of any H, D or T isotope from water to gas and vice versa. This software is useful for both design and laboratory-based research; the type of the catalytic filling (ordered or random) can be defined for any of these 2 cases, the isotopic calculation being specific to the package type. For the laboratory-based research, the performance of a catalytic packing can be determined by knowing the type and by using experimental results. Performance of the mixed catalytic packing is defined by mass transfer constants for each catalytic and hydrophilic package in that specific arrangement, and also for the isotope whose transfer is studied from one phase to another. Also, it has been established a link between these constants and commonly used parameters for the fillings performance defined by HETP (Height Equivalent of Theoretical Plate). To demonstrate the performance of the software, we present a comparative analysis of water-gas catalytic isotopic exchange on a column equipped with 3 types of filling: successive layers, random and structured (ordered package filled with catalyst). The program can be used for the LPCE process calculation, process used at detritiation facilities for CANDU reactors or fusion reactors. (authors)

  17. A frame-based domain-specific language for rapid prototyping of FPGA-based software-defined radios

    NASA Astrophysics Data System (ADS)

    Ouedraogo, Ganda Stephane; Gautier, Matthieu; Sentieys, Olivier

    2014-12-01

    The field-programmable gate array (FPGA) technology is expected to play a key role in the development of software-defined radio (SDR) platforms. As this technology evolves, low-level designing methods for prototyping FPGA-based applications did not change throughout the decades. In the outstanding context of SDR, it is important to rapidly implement new waveforms to fulfill such a stringent flexibility paradigm. At the current time, different proposals have defined, through software-based approaches, some efficient methods to prototype SDR waveforms in a processor-based running environment. This paper describes a novel design flow for FPGA-based SDR applications. This flow relies upon high-level synthesis (HLS) principles and leverages the nascent HLS tools. Its entry point is a domain-specific language (DSL) which handles the complexity of programming an FPGA and integrates some SDR features so as to enable automatic waveform control generation from a data frame model. Two waveforms (IEEE 802.15.4 and IEEE 802.11a) have been designed and explored via this new methodology, and the results are highlighted in this paper.

  18. FACET: A simulation software framework for modeling complex societal processes and interactions

    SciTech Connect

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  19. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets

    PubMed Central

    Johnson, Z. P.; Eady, R. D.; Ahmad, S. F.; Agravat, S.; Morris, T; Else, J; Lank, S. M.; Wiseman, R. W.; O’Connor, D. H.; Penedo, M. C. T.; Larsen, C. P.

    2012-01-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permitsmultiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox onWindows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie. kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo, user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  20. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    PubMed

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo. PMID:22080300

  1. Actinide-specific complexing agents: their structural and solution chemistry

    SciTech Connect

    Raymond, K.N.; Freeman, G.E.; Kappel, M.J.

    1983-07-01

    The synthesis of a series of tetracatecholate ligands designed to be specific for Pu(IV) and other actinide(IV) ions has been achieved. Although these compounds are very effective as in vivo plutonium removal agents, potentiometric and voltammetric data indicate that at neutral pH full complexation of the Pu(IV) ion by all four catecholate groups does not occur. Spectroscopic results indicate that the tetracatecholates, 3,4,3-LICAMS and 3,4,3-LICAMC, complex Am(III). The Am(IV)/(III)-catecholate couple (where catecholate = 3,4,3-LICAMS or 3,4,3-LICAMC) is not observed, but may not be observable due to the large currents associated with ligand oxidation. However, within the potential range where ligand oxidation does not occur, these experiments indicate that the reduction potential of free Am(IV)/(III) is probably greater than or equal to + 2.6 V vs NHE or higher. Proof of the complexation of americium in the trivalent oxidation state by 3,4,3-LICAMS and 3,4,3-LICAMC elimates the possibility of tetracatholates stabilizing Am(IV) in vivo.

  2. CellProfiler Analyst: data exploration and analysis software for complex image-based screens

    PubMed Central

    Jones, Thouis R; Kang, In Han; Wheeler, Douglas B; Lindquist, Robert A; Papallo, Adam; Sabatini, David M; Golland, Polina; Carpenter, Anne E

    2008-01-01

    Background Image-based screens can produce hundreds of measured features for each of hundreds of millions of individual cells in a single experiment. Results Here, we describe CellProfiler Analyst, open-source software for the interactive exploration and analysis of multidimensional data, particularly data from high-throughput, image-based experiments. Conclusion The system enables interactive data exploration for image-based screens and automated scoring of complex phenotypes that require combinations of multiple measured features per cell. PMID:19014601

  3. Project evaluates POSC specifications for infill drilling. [Petrochemical Open Software Corp

    SciTech Connect

    Zahniser, D.L. ); Merritt, R.W. ); Chan, C.K. )

    1994-05-16

    A project is under way to build data-loading tools and create an integrated oil and gas production data base using specifications developed during the last 3 years by the Petrotechnical Open Software Corp. (POSC). The Industry Pilot Project (IPP) Phase 1 is a collaborative effort between seven oil companies. The participating oil companies have provided a large set of data from a producing North American oil and gas field and money, hardware, and personnel. Many companies have already streamlined their infill drilling processes and receive significant incremental benefits. But current information technology can often be a stumbling block. Cross-disciplinary use of information is the key goal of these streamlining efforts, but much time is lost in finding, reformatting, accessing, and determining the quality of data. This project sets out to prove how POSC specifications can help reduce the cost and time for developing a field, improved the quality of the decision-making process, minimize the number of communication barriers and, most importantly, change technology from a hurdle to a seamless step. The project demonstrates that POSC specifications are an enabling technology that can dramatically improve the way oil companies and suppliers operate.

  4. Would Boys and Girls Benefit from Gender-Specific Educational Software?

    ERIC Educational Resources Information Center

    Luik, Piret

    2011-01-01

    Most boys and girls interact differently with educational software and have different preferences for the design of educational software. The question is whether the usage of educational software has the same consequences for both genders. This paper investigates the characteristics of drill-and-practice programmes or drills that are efficient for…

  5. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    PubMed Central

    Zaytsev, Yury V.; Morrison, Abigail

    2013-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique. PMID:23316158

  6. The Evolution of Software and Its Impact on Complex System Design in Robotic Spacecraft Embedded Systems

    NASA Technical Reports Server (NTRS)

    Butler, Roy

    2013-01-01

    The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.

  7. A recombinant antibody with the antigen-specific, major histocompatibility complex-restricted specificity of T cells.

    PubMed Central

    Andersen, P S; Stryhn, A; Hansen, B E; Fugger, L; Engberg, J; Buus, S

    1996-01-01

    Specific recognition of peptide/major histocompatibility complex (MHC) molecule complexes by the T-cell receptor is a key reaction in the specific immune response. Antibodies against peptide/MHC complexes would therefore be valuable tools in studying MHC function and T-cell recognition and might lead to novel approaches in immunotherapy. However, it has proven difficult to generate antibodies with the specificity of T cells by conventional hybridoma techniques. Here we report that the phage display technology is a feasible alternative to generate antibodies recognizing specific, predetermined peptide/MHC complexes. Images Fig. 2 PMID:8700842

  8. 77 FR 9252 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... events as announced in the Federal Register on November 1, 2011: 76 FR 67456-67457. The Software... Rule), published in the Federal Register on November 21, 2008: 73 FR 70731-70814. AHRQ coordinates the... HUMAN SERVICES Agency for Healthcare Research and Quality Meeting for Software Developers on...

  9. Modeling of the competition life cycle using the software complex of cellular automata PyCAlab

    NASA Astrophysics Data System (ADS)

    Berg, D. B.; Beklemishev, K. A.; Medvedev, A. N.; Medvedeva, M. A.

    2015-11-01

    The aim of the work is to develop a numerical model of the life cycle of competition on the basis of software complex cellular automata PyCAlab. The model is based on the general patterns of growth of various systems in resource-limited settings. At examples it is shown that the period of transition from an unlimited growth of the market agents to the stage of competitive growth takes quite a long time and may be characterized as monotonic. During this period two main strategies of competitive selection coexist: 1) capture of maximum market space with any reasonable costs; 2) saving by reducing costs. The obtained results allow concluding that the competitive strategies of companies must combine two mentioned types of behavior, and this issue needs to be given adequate attention in the academic literature on management. The created numerical model may be used for market research when developing of the strategies for promotion of new goods and services.

  10. Software/firmware design specification for 10-MWe solar-thermal central-receiver pilot plant

    SciTech Connect

    Ladewig, T.D.

    1981-03-01

    The software and firmware employed for the operation of the Barstow Solar Pilot Plant are completely described. The systems allow operator control of up to 2048 heliostats, and include the capability of operator-commanded control, graphic displays, status displays, alarm generation, system redundancy, and interfaces to the Operational Control System, the Data Acquisition System, and the Beam Characterization System. The requirements are decomposed into eleven software modules for execution in the Heliostat Array Controller computer, one firmware module for execution in the Heliostat Field Controller microprocessor, and one firmware module for execution in the Heliostat Controller microprocessor. The design of the modules to satisfy requirements, the interfaces between the computers, the software system structure, and the computers in which the software and firmware will execute are detailed. The testing sequence for validation of the software/firmware is described. (LEW)

  11. Exploring the specificities of glycan-binding proteins using glycan array data and the GlycoSearch software

    PubMed Central

    Kletter, Doron; Curnutte, Bryan; Maupin, Kevin; Bern, Marshall; Haab, Brian B.

    2015-01-01

    Summary The glycan array is a powerful tool for investigating the specificities of glycan-binding proteins. By incubating a glycan-binding protein on a glycan array, the relative binding to hundreds of different oligosaccharides can be quantified in parallel. Based on these data, much information can be obtained about the preference of a glycan-binding protein for specific subcomponents of oligosaccharides, or motifs. In many cases the analysis and interpretation of glycan array data can be time consuming and imprecise if done manually. Recently we developed software, called GlycoSearch, to facilitate the analysis and interpretation of glycan array data based on the previously developed methods called Motif Segregation and Outlier Motif Analysis. Here we describe the principles behind the software, the use of the software, and an example application. The automated, objective, and precise analysis of glycan array data should enhance the value of the data for a broad range of research applications. PMID:25753713

  12. Software design specification. Part 2: Orbital Flight Test (OFT) detailed design specification. Volume 3: Applications. Book 2: System management

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The functions performed by the systems management (SM) application software are described along with the design employed to accomplish these functions. The operational sequences (OPS) control segments and the cyclic processes they control are defined. The SM specialist function control (SPEC) segments and the display controlled 'on-demand' processes that are invoked by either an OPS or SPEC control segment as a direct result of an item entry to a display are included. Each processing element in the SM application is described including an input/output table and a structured control flow diagram. The flow through the module and other information pertinent to that process and its interfaces to other processes are included.

  13. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 3: Commands specification

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1990-01-01

    This volume (3 of 4) contains the specification for the command language for the AMPS system. The volume contains a requirements specification for the operating system and commands and a design specification for the operating system and command. The operating system and commands sits on top of the protocol. The commands are an extension of the present set of AMPS commands in that the commands are more compact, allow multiple sub-commands to be bundled into one command, and have provisions for identifying the sender and the intended receiver. The commands make no change to the actual software that implement the commands.

  14. Data verification of a hardware-software complex of sounding an ionosphere and ionosonde DPS-4

    NASA Astrophysics Data System (ADS)

    Smirnov, Vladimir; Ruzhin, Yuri; Smirnova, Elena; Skobelkin, Vladimir; Tynyankin, Sergey

    Appeared in recent years, opportunities to use as a source of signals used to determine the parameters of the ionosphere, the spacecraft global navigation satellite systems GLONASS and GPS are not currently in widespread use practices ionospheric wave frequency and radio centers and dispatch services. Given the urgency of the discussed areas of research, long experiment whose purpose is to conduct a comparative analysis of the results of determining the critical frequency of F2-layer of the ionosphere in two ways - vertical sounding (ionosonde DPS-4) and radio translucence track "satellite-the Earth" with signals using GLONASS satellites and GPS was started in 2013. For a comparative analysis of the results the hardware-software complex ionospheric soundings (HSCIS) was located at territory of the Pushkov Institute of Terrestrial Magnetism, Ionosphere and Radio Wave Propagation of the Russian Academy of Sciences. HSCIS product includes a personal computer with it specialized software, a dual-frequency navigation receiver and small receiving antenna. Used in the product receiver developed by NovAtel allows us to receive the signals of the navigation systems GPS/GLONASS and maintain their processing in real time. Location receiver determined autonomously: antenna position - 55.76o N, 37.94o E, coordinates ionosonde DPS-4 - 55.5o N, 37.3o E. In fact, both devices were in close proximity, which it allows for the identity conditions of observation. Both devices operate in real time. Ionosonde DPS- 4 gave the ionosphere parameters every 15 minutes, HSCIS - every minute. Information from both instruments displayed on the screen monitors, and recorded in the memory used by computers. Along with the numerical parameters on the monitor products HSCIS displayed time course of the critical frequency F2- layer of the ionosphere obtained from observations of the nearest navigation satellite. When limiting elevation observations 15o simultaneous use of navigation satellites can

  15. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ). PMID:25426642

  16. Waste Receiving and Processing Facility Module 1 Data Management System Software Requirements Specification

    SciTech Connect

    Brann, E.C. II

    1994-09-09

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  17. Waste Receiving and Processing Facility Module 1 Data Management System software requirements specification

    SciTech Connect

    Rosnick, C.K.

    1996-04-19

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-0126). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  18. Datgan, a reusable software system for facile interrogation and visualization of complex transcription profiling data

    PubMed Central

    2011-01-01

    Background We introduce Glaucoma Discovery Platform (GDP), an online environment for facile visualization and interrogation of complex transcription profiling datasets for glaucoma. We also report the availability of Datgan, the suite of scripts that was developed to construct GDP. This reusable software system complements existing repositories such as NCBI GEO or EBI ArrayExpress as it allows the construction of searchable databases to maximize understanding of user-selected transcription profiling datasets. Description Datgan scripts were used to construct both the underlying data tables and the web interface that form GDP. GDP is populated using data from a mouse model of glaucoma. The data was generated using the DBA/2J strain, a widely used mouse model of glaucoma. The DBA/2J-Gpnmb+ strain provided a genetically matched control strain that does not develop glaucoma. We separately assessed both the retina and the optic nerve head, important tissues in glaucoma. We used hierarchical clustering to identify early molecular stages of glaucoma that could not be identified using morphological assessment of disease. GDP has two components. First, an interactive search and retrieve component provides the ability to assess gene(s) of interest in all identified stages of disease in both the retina and optic nerve head. The output is returned in graphical and tabular format with statistically significant differences highlighted for easy visual analysis. Second, a bulk download component allows lists of differentially expressed genes to be retrieved as a series of files compatible with Excel. To facilitate access to additional information available for genes of interest, GDP is linked to selected external resources including Mouse Genome Informatics and Online Medelian Inheritance in Man (OMIM). Conclusion Datgan-constructed databases allow user-friendly access to datasets that involve temporally ordered stages of disease or developmental stages. Datgan and GDP are

  19. STATIC AND KINETIC SITE-SPECIFIC PROTEIN-DNA PHOTOCROSSLINKING: ANALYSIS OF BACTERIAL TRANSCRIPTION INITIATION COMPLEXES

    PubMed Central

    Naryshkin, Nikolai; Druzhinin, Sergei; Revyakin, Andrei; Kim, Younggyu; Mekler, Vladimir; Ebright, Richard H.

    2009-01-01

    Static site-specific protein-DNA photocrosslinking permits identification of protein-DNA interactions within multiprotein-DNA complexes. Kinetic site-specific protein-DNA photocrosslinking--involving rapid-quench-flow mixing and pulsed-laser irradiation--permits elucidation of pathways and kinetics of formation of protein-DNA interactions within multiprotein-DNA complexes. We present detailed protocols for application of static and kinetic site-specific protein-DNA photocrosslinking to bacterial transcription initiation complexes. PMID:19378179

  20. A theory for graph-based language specification, analysis, and mapping with application to the development of parallel software

    SciTech Connect

    Bailor, P.D.

    1989-01-01

    This investigation develops a generalized formal language theory model and associated theorems for the specification, analysis, and mapping of graphs and graph-based languages. The developed model is defined as a graph generative system, and the model is analyzed from a set theoretic, formal language, algebraic, and abstract automata perspective. As a result of the analysis, numerous theorems pertaining to the properties of the model, graphs, and graph-based languages are derived. Additionally, the graph generative system model serves as the basis for applying graph-based languages to areas such as the specification and design of software and visual programming. The specific application area emphasized is the use of graph-based languages as user friendly interfaces for wide-spectrum languages that include structures for representing parallelism. The goal of this approach is to provide an effective, efficient, and formal method for the specification, design, and rapid prototyping of parallel software. To demonstrate the utility of the theory and the feasibility of the application, two models of parallel computation are chosen. The widely used Petri net model of parallel computation is formalized as a graph-based language. The Petri net syntax is formally mapped into the corresponding syntax of a Communicating Sequential Processes (CSP) model of parallel computation where CSP is used as the formalism for extended wide-spectrum languages. Finally, the Petri net to CSP mapping is analyzed from a behavioral perspective to demonstrate that the CSP specification formally behaves in a manner equivalent to the Petri net model.

  1. Theory for graph-based language specification, analysis, and mapping with application to the development of parallel software. Doctoral thesis

    SciTech Connect

    Bailor, P.D.

    1989-09-01

    Ageneralized formal language theory model and associated theorems were developed for the specification, analysis, and mapping of graphs and graph-based languages. The developed model, defined as a graph-generative system, is analyzed from a set theoretic, formal language, algebraic, and abstract automata perspective. As a result of the analysis, numerous theorems pertaining to the properties of the model, graphs, and graph-based languages are derived. The graph generative system model also serves as the basis for applying graph based languages to areas such as the specification and design of software and visual programming. The specific application area emphasized is the use of graph-based languages as user-friendly interfaces for wide-spectrum languages that include structures for representing parallelism. The goal of this approach is to provide an effective, efficient, and formal method for the specification, design, and rapid prototyping of parallel software. To demonstrate the theory's utility and the feasibility of the application, two models of parallel computation are chosen. The widely used Petri net model of parallel computation is formalized as a graph-based language. The Petri net syntax is formally mapped into the corresponding syntax of a Communicating Sequential Processes(CSP) model of parallel computation where CSP is used as the formalism for extended wide-spectrum languages. Finally, the Petri net to CSP mapping is analyzed to demonstrate that the CSP specification formally behaves in a manner equivalent to the Petri net model.

  2. Tissue specificity in the nuclear envelope supports its functional complexity

    PubMed Central

    de las Heras, Jose I; Meinke, Peter; Batrakou, Dzmitry G; Srsen, Vlastimil; Zuleger, Nikolaj; Kerr, Alastair RW; Schirmer, Eric C

    2013-01-01

    Nuclear envelope links to inherited disease gave the conundrum of how mutations in near-ubiquitous proteins can yield many distinct pathologies, each focused in different tissues. One conundrum-resolving hypothesis is that tissue-specific partner proteins mediate these pathologies. Such partner proteins may have now been identified with recent proteome studies determining nuclear envelope composition in different tissues. These studies revealed that the majority of the total nuclear envelope proteins are tissue restricted in their expression. Moreover, functions have been found for a number these tissue-restricted nuclear envelope proteins that fit with mechanisms proposed to explain how the nuclear envelope could mediate disease, including defects in mechanical stability, cell cycle regulation, signaling, genome organization, gene expression, nucleocytoplasmic transport, and differentiation. The wide range of functions to which these proteins contribute is consistent with not only their involvement in tissue-specific nuclear envelope disease pathologies, but also tissue evolution. PMID:24213376

  3. Onco-Regulon: an integrated database and software suite for site specific targeting of transcription factors of cancer genes

    PubMed Central

    Tomar, Navneet; Mishra, Akhilesh; Mrinal, Nirotpal; Jayaram, B.

    2016-01-01

    Transcription factors (TFs) bind at multiple sites in the genome and regulate expression of many genes. Regulating TF binding in a gene specific manner remains a formidable challenge in drug discovery because the same binding motif may be present at multiple locations in the genome. Here, we present Onco-Regulon (http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm), an integrated database of regulatory motifs of cancer genes clubbed with Unique Sequence-Predictor (USP) a software suite that identifies unique sequences for each of these regulatory DNA motifs at the specified position in the genome. USP works by extending a given DNA motif, in 5′→3′, 3′ →5′ or both directions by adding one nucleotide at each step, and calculates the frequency of each extended motif in the genome by Frequency Counter programme. This step is iterated till the frequency of the extended motif becomes unity in the genome. Thus, for each given motif, we get three possible unique sequences. Closest Sequence Finder program predicts off-target drug binding in the genome. Inclusion of DNA-Protein structural information further makes Onco-Regulon a highly informative repository for gene specific drug development. We believe that Onco-Regulon will help researchers to design drugs which will bind to an exclusive site in the genome with no off-target effects, theoretically. Database URL: http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm PMID:27515825

  4. Onco-Regulon: an integrated database and software suite for site specific targeting of transcription factors of cancer genes.

    PubMed

    Tomar, Navneet; Mishra, Akhilesh; Mrinal, Nirotpal; Jayaram, B

    2016-01-01

    Transcription factors (TFs) bind at multiple sites in the genome and regulate expression of many genes. Regulating TF binding in a gene specific manner remains a formidable challenge in drug discovery because the same binding motif may be present at multiple locations in the genome. Here, we present Onco-Regulon (http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm), an integrated database of regulatory motifs of cancer genes clubbed with Unique Sequence-Predictor (USP) a software suite that identifies unique sequences for each of these regulatory DNA motifs at the specified position in the genome. USP works by extending a given DNA motif, in 5'→3', 3' →5' or both directions by adding one nucleotide at each step, and calculates the frequency of each extended motif in the genome by Frequency Counter programme. This step is iterated till the frequency of the extended motif becomes unity in the genome. Thus, for each given motif, we get three possible unique sequences. Closest Sequence Finder program predicts off-target drug binding in the genome. Inclusion of DNA-Protein structural information further makes Onco-Regulon a highly informative repository for gene specific drug development. We believe that Onco-Regulon will help researchers to design drugs which will bind to an exclusive site in the genome with no off-target effects, theoretically.Database URL: http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm. PMID:27515825

  5. siDirect: highly effective, target-specific siRNA design software for mammalian RNA interference

    PubMed Central

    Naito, Yuki; Yamada, Tomoyuki; Ui-Tei, Kumiko; Morishita, Shinichi; Saigo, Kaoru

    2004-01-01

    siDirect (http://design.RNAi.jp/) is a web-based online software system for computing highly effective small interfering RNA (siRNA) sequences with maximum target-specificity for mammalian RNA interference (RNAi). Highly effective siRNA sequences are selected using novel guidelines that were established through an extensive study of the relationship between siRNA sequences and RNAi activity. Our efficient software avoids off-target gene silencing to enumerate potential cross-hybridization candidates that the widely used BLAST search may overlook. The website accepts an arbitrary sequence as input and quickly returns siRNA candidates, providing a wide scope of applications in mammalian RNAi, including systematic functional genomics and therapeutic gene silencing. PMID:15215364

  6. Simplifying the construction of domain-specific automatic programming systems: The NASA automated software development workstation project

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1987-01-01

    An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.

  7. Simplifying the construction of domain-specific automatic programming systems: The NASA automated software development workstation project

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1988-01-01

    An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Flight Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.

  8. Data systems and computer science: Software Engineering Program

    NASA Technical Reports Server (NTRS)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  9. The definitive analysis of the Bendandi's methodology performed with a specific software

    NASA Astrophysics Data System (ADS)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  10. As-built design specification for P1A software system modified display subsystem

    NASA Technical Reports Server (NTRS)

    Horton, C. L.; Story, A. S. (Principal Investigator)

    1980-01-01

    This document contains the design of the proportional estimate processor which was written to satisfy the software requirement of Part A of the P1A experiment. The purposes of the project are: (1) to select the dots to be labelled; (2) to create tables of green numbers and brightness values for all selected dots per acquisition; (3) to create scatter plots of green numbers vs brightness for each acquisition for all selected dots. If labels have been provided then scatter plots of only categories of interest can be optionally produced; and (4) to produce trajectory plots of green number vs brightness at differing acquisition times for each dot. These plots need to be in the same order as the list of selected dots. When labels are provided only plots of dots of categories of interest are to be produced.

  11. Hydrological mofelling of large river basins using the ECOMAG software complex

    NASA Astrophysics Data System (ADS)

    Motovilov, Yuri

    2014-05-01

    According to some hydrologists, the characteristic scale of river basins when using traditional physically based models of runoff formation is limited to the size of a small (elementary) river basin. Within its limits, these models can describe hydrological processes on the different parts of the slopes and in the river network in great detail. For hydrological simulation of large river basins, it is reasonable to use greater calculated cells of hundreds and even thousands square kilometers. The problem is to find a new (compared to the point) computational elements of a certain scale, generalization (filtering) of micro-scale fluctuations of the characteristics that are of secondary importance at this level of consideration and parameterization of hydrological processes models at the meso- and macroscale levels. In this case, such a spatial refinement as in detailed physically based models is not longer needed to describe hydrological processes, since aggregate models operate with flows averaged over the elementary catchments. In particular, such an ideology is adopted in a hydrological semi-distributed model ECOMAG, where a major river basin is covered with a grid of elementary catchments, for each of which a physically based model with lumped parameters is described by a system of ordinary differential equations, most of which obtained by integrating the basic equations of detailed physically based models over space. For solving practical and research tasks with the help of up-to-date informational and technological background, a software complex (SC) was developed on the basis of the ECOMAG model with a daily time step resolution, which included a specialized geographical information system (GIS), databases of archival and operational data on hydrological, meteorological and water management monitoring for the whole Russia, watershed characteristics, as well as the command shell. An ability of hydrological simulation of large river basins using SC ECOMAG is

  12. The Differential Effects of Task Complexity on Domain-Specific and Peer Assessment Skills

    ERIC Educational Resources Information Center

    van Zundert, Marjo J.; Sluijsmans, Dominique M. A.; Konings, Karen D.; van Merrienboer, Jeroen J. G.

    2012-01-01

    In this study the relationship between domain-specific skills and peer assessment skills as a function of task complexity is investigated. We hypothesised that peer assessment skills were superposed on domain-specific skills and will therefore suffer more when higher cognitive load is induced by increased task complexity. In a mixed factorial…

  13. The Decision-Adoption of Software Currency and Specificity Levels: A Quantitative Study of Information Technology (IT) Training

    ERIC Educational Resources Information Center

    Somsen, W. Randy

    2009-01-01

    IT educators and learners are continually challenged to adapt to software changes and remain current in software applications. This study researched if it is necessary for IT educators to prepare IT students in the most current software releases--i.e., at the highest software currency preparation level and in the same software application that the…

  14. A Software System to Collect Expert Relevance Ratings of Medical Record Items for Specific Clinical Tasks

    PubMed Central

    Krishnaraj, Arun; Alkasab, Tarik K

    2014-01-01

    Development of task-specific electronic medical record (EMR) searches and user interfaces has the potential to improve the efficiency and safety of health care while curbing rising costs. The development of such tools must be data-driven and guided by a strong understanding of practitioner information requirements with respect to specific clinical tasks or scenarios. To acquire this important data, this paper describes a model by which expert practitioners are leveraged to identify which components of the medical record are most relevant to a specific clinical task. We also describe the computer system that was created to efficiently implement this model of data gathering. The system extracts medical record data from the EMR of patients matching a given clinical scenario, de-identifies the data, breaks the data up into separate medical record items (eg, radiology reports, operative notes, laboratory results, etc), presents each individual medical record item to experts under the hypothetical of the given clinical scenario, and records the experts’ ratings regarding the relevance of each medical record item to that specific clinical scenario or task. After an iterative process of data collection, these expert relevance ratings can then be pooled and used to design point-of-care EMR searches and user interfaces tailored to the task-specific needs of practitioners. PMID:25600925

  15. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  16. Technical Specifications for Hardware and Software, and Maintenance in Support of Computer Literacy Program. Volume II.

    ERIC Educational Resources Information Center

    District of Columbia Public Schools, Washington, DC.

    Designed for use by vendors, this guide provides an overview of the objectives for the 5-year computer literacy program to be implemented in the District of Columbia Public Schools; outlines requirements which are mandatory elements of vendors' bids unless explicitly designated "desirable"; and details specifications for computing equipment,…

  17. Software Design for Wireless Sensor-based Site-specific Irrigation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In-field sensor-based site-specific irrigation management is of benefit to producers for efficient water management. Integration of the decision making process with the controls is a viable option for determining when and where to irrigate, and how much water to apply. This research presents the des...

  18. 75 FR 16817 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... Safety Rule), published in the Federal Register on November 21, 2008: 73 FR 70731- 70814. As authorized... reporting formats (Common Formats) Version 1.1 that allow for reporting of patient safety information to... information on the Common Formats Version 1.1, including the technical specifications, can be obtained...

  19. Software for Managing Complex Learning: Examples from an Educational Psychology Course.

    ERIC Educational Resources Information Center

    Schwartz, Daniel L.; Brophy, Sean; Lin, Xiaodong; Bransford, John D.

    1999-01-01

    Describes a software shell, STAR.Legacy, designed to guide attempts to help students learn from case, problem, and project-based learning. STAR.Legacy supports the integration of four types of learning environments: learner-centered, knowledge-centered, assessment-centered, and community-centered. Describes how a STAR.Legacy constructed for an…

  20. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  1. The Effect of Software Features on Software Adoption and Training in the Audit Profession

    ERIC Educational Resources Information Center

    Kim, Hyo-Jeong

    2012-01-01

    Although software has been studied with technology adoption and training research, the study of specific software features for professional groups has been limited. To address this gap, I researched the impact of software features of varying complexity on internal audit (IA) professionals. Two studies along with the development of training…

  2. Scientific Software Component Technology

    SciTech Connect

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  3. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  4. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen

  5. Specificity of IS6110-based amplification assays for Mycobacterium tuberculosis complex.

    PubMed Central

    Hellyer, T J; DesJardin, L E; Assaf, M K; Bates, J H; Cave, M D; Eisenach, K D

    1996-01-01

    The specificity of IS6110 for the Mycobacterium tuberculosis complex has recently been questioned. We observed no cross-reaction with 27 nontuberculous mycobacteria using strand displacement- and PCR-based amplification of the nucleotide 970 to 1026 and 762 to 865 regions of IS6110. These data support use of selected regions of IS6110 as M. tuberculosis complex-specific targets. PMID:8897197

  6. Specific antibody for pesticide residue determination produced by antibody-pesticide complex

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A new method for specific antibody production was developed using antibody (Ab)-pesticide complex as a unique immunogen. Parathion (PA) was the targeted pesticide, and rabbit polyclonal antibody (Pab) and mouse monoclonal antibody (Mab) were used as carrier proteins. The Ab-PA complexes were genera...

  7. Three-dimensional representations of complex carbohydrates and polysaccharides--SweetUnityMol: a video game-based computer graphic software.

    PubMed

    Pérez, Serge; Tubiana, Thibault; Imberty, Anne; Baaden, Marc

    2015-05-01

    A molecular visualization program tailored to deal with the range of 3D structures of complex carbohydrates and polysaccharides, either alone or in their interactions with other biomacromolecules, has been developed using advanced technologies elaborated by the video games industry. All the specific structural features displayed by the simplest to the most complex carbohydrate molecules have been considered and can be depicted. This concerns the monosaccharide identification and classification, conformations, location in single or multiple branched chains, depiction of secondary structural elements and the essential constituting elements in very complex structures. Particular attention was given to cope with the accepted nomenclature and pictorial representation used in glycoscience. This achievement provides a continuum between the most popular ways to depict the primary structures of complex carbohydrates to visualizing their 3D structures while giving the users many options to select the most appropriate modes of representations including new features such as those provided by the use of textures to depict some molecular properties. These developments are incorporated in a stand-alone viewer capable of displaying molecular structures, biomacromolecule surfaces and complex interactions of biomacromolecules, with powerful, artistic and illustrative rendering methods. They result in an open source software compatible with multiple platforms, i.e., Windows, MacOS and Linux operating systems, web pages, and producing publication-quality figures. The algorithms and visualization enhancements are demonstrated using a variety of carbohydrate molecules, from glycan determinants to glycoproteins and complex protein-carbohydrate interactions, as well as very complex mega-oligosaccharides and bacterial polysaccharides and multi-stranded polysaccharide architectures. PMID:25475093

  8. A Software System for Filling Complex Holes in 3D Meshes by Flexible Interacting Particles

    NASA Astrophysics Data System (ADS)

    Yamazaki, Daisuke; Savchenko, Vladimir

    3D meshes generated by acquisition devices such as laser range scanners often contain holes due to occlusion, etc. In practice, these holes are extremely geometrically and topologically complex. We propose a heuristic hole filling technique using particle systems to fill complex holes with arbitrary topology in 3D meshes. Our approach includes the following steps: hole identification, base surface creation, particle distribution, triangulation, and mesh refinement. We demonstrate the functionality of the proposed surface retouching system on synthetic and real data.

  9. Detailed analysis of complex single molecule FRET data with the software MASH

    NASA Astrophysics Data System (ADS)

    Hadzic, Mélodie C. A. S.; Kowerko, Danny; Börner, Richard; Zelger-Paulus, Susann; Sigel, Roland K. O.

    2016-04-01

    The processing and analysis of surface-immobilized single molecule FRET (Förster resonance energy transfer) data follows systematic steps (e.g. single molecule localization, clearance of different sources of noise, selection of the conformational and kinetic model, etc.) that require a solid knowledge in optics, photophysics, signal processing and statistics. The present proceeding aims at standardizing and facilitating procedures for single molecule detection by guiding the reader through an optimization protocol for a particular experimental data set. Relevant features were determined from single molecule movies (SMM) imaging Cy3- and Cy5-labeled Sc.ai5γ group II intron molecules synthetically recreated, to test the performances of four different detection algorithms. Up to 120 different parameterizations per method were routinely evaluated to finally establish an optimum detection procedure. The present protocol is adaptable to any movie displaying surface-immobilized molecules, and can be easily reproduced with our home-written software MASH (multifunctional analysis software for heterogeneous data) and script routines (both available in the download section of www.chem.uzh.ch/rna).

  10. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jon; Marnay, Chris; Lai, Judy; Mendes, Goncalo; Appen, Jan von; Mégel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.

  11. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jonathan; Marnay, Chris; Mendes, Goncalo; Appen, Jan von; Megel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-17

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analysed.

  12. Single Event Testing on Complex Devices: Test Like You Fly Versus Test-Specific Design Structures

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth

    2016-01-01

    We present a mechanism for evaluating complex digital systems targeted for harsh radiation environments such as space. Focus is limited to analyzing the single event upset (SEU) susceptibility of designs implemented inside Field Programmable Gate Array (FPGA) devices. Tradeoffs are provided between application-specific versus test-specific test structures.

  13. GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.

    PubMed

    Nazarian, Alireza; Gezan, Salvador Alejandro

    2016-07-01

    Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. PMID:27025440

  14. Software Measurement Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.

  15. Use of checkpoint-restart for complex HEP software on traditional architectures and Intel MIC

    NASA Astrophysics Data System (ADS)

    Arya, Kapil; Cooperman, Gene; Dotti, Andrea; Elmer, Peter

    2014-06-01

    Process checkpoint-restart is a technology with great potential for use in HEP workflows. Use cases include debugging, reducing the startup time of applications both in offline batch jobs and the High Level Trigger, permitting job preemption in environments where spare CPU cycles are being used opportunistically and efficient scheduling of a mix of multicore and single-threaded jobs. We report on tests of checkpoint-restart technology using CMS software, Geant4-MT (multi-threaded Geant4), and the DMTCP (Distributed Multithreaded Checkpointing) package. We analyze both single- and multi-threaded applications and test on both standard Intel x86 architectures and on Intel MIC. The tests with multi-threaded applications on Intel MIC are used to consider scalability and performance. These are considered an indicator of what the future may hold for many-core computing.

  16. Effect of formal specifications on program complexity and reliability: An experimental study

    NASA Technical Reports Server (NTRS)

    Goel, Amrit L.; Sahoo, Swarupa N.

    1990-01-01

    The results are presented of an experimental study undertaken to assess the improvement in program quality by using formal specifications. Specifications in the Z notation were developed for a simple but realistic antimissile system. These specifications were then used to develop 2 versions in C by 2 programmers. Another set of 3 versions in Ada were independently developed from informal specifications in English. A comparison of the reliability and complexity of the resulting programs suggests the advantages of using formal specifications in terms of number of errors detected and fault avoidance.

  17. TASSEL: Software for Association Mapping of Complex Traits in Diverse Samples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The association mapping of complex traits is becoming the method of choice in plant and animal genetics. In most samples, researchers have to deal with both population and family structure. TASSEL (Trait Analysis by aSSociation, Evolution and Linkage) implements general linear model and mixed line...

  18. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  19. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  20. RT-Syn: A real-time software system generator

    NASA Technical Reports Server (NTRS)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  1. Screening Leishmania donovani complex-specific genes required for visceral disease.

    PubMed

    Zhang, Wen-Wei; Matlashewski, Greg

    2015-01-01

    Leishmania protozoan parasites are the causing agent of leishmaniasis. Depending on the infecting species, Leishmania infection can causes a wide variety of diseases such as self-healing cutaneous lesions by L. major and fatal visceral leishmaniasis by L. donovani and L. infantum. Comparison of the visceral disease causing L. infantum genome with cutaneous disease causing L. major and L. braziliensis genomes has identified 25 L. infantum (L. donovani complex) species-specific genes that are absent or pseudogenes in L. major and L. braziliensis. To investigate whether these L. donovani complex species-specific genes are involved in visceral infection, we cloned these genes from L. donovani and introduced them into L. major and then determined whether the transgenic L. major had an increased ability to survive in liver and spleen of BALB/c mice. Several of these L. donovani complex specific genes were found to significantly increase L. major survival in visceral organs in BALB/c mice including the A2 and Ld2834 genes, while down regulation of these genes in L. donovani by either antisense RNA or gene knockout dramatically reduced L. donovani virulence in BALB/c mice. This demonstrated that L. donovani complex species-specific genes play important roles in visceral infection. In this chapter, we describe procedures to screen L. donovani complex specific genes required for visceral infection by cross species transgenic expression, gene deletion targeting and measuring infection levels in mice. PMID:25388124

  2. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis

    PubMed Central

    Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range. PMID:26665161

  3. FACET: an object-oriented software framework for modeling complex social behavior patterns

    SciTech Connect

    Dolph, J. E.; Christiansen, J. H.; Sydelko, P. J.

    2000-06-30

    The Framework for Addressing Cooperative Extended Transactions (FACET) is a flexible, object-oriented architecture for implementing models of dynamic behavior of multiple individuals, or agents, in a simulation. These agents can be human (individuals or organizations) or animal and may exhibit any type of organized social behavior that can be logically articulated. FACET was developed by Argonne National Laboratory's (ANL) Decision and Information Sciences Division (DIS) out of the need to integrate societal processes into natural system simulations. The FACET architecture includes generic software components that provide the agents with various mechanisms for interaction, such as step sequencing and logic, resource management, conflict resolution, and preemptive event handling. FACET components provide a rich environment within which patterns of behavior can be captured in a highly expressive manner. Interactions among agents in FACET are represented by Course of Action (COA) object-based models. Each COA contains a directed graph of individual actions, which represents any known pattern of social behavior. The agents' behavior in a FACET COA, in turn, influences the natural landscape objects in a simulation (i.e., vegetation, soil, and habitat) by updating their states. The modular design of the FACET architecture provides the flexibility to create multiple and varied simulation scenarios by changing social behavior patterns, without disrupting the natural process models. This paper describes the FACET architecture and presents several examples of FACET models that have been developed to assess the effects of anthropogenic influences on the dynamics of the natural environment.

  4. Complex modeling: a strategy and software program for combining multiple information sources to solve ill posed structure and nanostructure inverse problems.

    PubMed

    Juhás, Pavol; Farrow, Christopher L; Yang, Xiaohao; Knox, Kevin R; Billinge, Simon J L

    2015-11-01

    A strategy is described for regularizing ill posed structure and nanostructure scattering inverse problems (i.e. structure solution) from complex material structures. This paper describes both the philosophy and strategy of the approach, and a software implementation, DiffPy Complex Modeling Infrastructure (DiffPy-CMI). PMID:26522405

  5. THREE-DIMENSIONAL RADIO AND X-RAY MODELING AND DATA ANALYSIS SOFTWARE: REVEALING FLARE COMPLEXITY

    SciTech Connect

    Nita, Gelu M.; Fleishman, Gregory D.; Gary, Dale E.; Kuznetsov, Alexey A.; Kontar, Eduard P.

    2015-02-01

    Many problems in solar physics require analysis of imaging data obtained in multiple wavelength domains with differing spatial resolution in a framework supplied by advanced three-dimensional (3D) physical models. To facilitate this goal, we have undertaken a major enhancement of our IDL-based simulation tools developed earlier for modeling microwave and X-ray emission. The enhanced software architecture allows the user to (1) import photospheric magnetic field maps and perform magnetic field extrapolations to generate 3D magnetic field models; (2) investigate the magnetic topology by interactively creating field lines and associated flux tubes; (3) populate the flux tubes with user-defined nonuniform thermal plasma and anisotropic, nonuniform, nonthermal electron distributions; (4) investigate the spatial and spectral properties of radio and X-ray emission calculated from the model; and (5) compare the model-derived images and spectra with observational data. The package integrates shared-object libraries containing fast gyrosynchrotron emission codes, IDL-based soft and hard X-ray codes, and potential and linear force-free field extrapolation routines. The package accepts user-defined radiation and magnetic field extrapolation plug-ins. We use this tool to analyze a relatively simple single-loop flare and use the model to constrain the magnetic 3D structure and spatial distribution of the fast electrons inside this loop. We iteratively compute multi-frequency microwave and multi-energy X-ray images from realistic magnetic flux tubes obtained from pre-flare extrapolations, and compare them with imaging data obtained by SDO, NoRH, and RHESSI. We use this event to illustrate the tool's use for the general interpretation of solar flares to address disparate problems in solar physics.

  6. Three-dimensional Radio and X-Ray Modeling and Data Analysis Software: Revealing Flare Complexity

    NASA Astrophysics Data System (ADS)

    Nita, Gelu M.; Fleishman, Gregory D.; Kuznetsov, Alexey A.; Kontar, Eduard P.; Gary, Dale E.

    2015-02-01

    Many problems in solar physics require analysis of imaging data obtained in multiple wavelength domains with differing spatial resolution in a framework supplied by advanced three-dimensional (3D) physical models. To facilitate this goal, we have undertaken a major enhancement of our IDL-based simulation tools developed earlier for modeling microwave and X-ray emission. The enhanced software architecture allows the user to (1) import photospheric magnetic field maps and perform magnetic field extrapolations to generate 3D magnetic field models; (2) investigate the magnetic topology by interactively creating field lines and associated flux tubes; (3) populate the flux tubes with user-defined nonuniform thermal plasma and anisotropic, nonuniform, nonthermal electron distributions; (4) investigate the spatial and spectral properties of radio and X-ray emission calculated from the model; and (5) compare the model-derived images and spectra with observational data. The package integrates shared-object libraries containing fast gyrosynchrotron emission codes, IDL-based soft and hard X-ray codes, and potential and linear force-free field extrapolation routines. The package accepts user-defined radiation and magnetic field extrapolation plug-ins. We use this tool to analyze a relatively simple single-loop flare and use the model to constrain the magnetic 3D structure and spatial distribution of the fast electrons inside this loop. We iteratively compute multi-frequency microwave and multi-energy X-ray images from realistic magnetic flux tubes obtained from pre-flare extrapolations, and compare them with imaging data obtained by SDO, NoRH, and RHESSI. We use this event to illustrate the tool's use for the general interpretation of solar flares to address disparate problems in solar physics.

  7. An empirical comparison of a dynamic software testability metric to static cyclomatic complexity

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.

    1993-01-01

    This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.

  8. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  9. New Rust Resistance Specificities Associated with Recombination in the Rp1 Complex in Maize

    PubMed Central

    Richter, T. E.; Pryor, T. J.; Bennetzen, J. L.; Hulbert, S. H.

    1995-01-01

    We address the question of whether genetic reassortment events, including unequal crossing over and gene conversion, at the Rp1 complex are capable of generating novel resistance specificities that were not present in the parents. Some 176 events involving genetic reassortment within the Rp1 complex were screened for novel resistance specificities with a set of 11 different rust biotypes. Most (150/176) of the events were susceptible to all tested rust biotypes, providing no evidence for new specificities. Eleven events selected as double-resistant recombinants, when screened with the 11 test biotypes, showed the combined resistance of the two parental types consistent with a simple recombination and pyramiding of the parental resistances. Nine events selected either as having partial resistance or complete susceptibility to a single biotype possessed resistance to a subset of the biotypes that the parents were resistant to, suggesting segregation of resistance genes present in the parental Rp1 complex. Four events gave rise to novel specificities being resistant to at least one rust biotype to which both parents were susceptible. All four had flanking marker exchange, demonstrating that crossing over within the Rp1 complex is associated with the appearance of new rust resistance specificities. PMID:8536984

  10. An unstructured-grid software system for solving complex aerodynamic problems

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh

    1995-01-01

    A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.

  11. Children with Specific Language Impairment: The Effect of Argument-Structure Complexity on Auditory Sentence Comprehension

    ERIC Educational Resources Information Center

    Pizzioli, Fabrizio; Schelstraete, Marie-Anne

    2011-01-01

    Children with specific language impairment (SLI) demonstrate consistent comprehension problems. The present study investigated whether these problems are driven primarily by structural complexity or length. A picture-sentence matching task was presented to 30 children: (1) 10 children with SLI, (2) 10 comprehension-matched children with typical…

  12. Rapid and Specific Method for Evaluating Streptomyces Competitive Dynamics in Complex Soil Communities

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantifying target microbial populations in complex communities remains a barrier to studying species interactions in soil environments. Quantitative real-time PCR (qPCR) offers a rapid and specific means to assess populations of target microorganisms. SYBR Green and TaqMan-based qPCR assays were de...

  13. 26-10 Fab-digoxin complex: affinity and specificity due to surface complementarity.

    PubMed Central

    Jeffrey, P D; Strong, R K; Sieker, L C; Chang, C Y; Campbell, R L; Petsko, G A; Haber, E; Margolies, M N; Sheriff, S

    1993-01-01

    We have determined the three-dimensional structures of the antigen-binding fragment of the anti-digoxin monoclonal antibody 26-10 in the uncomplexed state at 2.7 A resolution and as a complex with digoxin at 2.5 A resolution. Neither the antibody nor digoxin undergoes any significant conformational changes upon forming the complex. Digoxin interacts primarily with the antibody heavy chain and is oriented such that the carbohydrate groups are exposed to solvent and the lactone ring is buried in a deep pocket at the bottom of the combining site. Despite extensive interactions between antibody and antigen, no hydrogen bonds or salt links are formed between 26-10 and digoxin. Thus the 26-10-digoxin complex is unique among the known three-dimensional structures of antibody-antigen complexes in that specificity and high affinity arise primarily from shape complementarity. Images Fig. 1 PMID:8234291

  14. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    SciTech Connect

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  15. Complexity and Specificity of the Neutrophil Transcriptomes in Juvenile Idiopathic Arthritis.

    PubMed

    Hu, Zihua; Jiang, Kaiyu; Frank, Mark Barton; Chen, Yanmin; Jarvis, James N

    2016-01-01

    NIH projects such as ENCODE and Roadmap Epigenomics have revealed surprising complexity in the transcriptomes of mammalian cells. In this study, we explored transcriptional complexity in human neutrophils, cells generally regarded as nonspecific in their functions and responses. We studied distinct human disease phenotypes and found that, at the gene, gene isoform, and miRNA level, neutrophils exhibit considerable specificity in their transcriptomes. Thus, even cells whose responses are considered non-specific show tailoring of their transcriptional repertoire toward specific physiologic or pathologic contexts. We also found that miRNAs had a global impact on neutrophil transcriptome and are associated with innate immunity in juvenile idiopathic arthritis (JIA). These findings have important implications for our understanding of the link between genes, non-coding transcripts and disease phenotypes. PMID:27271962

  16. Complexity and Specificity of the Neutrophil Transcriptomes in Juvenile Idiopathic Arthritis

    PubMed Central

    Hu, Zihua; Jiang, Kaiyu; Frank, Mark Barton; Chen, Yanmin; Jarvis, James N.

    2016-01-01

    NIH projects such as ENCODE and Roadmap Epigenomics have revealed surprising complexity in the transcriptomes of mammalian cells. In this study, we explored transcriptional complexity in human neutrophils, cells generally regarded as nonspecific in their functions and responses. We studied distinct human disease phenotypes and found that, at the gene, gene isoform, and miRNA level, neutrophils exhibit considerable specificity in their transcriptomes. Thus, even cells whose responses are considered non-specific show tailoring of their transcriptional repertoire toward specific physiologic or pathologic contexts. We also found that miRNAs had a global impact on neutrophil transcriptome and are associated with innate immunity in juvenile idiopathic arthritis (JIA). These findings have important implications for our understanding of the link between genes, non-coding transcripts and disease phenotypes. PMID:27271962

  17. Understanding Interactions between Cellular Matrices and Metal Complexes: Methods To Improve Silver Nanodot-Specific Staining.

    PubMed

    Choi, Sungmoon; Yu, Junhua

    2016-08-26

    Metal complexes are frequently used for biological applications due to their special photophysical and chemical characteristics. Due to strong interactions between metals and biomacromolecules, a random staining of cytoplasm or nucleoplasm by the complexes results in a low signal-to-background ratio. In this study, we used luminescent silver nanodots as a model to investigate the major driving force for non-specific staining in cellular matrices. Even though some silver nanodot emitters exhibited excellent specific staining of nucleoli, labeling with nanodots was problematic owing to severe non-specific staining. Binding between silver and sulfhydryl group of proteins appeared to be the major factor that enforced the silver staining. The oxidation of thiol groups in cells with hexacyanoferrate(III) dramatically weakened the silver-cell interaction and consequently significantly improved the efficiency of targeted staining. PMID:27380586

  18. Automated Software Development Workstation (ASDW)

    NASA Technical Reports Server (NTRS)

    Fridge, Ernie

    1990-01-01

    Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.

  19. Using protein-binding microarrays to study transcription factor specificity: homologs, isoforms and complexes

    PubMed Central

    Andrilenas, Kellen K.; Penvose, Ashley

    2015-01-01

    Protein–DNA binding is central to specificity in gene regulation, and methods for characterizing transcription factor (TF)–DNA binding remain crucial to studies of regulatory specificity. High-throughput (HT) technologies have revolutionized our ability to characterize protein–DNA binding by significantly increasing the number of binding measurements that can be performed. Protein-binding microarrays (PBMs) are a robust and powerful HT platform for studying DNA-binding specificity of TFs. Analysis of PBM-determined DNA-binding profiles has provided new insight into the scope and mechanisms of TF binding diversity. In this review, we focus specifically on the PBM technique and discuss its application to the study of TF specificity, in particular, the binding diversity of TF homologs and multi-protein complexes. PMID:25431149

  20. A cryptobiosis-specific 19S protein complex of Artemia salina gastrulae.

    PubMed

    De Herdt, E; De Voeght, F; Clauwaert, J; Kondo, M; Slegers, H

    1981-01-15

    The postribosomal supernatant of Artemia salina cryptobiotic embryos contains a large quantity of a 19S protein complex. An amount of 3.6 mg/g of cysts is measured by immunoprecipitation with anti-(19S protein complex) antibody. The quantity of this complex decreases during further development to nauplius larvae to only 15% of the quantity present in cryptobiotic embryos. The complex was no longer detectable after 7 days of growth. The 27000-Mr protein subunit of the 19S complex is not synthesized by mRNA isolated from cryptobiotic embryos. The cryptobiosis-specific complex has Mr 573000 and 610000 as calculated from light-scattering and sedimentation-diffusion measurements respectively. The 19S homocomplex contains 20-23 27000-Mr proteins and has no function in the translation of homologous mRNA. From hydrodynamic data a hydration of 1.25 g of water/g of protein is calculated. The abundant presence of the 19S protein complex in cryptobiotic embryos and the absence of synthesis during development to nauplius larvae indicate a functional role during the cryptobiotic process in early embryogenesis. A role in maintaining the water content of the cytoplasm above a critical threshold during desiccation is suggested. PMID:7305995

  1. Characterization of a novel meiosis-specific protein within the central element of the synaptonemal complex.

    PubMed

    Hamer, Geert; Gell, Katarina; Kouznetsova, Anna; Novak, Ivana; Benavente, Ricardo; Höög, Christer

    2006-10-01

    During the first meiotic prophase, alignment and synapsis of the homologous chromosomes are mediated by the synaptonemal complex. Incorrect assembly of this complex results in cell death, impaired meiotic recombination and formation of aneuploid germ cells. We have identified a novel mouse meiosis-specific protein, TEX12, and shown it to be a component of the central element structure of the synaptonemal complex at synapsed homologous chromosomes. Only two other central element proteins, SYCE1 and SYCE2, have been identified to date and, using several mouse knockout models, we show that these proteins and TEX12 specifically depend on the synaptonemal transverse filament protein SYCP1 for localization to the meiotic chromosomes. Additionally, we show that TEX12 exactly co-localized with SYCE2, having the same, often punctate, localization pattern. SYCE1, on the other hand, co-localized with SYCP1 and these proteins displayed the same more continuous expression pattern. These co-localization studies were confirmed by co-immunoprecipitation experiments that showed that TEX12 specifically co-precipitated with SYCE2. Our results suggest a molecular network within the central elements, in which TEX12 and SYCE2 form a complex that interacts with SYCE1. SYCE1 interacts more directly with SYCP1 and could thus anchor the central element proteins to the transverse filaments. PMID:16968740

  2. Characterization of the Mammalian CORVET and HOPS Complexes and Their Modular Restructuring for Endosome Specificity.

    PubMed

    van der Kant, Rik; Jonker, Caspar T H; Wijdeven, Ruud H; Bakker, Jeroen; Janssen, Lennert; Klumperman, Judith; Neefjes, Jacques

    2015-12-18

    Trafficking of cargo through the endosomal system depends on endosomal fusion events mediated by SNARE proteins, Rab-GTPases, and multisubunit tethering complexes. The CORVET and HOPS tethering complexes, respectively, regulate early and late endosomal tethering and have been characterized in detail in yeast where their sequential membrane targeting and assembly is well understood. Mammalian CORVET and HOPS subunits significantly differ from their yeast homologues, and novel proteins with high homology to CORVET/HOPS subunits have evolved. However, an analysis of the molecular interactions between these subunits in mammals is lacking. Here, we provide a detailed analysis of interactions within the mammalian CORVET and HOPS as well as an additional endosomal-targeting complex (VIPAS39-VPS33B) that does not exist in yeast. We show that core interactions within CORVET and HOPS are largely conserved but that the membrane-targeting module in HOPS has significantly changed to accommodate binding to mammalian-specific RAB7 interacting lysosomal protein (RILP). Arthrogryposis-renal dysfunction-cholestasis (ARC) syndrome-associated mutations in VPS33B selectively disrupt recruitment to late endosomes by RILP or binding to its partner VIPAS39. Within the shared core of CORVET/HOPS, we find that VPS11 acts as a molecular switch that binds either CORVET-specific TGFBRAP1 or HOPS-specific VPS39/RILP thereby allowing selective targeting of these tethering complexes to early or late endosomes to time fusion events in the endo/lysosomal pathway. PMID:26463206

  3. Use of Kidspiration[C] Software to Enhance the Reading Comprehension of Story Grammar Components for Elementary-Age Students with Specific Learning Disabilities

    ERIC Educational Resources Information Center

    Wade, Erin; Boon, Richard T.; Spencer, Vicky G.

    2010-01-01

    The aim of this research brief was to explore the efficacy of story mapping, with the integration of Kidspiration[C] software, to enhance the reading comprehension skills of story grammar components for elementary-age students. Three students served as the participants, two in third grade and one in fourth, with specific learning disabilities…

  4. Effects of Text-to-Speech Software on the Reading Rate and Comprehension Skills of High School Students with Specific Learning Disabilities

    ERIC Educational Resources Information Center

    Moorman, Amanda; Boon, Richard T.; Keller-Bell, Yolanda; Stagliano, Christina; Jeffs, Tara

    2010-01-01

    The purpose of this study was to examine the effects of a text-to-speech software program known as "Read Please" on the reading rate and reading comprehension accuracy of two high school students with specific learning disabilities (SLD) in reading. A single-subject A-B-A-B "withdrawal" research design (Alberto & Troutman, 2009) was used to…

  5. Spectroscopic investigation on the interaction of ruthenium complexes with tumor specific lectin, jacalin.

    PubMed

    Ayaz Ahmed, Khan Behlol; Reshma, Elamvazhuthi; Mariappan, Mariappan; Anbazhagan, Veerappan

    2015-02-25

    Several ruthenium complexes are regarded as anticancer agents and considered as an alternative to the widely used platinum complexes. Owing to the preferential interaction of jacalin with tumor-associated T-antigen, we report the interaction of jacalin with four ruthenium complex namely, tris(1,10-phenanthroline)ruthenium(II)chloride, bis(1,10-phenanthroline)(N-[1,10]phenanthrolin-5-yl-pyrenylmethanimine)ruthenium(II)chloride, bis(1,10-phenanthroline)(dipyrido[3,2-a:2',3'-c]-phenazine)ruthenium(II)chloride, bis(1,10-phenanthroline)(11-(9-acridinyl)dipyrido[3,2-a:2',3'-c]phenazine)ruthenium(II) chloride. Fluorescence spectroscopic analysis revealed that the ruthenium complexes strongly quenched the intrinsic fluorescence of jacalin through a static quenching procedure, and a non-radiative energy transfer occurred within the molecules. Association constants obtained for the interaction of different ruthenium complexes with jacalin are in the order of 10(5) M(-1), which is in the same range as those obtained for the interaction of lectin with carbohydrate and hydrophobic ligand. Each subunit of the tetrameric jacalin binds one ruthenium complex, and the stoichiometry is found to be unaffected by the presence of the specific sugar, galactose. In addition, agglutination activity of jacalin is largely unaffected by the presence of the ruthenium complexes, indicating that the binding sites for the carbohydrate and the ruthenium complexes are different. These results suggest that the development of lectin-ruthenium complex conjugate would be feasible to target malignant cells in chemo-therapeutics. PMID:25306128

  6. Spectroscopic investigation on the interaction of ruthenium complexes with tumor specific lectin, jacalin

    NASA Astrophysics Data System (ADS)

    Ayaz Ahmed, Khan Behlol; Reshma, Elamvazhuthi; Mariappan, Mariappan; Anbazhagan, Veerappan

    2015-02-01

    Several ruthenium complexes are regarded as anticancer agents and considered as an alternative to the widely used platinum complexes. Owing to the preferential interaction of jacalin with tumor-associated T-antigen, we report the interaction of jacalin with four ruthenium complex namely, tris(1,10-phenanthroline)ruthenium(II)chloride, bis(1,10-phenanthroline)(N-[1,10]phenanthrolin-5-yl-pyrenylmethanimine)ruthenium(II)chloride, bis(1,10-phenanthroline)(dipyrido[3,2-a:2‧,3‧-c]-phenazine)ruthenium(II)chloride, bis(1,10-phenanthroline)(11-(9-acridinyl)dipyrido[3,2-a:2‧,3‧-c]phenazine)ruthenium(II) chloride. Fluorescence spectroscopic analysis revealed that the ruthenium complexes strongly quenched the intrinsic fluorescence of jacalin through a static quenching procedure, and a non-radiative energy transfer occurred within the molecules. Association constants obtained for the interaction of different ruthenium complexes with jacalin are in the order of 105 M-1, which is in the same range as those obtained for the interaction of lectin with carbohydrate and hydrophobic ligand. Each subunit of the tetrameric jacalin binds one ruthenium complex, and the stoichiometry is found to be unaffected by the presence of the specific sugar, galactose. In addition, agglutination activity of jacalin is largely unaffected by the presence of the ruthenium complexes, indicating that the binding sites for the carbohydrate and the ruthenium complexes are different. These results suggest that the development of lectin-ruthenium complex conjugate would be feasible to target malignant cells in chemo-therapeutics.

  7. Dissociation free-energy profiles of specific and nonspecific DNA-protein complexes.

    PubMed

    Yonetani, Yoshiteru; Kono, Hidetoshi

    2013-06-27

    DNA-binding proteins recognize DNA sequences with at least two different binding modes: specific and nonspecific. Experimental structures of such complexes provide us a static view of the bindings. However, it is difficult to reveal further mechanisms of their target-site search and recognition only from static information because the transition process between the bound and unbound states is not clarified by static information. What is the difference between specific and nonspecific bindings? Here we performed adaptive biasing force molecular dynamics simulations with the specific and nonspecific structures of DNA-Lac repressor complexes to investigate the dissociation process. The resultant free-energy profiles showed that the specific complex has a sharp, deep well consistent with tight binding, whereas the nonspecific complex has a broad, shallow well consistent with loose binding. The difference in the well depth, ~5 kcal/mol, was in fair agreement with the experimentally obtained value and was found to mainly come from the protein conformational difference, particularly in the C-terminal tail. Also, the free-energy profiles were found to be correlated with changes in the number of protein-DNA contacts and that of surface water molecules. The derived protein spatial distributions around the DNA indicate that any large dissociation occurs rarely, regardless of the specific and nonspecific sites. Comparison of the free-energy barrier for sliding [~8.7 kcal/mol; Furini J. Phys. Chem. B 2010, 114, 2238] and that for dissociation (at least ~16 kcal/mol) calculated in this study suggests that sliding is much preferred to dissociation. PMID:23713479

  8. Tissue-specific regulatory circuits reveal variable modular perturbations across complex diseases.

    PubMed

    Marbach, Daniel; Lamparter, David; Quon, Gerald; Kellis, Manolis; Kutalik, Zoltán; Bergmann, Sven

    2016-04-01

    Mapping perturbed molecular circuits that underlie complex diseases remains a great challenge. We developed a comprehensive resource of 394 cell type- and tissue-specific gene regulatory networks for human, each specifying the genome-wide connectivity among transcription factors, enhancers, promoters and genes. Integration with 37 genome-wide association studies (GWASs) showed that disease-associated genetic variants-including variants that do not reach genome-wide significance-often perturb regulatory modules that are highly specific to disease-relevant cell types or tissues. Our resource opens the door to systematic analysis of regulatory programs across hundreds of human cell types and tissues (http://regulatorycircuits.org). PMID:26950747

  9. Immunization with Immune Complexes Modulates the Fine Specificity of Antibody Responses to a Flavivirus Antigen

    PubMed Central

    Tsouchnikas, Georgios; Zlatkovic, Juergen; Jarmer, Johanna; Strauß, Judith; Vratskikh, Oksana; Kundi, Michael; Stiasny, Karin

    2015-01-01

    ABSTRACT The antibody response to proteins may be modulated by the presence of preexisting antigen-specific antibodies and the formation of immune complexes (ICs). Effects such as a general increase or decrease of the response as well as epitope-specific phenomena have been described. In this study, we investigated influences of IC immunization on the fine specificity of antibody responses in a structurally well-defined system, using the envelope (E) protein of tick-borne encephalitis (TBE) virus as an immunogen. TBE virus occurs in Europe and Asia and—together with the yellow fever, dengue, West Nile, and Japanese encephalitis viruses—represents one of the major human-pathogenic flaviviruses. Mice were immunized with a dimeric soluble form of E (sE) alone or in complex with monoclonal antibodies specific for each of the three domains of E, and the antibody response induced by these ICs was compared to that seen after immunization with sE alone. Immunoassays using recombinant domains and domain combinations of TBE virus sE as well as the distantly related West Nile virus sE allowed the dissection and quantification of antibody subsets present in postimmunization sera, thus generating fine-specificity patterns of the polyclonal responses. There were substantially different responses with two of the ICs, and the differences could be mechanistically related to (i) epitope shielding and (ii) antibody-mediated structural changes leading to dissociation of the sE dimer. The phenomena described may also be relevant for polyclonal responses upon secondary infections and/or booster immunizations and may affect antibody responses in an individual-specific way. IMPORTANCE Infections with flaviviruses such as yellow fever, dengue, Japanese encephalitis, West Nile, and tick-borne encephalitis (TBE) viruses pose substantial public health problems in different parts of the world. Antibodies to viral envelope protein E induced by natural infection or vaccination were shown to

  10. Modern Tools for Modern Software

    SciTech Connect

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  11. Energy conservation and analysis and evaluation. [specifically at Slidell Computer Complex

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The survey assembled and made recommendations directed at conserving utilities and reducing the use of energy at the Slidell Computer Complex. Specific items included were: (1) scheduling and controlling the use of gas and electricity, (2) building modifications to reduce energy, (3) replacement of old, inefficient equipment, (4) modifications to control systems, (5) evaluations of economizer cycles in HVAC systems, and (6) corrective settings for thermostats, ductstats, and other temperature and pressure control devices.

  12. Characterising the atypical 5'-CG DNA sequence specificity of 9-aminoacridine carboxamide Pt complexes.

    PubMed

    Kava, Hieronimus W; Galea, Anne M; Md Jamil, Farhana; Feng, Yue; Murray, Vincent

    2014-08-01

    In this study, the DNA sequence specificity of four DNA-targeted 9-aminoacridine carboxamide Pt complexes was compared with cisplatin, using two specially constructed plasmid templates. One plasmid contained 5'-CG and 5'-GA insert sequences while the other plasmid contained a G-rich transferrin receptor gene promoter insert sequence. The damage profiles of each compound on the different DNA templates were quantified via a polymerase stop assay with fluorescently labelled primers and capillary electrophoresis. With the plasmid that contained 5'-CG and 5'-GA dinucleotides, the four 9-aminoacridine carboxamide Pt complexes produced distinctly different damage profiles as compared with cisplatin. These 9-aminoacridine complexes had greatly increased levels of DNA damage at CG and GA dinucleotides as compared with cisplatin. It was shown that the presence of a CG or GA dinucleotide was sufficient to reveal the altered DNA sequence selectivity of the 9-aminoacridine carboxamide Pt analogues. The DNA sequence specificity of the Pt complexes was also found to be similarly altered utilising the transferrin receptor DNA sequence. PMID:24827388

  13. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  14. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1980-01-01

    Language supports design and documentation of complex software. Included are: design and documentation language for expressing design concepts; processor that produces intelligble documentation based on design specifications; and methodology for using language and processor to create well-structured top-down programs and documentation. Processor is written in SIMSCRIPT 11.5 programming language for use on UNIVAC, IBM, and CDC machines.

  15. Nuquantus: Machine learning software for the characterization and quantification of cell nuclei in complex immunofluorescent tissue images

    PubMed Central

    Gross, Polina; Honnorat, Nicolas; Varol, Erdem; Wallner, Markus; Trappanese, Danielle M.; Sharp, Thomas E.; Starosta, Timothy; Duran, Jason M.; Koller, Sarah; Davatzikos, Christos; Houser, Steven R.

    2016-01-01

    Determination of fundamental mechanisms of disease often hinges on histopathology visualization and quantitative image analysis. Currently, the analysis of multi-channel fluorescence tissue images is primarily achieved by manual measurements of tissue cellular content and sub-cellular compartments. Since the current manual methodology for image analysis is a tedious and subjective approach, there is clearly a need for an automated analytical technique to process large-scale image datasets. Here, we introduce Nuquantus (Nuclei quantification utility software) - a novel machine learning-based analytical method, which identifies, quantifies and classifies nuclei based on cells of interest in composite fluorescent tissue images, in which cell borders are not visible. Nuquantus is an adaptive framework that learns the morphological attributes of intact tissue in the presence of anatomical variability and pathological processes. Nuquantus allowed us to robustly perform quantitative image analysis on remodeling cardiac tissue after myocardial infarction. Nuquantus reliably classifies cardiomyocyte versus non-cardiomyocyte nuclei and detects cell proliferation, as well as cell death in different cell classes. Broadly, Nuquantus provides innovative computerized methodology to analyze complex tissue images that significantly facilitates image analysis and minimizes human bias. PMID:27005843

  16. Nuquantus: Machine learning software for the characterization and quantification of cell nuclei in complex immunofluorescent tissue images

    NASA Astrophysics Data System (ADS)

    Gross, Polina; Honnorat, Nicolas; Varol, Erdem; Wallner, Markus; Trappanese, Danielle M.; Sharp, Thomas E.; Starosta, Timothy; Duran, Jason M.; Koller, Sarah; Davatzikos, Christos; Houser, Steven R.

    2016-03-01

    Determination of fundamental mechanisms of disease often hinges on histopathology visualization and quantitative image analysis. Currently, the analysis of multi-channel fluorescence tissue images is primarily achieved by manual measurements of tissue cellular content and sub-cellular compartments. Since the current manual methodology for image analysis is a tedious and subjective approach, there is clearly a need for an automated analytical technique to process large-scale image datasets. Here, we introduce Nuquantus (Nuclei quantification utility software) - a novel machine learning-based analytical method, which identifies, quantifies and classifies nuclei based on cells of interest in composite fluorescent tissue images, in which cell borders are not visible. Nuquantus is an adaptive framework that learns the morphological attributes of intact tissue in the presence of anatomical variability and pathological processes. Nuquantus allowed us to robustly perform quantitative image analysis on remodeling cardiac tissue after myocardial infarction. Nuquantus reliably classifies cardiomyocyte versus non-cardiomyocyte nuclei and detects cell proliferation, as well as cell death in different cell classes. Broadly, Nuquantus provides innovative computerized methodology to analyze complex tissue images that significantly facilitates image analysis and minimizes human bias.

  17. Nuquantus: Machine learning software for the characterization and quantification of cell nuclei in complex immunofluorescent tissue images.

    PubMed

    Gross, Polina; Honnorat, Nicolas; Varol, Erdem; Wallner, Markus; Trappanese, Danielle M; Sharp, Thomas E; Starosta, Timothy; Duran, Jason M; Koller, Sarah; Davatzikos, Christos; Houser, Steven R

    2016-01-01

    Determination of fundamental mechanisms of disease often hinges on histopathology visualization and quantitative image analysis. Currently, the analysis of multi-channel fluorescence tissue images is primarily achieved by manual measurements of tissue cellular content and sub-cellular compartments. Since the current manual methodology for image analysis is a tedious and subjective approach, there is clearly a need for an automated analytical technique to process large-scale image datasets. Here, we introduce Nuquantus (Nuclei quantification utility software) - a novel machine learning-based analytical method, which identifies, quantifies and classifies nuclei based on cells of interest in composite fluorescent tissue images, in which cell borders are not visible. Nuquantus is an adaptive framework that learns the morphological attributes of intact tissue in the presence of anatomical variability and pathological processes. Nuquantus allowed us to robustly perform quantitative image analysis on remodeling cardiac tissue after myocardial infarction. Nuquantus reliably classifies cardiomyocyte versus non-cardiomyocyte nuclei and detects cell proliferation, as well as cell death in different cell classes. Broadly, Nuquantus provides innovative computerized methodology to analyze complex tissue images that significantly facilitates image analysis and minimizes human bias. PMID:27005843

  18. Software attribute visualization for high integrity software

    SciTech Connect

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  19. A Hybrid Approach to Finding Relevant Social Media Content for Complex Domain Specific Information Needs

    PubMed Central

    Cameron, Delroy; Sheth, Amit P.; Jaykumar, Nishita; Thirunarayan, Krishnaprasad; Anand, Gaurish; Smith, Gary A.

    2015-01-01

    While contemporary semantic search systems offer to improve classical keyword-based search, they are not always adequate for complex domain specific information needs. The domain of prescription drug abuse, for example, requires knowledge of both ontological concepts and “intelligible constructs” not typically modeled in ontologies. These intelligible constructs convey essential information that include notions of intensity, frequency, interval, dosage and sentiments, which could be important to the holistic needs of the information seeker. In this paper, we present a hybrid approach to domain specific information retrieval that integrates ontology-driven query interpretation with synonym-based query expansion and domain specific rules, to facilitate search in social media on prescription drug abuse. Our framework is based on a context-free grammar (CFG) that defines the query language of constructs interpretable by the search system. The grammar provides two levels of semantic interpretation: 1) a top-level CFG that facilitates retrieval of diverse textual patterns, which belong to broad templates and 2) a low-level CFG that enables interpretation of specific expressions belonging to such textual patterns. These low-level expressions occur as concepts from four different categories of data: 1) ontological concepts, 2) concepts in lexicons (such as emotions and sentiments), 3) concepts in lexicons with only partial ontology representation, called lexico-ontology concepts (such as side effects and routes of administration (ROA)), and 4) domain specific expressions (such as date, time, interval, frequency and dosage) derived solely through rules. Our approach is embodied in a novel Semantic Web platform called PREDOSE, which provides search support for complex domain specific information needs in prescription drug abuse epidemiology. When applied to a corpus of over 1 million drug abuse-related web forum posts, our search framework proved effective in retrieving

  20. The Impact of Specific and Complex Trauma on the Mental Health of Homeless Youth.

    PubMed

    Wong, Carolyn F; Clark, Leslie F; Marlotte, Lauren

    2016-03-01

    This study investigates the relative impact of trauma experiences that occurred prior to and since becoming homeless on depressive symptoms, posttraumatic stress disorder (PTSD) symptoms, and self-injurious behaviors among a sample of homeless youth (N = 389). Youth (aged 13 to 25) who had been homeless or precariously housed in the past year completed a survey about housing history, experiences of violence and victimization, mental health, and service utilization. In addition to examining the impact associated with specific trauma types, we also considered the effect of "early-on" poly-victimization (i.e., cumulative number of reported traumas prior to homelessness) and the influence of a compound sexual trauma variable created to represent earlier complex trauma. This created-variable has values ranging from no reported trauma, single trauma, multiple non-sexual traumas, and multiple traumas that co-occurred with sexual abuse. Multivariate analyses revealed that specific traumatic experiences prior to homelessness, including sexual abuse, emotional abuse/neglect, and adverse home environment, predicted greater mental health symptoms. Poly-victimization did not add to the prediction of mental health symptoms after the inclusion of specific traumas. Results with early compound sexual trauma revealed significant differences between lower-order trauma exposures and multiple-trauma exposures. Specifically, experience of multiple traumas that co-occurred with sexual trauma was significantly more detrimental in predicting PTSD symptoms than multiple traumas of non-sexual nature. Findings support the utility of an alternate/novel conceptualization of complex trauma, and support the need to carefully evaluate complex traumatic experiences that occurred prior to homelessness, which can impact the design and implementation of mental health care and services for homeless youth. PMID:25392379

  1. Specific and Rapid Detection of Mycobacterium tuberculosis Complex in Clinical Samples by Polymerase Chain Reaction

    PubMed Central

    Singh, Anamika; Kashyap, Vijendra Kumar

    2012-01-01

    Background. Tuberculosis, a global health problem and highly prevalent in India, has always been a serious problem with respect to definitive diagnosis. Polymerase chain reaction (PCR) techniques are now widely used for early detection and species differentiation of mycobacteria, but mostly with their own limitations. We aim to detect and differentiate Mycobacterium tuberculosis (Mtb) infections by choosing appropriate target sequences, ideally present in all mycobacterial species (MTB complex) and absent in others. Methods. Amplification of three target sequences from unrelated genes, namely, hsp 65 (165 bp), dnaJ (365 bp), and insertion element IS 6110 (541 bp) by PCR was carried out in clinical samples from suspected cases of tuberculosis/ mycobacterioses and healthy controls. Results. The sensitivity of this method ranged from 73.33% to 84.61%, and the specificity was 80%. The PCR method was significantly better (P = 0.03 and P = 0.009) than both smear and culture methods. Conclusion. Our trimarker-based PCR method could specifically detect M. tuberculosis and MTB complex infection from that of major pathogenic NTM and nonpathogenic mycobacteria. This method, by well distinguishing between MTB complex and NTM, presented a fast and accurate method to detect and diagnose mycobacterial infections more efficiently and could thereby help in better patient management particularly considering the increase in mycobacterial infections due to emergence of NTM over the past decades. PMID:23093958

  2. Specific Nucleotide Binding and Rebinding to Individual DNA Polymerase Complexes Captured on a Nanopore

    PubMed Central

    Hurt, Nicholas; Wang, Hongyun; Akeson, Mark; Lieberman, Kate R.

    2009-01-01

    Nanoscale pores are a tool for single molecule analysis of DNA or RNA processing enzymes. Monitoring catalytic activity in real time using this technique requires that these enzymes retain function while held atop a nanopore in an applied electric field. Using an α-hemolysin nanopore, we measured the dwell time for complexes of DNA with the Klenow fragment of Escherichia coli DNA polymerase I (KF) as a function of the concentration of deoxynucleoside triphosphate (dNTP) substrate. We analyzed these dwell time measurements in the framework of a two-state model for captured complexes (DNA-KF binary and DNA-KF-dNTP ternary states). Average nanopore dwell time increased without saturating as a function of correct dNTP concentration across four orders of magnitude. This arises from two factors that are proportional to dNTP concentration: 1) The fraction of complexes that are in the ternary state when initially captured predominantly affects dwell time at low dNTP concentrations; 2) The rate of binding and rebinding of dNTP to captured complexes affects dwell time at higher dNTP concentrations. Thus there are two regimes that display a linear relationship between average dwell time and dNTP concentration. The transition from one linear regime to the other occurs near the equilibrium dissociation constant (Kd) for dNTP binding to KF-DNA complexes in solution. We conclude from the combination of titration experiments and modeling that DNA-KF complexes captured atop the nanopore retain iterative, sequence-specific dNTP binding, as required for catalysis and fidelity in DNA synthesis. PMID:19275265

  3. A Rapid and Specific Method for the Detection of Indole in Complex Biological Samples

    PubMed Central

    Chappell, Cynthia; Gonzales, Christopher; Okhuysen, Pablo

    2015-01-01

    Indole, a bacterial product of tryptophan degradation, has a variety of important applications in the pharmaceutical industry and is a biomarker in biological and clinical specimens. Yet, specific assays to quantitate indole are complex and require expensive equipment and a high level of training. Thus, indole in biological samples is often estimated using the simple and rapid Kovács assay, which nonspecifically detects a variety of commonly occurring indole analogs. We demonstrate here a sensitive, specific, and rapid method for measuring indole in complex biological samples using a specific reaction between unsubstituted indole and hydroxylamine. We compared the hydroxylamine-based indole assay (HIA) to the Kovács assay and confirmed that the two assays are capable of detecting microgram amounts of indole. However, the HIA is specific to indole and does not detect other naturally occurring indole analogs. We further demonstrated the utility of the HIA in measuring indole levels in clinically relevant biological materials, such as fecal samples and bacterial cultures. Mean and median fecal indole concentrations from 53 healthy adults were 2.59 mM and 2.73 mM, respectively, but varied widely (0.30 mM to 6.64 mM) among individuals. We also determined that enterotoxigenic Escherichia coli strain H10407 produces 3.3 ± 0.22 mM indole during a 24-h period in the presence of 5 mM tryptophan. The sensitive and specific HIA should be of value in a variety of settings, such as the evaluation of various clinical samples and the study of indole-producing bacterial species in the gut microbiota. PMID:26386049

  4. Confident Assignment of Site-Specific Glycosylation in Complex Glycoproteins in a Single Step

    PubMed Central

    2015-01-01

    A glycoprotein may contain several sites of glycosylation, each of which is heterogeneous. As a consequence of glycoform diversity and signal suppression from nonglycosylated peptides that ionize more efficiently, typical reversed-phase LC–MS and bottom–up proteomics database searching workflows do not perform well for identification of site-specific glycosylation for complex glycoproteins. We present an LC–MS system for enrichment, separation, and analysis of glycopeptides from complex glycoproteins (>4 N-glycosylation sequons) in a single step. This system uses an online HILIC enrichment trap prior to reversed-phase C18-MS analysis. We demonstrated the effectiveness of the system using a set of glycoproteins including human transferrin (2 sequons), human alpha-1-acid glycoprotein (5 sequons), and influenza A virus hemagglutinin (9 sequons). The online enrichment renders glycopeptides the most abundant ions detected, thereby facilitating the generation of high-quality data-dependent tandem mass spectra. The tandem mass spectra exhibited product ions from both glycan and peptide backbone dissociation for a majority of the glycopeptides tested using collisionally activated dissociation that served to confidently assign site-specific glycosylation. We demonstrated the value of our system to define site-specific glycosylation using a hemagglutinin containing 9 N-glycosylation sequons from a single HILIC-C18-MS acquisition. PMID:25153361

  5. Rapid and specific SPRi detection of L. pneumophila in complex environmental water samples.

    PubMed

    Foudeh, Amir M; Trigui, Hana; Mendis, Nilmini; Faucher, Sebastien P; Veres, Teodor; Tabrizian, Maryam

    2015-07-01

    Legionellosis is a very devastating disease worldwide mainly due to unpredictable outbreaks in man-made water systems. Developing a highly specific and sensitive rapid detection system that detects only metabolically active bacteria is a main priority for water quality assessment. We previously developed a versatile technique for sensitive and specific detection of synthetic RNA. In the present work, we further investigated the performance of the developed biosensor for detection of Legionella pneumophila in complex environmental samples, particularly those containing protozoa. The specificity and sensitivity of the detection system were verified using total RNA extracted from L. pneumophila in spiked water co-cultured with amoebae. We demonstrated that the expression level of ribosomal RNA (rRNA) is extremely dependent on the environmental conditions. The presence of amoebae with L. pneumophila, especially in nutrition-deprived samples, increased the amount of L. pneumophila 15-fold after 1 week as measured through the expression of 16s rRNA. Using the developed surface plasmon resonance imaging (SPRi) detection method, we were also able to successfully detect L. pneumophila within 3 h, both in the presence and absence of amoebae in the complex environmental samples obtained from a cooling water tower. These findings suggest that the developed biosensing system is a viable method for rapid, real-time and effective detection not only for L. pneumophila in environmental samples but also to assess the risk associated with the use of water contaminated with other pathogens. PMID:25935681

  6. Complexity from Specificity: Light Scattering and Colloidal Studies of Dscam Self-Association

    NASA Astrophysics Data System (ADS)

    Collins, Jesse; Arkus, Natalie; Meng, Guangnan; Brenner, Michael; Schmucker, Dietmar; Manoharan, Vinothan

    2008-03-01

    The self-assembly of complex structures from nanometer-sized building blocks is of great technological importance(i.e. for the development of tissue scaffolds and photonic crystals) and is of significant basic scientific interest. Here I present light scattering and colloidal aggregation studies of Dscam, a protein with over 18,000 splice variants which all (or almost all) exhibit exclusively homophilic binding, and which is necessary for the generation of structural complexity in the brain of insects. Static and dynamic light scattering data reveal the statistical mechanical properties of Dscam self-association, including the free energy, second virial coefficient, and oligomer molecular weight. Finally, I demonstrate how to exploit Dscam's unprecedented level of molecular diversity and specificity for the self-assembly of custom nano- and micro-structures out of Dscam-conjugated colloids.

  7. Complex I and complex III inhibition specifically increase cytosolic hydrogen peroxide levels without inducing oxidative stress in HEK293 cells

    PubMed Central

    Forkink, Marleen; Basit, Farhan; Teixeira, José; Swarts, Herman G.; Koopman, Werner J.H.; Willems, Peter H.G.M.

    2015-01-01

    Inhibitor studies with isolated mitochondria demonstrated that complex I (CI) and III (CIII) of the electron transport chain (ETC) can act as relevant sources of mitochondrial reactive oxygen species (ROS). Here we studied ROS generation and oxidative stress induction during chronic (24 h) inhibition of CI and CIII using rotenone (ROT) and antimycin A (AA), respectively, in intact HEK293 cells. Both inhibitors stimulated oxidation of the ROS sensor hydroethidine (HEt) and increased mitochondrial NAD(P)H levels without major effects on cell viability. Integrated analysis of cells stably expressing cytosolic- or mitochondria-targeted variants of the reporter molecules HyPer (H2O2-sensitive and pH-sensitive) and SypHer (H2O2-insensitive and pH-sensitive), revealed that CI- and CIII inhibition increased cytosolic but not mitochondrial H2O2 levels. Total and mitochondria-specific lipid peroxidation was not increased in the inhibited cells as reported by the C11-BODIPY581/591 and MitoPerOx biosensors. Also expression of the superoxide-detoxifying enzymes CuZnSOD (cytosolic) and MnSOD (mitochondrial) was not affected. Oxyblot analysis revealed that protein carbonylation was not stimulated by CI and CIII inhibition. Our findings suggest that chronic inhibition of CI and CIII: (i) increases the levels of HEt-oxidizing ROS and (ii) specifically elevates cytosolic but not mitochondrial H2O2 levels, (iii) does not induce oxidative stress or substantial cell death. We conclude that the increased ROS levels are below the stress-inducing level and might play a role in redox signaling. PMID:26516986

  8. PAF Complex Plays Novel Subunit-Specific Roles in Alternative Cleavage and Polyadenylation

    PubMed Central

    Yang, Yan; Li, Wencheng; Hoque, Mainul; Hou, Liming; Shen, Steven; Tian, Bin; Dynlacht, Brian D.

    2016-01-01

    The PAF complex (Paf1C) has been shown to regulate chromatin modifications, gene transcription, and RNA polymerase II (PolII) elongation. Here, we provide the first genome-wide profiles for the distribution of the entire complex in mammalian cells using chromatin immunoprecipitation and high throughput sequencing. We show that Paf1C is recruited not only to promoters and gene bodies, but also to regions downstream of cleavage/polyadenylation (pA) sites at 3’ ends, a profile that sharply contrasted with the yeast complex. Remarkably, we identified novel, subunit-specific links between Paf1C and regulation of alternative cleavage and polyadenylation (APA) and upstream antisense transcription using RNAi coupled with deep sequencing of the 3’ ends of transcripts. Moreover, we found that depletion of Paf1C subunits resulted in the accumulation of PolII over gene bodies, which coincided with APA. Depletion of specific Paf1C subunits led to global loss of histone H2B ubiquitylation, although there was little impact of Paf1C depletion on other histone modifications, including tri-methylation of histone H3 on lysines 4 and 36 (H3K4me3 and H3K36me3), previously associated with this complex. Our results provide surprising differences with yeast, while unifying observations that link Paf1C with PolII elongation and RNA processing, and indicate that Paf1C subunits could play roles in controlling transcript length through suppression of PolII accumulation at transcription start site (TSS)-proximal pA sites and regulating pA site choice in 3’UTRs. PMID:26765774

  9. Conflict and Reconciliation in Software Design

    NASA Astrophysics Data System (ADS)

    Mandel, Eric

    2014-01-01

    Data analysis software is as open-ended and complex as the research it supports. The written specification is never the full story in an arena where users can’t always know what they want to do next. Requirements often are too vague or too concrete, missing or implicit. They sometimes conflict with one another. How can we design high quality software amidst these variables? In this talk, I will discuss provisional conclusions I have reached concerning software design, based on thirty years of experience developing astronomical software.

  10. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.