Sample records for software factory techniques

  1. Object-Oriented Algorithm For Evaluation Of Fault Trees

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1992-01-01

    Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).

  2. Protyping machine vision software on the World Wide Web

    NASA Astrophysics Data System (ADS)

    Karantalis, George; Batchelor, Bruce G.

    1998-10-01

    Interactive image processing is a proven technique for analyzing industrial vision applications and building prototype systems. Several of the previous implementations have used dedicated hardware to perform the image processing, with a top layer of software providing a convenient user interface. More recently, self-contained software packages have been devised and these run on a standard computer. The advent of the Java programming language has made it possible to write platform-independent software, operating over the Internet, or a company-wide Intranet. Thus, there arises the possibility of designing at least some shop-floor inspection/control systems, without the vision engineer ever entering the factories where they will be used. It successful, this project will have a major impact on the productivity of vision systems designers.

  3. Problem Solving Software for Math Classes.

    ERIC Educational Resources Information Center

    Troutner, Joanne

    1987-01-01

    Described are 10 computer software programs for problem solving related to mathematics. Programs described are: (1) Box Solves Story Problems; (2) Safari Search; (3) Puzzle Tanks; (4) The King's Rule; (5) The Factory; (6) The Royal Rules; (7) The Enchanted Forest; (8) Gears; (9) The Super Factory; and (10) Creativity Unlimited. (RH)

  4. The Personal Software Process: Downscaling the factory

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.

    1994-01-01

    It is argued that the next wave of software process improvement (SPI) activities will be based on a people-centered paradigm. The most promising such paradigm, Watts Humphrey's personal software process (PSP), is summarized and its advantages are listed. The concepts of the PSP are shown also to fit a down-scaled version of Basili's experience factory. The author's data and lessons learned while practicing the PSP are presented along with personal experience, observations, and advice from the perspective of a consultant and teacher for the personal software process.

  5. The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards

    DTIC Science & Technology

    1986-08-01

    Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques

  6. The combination of simulation and response methodology and its application in an aggregate production plan

    NASA Astrophysics Data System (ADS)

    Chen, Zhiming; Feng, Yuncheng

    1988-08-01

    This paper describes an algorithmic structure for combining simulation and optimization techniques both in theory and practice. Response surface methodology is used to optimize the decision variables in the simulation environment. A simulation-optimization software has been developed and successfully implemented, and its application to an aggregate production planning simulation-optimization model is reported. The model's objective is to minimize the production cost and to generate an optimal production plan and inventory control strategy for an aircraft factory.

  7. Cameo: A Python Library for Computer Aided Metabolic Engineering and Optimization of Cell Factories.

    PubMed

    Cardoso, João G R; Jensen, Kristian; Lieven, Christian; Lærke Hansen, Anne Sofie; Galkina, Svetlana; Beber, Moritz; Özdemir, Emre; Herrgård, Markus J; Redestig, Henning; Sonnenschein, Nikolaus

    2018-04-20

    Computational systems biology methods enable rational design of cell factories on a genome-scale and thus accelerate the engineering of cells for the production of valuable chemicals and proteins. Unfortunately, the majority of these methods' implementations are either not published, rely on proprietary software, or do not provide documented interfaces, which has precluded their mainstream adoption in the field. In this work we present cameo, a platform-independent software that enables in silico design of cell factories and targets both experienced modelers as well as users new to the field. It is written in Python and implements state-of-the-art methods for enumerating and prioritizing knockout, knock-in, overexpression, and down-regulation strategies and combinations thereof. Cameo is an open source software project and is freely available under the Apache License 2.0. A dedicated Web site including documentation, examples, and installation instructions can be found at http://cameo.bio . Users can also give cameo a try at http://try.cameo.bio .

  8. The Software Engineering Laboratory: An operational software experience factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon

    1992-01-01

    For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.

  9. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  10. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  11. A reference architecture for the component factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Cantone, Giovanni

    1992-01-01

    Software reuse can be achieved through an organization that focuses on utilization of life cycle products from previous developments. The component factory is both an example of the more general concepts of experience and domain factory and an organizational unit worth being considered independently. The critical features of such an organization are flexibility and continuous improvement. In order to achieve these features we can represent the architecture of the factory at different levels of abstraction and define a reference architecture from which specific architectures can be derived by instantiation. A reference architecture is an implementation and organization independent representation of the component factory and its environment. The paper outlines this reference architecture, discusses the instantiation process, and presents some examples of specific architectures by comparing them in the framework of the reference model.

  12. Domain analysis for the reuse of software development experiences

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Briand, L. C.; Thomas, W. M.

    1994-01-01

    We need to be able to learn from past experiences so we can improve our software processes and products. The Experience Factory is an organizational structure designed to support and encourage the effective reuse of software experiences. This structure consists of two organizations which separates project development concerns from organizational concerns of experience packaging and learning. The experience factory provides the processes and support for analyzing, packaging, and improving the organization's stored experience. The project organization is structured to reuse this stored experience in its development efforts. However, a number of questions arise: What past experiences are relevant? Can they all be used (reused) on our current project? How do we take advantage of what has been learned in other parts of the organization? How do we take advantage of experience in the world-at-large? Can someone else's best practices be used in our organization with confidence? This paper describes approaches to help answer these questions. We propose both quantitative and qualitative approaches for effectively reusing software development experiences.

  13. Planning for the semiconductor manufacturer of the future

    NASA Technical Reports Server (NTRS)

    Fargher, Hugh E.; Smith, Richard A.

    1992-01-01

    Texas Instruments (TI) is currently contracted by the Air Force Wright Laboratory and the Defense Advanced Research Projects Agency (DARPA) to develop the next generation flexible semiconductor wafer fabrication system called Microelectronics Manufacturing Science & Technology (MMST). Several revolutionary concepts are being pioneered on MMST, including the following: new single-wafer rapid thermal processes, in-situ sensors, cluster equipment, and advanced Computer Integrated Manufacturing (CIM) software. The objective of the project is to develop a manufacturing system capable of achieving an order of magnitude improvement in almost all aspects of wafer fabrication. TI was awarded the contract in Oct., 1988, and will complete development with a fabrication facility demonstration in April, 1993. An important part of MMST is development of the CIM environment responsible for coordinating all parts of the system. The CIM architecture being developed is based on a distributed object oriented framework made of several cooperating subsystems. The software subsystems include the following: process control for dynamic control of factory processes; modular processing system for controlling the processing equipment; generic equipment model which provides an interface between processing equipment and the rest of the factory; specification system which maintains factory documents and product specifications; simulator for modelling the factory for analysis purposes; scheduler for scheduling work on the factory floor; and the planner for planning and monitoring of orders within the factory. This paper first outlines the division of responsibility between the planner, scheduler, and simulator subsystems. It then describes the approach to incremental planning and the way in which uncertainty is modelled within the plan representation. Finally, current status and initial results are described.

  14. Improvement of productivity in low volume production industry layout by using witness simulation software

    NASA Astrophysics Data System (ADS)

    Jaffrey, V.; Mohamed, N. M. Z. N.; Rose, A. N. M.

    2017-10-01

    In almost all manufacturing industry, increased productivity and better efficiency of the production line are the most important goals. Most factories especially small scale factory has less awareness of manufacturing system optimization and lack of knowledge about it and uses the traditional way of management. Problems that are commonly identified in the factory are a high idle time of labour and also small production. This study is done in a Small and Medium Enterprises (SME) low volume production company. Data collection and problems affecting productivity and efficiency are identified. In this study, Witness simulation software is being used to simulate the layout and the output is focusing on the improvement of layout in terms of productivity and efficiency. In this study, the layout is rearranged by reducing the travel time from a workstation to another workstation. Then, the improved layout is modelled and the machine and labour statistic of both, original and improved layout is taken. Productivity and efficiency are calculated for both layout and then being compared.

  15. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  16. A new technology for manufacturing scheduling derived from space system operations

    NASA Technical Reports Server (NTRS)

    Hornstein, R. S.; Willoughby, J. K.

    1993-01-01

    A new technology for producing finite capacity schedules has been developed in response to complex requirements for operating space systems such as the Space Shuttle, the Space Station, and the Deep Space Network for telecommunications. This technology has proven its effectiveness in manufacturing environments where popular scheduling techniques associated with Materials Resources Planning (MRPII) and with factory simulation are not adequate for shop-floor work planning and control. The technology has three components. The first is a set of data structures that accommodate an extremely general description of a factory's resources, its manufacturing activities, and the constraints imposed by the environment. The second component is a language and set of software utilities that enable a rapid synthesis of functional capabilities. The third component is an algorithmic architecture called the Five Ruleset Model which accommodates the unique needs of each factory. Using the new technology, systems can model activities that generate, consume, and/or obligate resources. This allows work-in-process (WIP) to be generated and used; it permits constraints to be imposed or intermediate as well as finished goods inventories. It is also possible to match as closely as possible both the current factory state and future conditions such as promise dates. Schedule revisions can be accommodated without impacting the entire production schedule. Applications have been successful in both discrete and process manufacturing environments. The availability of a high-quality finite capacity production planning capability enhances the data management capabilities of MRP II systems. These schedules can be integrated with shop-floor data collection systems and accounting systems. Using the new technology, semi-custom systems can be developed at costs that are comparable to products that do not have equivalent functional capabilities and/or extensibility.

  17. Proceedings of the Eighteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The workshop provided a forum for software practitioners from around the world to exchange information on the measurement, use, and evaluation of software methods, models, and tools. This year, approximately 450 people attended the workshop, which consisted of six sessions on the following topics: the Software Engineering Laboratory, measurement, technology assessment, advanced concepts, process, and software engineering issues in NASA. Three presentations were given in each of the topic areas. The content of those presentations and the research papers detailing the work reported are included in these proceedings. The workshop concluded with a tutorial session on how to start an Experience Factory.

  18. The experience factory: Can it make you a 5? or what is its relationship to other quality and improvement concepts?

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1992-01-01

    The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.

  19. Developing Software For Monitoring And Diagnosis

    NASA Technical Reports Server (NTRS)

    Edwards, S. J.; Caglayan, A. K.

    1993-01-01

    Expert-system software shell produces executable code. Report discusses beginning phase of research directed toward development of artificial intelligence for real-time monitoring of, and diagnosis of faults in, complicated systems of equipment. Motivated by need for onboard monitoring and diagnosis of electronic sensing and controlling systems of advanced aircraft. Also applicable to such equipment systems as refineries, factories, and powerplants.

  20. Efficient processing of two-dimensional arrays with C or C++

    USGS Publications Warehouse

    Donato, David I.

    2017-07-20

    Because fast and efficient serial processing of raster-graphic images and other two-dimensional arrays is a requirement in land-change modeling and other applications, the effects of 10 factors on the runtimes for processing two-dimensional arrays with C and C++ are evaluated in a comparative factorial study. This study’s factors include the choice among three C or C++ source-code techniques for array processing; the choice of Microsoft Windows 7 or a Linux operating system; the choice of 4-byte or 8-byte array elements and indexes; and the choice of 32-bit or 64-bit memory addressing. This study demonstrates how programmer choices can reduce runtimes by 75 percent or more, even after compiler optimizations. Ten points of practical advice for faster processing of two-dimensional arrays are offered to C and C++ programmers. Further study and the development of a C and C++ software test suite are recommended.Key words: array processing, C, C++, compiler, computational speed, land-change modeling, raster-graphic image, two-dimensional array, software efficiency

  1. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  2. Treatment of dyeing wastewater by TiO2/H2O2/UV process: experimental design approach for evaluating total organic carbon (TOC) removal efficiency.

    PubMed

    Lee, Seung-Mok; Kim, Young-Gyu; Cho, Il-Hyoung

    2005-01-01

    Optimal operating conditions in order to treat dyeing wastewater were investigated by using the factorial design and responses surface methodology (RSM). The experiment was statistically designed and carried out according to a 22 full factorial design with four factorial points, three center points, and four axial points. Then, the linear and nonlinear regression was applied on the data by using SAS package software. The independent variables were TiO2 dosage, H2O2 concentration and total organic carbon (TOC) removal efficiency of dyeing wastewater was dependent variable. From the factorial design and responses surface methodology (RSM), maximum removal efficiency (85%) of dyeing wastewater was obtained at TiO2 dosage (1.82 gL(-1)), H2O2 concentration (980 mgL(-1)) for oxidation reaction (20 min).

  3. Automated Scheduling Via Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  4. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  5. Status of the MIND simulation and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cervera Villanueva, A.; Martin-Albo, J.; Laing, A.

    2010-03-30

    A realistic simulation of the Neutrino Factory detectors is required in order to fully understand the sensitivity of such a facility to the remaining parameters and degeneracies of the neutrino mixing matrix. Here described is the status of a modular software framework being developed to accommodate such a study. The results of initial studies of the reconstruction software and expected efficiency curves in the context of the golden channel are given.

  6. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  7. Impacts of object-oriented technologies: Seven years of SEL studies

    NASA Technical Reports Server (NTRS)

    Stark, Mike

    1993-01-01

    This paper examines the premise that object-oriented technology (OOT) is the most significant technology ever examined by the Software Engineering Laboratory. The evolution of the use of OOT in the Software Engineering Laboratory (SEL) 'Experience Factory' is described in terms of the SEL's original expectations, focusing on how successive generations of projects have used OOT. General conclusions are drawn on how the usage of the technology has evolved in this environment.

  8. TEAM (Technologies Enabling Agile Manufacturing) shop floor control requirements guide: Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-28

    TEAM will create a shop floor control system (SFC) to link the pre-production planning to shop floor execution. SFC must meet the requirements of a multi-facility corporation, where control must be maintained between co-located facilities down to individual workstations within each facility. SFC must also meet the requirements of a small corporation, where there may only be one small facility. A hierarchical architecture is required to meet these diverse needs. The hierarchy contains the following levels: Enterprise, Factory, Cell, Station, and Equipment. SFC is focused on the top three levels. Each level of the hierarchy is divided into three basicmore » functions: Scheduler, Dispatcher, and Monitor. The requirements of each function depend on the hierarchical level in which it is to be used. For example, the scheduler at the Enterprise level must allocate production to individual factories and assign due-dates; the scheduler at the Cell level must provide detailed start and stop times of individual operations. Finally the system shall have the following features: distributed and open-architecture. Open architecture software is required in order that the appropriate technology be used at each level of the SFC hierarchy, and even at different instances within the same hierarchical level (for example, Factory A uses discrete-event simulation scheduling software, and Factory B uses an optimization-based scheduler). A distributed implementation is required to reduce the computational burden of the overall system, and allow for localized control. A distributed, open-architecture implementation will also require standards for communication between hierarchical levels.« less

  9. Optimization of antibacterial activity by Gold-Thread (Coptidis Rhizoma Franch) against Streptococcus mutans using evolutionary operation-factorial design technique.

    PubMed

    Choi, Ung-Kyu; Kim, Mi-Hyang; Lee, Nan-Hee

    2007-11-01

    This study was conducted to find the optimum extraction condition of Gold-Thread for antibacterial activity against Streptococcus mutans using The evolutionary operation-factorial design technique. Higher antibacterial activity was achieved in a higher extraction temperature (R2 = -0.79) and in a longer extraction time (R2 = -0.71). Antibacterial activity was not affected by differentiation of the ethanol concentration in the extraction solvent (R2 = -0.12). The maximum antibacterial activity of clove against S. mutans determined by the EVOP-factorial technique was obtained at 80 degrees C extraction temperature, 26 h extraction time, and 50% ethanol concentration. The population of S. mutans decreased from 6.110 logCFU/ml in the initial set to 4.125 logCFU/ml in the third set.

  10. Multi-registration of software library resources

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-04-05

    Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.

  11. Some Solved Problems with the SLAC PEP-II B-Factory Beam-Position Monitor System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Ronald G.

    2000-05-05

    The Beam-Position Monitor (BPM) system for the SLAC PEP-II B-Factory has been in operation for over two years. Although the BPM system has met all of its specifications, several problems with the system have been identified and solved. The problems include errors and limitations in both the hardware and software. Solutions of such problems have led to improved performance and reliability. In this paper the authors report on this experience. The process of identifying problems is not at an end and they expect continued improvement of the BPM system.

  12. Factory approach can streamline patient accounting.

    PubMed

    Rands, J; Muench, M

    1991-08-01

    Although they may seem fundamentally different, similarities exist between operations of factories and healthcare organizations' business offices. As a result, a patient accounting approach based on manufacturing firms' management techniques may help smooth healthcare business processes. Receivables performance management incorporates the Japanese techniques of "just-in-time" and total quality management to reduce unbilled accounts and information backlog and accelerate payment. A preliminary diagnostic assessment of a patient accounting process helps identify bottlenecks and set priorities for work flow.

  13. Plant Factory

    NASA Astrophysics Data System (ADS)

    Ikeda, Hideo

    Recently, much attention is paid on the plant factory, as it enable to grow plants stably under extraordinary climate condition such as high and/or low air temperature and less rain. Lots of questions such as decreasing investing cost, realizing stable plant production and developing new growing technique should be solved for making popular this growing system. However, I think that we can introduce a highly developed Japanese industrial now-how to plant factory system and can produce a business chance to the world market.

  14. Systems biology of yeast: enabling technology for development of cell factories for production of advanced biofuels.

    PubMed

    de Jong, Bouke; Siewers, Verena; Nielsen, Jens

    2012-08-01

    Transportation fuels will gradually shift from oil based fuels towards alternative fuel resources like biofuels. Current bioethanol and biodiesel can, however, not cover the increasing demand for biofuels and there is therefore a need for advanced biofuels with superior fuel properties. Novel cell factories will provide a production platform for advanced biofuels. However, deep cellular understanding is required for improvement of current biofuel cell factories. Fast screening and analysis (-omics) methods and metabolome-wide mathematical models are promising techniques. An integrated systems approach of these techniques drives diversity and quantity of several new biofuel compounds. This review will cover the recent technological developments that support improvement of the advanced biofuels 1-butanol, biodiesels and jetfuels. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Radiative Penguin Decays at the B Factories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koneke, Karsten; /MIT, LNS

    2007-11-16

    In this article, I review the most recent results in radiative penguin decays from the B factories Belle and BABAR. Most notably, I will talk about the recent new observations in the decays B {yields} ({rho}/{omega}) {gamma}, a new analysis technique in b {yields} s{gamma}, and first measurements of radiative penguin decays in the B{sup 0}{sub s} meson system. Finally, I will summarize the current status and future prospects of radiative penguin B physics at the B factories.

  16. Performing Contrast Analysis in Factorial Designs: From NHST to Confidence Intervals and Beyond

    PubMed Central

    Wiens, Stefan; Nilsson, Mats E.

    2016-01-01

    Because of the continuing debates about statistics, many researchers may feel confused about how to analyze and interpret data. Current guidelines in psychology advocate the use of effect sizes and confidence intervals (CIs). However, researchers may be unsure about how to extract effect sizes from factorial designs. Contrast analysis is helpful because it can be used to test specific questions of central interest in studies with factorial designs. It weighs several means and combines them into one or two sets that can be tested with t tests. The effect size produced by a contrast analysis is simply the difference between means. The CI of the effect size informs directly about direction, hypothesis exclusion, and the relevance of the effects of interest. However, any interpretation in terms of precision or likelihood requires the use of likelihood intervals or credible intervals (Bayesian). These various intervals and even a Bayesian t test can be obtained easily with free software. This tutorial reviews these methods to guide researchers in answering the following questions: When I analyze mean differences in factorial designs, where can I find the effects of central interest, and what can I learn about their effect sizes? PMID:29805179

  17. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  18. Baby factories taint surrogacy in Nigeria.

    PubMed

    Makinde, Olusesan Ayodeji; Makinde, Olufunmbi Olukemi; Olaleye, Olalekan; Brown, Brandon; Odimegwu, Clifford O

    2016-01-01

    The practice of reproductive medicine in Nigeria is facing new challenges with the proliferation of 'baby factories'. Baby factories are buildings, hospitals or orphanages that have been converted into places for young girls and women to give birth to children for sale on the black market, often to infertile couples, or into trafficking rings. This practice illegally provides outcomes (children) similar to surrogacy. While surrogacy has not been well accepted in this environment, the proliferation of baby factories further threatens its acceptance. The involvement of medical and allied health workers in the operation of baby factories raises ethical concerns. The lack of a properly defined legal framework and code of practice for surrogacy makes it difficult to prosecute baby factory owners, especially when they are health workers claiming to be providing services to clients. In this environment, surrogacy and other assisted reproductive techniques urgently require regulation in order to define when ethico-legal lines have been crossed in providing surrogacy or surrogacy-like services. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  19. Formulation of topical bioadhesive gel of aceclofenac using 3-level factorial design.

    PubMed

    Singh, Sanjay; Parhi, Rabinarayan; Garg, Anuj

    2011-01-01

    The objective of this work was to develop bioadhesive topical gel of Aceclofenac with the help of response-surface approach. Experiments were performed according to a 3-level factorial design to evaluate the effects of two independent variables [amount of Poloxamer 407 (PL-407 = X1) and hydroxypropylmethyl cellulose K100 M (HPMC = X2)] on the bioadhesive character of gel, rheological property of gel (consistency index), and in-vitro drug release. The best model was selected to fit the data. Mathematical equation was generated by Design Expert® software for the model which assists in determining the effect of independent variables. Response surface plots were also generated by the software for analyzing effect of the independent variables on the response. Quadratic model was found to be the best for all the responses. Both independent variable (X1 and X2) were found to have synergistic effect on bioadhesion (Y1) but the effect of HPMC was more pronounced than PL-407. Consistency index was enhanced by increasing the level of both independent variables. An antagonistic effect of both independent variables was found on cumulative percentage release of drug in 2 (Y3) and 8 h (Y4). Both independent variables approximately equally contributed the antagonistic effect on Y3 whereas antagonistic effect of HPMC was more pronounced than PL-407. The effect of formulation variables on the product characteristics can be easily predicted and precisely interpreted by using a 3-level factorial experimental design and generated quadratic mathematical equations.

  20. Development of new multilocus variable number of tandem repeat analysis (MLVA) for Listeria innocua and its application in a food processing plant.

    PubMed

    Takahashi, Hajime; Ohshima, Chihiro; Nakagawa, Miku; Thanatsang, Krittaporn; Phraephaisarn, Chirapiphat; Chaturongkasumrit, Yuphakhun; Keeratipibul, Suwimon; Kuda, Takashi; Kimura, Bon

    2014-01-01

    Listeria innocua is an important hygiene indicator bacterium in food industries because it behaves similar to Listeria monocytogenes, which is pathogenic to humans. PFGE is often used to characterize bacterial strains and to track contamination source. However, because PFGE is an expensive, complicated, time-consuming protocol, and poses difficulty in data sharing, development of a new typing method is necessary. MLVA is a technique that identifies bacterial strains on the basis of the number of tandem repeats present in the genome varies depending on the strains. MLVA has gained attention due to its high reproducibility and ease of data sharing. In this study, we developed a MLVA protocol to assess L. innocua and evaluated it by tracking the contamination source of L. innocua in an actual food manufacturing factory by typing the bacterial strains isolated from the factory. Three VNTR regions of the L. innocua genome were chosen for use in the MLVA. The number of repeat units in each VNTR region was calculated based on the results of PCR product analysis using capillary electrophoresis (CE). The calculated number of repetitions was compared with the results of the gene sequence analysis to demonstrate the accuracy of the CE repeat number analysis. The developed technique was evaluated using 60 L. innocua strains isolated from a food factory. These 60 strains were classified into 11 patterns using MLVA. Many of the strains were classified into ST-6, revealing that this MLVA strain type can contaminate each manufacturing process in the factory. The MLVA protocol developed in this study for L. innocua allowed rapid and easy analysis through the use of CE. This technique was found to be very useful in hygiene control in factories because it allowed us to track contamination sources and provided information regarding whether the bacteria were present in the factories.

  1. The Potential Role of Drexon LaserCards in Optical Publishing.

    ERIC Educational Resources Information Center

    Schwerin, Julie B.

    1985-01-01

    Describes Drexon LaserCard (credit card size format holding two megabytes of digital data that can be recorded at factory or by information distributors) as a viable option to rotating optical media for distribution of computer software, technical manuals, periodicals, and other document applications, and projects its future in optical publishing.…

  2. The Experience Factory: Strategy and Practice

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi

    1995-01-01

    The quality movement, that has had in recent years a dramatic impact on all industrial sectors, has recently reached the system and software industry. Although some concepts of quality management, originally developed for other product types, can be applied to software, its specificity as a product which is developed and not produced requires a special approach. This paper introduces a quality paradigm specifically tailored on the problem of the systems and software industry. Reuse of products, processes and experiences originating from the system life cycle is seen today as a feasible solution to the problem of developing higher quality systems at a lower cost. In fact, quality improvement is very often achieved by defining and developing an appropriate set of strategic capabilities and core competencies to support them. A strategic capability is, in this context, a corporate goal defined by the business position of the organization and implemented by key business processes. Strategic capabilities are supported by core competencies, which are aggregate technologies tailored to the specific needs of the organization in performing the needed business processes. Core competencies are non-transitional, have a consistent evolution, and are typically fueled by multiple technologies. Their selection and development requires commitment, investment and leadership. The paradigm introduced in this paper for developing core competencies is the Quality Improvement Paradigm which consists of six steps: (1) Characterize the environment, (2) Set the goals, (3) Choose the process, (4) Execute the process, (5) Analyze the process data, and (6) Package experience. The process must be supported by a goal oriented approach to measurement and control, and an organizational infrastructure, called Experience Factory. The Experience Factory is a logical and physical organization distinct from the project organizations it supports. Its goal is development and support of core competencies through capitalization and reuse of its cycle experience and products. The paper introduces the major concepts of the proposed approach, discusses their relationship with other approaches used in the industry, and presents a case in which those concepts have been successfully applied.

  3. Gearing up to the factory of the future

    NASA Astrophysics Data System (ADS)

    Godfrey, D. E.

    1985-01-01

    The features of factories and manufacturing techniques and tools of the near future are discussed. The spur to incorporate new technologies on the factory floor will originate in management, who must guide the interfacing of computer-enhanced equipment with traditional manpower, materials and machines. Electronic control with responsiveness and flexibility will be the key concept in an integrated approach to processing materials. Microprocessor controlled laser and fluid cutters add accuracy to cutting operations. Unattended operation will become feasible when automated inspection is added to a work station through developments in robot vision. Optimum shop management will be achieved through AI programming of parts manufacturing, optimized work flows, and cost accounting. The automation enhancements will allow designers to affect directly parts being produced on the factory floor.

  4. DOCLIB: a software library for document processing

    NASA Astrophysics Data System (ADS)

    Jaeger, Stefan; Zhu, Guangyu; Doermann, David; Chen, Kevin; Sampat, Summit

    2006-01-01

    Most researchers would agree that research in the field of document processing can benefit tremendously from a common software library through which institutions are able to develop and share research-related software and applications across academic, business, and government domains. However, despite several attempts in the past, the research community still lacks a widely-accepted standard software library for document processing. This paper describes a new library called DOCLIB, which tries to overcome the drawbacks of earlier approaches. Many of DOCLIB's features are unique either in themselves or in their combination with others, e.g. the factory concept for support of different image types, the juxtaposition of image data and metadata, or the add-on mechanism. We cherish the hope that DOCLIB serves the needs of researchers better than previous approaches and will readily be accepted by a larger group of scientists.

  5. SWITCH: a dynamic CRISPR tool for genome engineering and metabolic pathway control for cell factory construction in Saccharomyces cerevisiae.

    PubMed

    Vanegas, Katherina García; Lehka, Beata Joanna; Mortensen, Uffe Hasbro

    2017-02-08

    The yeast Saccharomyces cerevisiae is increasingly used as a cell factory. However, cell factory construction time is a major obstacle towards using yeast for bio-production. Hence, tools to speed up cell factory construction are desirable. In this study, we have developed a new Cas9/dCas9 based system, SWITCH, which allows Saccharomyces cerevisiae strains to iteratively alternate between a genetic engineering state and a pathway control state. Since Cas9 induced recombination events are crucial for SWITCH efficiency, we first developed a technique TAPE, which we have successfully used to address protospacer efficiency. As proof of concept of the use of SWITCH in cell factory construction, we have exploited the genetic engineering state of a SWITCH strain to insert the five genes necessary for naringenin production. Next, the naringenin cell factory was switched to the pathway control state where production was optimized by downregulating an essential gene TSC13, hence, reducing formation of a byproduct. We have successfully integrated two CRISPR tools, one for genetic engineering and one for pathway control, into one system and successfully used it for cell factory construction.

  6. A novel method about detecting missing holes on the motor carling

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Tan, Hao; Li, Guirong

    2018-03-01

    After a deep analysis on how to use an image processing system to detect the missing holes on the motor carling, we design the whole system combined with the actual production conditions of the motor carling. Afterwards we explain the whole system's hardware and software in detail. We introduce the general functions for the system's hardware and software. Analyzed these general functions, we discuss the modules of the system's hardware and software and the theory to design these modules in detail. The measurement to confirm the area to image processing, edge detection, randomized Hough transform to circle detecting is explained in detail. Finally, the system result tested in the laboratory and in the factory is given out.

  7. Effect of Items Direction (Positive or Negative) on the Factorial Construction and Criterion Related Validity in Likert Scale

    ERIC Educational Resources Information Center

    Naji Qasem, Mamun Ali; Ahmad Gul, Showkeen Bilal

    2014-01-01

    The study was conducted to know the effect of items direction (positive or negative) on the factorial construction and criterion related validity in Likert scale. The descriptive survey research method was used for the study and the sample consisted of 510 undergraduate students selected by used random sampling technique. A scale developed by…

  8. Quantum computation with realistic magic-state factories

    NASA Astrophysics Data System (ADS)

    O'Gorman, Joe; Campbell, Earl T.

    2017-03-01

    Leading approaches to fault-tolerant quantum computation dedicate a significant portion of the hardware to computational factories that churn out high-fidelity ancillas called magic states. Consequently, efficient and realistic factory design is of paramount importance. Here we present the most detailed resource assessment to date of magic-state factories within a surface code quantum computer, along the way introducing a number of techniques. We show that the block codes of Bravyi and Haah [Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329] have been systematically undervalued; we track correlated errors both numerically and analytically, providing fidelity estimates without appeal to the union bound. We also introduce a subsystem code realization of these protocols with constant time and low ancilla cost. Additionally, we confirm that magic-state factories have space-time costs that scale as a constant factor of surface code costs. We find that the magic-state factory required for postclassical factoring can be as small as 6.3 million data qubits, ignoring ancilla qubits, assuming 10-4 error gates and the availability of long-range interactions.

  9. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  10. Elementary Preservice Teachers' Reasoning about Modeling a "Family Factory" with TinkerPlots--A Pilot Study

    ERIC Educational Resources Information Center

    Biehler, Rolf; Frischemeier, Daniel; Podworny, Susanne

    2017-01-01

    Connecting data and chance is fundamental in statistics curricula. The use of software like TinkerPlots can bridge both worlds because the TinkerPlots Sampler supports learners in expressive modeling. We conducted a study with elementary preservice teachers with a basic university education in statistics. They were asked to set up and evaluate…

  11. Support for life-cycle product reuse in NASA's SSE

    NASA Technical Reports Server (NTRS)

    Shotton, Charles

    1989-01-01

    The Software Support Environment (SSE) is a software factory for the production of Space Station Freedom Program operational software. The SSE is to be centrally developed and maintained and used to configure software production facilities in the field. The PRC product TTCQF provides for an automated qualification process and analysis of existing code that can be used for software reuse. The interrogation subsystem permits user queries of the reusable data and components which have been identified by an analyzer and qualified with associated metrics. The concept includes reuse of non-code life-cycle components such as requirements and designs. Possible types of reusable life-cycle components include templates, generics, and as-is items. Qualification of reusable elements requires analysis (separation of candidate components into primitives), qualification (evaluation of primitives for reusability according to reusability criteria) and loading (placing qualified elements into appropriate libraries). There can be different qualifications for different installations, methodologies, applications and components. Identifying reusable software and related components is labor-intensive and is best carried out as an integrated function of an SSE.

  12. The Living Cell as a Multi-agent Organisation: A Compositional Organisation Model of Intracellular Dynamics

    NASA Astrophysics Data System (ADS)

    Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.

    Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.

  13. Evolution of the ATLAS Nightly Build System

    NASA Astrophysics Data System (ADS)

    Undrus, A.

    2012-12-01

    The ATLAS Nightly Build System is a major component in the ATLAS collaborative software organization, validation, and code approval scheme. For over 10 years of development it has evolved into a factory for automatic release production and grid distribution. The 50 multi-platform branches of ATLAS releases provide vast opportunities for testing new packages, verification of patches to existing software, and migration to new platforms and compilers for ATLAS code that currently contains 2200 packages with 4 million C++ and 1.4 million python scripting lines written by about 1000 developers. Recent development was focused on the integration of ATLAS Nightly Build and Installation systems. The nightly releases are distributed and validated and some are transformed into stable releases used for data processing worldwide. The ATLAS Nightly System is managed by the NICOS control tool on a computing farm with 50 powerful multiprocessor nodes. NICOS provides the fully automated framework for the release builds, testing, and creation of distribution kits. The ATN testing framework of the Nightly System runs unit and integration tests in parallel suites, fully utilizing the resources of multi-core machines, and provides the first results even before compilations complete. The NICOS error detection system is based on several techniques and classifies the compilation and test errors according to their severity. It is periodically tuned to place greater emphasis on certain software defects by highlighting the problems on NICOS web pages and sending automatic e-mail notifications to responsible developers. These and other recent developments will be presented and future plans will be described.

  14. An intelligent CNC machine control system architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.J.; Loucks, C.S.

    1996-10-01

    Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less

  15. Internet SCADA Utilizing API's as Data Source

    NASA Astrophysics Data System (ADS)

    Robles, Rosslin John; Kim, Haeng-Kon; Kim, Tai-Hoon

    An Application programming interface or API is an interface implemented by a software program that enables it to interact with other software. Many companies provide free API services which can be utilized in Control Systems. SCADA is an example of a control system and it is a system that collects data from various sensors at a factory, plant or in other remote locations and then sends this data to a central computer which then manages and controls the data. In this paper, we designed a scheme for Weather Condition in Internet SCADA Environment utilizing data from external API services. The scheme was designed to double check the weather information in SCADA.

  16. Examining robustness of model selection with half-normal and LASSO plots for unreplicated factorial designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, Dae -Heung; Anderson-Cook, Christine Michaela

    When there are constraints on resources, an unreplicated factorial or fractional factorial design can allow efficient exploration of numerous factor and interaction effects. A half-normal plot is a common graphical tool used to compare the relative magnitude of effects and to identify important effects from these experiments when no estimate of error from the experiment is available. An alternative is to use a least absolute shrinkage and selection operation plot to examine the pattern of model selection terms from an experiment. We examine how both the half-normal and least absolute shrinkage and selection operation plots are impacted by the absencemore » of individual observations or an outlier, and the robustness of conclusions obtained from these 2 techniques for identifying important effects from factorial experiments. As a result, the methods are illustrated with 2 examples from the literature.« less

  17. Examining robustness of model selection with half-normal and LASSO plots for unreplicated factorial designs

    DOE PAGES

    Jang, Dae -Heung; Anderson-Cook, Christine Michaela

    2017-04-12

    When there are constraints on resources, an unreplicated factorial or fractional factorial design can allow efficient exploration of numerous factor and interaction effects. A half-normal plot is a common graphical tool used to compare the relative magnitude of effects and to identify important effects from these experiments when no estimate of error from the experiment is available. An alternative is to use a least absolute shrinkage and selection operation plot to examine the pattern of model selection terms from an experiment. We examine how both the half-normal and least absolute shrinkage and selection operation plots are impacted by the absencemore » of individual observations or an outlier, and the robustness of conclusions obtained from these 2 techniques for identifying important effects from factorial experiments. As a result, the methods are illustrated with 2 examples from the literature.« less

  18. Proceedings of the Twenty-Fourth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    2000-01-01

    On December 1 and 2, the Software Engineering Laboratory (SEL), a consortium composed of NASA/Goddard, the University of Maryland, and CSC, held the 24th Software Engineering Workshop (SEW), the last of the millennium. Approximately 240 people attended the 2-day workshop. Day 1 was composed of four sessions: International Influence of the Software Engineering Laboratory; Object Oriented Testing and Reading; Software Process Improvement; and Space Software. For the first session, three internationally known software process experts discussed the influence of the SEL with respect to software engineering research. In the Space Software session, prominent representatives from three different NASA sites- GSFC's Marti Szczur, the Jet Propulsion Laboratory's Rick Doyle, and the Ames Research Center IV&V Facility's Lou Blazy- discussed the future of space software in their respective centers. At the end of the first day, the SEW sponsored a reception at the GSFC Visitors' Center. Day 2 also provided four sessions: Using the Experience Factory; A panel discussion entitled "Software Past, Present, and Future: Views from Government, Industry, and Academia"; Inspections; and COTS. The day started with an excellent talk by CSC's Frank McGarry on "Attaining Level 5 in CMM Process Maturity." Session 2, the panel discussion on software, featured NASA Chief Information Officer Lee Holcomb (Government), our own Jerry Page (Industry), and Mike Evangelist of the National Science Foundation (Academia). Each presented his perspective on the most important developments in software in the past 10 years, in the present, and in the future.

  19. An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.

    ERIC Educational Resources Information Center

    Skakun, Ernest N.; Hakstian, A. Ralph

    Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…

  20. Building an experience factory for maintenance

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Condon, Steven E.; Briand, Lionel; Kim, Yong-Mi; Basili, Victor R.

    1994-01-01

    This paper reports the preliminary results of a study of the software maintenance process in the Flight Dynamics Division (FDD) of the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC). This study is being conducted by the Software Engineering Laboratory (SEL), a research organization sponsored by the Software Engineering Branch of the FDD, which investigates the effectiveness of software engineering technologies when applied to the development of applications software. This software maintenance study began in October 1993 and is being conducted using the Quality Improvement Paradigm (QIP), a process improvement strategy based on three iterative steps: understanding, assessing, and packaging. The preliminary results represent the outcome of the understanding phase, during which SEL researchers characterized the maintenance environment, product, and process. Findings indicate that a combination of quantitative and qualitative analysis is effective for studying the software maintenance process, that additional measures should be collected for maintenance (as opposed to new development), and that characteristics such as effort, error rate, and productivity are best considered on a 'release' basis rather than on a project basis. The research thus far has documented some basic differences between new development and software maintenance. It lays the foundation for further application of the QIP to investigate means of improving the maintenance process and product in the FDD.

  1. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  2. Virus-host interactions: insights from the replication cycle of the large Paramecium bursaria chlorella virus.

    PubMed

    Milrot, Elad; Mutsafi, Yael; Fridmann-Sirkis, Yael; Shimoni, Eyal; Rechav, Katya; Gurnon, James R; Van Etten, James L; Minsky, Abraham

    2016-01-01

    The increasing interest in cytoplasmic factories generated by eukaryotic-infecting viruses stems from the realization that these highly ordered assemblies may contribute fundamental novel insights to the functional significance of order in cellular biology. Here, we report the formation process and structural features of the cytoplasmic factories of the large dsDNA virus Paramecium bursaria chlorella virus 1 (PBCV-1). By combining diverse imaging techniques, including scanning transmission electron microscopy tomography and focused ion beam technologies, we show that the architecture and mode of formation of PBCV-1 factories are significantly different from those generated by their evolutionary relatives Vaccinia and Mimivirus. Specifically, PBCV-1 factories consist of a network of single membrane bilayers acting as capsid templates in the central region, and viral genomes spread throughout the host cytoplasm but excluded from the membrane-containing sites. In sharp contrast, factories generated by Mimivirus have viral genomes in their core, with membrane biogenesis region located at their periphery. Yet, all viral factories appear to share structural features that are essential for their function. In addition, our studies support the notion that PBCV-1 infection, which was recently reported to result in significant pathological outcomes in humans and mice, proceeds through a bacteriophage-like infection pathway. © 2015 John Wiley & Sons Ltd.

  3. Permethrin-Treated Clothing as Protection against the Dengue Vector, Aedes aegypti: Extent and Duration of Protection

    PubMed Central

    DeRaedt Banks, Sarah; Orsborne, James; Gezan, Salvador A.; Kaur, Harparkash; Wilder-Smith, Annelies; Lindsey, Steve W.; Logan, James G.

    2015-01-01

    Introduction Dengue transmission by the mosquito vector, Aedes aegypti, occurs indoors and outdoors during the day. Personal protection of individuals, particularly when outside, is challenging. Here we assess the efficacy and durability of different types of insecticide-treated clothing on laboratory-reared Ae. aegypti. Methods Standardised World Health Organisation Pesticide Evaluation Scheme (WHOPES) cone tests and arm-in-cage assays were used to assess knockdown (KD) and mortality of Ae. aegypti tested against factory-treated fabric, home-dipped fabric and microencapsulated fabric. Based on the testing of these three different treatment types, the most protective was selected for further analysis using arm-in cage assays with the effect of washing, ultra-violet light, and ironing investigated using high pressure liquid chromatography. Results Efficacy varied between the microencapsulated and factory dipped fabrics in cone testing. Factory-dipped clothing showed the greatest effect on KD (3 min 38.1%; 1 hour 96.5%) and mortality (97.1%) with no significant difference between this and the factory dipped school uniforms. Factory-dipped clothing was therefore selected for further testing. Factory dipped clothing provided 59% (95% CI = 49.2%– 66.9%) reduction in landing and a 100% reduction in biting in arm-in-cage tests. Washing duration and technique had a significant effect, with insecticidal longevity shown to be greater with machine washing (LW50 = 33.4) compared to simulated hand washing (LW50 = 17.6). Ironing significantly reduced permethrin content after 1 week of simulated use, with a 96.7% decrease after 3 months although UV exposure did not reduce permethrin content within clothing significantly after 3 months simulated use. Conclusion Permethrin-treated clothing may be a promising intervention in reducing dengue transmission. However, our findings also suggest that clothing may provide only short-term protection due to the effect of washing and ironing, highlighting the need for improved fabric treatment techniques. PMID:26440967

  4. Permethrin-Treated Clothing as Protection against the Dengue Vector, Aedes aegypti: Extent and Duration of Protection.

    PubMed

    DeRaedt Banks, Sarah; Orsborne, James; Gezan, Salvador A; Kaur, Harparkash; Wilder-Smith, Annelies; Lindsey, Steve W; Logan, James G

    2015-01-01

    Dengue transmission by the mosquito vector, Aedes aegypti, occurs indoors and outdoors during the day. Personal protection of individuals, particularly when outside, is challenging. Here we assess the efficacy and durability of different types of insecticide-treated clothing on laboratory-reared Ae. aegypti. Standardised World Health Organisation Pesticide Evaluation Scheme (WHOPES) cone tests and arm-in-cage assays were used to assess knockdown (KD) and mortality of Ae. aegypti tested against factory-treated fabric, home-dipped fabric and microencapsulated fabric. Based on the testing of these three different treatment types, the most protective was selected for further analysis using arm-in cage assays with the effect of washing, ultra-violet light, and ironing investigated using high pressure liquid chromatography. Efficacy varied between the microencapsulated and factory dipped fabrics in cone testing. Factory-dipped clothing showed the greatest effect on KD (3 min 38.1%; 1 hour 96.5%) and mortality (97.1%) with no significant difference between this and the factory dipped school uniforms. Factory-dipped clothing was therefore selected for further testing. Factory dipped clothing provided 59% (95% CI = 49.2%- 66.9%) reduction in landing and a 100% reduction in biting in arm-in-cage tests. Washing duration and technique had a significant effect, with insecticidal longevity shown to be greater with machine washing (LW50 = 33.4) compared to simulated hand washing (LW50 = 17.6). Ironing significantly reduced permethrin content after 1 week of simulated use, with a 96.7% decrease after 3 months although UV exposure did not reduce permethrin content within clothing significantly after 3 months simulated use. Permethrin-treated clothing may be a promising intervention in reducing dengue transmission. However, our findings also suggest that clothing may provide only short-term protection due to the effect of washing and ironing, highlighting the need for improved fabric treatment techniques.

  5. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  6. Dynamic Generalizations of Systems Factorial Technology for Modeling Perception of Fused Information

    DTIC Science & Technology

    2017-01-11

    2004). Nonetheless, in all of these applications the stimuli were highly controlled and presented in isolation with little or no ex - traneous...50 maximum compensation). Twenty members of the Wright State University community were recruited to participate, sixteen of whom completed all five ex ...aligned to minimize the need for registering the images from each sensor post collection, although further registration was done with software developed

  7. PEOPLE’S LIBERATION ARMY AFTER NEXT

    DTIC Science & Technology

    2000-08-01

    machine tools and workforce training), and, above all, money (to support the modernization of Chinese factories and product l ines)—but also “software...shipbuilding capacity. This overcapacity is exacerbated by an endemic lack of sufficient capital. The government often does not have enough money to put...arms industries reportedly are among biggest money -losers.110 As a result, most SOEs are burdened with 30 considerable debt, much of which is

  8. Evolution of solid rocket booster component testing

    NASA Technical Reports Server (NTRS)

    Lessey, Joseph A.

    1989-01-01

    The evolution of one of the new generation of test sets developed for the Solid Rocket Booster of the U.S. Space Transportation System. Requirements leading to factory checkout of the test set are explained, including the evolution from manual to semiautomated toward fully automated status. Individual improvements in the built-in test equipment, self-calibration, and software flexibility are addressed, and the insertion of fault detection to improve reliability is discussed.

  9. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  10. Study of fault tolerant software technology for dynamic systems

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Zacharias, G. L.

    1985-01-01

    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.

  11. Top down, bottom up structured programming and program structuring

    NASA Technical Reports Server (NTRS)

    Hamilton, M.; Zeldin, S.

    1972-01-01

    New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.

  12. Tamarind seed gum-hydrolyzed polymethacrylamide-g-gellan beads for extended release of diclofenac sodium using 32 full factorial design.

    PubMed

    Nandi, Gouranga; Nandi, Amit Kumar; Khan, Najim Sarif; Pal, Souvik; Dey, Sibasish

    2018-07-15

    Development of tamarind seed gum (TSG)-hydrolyzed polymethacrylamide-g-gellan (h-Pmaa-g-GG) composite beads for extended release of diclofenac sodium using 3 2 full factorial design is the main purpose of this study. The ratio of h-Pmaa-g-GG and TSG and concentration of cross-linker CaCl 2 were taken as independent factors with three different levels of each. Effects of polymer ratio and CaCl 2 on drug entrapment efficiency (DEE), drug release, bead size and swelling were investigated. Responses such as DEE and different drug release parameters were statistically analyzed by 3 2 full factorial design using Design-Expert software and finally the formulation factors were optimized to obtain USP-reference release profile. Drug release rate was found to decrease with decrease in the ratio of h-Pmaa-g-GG:TSG and increase in the concentration of Ca 2+ ions in cross-linking medium. The optimized formulation showed DEE of 93.25% and an extended drug release profile over a period of 10h with f 2 =80.13. Kinetic modeling unveiled case-I-Fickian diffusion based drug release mechanism. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Mathematical model for dynamic cell formation in fast fashion apparel manufacturing stage

    NASA Astrophysics Data System (ADS)

    Perera, Gayathri; Ratnayake, Vijitha

    2018-05-01

    This paper presents a mathematical programming model for dynamic cell formation to minimize changeover-related costs (i.e., machine relocation costs and machine setup cost) and inter-cell material handling cost to cope with the volatile production environments in apparel manufacturing industry. The model is formulated through findings of a comprehensive literature review. Developed model is validated based on data collected from three different factories in apparel industry, manufacturing fast fashion products. A program code is developed using Lingo 16.0 software package to generate optimal cells for developed model and to determine the possible cost-saving percentage when the existing layouts used in three factories are replaced by generated optimal cells. The optimal cells generated by developed mathematical model result in significant cost saving when compared with existing product layouts used in production/assembly department of selected factories in apparel industry. The developed model can be considered as effective in minimizing the considered cost terms in dynamic production environment of fast fashion apparel manufacturing industry. Findings of this paper can be used for further researches on minimizing the changeover-related costs in fast fashion apparel production stage.

  14. An artificial reality environment for remote factory control and monitoring

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    Work has begun on the merger of two well known systems, VEOS (HITLab) and CLIPS (NASA). In the recent past, the University of Massachusetts Lowell developed a parallel version of NASA CLIPS, called P-CLIPS. This modification allows users to create smaller expert systems which are able to communicate with each other to jointly solve problems. With the merger of a VEOS message system, PCLIPS-V can now act as a group of entities working within VEOS. To display the 3D virtual world we have been using a graphics package called HOOPS, from Ithaca Software. The artificial reality environment we have set up contains actors and objects as found in our Lincoln Logs Factory of the Future project. The environment allows us to view and control the objects within the virtual world. All communication between the separate CLIPS expert systems is done through VEOS. A graphical renderer generates camera views on X-Windows devices; Head Mounted Devices are not required. This allows more people to make use of this technology. We are experimenting with different types of virtual vehicles to give the user a sense that he or she is actually moving around inside the factory looking ahead through windows and virtual monitors.

  15. Determination of the elemental composition of aerosol samples in the working environment of a secondary lead smelting company in Nigeria using EDXRF technique

    NASA Astrophysics Data System (ADS)

    Obiajunwa, E. I.; Johnson-Fatokun, F. O.; Olaniyi, H. B.; Olowole, A. F.

    2002-07-01

    Energy dispersive X-ray fluorescence technique was employed to determine the concentrations of elements in aerosol samples collected in the working environment of a secondary lead smelting company in Nigeria. Sampling was done using Whatman-41 cellulose filters mounted in Negretti air samplers at 10 locations within the factory. The concentrations of eight elements (K, Ca, Ti, Mn, Fe, Cu, Zn and Pb) were determined. The TSP values ranged from 70 to 7963 μg/m 3 and the concentration of Pb was found to be between 2.98 and 538.47 μg/m 3. The high Pb concentration is a danger signal to the health of the factory workers.

  16. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  17. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  18. Simulation Assessment Validation Environment (SAVE). Software User’s Manual

    DTIC Science & Technology

    2000-09-01

    requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible

  19. Rotary Kiln Gasification of Solid Waste for Base Camps

    DTIC Science & Technology

    2017-10-02

    cup after full day run 3.3 Feedstock Handling System Garbage bags containing waste feedstock are placed into feed bin FB-101. Ram feeder RF-102...Environmental Science and Technology using the Factory Talk SCADA software running on a laptop computer. A wireless Ethernet router that is located within the...pyrolysis oil produced required consistent draining from the system during operation and became a liquid waste disposal problem. A 5-hour test run could

  20. Section Work--Sleeves; Apparel Manufacturing: 9377.08.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    This course involves practice in a sleeve-making techniques. Prior to entry in this course the vocational student will have completed "Section Work--Pocket Setting." Upon completion of the course the student will be able to understand the underlying principles of sewing individual sections of garments using factory techniques comparable…

  1. A Study of Item Bias for Attitudinal Measurement Using Maximum Likelihood Factor Analysis.

    ERIC Educational Resources Information Center

    Mayberry, Paul W.

    A technique for detecting item bias that is responsive to attitudinal measurement considerations is a maximum likelihood factor analysis procedure comparing multivariate factor structures across various subpopulations, often referred to as SIFASP. The SIFASP technique allows for factorial model comparisons in the testing of various hypotheses…

  2. In-situ data collection at the photon factory macromolecular crystallography beamlines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamada, Yusuke, E-mail: yusuke.yamada@kek.jp; Matsugaki, Naohiro; Kato, Ryuichi

    Crystallization trial is one of the most important but time-consuming steps in macromolecular crystallography, and in-situ diffraction experiment has a capability to make researchers to proceed this step more efficiently. At the Photon Factory, a new tabletop diffractometer for in-situ diffraction experiments has been developed. It consists of XYZ translation stages with a plate handler, an on-axis viewing system and a plate rack with a capacity for ten crystallization plates. These components sit on a common plate and are able to be placed on the existing diffractometer table. The CCD detector with a large active area and a pixel arraymore » detector with a small active area are used for acquiring diffraction images from crystals. Dedicated control software and a user interface have also been developed. The new diffractometer has been operational for users and used for evaluation of crystallization screening since 2014.« less

  3. Increasing the reliability of ecological models using modern software engineering techniques

    Treesearch

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  4. Nonmedical influences on medical decision making: an experimental technique using videotapes, factorial design, and survey sampling.

    PubMed Central

    Feldman, H A; McKinlay, J B; Potter, D A; Freund, K M; Burns, R B; Moskowitz, M A; Kasten, L E

    1997-01-01

    OBJECTIVE: To study nonmedical influences on the doctor-patient interaction. A technique using simulated patients and "real" doctors is described. DATA SOURCES: A random sample of physicians, stratified on such characteristics as demographics, specialty, or experience, and selected from commercial and professional listings. STUDY DESIGN: A medical appointment is depicted on videotape by professional actors. The patient's presenting complaint (e.g., chest pain) allows a range of valid interpretation. Several alternative versions are taped, featuring the same script with patient-actors of different age, sex, race, or other characteristics. Fractional factorial design is used to select a balanced subset of patient characteristics, reducing costs without biasing the outcome. DATA COLLECTION: Each physician is shown one version of the videotape appointment and is asked to describe how he or she would diagnose or treat such a patient. PRINCIPAL FINDINGS: Two studies using this technique have been completed to date, one involving chest pain and dyspnea and the other involving breast cancer. The factorial design provided sufficient power, despite limited sample size, to demonstrate with statistical significance various influences of the experimental and stratification variables, including the patient's gender and age and the physician's experience. Persistent recruitment produced a high response rate, minimizing selection bias and enhancing validity. CONCLUSION: These techniques permit us to determine, with a degree of control unattainable in observational studies, whether medical decisions as described by actual physicians and drawn from a demographic or professional group of interest, are influenced by a prescribed set of nonmedical factors. PMID:9240285

  5. Executable assertions and flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    Executable assertions are used to test flight control software. The techniques used for testing flight software; however, are different from the techniques used to test other kinds of software. This is because of the redundant nature of flight software. An experimental setup for testing flight software using executable assertions is described. Techniques for writing and using executable assertions to test flight software are presented. The error detection capability of assertions is studied and many examples of assertions are given. The issues of placement and complexity of assertions and the language features to support efficient use of assertions are discussed.

  6. Laser electro-optic system for rapid three-dimensional /3-D/ topographic mapping of surfaces

    NASA Technical Reports Server (NTRS)

    Altschuler, M. D.; Altschuler, B. R.; Taboada, J.

    1981-01-01

    It is pointed out that the generic utility of a robot in a factory/assembly environment could be substantially enhanced by providing a vision capability to the robot. A standard videocamera for robot vision provides a two-dimensional image which contains insufficient information for a detailed three-dimensional reconstruction of an object. Approaches which supply the additional information needed for the three-dimensional mapping of objects with complex surface shapes are briefly considered and a description is presented of a laser-based system which can provide three-dimensional vision to a robot. The system consists of a laser beam array generator, an optical image recorder, and software for controlling the required operations. The projection of a laser beam array onto a surface produces a dot pattern image which is viewed from one or more suitable perspectives. Attention is given to the mathematical method employed, the space coding technique, the approaches used for obtaining the transformation parameters, the optics for laser beam array generation, the hardware for beam array coding, and aspects of image acquisition.

  7. Assessing Orchestrated Simulation Through Modeling to Quantify the Benefits of Unmanned-Teaming in a Tactical ASW Scenario

    DTIC Science & Technology

    2018-03-01

    Results are compared to a previous study using a similar design of experiments but different simulation software. The baseline scenario for exploring the...behaviors are mimicked in this research, enabling Solem’s MANA results to be compared to our LITMUS’ results. By design , the principal difference...missions when using the second order NOLH, and compares favorably with the over six million in the full factorial design . 3. Advantages of Cluster

  8. Integrating Testing into Software Engineering Courses Supported by a Collaborative Learning Environment

    ERIC Educational Resources Information Center

    Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.

    2014-01-01

    As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…

  9. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  10. Survey of Software Assurance Techniques for Highly Reliable Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2004-01-01

    This document provides a survey of software assurance techniques for highly reliable systems including a discussion of relevant safety standards for various industries in the United States and Europe, as well as examples of methods used during software development projects. It contains one section for each industry surveyed: Aerospace, Defense, Nuclear Power, Medical Devices and Transportation. Each section provides an overview of applicable standards and examples of a mission or software development project, software assurance techniques used and reliability achieved.

  11. On The Modeling of Educational Systems: II

    ERIC Educational Resources Information Center

    Grauer, Robert T.

    1975-01-01

    A unified approach to model building is developed from the separate techniques of regression, simulation, and factorial design. The methodology is applied in the context of a suburban school district. (Author/LS)

  12. Visible light assisted photoelectrocatalytic degradation of sugarcane factory wastewater by sprayed CZTS thin films

    NASA Astrophysics Data System (ADS)

    Hunge, Y. M.; Mahadik, M. A.; Patil, V. L.; Pawar, A. R.; Gadakh, S. R.; Moholkar, A. V.; Patil, P. S.; Bhosale, C. H.

    2017-12-01

    Highly crystalline Cu2ZnSnS4 (CZTS) thin films have been deposited onto glass and FTO coated glass substrates by simple chemical spray-pyrolysis technique. It is an important material for solar energy conversion through the both photovoltaics and photocatalysis. The effect of substrate temperatures on the physico-chemical properties of the CZTS films is studied. The XRD study shows the formation of single phase CZTS with kesterite structure. FE-SEM analysis reveals nano flakes architecture with pin-hole and crake free surface with more adherent. The film deposited at optimized substrate temperature exhibits optical band gap energy of 1.90 eV, which lies in the visible region of the solar spectrum and useful for photocatalysis application. The photoelectrocatalytic activities of the large surface area (10 × 10 cm2) deposited CZTS thin film photocatalysts were evaluated for the degradation of sugarcane factory wastewater under visible light irradiation. The results show that the CZTS thin film photocatalyst exhibited about 90% degradation of sugar cane factory wastewater. The mineralization of sugarcane factory wastewater is studied by measuring chemical oxygen demand (COD) values.

  13. Maintaining the Health of Software Monitors

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Rungta, Neha

    2013-01-01

    Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.

  14. Fault-tolerant software - Experiment with the sift operating system. [Software Implemented Fault Tolerance computer

    NASA Technical Reports Server (NTRS)

    Brunelle, J. E.; Eckhardt, D. E., Jr.

    1985-01-01

    Results are presented of an experiment conducted in the NASA Avionics Integrated Research Laboratory (AIRLAB) to investigate the implementation of fault-tolerant software techniques on fault-tolerant computer architectures, in particular the Software Implemented Fault Tolerance (SIFT) computer. The N-version programming and recovery block techniques were implemented on a portion of the SIFT operating system. The results indicate that, to effectively implement fault-tolerant software design techniques, system requirements will be impacted and suggest that retrofitting fault-tolerant software on existing designs will be inefficient and may require system modification.

  15. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  16. Determining production level under uncertainty using fuzzy simulation and bootstrap technique, a case study

    NASA Astrophysics Data System (ADS)

    Hamidi, Mohammadreza; Shahanaghi, Kamran; Jabbarzadeh, Armin; Jahani, Ehsan; Pousti, Zahra

    2017-12-01

    In every production plant, it is necessary to have an estimation of production level. Sometimes there are many parameters affective in this estimation. In this paper, it tried to find an appropriate estimation of production level for an industrial factory called Barez in an uncertain environment. We have considered a part of production line, which has different production time for different kind of products, which means both environmental and system uncertainty. To solve the problem we have simulated the line and because of the uncertainty in the times, fuzzy simulation is considered. Required fuzzy numbers are estimated by the use of bootstrap technique. The results are used in production planning process by factory experts and have had satisfying consequences. Opinions of these experts about the efficiency of using this methodology, has been attached.

  17. Self Management Techniques and Disclosure of Sero Status

    ERIC Educational Resources Information Center

    Falaye, Ajibola; Afolayan, Joel Adeleke

    2015-01-01

    This study looked at using Self Management Technique (SMT) to promote self-disclosure of Sero status in Kwara State, Nigeria. A pre-test, post-test and control group quasi experimental design using a 2x2x2 factorial matrix was adopted. Sixty participants were sampled by balloting from two HIV/AIDS screening centres. Four instruments were used such…

  18. All-Possible-Subsets for MANOVA and Factorial MANOVAs: Less than a Weekend Project

    ERIC Educational Resources Information Center

    Nimon, Kim; Zientek, Linda Reichwein; Kraha, Amanda

    2016-01-01

    Multivariate techniques are increasingly popular as researchers attempt to accurately model a complex world. MANOVA is a multivariate technique used to investigate the dimensions along which groups differ, and how these dimensions may be used to predict group membership. A concern in a MANOVA analysis is to determine if a smaller subset of…

  19. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  20. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  1. Linking structural biology with genome research: Beamlines for the Berlin ``Protein Structure Factory'' initiative

    NASA Astrophysics Data System (ADS)

    Illing, Gerd; Saenger, Wolfram; Heinemann, Udo

    2000-06-01

    The Protein Structure Factory will be established to characterize proteins encoded by human genes or cDNAs, which will be selected by criteria of potential structural novelty or medical or biotechnological usefulness. It represents an integrative approach to structure analysis combining bioinformatics techniques, automated gene expression and purification of gene products, generation of a biophysical fingerprint of the proteins and the determination of their three-dimensional structures either by NMR spectroscopy or by X-ray diffraction. The use of synchrotron radiation will be crucial to the Protein Structure Factory: high brilliance and tunable wavelengths are prerequisites for fast data collection, the use of small crystals and multiwavelength anomalous diffraction (MAD) phasing. With the opening of BESSY II, direct access to a third-generation XUV storage ring source with excellent conditions is available nearby. An insertion device with two MAD beamlines and one constant energy station will be set up until 2001.

  2. Using Blocked Fractional Factorial Designs to Construct Discrete Choice Experiments for Health Care Studies

    PubMed Central

    Jaynes, Jessica; Wong, Weng Kee; Xu, Hongquan

    2016-01-01

    Discrete choice experiments (DCEs) are increasingly used for studying and quantifying subjects preferences in a wide variety of health care applications. They provide a rich source of data to assess real-life decision making processes, which involve trade-offs between desirable characteristics pertaining to health and health care, and identification of key attributes affecting health care. The choice of the design for a DCE is critical because it determines which attributes’ effects and their interactions are identifiable. We apply blocked fractional factorial designs to construct DCEs and address some identification issues by utilizing the known structure of blocked fractional factorial designs. Our design techniques can be applied to several situations including DCEs where attributes have different number of levels. We demonstrate our design methodology using two health care studies to evaluate (1) asthma patients’ preferences for symptom-based outcome measures, and (2) patient preference for breast screening services. PMID:26823156

  3. Thermal photons in heavy ion collisions at 158 A GeV

    NASA Astrophysics Data System (ADS)

    Dutt, Sunil

    2018-05-01

    The essence of experimental ultra-relativistic heavy ion collision physics is the production and study of strongly interacting matter at extreme energy densities, temperatures and consequent search for equation of state of nuclear matter. The focus of the analysis has been to examine pseudo-rapidity distributions obtained for the γ-like particles in pre-shower photon multiplicity detector. This allows the extension of scaled factorial moment analysis to bin sizes smaller than those accessible to other experimental techniques. Scaled factorial moments are calculated using horizontal corrected and vertical analysis. The results are compared with simulation analysis using VENUS event generator.

  4. Software-Enabled Project Management Techniques and Their Relationship to the Triple Constraints

    ERIC Educational Resources Information Center

    Elleh, Festus U.

    2013-01-01

    This study investigated the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). There was the dearth of academic literature that focused on the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). Based on the gap…

  5. Adaptive Educational Software by Applying Reinforcement Learning

    ERIC Educational Resources Information Center

    Bennane, Abdellah

    2013-01-01

    The introduction of the intelligence in teaching software is the object of this paper. In software elaboration process, one uses some learning techniques in order to adapt the teaching software to characteristics of student. Generally, one uses the artificial intelligence techniques like reinforcement learning, Bayesian network in order to adapt…

  6. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. Increasing walking among older people: A test of behaviour change techniques using factorial randomised N-of-1 trials

    PubMed Central

    Nyman, Samuel R.; Goodwin, Kelly; Kwasnicka, Dominika; Callaway, Andrew

    2016-01-01

    Objective: Evaluations of techniques to promote physical activity usually adopt a randomised controlled trial (RCT). Such designs inform how a technique performs on average but cannot be used for treatment of individuals. Our objective was to conduct the first N-of-1 RCTs of behaviour change techniques with older people and test the effectiveness of the techniques for increasing walking within individuals. Design: Eight adults aged 60–87 were randomised to a 2 (goal-setting vs. active control) × 2 (self-monitoring vs. active control) factorial RCT over 62 days. The time series data were analysed for each single case using linear regressions. Main outcome measures: Walking was objectively measured using pedometers. Results: Compared to control days, goal-setting increased walking in 4 out of 8 individuals and self-monitoring increased walking in 7 out of 8 individuals. While the probability for self-monitoring to be effective in 7 out of 8 participants was beyond chance (p = .03), no intervention effect was significant for individual participants. Two participants had a significant but small linear decrease in walking over time. Conclusion: We demonstrate the utility of N-of-1 trials for advancing scientific enquiry of behaviour change and in practice for increasing older people’s physical activity. PMID:26387689

  8. Secure web book to store structural genomics research data.

    PubMed

    Manjasetty, Babu A; Höppner, Klaus; Mueller, Uwe; Heinemann, Udo

    2003-01-01

    Recently established collaborative structural genomics programs aim at significantly accelerating the crystal structure analysis of proteins. These large-scale projects require efficient data management systems to ensure seamless collaboration between different groups of scientists working towards the same goal. Within the Berlin-based Protein Structure Factory, the synchrotron X-ray data collection and the subsequent crystal structure analysis tasks are located at BESSY, a third-generation synchrotron source. To organize file-based communication and data transfer at the BESSY site of the Protein Structure Factory, we have developed the web-based BCLIMS, the BESSY Crystallography Laboratory Information Management System. BCLIMS is a relational data management system which is powered by MySQL as the database engine and Apache HTTP as the web server. The database interface routines are written in Python programing language. The software is freely available to academic users. Here we describe the storage, retrieval and manipulation of laboratory information, mainly pertaining to the synchrotron X-ray diffraction experiments and the subsequent protein structure analysis, using BCLIMS.

  9. Free Factories: Unified Infrastructure for Data Intensive Web Services

    PubMed Central

    Zaranek, Alexander Wait; Clegg, Tom; Vandewege, Ward; Church, George M.

    2010-01-01

    We introduce the Free Factory, a platform for deploying data-intensive web services using small clusters of commodity hardware and free software. Independently administered virtual machines called Freegols give application developers the flexibility of a general purpose web server, along with access to distributed batch processing, cache and storage services. Each cluster exploits idle RAM and disk space for cache, and reserves disks in each node for high bandwidth storage. The batch processing service uses a variation of the MapReduce model. Virtualization allows every CPU in the cluster to participate in batch jobs. Each 48-node cluster can achieve 4-8 gigabytes per second of disk I/O. Our intent is to use multiple clusters to process hundreds of simultaneous requests on multi-hundred terabyte data sets. Currently, our applications achieve 1 gigabyte per second of I/O with 123 disks by scheduling batch jobs on two clusters, one of which is located in a remote data center. PMID:20514356

  10. A control strategy for grid-side converter of DFIG under unbalanced condition based on Dig SILENT/Power Factory

    NASA Astrophysics Data System (ADS)

    Han, Pingping; Zhang, Haitian; Chen, Lingqi; Zhang, Xiaoan

    2018-01-01

    The models of doubly fed induction generator (DFIG) and its grid-side converter (GSC) are established under unbalanced grid condition based on DIgSILENT/PowerFactory. According to the mathematical model, the vector equations of positive and negative sequence voltage and current are deduced in the positive sequence synchronous rotating reference frame d-q-0 when the characteristics of the simulation software are considered adequately. Moreover, the reference value of current component of GSC in the positive sequence frame d-q-0 under unbalanced condition can be obtained to improve the traditional control of GSC when the national issue of unbalanced current limits is combined. The simulated results indicate that the control strategy can restrain negative sequence current and the two times frequency power wave of GSC’s ac side effectively. The voltage of DC bus can be maintained a constant to ensure the uninterrupted operation of DFIG under unbalanced grid condition eventually.

  11. Evaluation of a Multicolor, Single-Tube Technique To Enumerate Lymphocyte Subpopulations▿

    PubMed Central

    Colombo, F.; Cattaneo, A.; Lopa, R.; Portararo, P.; Rebulla, P.; Porretti, L.

    2008-01-01

    To evaluate the fully automated FACSCanto software, we compared lymphocyte subpopulation counts obtained using three-color FACSCalibur-CELLQuest and six-color FACSCanto-FACSCanto software techniques. High correlations were observed between data obtained with these techniques. Our study indicated that FACSCanto clinical software is accurate and sensitive in single-platform lymphocyte immunophenotyping. PMID:18448621

  12. Transient Faults in Computer Systems

    NASA Technical Reports Server (NTRS)

    Masson, Gerald M.

    1993-01-01

    A powerful technique particularly appropriate for the detection of errors caused by transient faults in computer systems was developed. The technique can be implemented in either software or hardware; the research conducted thus far primarily considered software implementations. The error detection technique developed has the distinct advantage of having provably complete coverage of all errors caused by transient faults that affect the output produced by the execution of a program. In other words, the technique does not have to be tuned to a particular error model to enhance error coverage. Also, the correctness of the technique can be formally verified. The technique uses time and software redundancy. The foundation for an effective, low-overhead, software-based certification trail approach to real-time error detection resulting from transient fault phenomena was developed.

  13. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  14. Infusing Software Assurance Research Techniques into Use

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas; DiVito, Ben; Feather, Martin S.; Hinchey, Michael; Markosian, Lawrence; Trevino, Luis C.

    2006-01-01

    Research in the software engineering community continues to lead to new development techniques that encompass processes, methods and tools. However, a number of obstacles impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may benefit them, and cannot afford to risk time and effort evaluating and trying one out while there remains uncertainty about whether it will work for them. Researchers cannot readily identify the practitioners whose problems would be amenable to their techniques, and, lacking feedback from practical applications, are hard-pressed to gauge the where and in what ways to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team established by NASA s Software Engineering Initiative to overcome these obstacles. .

  15. Use of doubly labeled water technique in soldiers training for jungle warfare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forbes-Ewan, C.H.; Morrissey, B.L.; Gregg, G.C.

    1989-07-01

    The doubly labeled water method was used to estimate the energy expended by four members of an Australian Army platoon (34 soldiers) engaged in training for jungle warfare. Each subject received an oral isotope dose sufficient to raise isotope levels by 200-250 ({sup 18}O) and 100-120 ppm ({sup 2}H). The experimental period was 7 days. Concurrently, a factorial estimate of the energy expenditure of the platoon was conducted. Also, a food intake-energy balance study was conducted for the platoon. Mean daily energy expenditure by the doubly labeled water method was 4,750 kcal (range 4,152-5,394 kcal). The factorial estimate of meanmore » daily energy expenditure was 4,535 kcal. Because of inherent inaccuracies in the food intake-energy balance technique, we were able to conclude only that energy expenditure, as measured by this method, was greater than the estimated mean daily intake of 4,040 kcal. The doubly labeled water technique was well tolerated, is noninvasive, and appears to be suitable in a wide range of field applications.« less

  16. A Human Reliability Based Usability Evaluation Method for Safety-Critical Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillippe Palanque; Regina Bernhaupt; Ronald Boring

    2006-04-01

    Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been donemore » to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.« less

  17. FY 2002 Report on Software Visualization Techniques for IV and V

    NASA Technical Reports Server (NTRS)

    Fotta, Michael E.

    2002-01-01

    One of the major challenges software engineers often face in performing IV&V is developing an understanding of a system created by a development team they have not been part of. As budgets shrink and software increases in complexity, this challenge will become even greater as these software engineers face increased time and resource constraints. This research will determine which current aspects of providing this understanding (e.g., code inspections, use of control graphs, use of adjacency matrices, requirements traceability) are critical to the performing IV&V and amenable to visualization techniques. We will then develop state-of-the-art software visualization techniques to facilitate the use of these aspects to understand software and perform IV&V.

  18. A Secure and Robust Approach to Software Tamper Resistance

    NASA Astrophysics Data System (ADS)

    Ghosh, Sudeep; Hiser, Jason D.; Davidson, Jack W.

    Software tamper-resistance mechanisms have increasingly assumed significance as a technique to prevent unintended uses of software. Closely related to anti-tampering techniques are obfuscation techniques, which make code difficult to understand or analyze and therefore, challenging to modify meaningfully. This paper describes a secure and robust approach to software tamper resistance and obfuscation using process-level virtualization. The proposed techniques involve novel uses of software check summing guards and encryption to protect an application. In particular, a virtual machine (VM) is assembled with the application at software build time such that the application cannot run without the VM. The VM provides just-in-time decryption of the program and dynamism for the application's code. The application's code is used to protect the VM to ensure a level of circular protection. Finally, to prevent the attacker from obtaining an analyzable snapshot of the code, the VM periodically discards all decrypted code. We describe a prototype implementation of these techniques and evaluate the run-time performance of applications using our system. We also discuss how our system provides stronger protection against tampering attacks than previously described tamper-resistance approaches.

  19. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  20. Techniques for development of safety-related software for surgical robots.

    PubMed

    Varley, P

    1999-12-01

    Regulatory bodies require evidence that software controlling potentially hazardous devices is developed to good manufacturing practices. Effective techniques used in other industries assume long timescales and high staffing levels and can be unsuitable for use without adaptation in developing electronic healthcare devices. This paper discusses a set of techniques used in practice to develop software for a particular innovative medical product, an endoscopic camera manipulator. These techniques include identification of potential hazards and tracing their mitigating factors through the project lifecycle.

  1. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  2. Sexual behavior and perceived risk of HIV/AIDS among young migrant factory workers in Nepal.

    PubMed

    Puri, M; Cleland, J

    2006-03-01

    To analyze the sexual behavior, perceived risk of contracting STIs and HIV/AIDS, and protective behaviors of migrant workers aged 14-19 years in carpet and garment factories in the Kathmandu Valley, Nepal. A common assumption in Nepal is that young migrant workers experience an increase in vulnerability. Moving away from the social controls of family and community, they become exposed to a mixed-gender environment and therefore might initiate sex earlier or have more casual encounters than might otherwise be the case. The analysis is based on a representative sample survey of 1050 factory workers. Information was also obtained from 23 in-depth case histories. Both bivarite and multivariate techniques were applied to identify the factors associated with involvement in risky sexual behavior. Despite religious and cultural restrictions, one in five boys and one in eight unmarried girls reported experience of sexual intercourse. Early sexual experimentation, multiple partners, and low and irregular use of condoms are not uncommon. Instances of sexual exploitation by factory owners or managers were documented but were rare. Most nonregular sex partners were described as friends from the same factory or community. Despite high-risk behavior, relatively few young people considered themselves to be at risk of getting STIs or HIV/AIDS. Information on the possible consequences of unsafe sex is inadequate. Programs aimed at promotion of safer sex practices and life skill training that facilitates communication and utilization of sexual health services should target vulnerable migrant young people.

  3. Study on a novel laser target detection system based on software radio technique

    NASA Astrophysics Data System (ADS)

    Song, Song; Deng, Jia-hao; Wang, Xue-tian; Gao, Zhen; Sun, Ji; Sun, Zhi-hui

    2008-12-01

    This paper presents that software radio technique is applied to laser target detection system with the pseudo-random code modulation. Based on the theory of software radio, the basic framework of the system, hardware platform, and the implementation of the software system are detailed. Also, the block diagram of the system, DSP circuit, block diagram of the pseudo-random code generator, and soft flow diagram of signal processing are designed. Experimental results have shown that the application of software radio technique provides a novel method to realize the modularization, miniaturization and intelligence of the laser target detection system, and the upgrade and improvement of the system will become simpler, more convenient, and cheaper.

  4. Software Health Management: A Short Review of Challenges and Existing Techniques

    NASA Technical Reports Server (NTRS)

    Pipatsrisawat, Knot; Darwiche, Adnan; Mengshoel, Ole J.; Schumann, Johann

    2009-01-01

    Modern spacecraft (as well as most other complex mechanisms like aircraft, automobiles, and chemical plants) rely more and more on software, to a point where software failures have caused severe accidents and loss of missions. Software failures during a manned mission can cause loss of life, so there are severe requirements to make the software as safe and reliable as possible. Typically, verification and validation (V&V) has the task of making sure that all software errors are found before the software is deployed and that it always conforms to the requirements. Experience, however, shows that this gold standard of error-free software cannot be reached in practice. Even if the software alone is free of glitches, its interoperation with the hardware (e.g., with sensors or actuators) can cause problems. Unexpected operational conditions or changes in the environment may ultimately cause a software system to fail. Is there a way to surmount this problem? In most modern aircraft and many automobiles, hardware such as central electrical, mechanical, and hydraulic components are monitored by IVHM (Integrated Vehicle Health Management) systems. These systems can recognize, isolate, and identify faults and failures, both those that already occurred as well as imminent ones. With the help of diagnostics and prognostics, appropriate mitigation strategies can be selected (replacement or repair, switch to redundant systems, etc.). In this short paper, we discuss some challenges and promising techniques for software health management (SWHM). In particular, we identify unique challenges for preventing software failure in systems which involve both software and hardware components. We then present our classifications of techniques related to SWHM. These classifications are performed based on dimensions of interest to both developers and users of the techniques, and hopefully provide a map for dealing with software faults and failures.

  5. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    PubMed Central

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  6. The simulation library of the Belle II software system

    NASA Astrophysics Data System (ADS)

    Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.

    2017-10-01

    SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.

  7. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  8. The NASA Software Research Infusion Initiative: Successful Technology Transfer for Software Assurance

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.

    2006-01-01

    New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.

  9. Protective Coatings for Metals

    NASA Technical Reports Server (NTRS)

    Ruggieri, D. J.; Rowe, A. P.

    1986-01-01

    Report evaluates protective coatings for metal structures in seashore and acid-cloud environments. Evaluation result of study of coating application characteristics, repair techniques, and field performance. Products from variety of manufacturers included in study. Also factory-coated panels and industrial galvanized panels with and without topcoats.

  10. Industrial Internship and Entrepreneurship Competencies on Vocational High School Students

    NASA Astrophysics Data System (ADS)

    Wendi, H. F.; Kusumah, I. H.

    2018-02-01

    The purpose of the research is to explore the influence of internship and vocational skill to student’s entrepreneurship competencies. The research used ex post facto approach. Population in this research is all students of Vocational High School in Bandung, Indonesia. The sample of 40 respondents was determined by proportional random sampling technique. The data were collected by instrument questionnaire and test. Data analysis used descriptive statistics and multiple linear regression. The results show that almost half students have a low the competencies of an entrepreneur. The hypothesis testing shows many the influence of factory teaching has a positive and significant effect on the competencies of an entrepreneur. Similarly, vocational skills have positive influence and significant on the competencies of an entrepreneur. Respectively, the influence of factory teaching and vocational skills expertise collectively have the effect on the competencies of an entrepreneur. Therefore, the influence of factory teaching and vocational skills are effective to student’s entrepreneurship cap competencies.

  11. Artificial intelligence approaches to software engineering

    NASA Technical Reports Server (NTRS)

    Johannes, James D.; Macdonald, James R.

    1988-01-01

    Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.

  12. Design Features of Pedagogically-Sound Software in Mathematics.

    ERIC Educational Resources Information Center

    Haase, Howard; And Others

    Weaknesses in educational software currently available in the domain of mathematics are discussed. A technique that was used for the design and production of mathematics software aimed at improving problem-solving skills which combines sound pedagogy and innovative programming is presented. To illustrate the design portion of this technique, a…

  13. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  14. de novo computational enzyme design.

    PubMed

    Zanghellini, Alexandre

    2014-10-01

    Recent advances in systems and synthetic biology as well as metabolic engineering are poised to transform industrial biotechnology by allowing us to design cell factories for the sustainable production of valuable fuels and chemicals. To deliver on their promises, such cell factories, as much as their brick-and-mortar counterparts, will require appropriate catalysts, especially for classes of reactions that are not known to be catalyzed by enzymes in natural organisms. A recently developed methodology, de novo computational enzyme design can be used to create enzymes catalyzing novel reactions. Here we review the different classes of chemical reactions for which active protein catalysts have been designed as well as the results of detailed biochemical and structural characterization studies. We also discuss how combining de novo computational enzyme design with more traditional protein engineering techniques can alleviate the shortcomings of state-of-the-art computational design techniques and create novel enzymes with catalytic proficiencies on par with natural enzymes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Web-Based Smoking-Cessation Program

    PubMed Central

    Strecher, Victor J.; McClure, Jennifer B.; Alexander, Gwen L.; Chakraborty, Bibhas; Nair, Vijay N.; Konkel, Janine M.; Greene, Sarah M.; Collins, Linda M.; Carlier, Carola C.; Wiese, Cheryl J.; Little, Roderick J.; Pomerleau, Cynthia S.; Pomerleau, Ovide F.

    2009-01-01

    Background Initial trials of web-based smoking-cessation programs have generally been promising. The active components of these programs, however, are not well understood. This study aimed to (1) identify active psychosocial and communication components of a web-based smoking-cessation intervention and (2) examine the impact of increasing the tailoring depth on smoking cessation. Design Randomized fractional factorial design. Setting Two HMOs: Group Health in Washington State and Henry Ford Health System in Michigan. Participants 1866 smokers. Intervention A web-based smoking-cessation program plus nicotine patch. Five components of the intervention were randomized using a fractional factorial design: high- versus low-depth tailored success story, outcome expectation, and efficacy expectation messages; high- versus low-personalized source; and multiple versus single exposure to the intervention components. Measurements Primary outcome was 7 day point-prevalence abstinence at the 6-month follow-up. Findings Abstinence was most influenced by high-depth tailored success stories and a high-personalized message source. The cumulative assignment of the three tailoring depth factors also resulted in increasing the rates of 6-month cessation, demonstrating an effect of tailoring depth. Conclusions The study identified relevant components of smoking-cessation interventions that should be generalizable to other cessation interventions. The study also demonstrated the importance of higher-depth tailoring in smoking-cessation programs. Finally, the use of a novel fractional factorial design allowed efficient examination of the study aims. The rapidly changing interfaces, software, and capabilities of eHealth are likely to require such dynamic experimental approaches to intervention discovery. PMID:18407003

  16. Optimization and influence of parameter affecting the compressive strength of geopolymer concrete containing recycled concrete aggregate: using full factorial design approach

    NASA Astrophysics Data System (ADS)

    Krishnan, Thulasirajan; Purushothaman, Revathi

    2017-07-01

    There are several parameters that influence the properties of geopolymer concrete, which contains recycled concrete aggregate as the coarse aggregate. In the present study, the vital parameters affecting the compressive strength of geopolymer concrete containing recycled concrete aggregate are analyzedby varying four parameters with two levels using full factorial design in statistical software Minitab® 17. The objective of the present work is to gain an idea on the optimization, main parameter effects, their interactions and the predicted response of the model generated using factorial design. The parameters such as molarity of sodium hydroxide (8M and 12M), curing time (6hrs and 24 hrs), curing temperature (60°C and 90°C) and percentage of recycled concrete aggregate (0% and 100%) are considered. The results show that the curing time, molarity of sodium hydroxide and curing temperature were the orderly significant parameters and the percentage of Recycled concrete aggregate (RCA) was statistically insignificant in the production of geopolymer concrete. Thus, it may be noticeable that the RCA content had negligible effect on the compressive strength of geopolymer concrete. The expected responses from the generated model showed a satisfactory and rational agreement to the experimental data with the R2 value of 97.70%. Thus, geopolymer concrete comprising recycled concrete aggregate can solve the major social and environmental concerns such as the depletion of the naturally available aggregate sources and disposal of construction and demolition waste into the landfill.

  17. Factors that influence the tribocharging of pulverulent materials in compressed-air devices

    NASA Astrophysics Data System (ADS)

    Das, S.; Medles, K.; Mihalcioiu, A.; Beleca, R.; Dragan, C.; Dascalescu, L.

    2008-12-01

    Tribocharging of pulverulent materials in compressed-air devices is a typical multi-factorial process. This paper aims at demonstrating the interest of using the design of experiments methodology in association with virtual instrumentation for quantifying the effects of various process varaibles and of their interactions, as a prerequisite for the development of new tribocharging devices for industrial applications. The study is focused on the tribocharging of PVC powders in compressed-air devices similar to those employed in electrostatic painting. A classical 2 full-factorial design (3 factors at two levels) was employed for conducting the experiments. The response function was the charge/mass ratio of the material collected in a modified Faraday cage, at the exit of the tribocharging device. The charge/mass ratio was found to increase with the injection pressure and the vortex pressure in the tribocharging device, and to decrease with the increasing of the feed rate. In the present study an in-house design of experiments software was employed for statistical analysis of experimental data and validation of the experimental model.

  18. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  19. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  20. Proactive Security Testing and Fuzzing

    NASA Astrophysics Data System (ADS)

    Takanen, Ari

    Software is bound to have security critical flaws, and no testing or code auditing can ensure that software is flaw-less. But software security testing requirements have improved radically during the past years, largely due to criticism from security conscious consumers and Enterprise customers. Whereas in the past, security flaws were taken for granted (and patches were quietly and humbly installed), they now are probably one of the most common reasons why people switch vendors or software providers. The maintenance costs from security updates often add to become one of the biggest cost items to large Enterprise users. Fortunately test automation techniques have also improved. Techniques like model-based testing (MBT) enable efficient generation of security tests that reach good confidence levels in discovering zero-day mistakes in software. This technique is called fuzzing.

  1. Displacement of screw-retained splinted and nonsplinted restorations into implants with conical internal connections.

    PubMed

    Yilmaz, Burak; Seidt, Jeremy D; Clelland, Nancy L

    2014-01-01

    Variable abutment displacement could potentially affect proximal contacts, incisal edge position, or occlusion of implant-supported prostheses. This study aimed to measure and compare displacements of splinted and nonsplinted restorations into implants featuring internal conical connections as screws were tightened by hand or by torque driver. A stereolithic resin model was printed using computed tomography data from a patient missing mandibular left first and second molars. Two 5.0 × 11-mm implants were placed in the edentulous site using a surgical guide. Two sets (splinted and nonsplinted) of gold screw-retained prostheses were made indirectly to fit the implants in the stereolithic model representing the patient. The axial position of the crowns relative to a fixed location on the model was recorded following hand tightening using the three-dimensional image correlation technique and image correlation software. A pair of high-resolution digital cameras provided a synchronized view of the model during the experiment. Relative crown positions were again recorded after tightening with a torque driver to 25 Ncm. Testing was repeated randomly three times for each set of crowns. Displacement data after torque tightening were compared using a factorial analysis of variance with JMP 9.0 software (SAS) followed by a Tukey-Kramer post hoc test (α = .05). Interproximal contacts were evaluated using an 8-μm tin foil shim after tightening by hand and torque driver. Displacements for splinted and nonsplinted restorations differed only in a buccal direction. The nonsplinted crowns displaced significantly more than splinted crowns. Discernible differences were observed for the tin foil shim when dragged through proximal contacts following hand versus torque tightening. Differences between screw tightening by hand or torque driver should be taken into consideration during laboratory and clinical adjustments to prevent esthetic and functional complications.

  2. Spacecraft Avionics Software Development Then and Now: Different but the Same

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Garman, John (Jack); Vice, Jason

    2012-01-01

    NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA s historic Software Production Facility (SPF) was developed to serve complex avionics software solutions during an era dominated by mainframes, tape drives, and lower level programming languages. These systems have proven themselves resilient enough to serve the Shuttle Orbiter Avionics life cycle for decades. The SPF and its predecessor the Software Development Lab (SDL) at NASA s Johnson Space Center (JSC) hosted flight software (FSW) engineering, development, simulation, and test. It was active from the beginning of Shuttle Orbiter development in 1972 through the end of the shuttle program in the summer of 2011 almost 40 years. NASA s Kedalion engineering analysis lab is on the forefront of validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA s heritage culture in avionics software engineering. Kedalion has validated many of the Orion project s HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics environment, inserting new techniques and skills into the Multi-Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, COTS products, early rapid prototyping, in-house expertise and tools, and customer collaboration, NASA has adopted a cost effective paradigm that is currently serving Orion effectively. This paper will explore and contrast differences in technology employed over the years of NASA s space program, due largely to technological advances in hardware and software systems, while acknowledging that the basic software engineering and integration paradigms share many similarities.

  3. Evolution of cooperative behavior in simulation agents

    NASA Astrophysics Data System (ADS)

    Stroud, Phillip D.

    1998-03-01

    A simulated automobile factory paint shop is used as a testbed for exploring the emulation of human decision-making behavior. A discrete-events simulation of the paint shop as a collection of interacting Java actors is described. An evolutionary cognitive architecture is under development for building software actors to emulate humans in simulations of human- dominated complex systems. In this paper, the cognitive architecture is extended by implementing a persistent population of trial behaviors with an incremental fitness valuation update strategy, and by allowing a group of cognitive actors to share information. A proof-of-principle demonstration is presented.

  4. Software Defined GPS API: Development and Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations

    DTIC Science & Technology

    2014-05-18

    intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques...with the intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved...intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques to

  5. A systematic approach to designing statistically powerful heteroscedastic 2 × 2 factorial studies while minimizing financial costs.

    PubMed

    Jan, Show-Li; Shieh, Gwowen

    2016-08-31

    The 2 × 2 factorial design is widely used for assessing the existence of interaction and the extent of generalizability of two factors where each factor had only two levels. Accordingly, research problems associated with the main effects and interaction effects can be analyzed with the selected linear contrasts. To correct for the potential heterogeneity of variance structure, the Welch-Satterthwaite test is commonly used as an alternative to the t test for detecting the substantive significance of a linear combination of mean effects. This study concerns the optimal allocation of group sizes for the Welch-Satterthwaite test in order to minimize the total cost while maintaining adequate power. The existing method suggests that the optimal ratio of sample sizes is proportional to the ratio of the population standard deviations divided by the square root of the ratio of the unit sampling costs. Instead, a systematic approach using optimization technique and screening search is presented to find the optimal solution. Numerical assessments revealed that the current allocation scheme generally does not give the optimal solution. Alternatively, the suggested approaches to power and sample size calculations give accurate and superior results under various treatment and cost configurations. The proposed approach improves upon the current method in both its methodological soundness and overall performance. Supplementary algorithms are also developed to aid the usefulness and implementation of the recommended technique in planning 2 × 2 factorial designs.

  6. Computer experimental analysis of the CHP performance of a 100 kW e SOFC Field Unit by a factorial design

    NASA Astrophysics Data System (ADS)

    Calì, M.; Santarelli, M. G. L.; Leone, P.

    Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.

  7. Using Combined SFTA and SFMECA Techniques for Space Critical Software

    NASA Astrophysics Data System (ADS)

    Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.

    2012-01-01

    This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.

  8. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  9. Using Symbolic-Logic Matrices To Improve Confirmatory Factor Analysis Techniques.

    ERIC Educational Resources Information Center

    Creighton, Theodore B.; Coleman, Donald G.; Adams, R. C.

    A continuing and vexing problem associated with survey instrument development is the creation of items, initially, that correlate favorably a posteriori with constructs being measured. This study tests the use of symbolic-logic matrices developed by D. G. Coleman (1979) in creating factorially "pure" statistically discrete constructs in…

  10. Artistic Tasks Outperform Nonartistic Tasks for Stress Reduction

    ERIC Educational Resources Information Center

    Abbott, Kayleigh A.; Shanahan, Matthew J.; Neufeld, Richard W. J.

    2013-01-01

    Art making has been documented as an effective stress reduction technique. In this between-subjects experimental study, possible mechanisms of stress reduction were examined in a sample of 52 university students randomly assigned to one of four conditions generated by factorially crossing Activity Type (artistic or nonartistic) with Coping…

  11. Software Engineering Education Directory

    DTIC Science & Technology

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  12. RT-Syn: A real-time software system generator

    NASA Technical Reports Server (NTRS)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  13. Quality measures and assurance for AI (Artificial Intelligence) software

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.

  14. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  15. The environmental control and life support system advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  16. ISWHM: Tools and Techniques for Software and System Health Management

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  17. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  18. Development of a New VLBI Data Analysis Software

    NASA Technical Reports Server (NTRS)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  19. An implementation and performance measurement of the progressive retry technique

    NASA Technical Reports Server (NTRS)

    Suri, Gaurav; Huang, Yennun; Wang, Yi-Min; Fuchs, W. Kent; Kintala, Chandra

    1995-01-01

    This paper describes a recovery technique called progressive retry for bypassing software faults in message-passing applications. The technique is implemented as reusable modules to provide application-level software fault tolerance. The paper describes the implementation of the technique and presents results from the application of progressive retry to two telecommunications systems. the results presented show that the technique is helpful in reducing the total recovery time for message-passing applications.

  20. How Do I Sample the Environment and Equipment?

    NASA Astrophysics Data System (ADS)

    Kornacki, Jeffrey L.

    Food product contamination from the post-processing environment is likely the most frequent cause of contaminated processed food product recalls and a significant source of poisoning outbreaks, and shelf life problems in North America with processed Ready-To-Eat foods. Conditions exist for the growth of microorganisms in most food processing factories. Failure to clean and effectively sanitize a microbial growth niche can lead to biofilm formation. Biofilms may be orders of magnitude more resistant to destruction by sanitizers. Cells in some biofilms have been shown to be 1,000 times more resistant to destruction than those which are freely suspended. This has implications for cleaning, sanitizing, sampling, and training. Sampling the factory environment is one means of monitoring the efficacy of microbiological control as well as a powerful tool for in-factory contamination investigation. Many sampling techniques exist and are discussed. It is important to recognize the difference between cleaning (removal of soil) and sanitization (reduction of microbial populations). Knowing where, when, and how to sample, how many samples to take, and what to test for and how to interpret test information is critical in finding and preventing contamination.

  1. Summary and recommendations. [reduced gravitational effects on materials manufactured in space

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An economic analysis using econometric and cost benefit analysis techniques was performed to determine the feasibility of space processing of certain products. The overall objectives of the analysis were (1) to determine specific products or processes uniquely connected with space manufacturing, (2) to select a specific product or process from each of the areas of semiconductors, metals, and biochemicals, and (3) to determine the overall price/cost structure of each product or process considered. The economic elements of the analysis involved a generalized decision making format for analyzing space manufacturing, a comparative cost study of the selected processes in space vs. earth manufacturing, and a supply and demand study of the economic relationships of one of the manufacturing processes. Space processing concepts were explored. The first involved the use of the shuttle as the factory with all operations performed during individual flights. The second concept involved a permanent unmanned space factory which would be launched separately. The shuttle in this case would be used only for maintenance and refurbishment. Finally, some consideration was given to a permanent manned space factory.

  2. An empirical research on customer satisfaction study: a consideration of different levels of performance.

    PubMed

    Lee, Yu-Cheng; Wang, Yu-Che; Lu, Shu-Chiung; Hsieh, Yi-Fang; Chien, Chih-Hung; Tsai, Sang-Bing; Dong, Weiwei

    2016-01-01

    Customer satisfaction is the key factor for successful and depends highly on the behaviors of frontline service providers. Customers should be managed as assets, and that customers vary in their needs, preferences, and buying behavior. This study applied the Taiwan Customer Satisfaction Index model to a tourism factory to analyze customer satisfaction and loyalty. We surveyed 242 customers served by one tourism factory organizations in Taiwan. A partial least squares was performed to analyze and test the theoretical model. The results show that perceived quality had the greatest influence on the customer satisfaction for satisfied and dissatisfied customers. In addition, in terms of customer loyalty, the customer satisfaction is more important than image for satisfied and dissatisfied customers. The contribution of this paper is to propose two satisfaction levels of CSI models for analyzing customer satisfaction and loyalty, thereby helping tourism factory managers improve customer satisfaction effectively. Compared with traditional techniques, we believe that our method is more appropriate for making decisions about allocating resources and for assisting managers in establishing appropriate priorities in customer satisfaction management.

  3. Measurement of Employability Skills on Teaching Factory Learning

    NASA Astrophysics Data System (ADS)

    Subekti, S.; Ana, A.

    2018-02-01

    Vocational High Schools as one of the educational institutions that has the responsibility in preparing skilled labors has a challenge to improve the quality of human resources as a candidate for skilled labors, to compete and survive in a changing climate of work. BPS noted an increase in the number of non-worker population (BAK) in 2015-2017 on vocational graduates as many as 564,272 people. The ability to adapt and maintain jobs in a variety of conditions is called employability skills. This study purpose to measure the development of employability skills of communication skills, problem-solving skills and teamwork skills on the implementation of teaching factory learning in SMK Negeri 1 Cibadak, THPH Skills Program on bakery competency. This research uses mixed method, with concurrent triangulation mix methods research design. Data collection techniques used interviews and questionnaires. The result shows that there are increasing students’ employability skills in communication skills, problem solving skills, and teamwork skills in teaching factory learning. Principles of learning that apply learning by doing student centering and learning arrangements such as situations and conditions in the workplace have an impact on improving student employability skills.

  4. Engineering Translation in Mammalian Cell Factories to Increase Protein Yield: The Unexpected Use of Long Non-Coding SINEUP RNAs.

    PubMed

    Zucchelli, Silvia; Patrucco, Laura; Persichetti, Francesca; Gustincich, Stefano; Cotella, Diego

    2016-01-01

    Mammalian cells are an indispensable tool for the production of recombinant proteins in contexts where function depends on post-translational modifications. Among them, Chinese Hamster Ovary (CHO) cells are the primary factories for the production of therapeutic proteins, including monoclonal antibodies (MAbs). To improve expression and stability, several methodologies have been adopted, including methods based on media formulation, selective pressure and cell- or vector engineering. This review presents current approaches aimed at improving mammalian cell factories that are based on the enhancement of translation. Among well-established techniques (codon optimization and improvement of mRNA secondary structure), we describe SINEUPs, a family of antisense long non-coding RNAs that are able to increase translation of partially overlapping protein-coding mRNAs. By exploiting their modular structure, SINEUP molecules can be designed to target virtually any mRNA of interest, and thus to increase the production of secreted proteins. Thus, synthetic SINEUPs represent a new versatile tool to improve the production of secreted proteins in biomanufacturing processes.

  5. Advances in metabolic pathway and strain engineering paving the way for sustainable production of chemical building blocks.

    PubMed

    Chen, Yun; Nielsen, Jens

    2013-12-01

    Bio-based production of chemical building blocks from renewable resources is an attractive alternative to petroleum-based platform chemicals. Metabolic pathway and strain engineering is the key element in constructing robust microbial chemical factories within the constraints of cost effective production. Here we discuss how the development of computational algorithms, novel modules and methods, omics-based techniques combined with modeling refinement are enabling reduction in development time and thus advance the field of industrial biotechnology. We further discuss how recent technological developments contribute to the development of novel cell factories for the production of the building block chemicals: adipic acid, succinic acid and 3-hydroxypropionic acid. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Optimization of the Critical Parameters of the Spherical Agglomeration Crystallization Method by the Application of the Quality by Design Approach.

    PubMed

    Gyulai, Orsolya; Kovács, Anita; Sovány, Tamás; Csóka, Ildikó; Aigner, Zoltán

    2018-04-20

    This research work presents the use of the Quality by Design (QbD) concept for optimization of the spherical agglomeration crystallization method in the case of the active agent, ambroxol hydrochloride (AMB HCl). AMB HCl spherical crystals were formulated by the spherical agglomeration method, which was applied as an antisolvent technique. Spherical crystals have good flowing properties, which makes the direct compression tableting method applicable. This means that the amount of additives used can be reduced and smaller tablets can be formed. For the risk assessment, LeanQbD Software was used. According to its results, four independent variables (mixing type and time, dT (temperature difference between solvent and antisolvent), and composition (solvent/antisolvent volume ratio)) and three dependent variables (mean particle size, aspect ratio, and roundness) were selected. Based on these, a 2⁻3 mixed-level factorial design was constructed, crystallization was accomplished, and the results were evaluated using Statistica for Windows 13 program. Product assay was performed and it was revealed that improvements in the mean particle size (from ~13 to ~200 µm), roundness (from ~2.4 to ~1.5), aspect ratio (from ~1.7 to ~1.4), and flow properties were observed while polymorphic transitions were avoided.

  7. Statistical Modelling of Temperature and Moisture Uptake of Biochars Exposed to Selected Relative Humidity of Air.

    PubMed

    Bastistella, Luciane; Rousset, Patrick; Aviz, Antonio; Caldeira-Pires, Armando; Humbert, Gilles; Nogueira, Manoel

    2018-02-09

    New experimental techniques, as well as modern variants on known methods, have recently been employed to investigate the fundamental reactions underlying the oxidation of biochar. The purpose of this paper was to experimentally and statistically study how the relative humidity of air, mass, and particle size of four biochars influenced the adsorption of water and the increase in temperature. A random factorial design was employed using the intuitive statistical software Xlstat. A simple linear regression model and an analysis of variance with a pairwise comparison were performed. The experimental study was carried out on the wood of Quercus pubescens , Cyclobalanopsis glauca , Trigonostemon huangmosun , and Bambusa vulgaris , and involved five relative humidity conditions (22, 43, 75, 84, and 90%), two mass samples (0.1 and 1 g), and two particle sizes (powder and piece). Two response variables including water adsorption and temperature increase were analyzed and discussed. The temperature did not increase linearly with the adsorption of water. Temperature was modeled by nine explanatory variables, while water adsorption was modeled by eight. Five variables, including factors and their interactions, were found to be common to the two models. Sample mass and relative humidity influenced the two qualitative variables, while particle size and biochar type only influenced the temperature.

  8. Mobile shearography in applications

    NASA Astrophysics Data System (ADS)

    Kalms, Michael

    2007-09-01

    Modern optical methods such as digital shearography have attracted interest not only for laboratory investigations but also for applications on the factory floor because they can be sensitive, accurate, non-tactile and non-destructive. Optical inspection and measurement systems are more and more used in the entire manufacturing process. Shearography as a coherent optical method has been widely accepted as a useful NDT tool. It is a robust interferometric method to determine locations with maximum stress on various material structures. However, limitations of this technique can be found in the bulky equipment components, the interpretation of the complex shearographic result images and a barely solvable challenge at the work with difficult surfaces like dark absorbing or bright reflecting materials. We report a mobile shearography system that was especially designed for investigations at aircraft constructions. The great advantage of this system is the adjusted balance of all single elements to a complete measurement procedure integrated in a handy body. Only with the arrangement of all involved parameters like loading, laser source, sensor unit and software, it is feasible to get optimal measurement results. This paper describes a complete mobile shearographic procedure including loading and image processing facilities for structural testing and flaw recognition on aircrafts. The mobile system was successfully tested, e.g. with the up-to-date EADS multi-role combat aircraft Eurofighter.

  9. Constructing the Japanese version of the Maslach Burnout Inventory-Student Survey: Confirmatory factor analysis.

    PubMed

    Tsubakita, Takashi; Shimazaki, Kazuyo

    2016-01-01

    To examine the factorial validity of the Maslach Burnout Inventory-Student Survey, using a sample of 2061 Japanese university students majoring in the medical and natural sciences (67.9% male, 31.8% female; Mage  = 19.6 years, standard deviation = 1.5). The back-translated scale used unreversed items to assess inefficacy. The inventory's descriptive properties and Cronbach's alphas were calculated using SPSS software. The present authors compared fit indices of the null, one factor, and default three factor models via confirmatory factor analysis with maximum-likelihood estimation using AMOS software, version 21.0. Intercorrelations between exhaustion, cynicism, and inefficacy were relatively higher than in prior studies. Cronbach's alphas were 0.76, 0.85, and 0.78, respectively. Although fit indices of the hypothesized three factor model did not meet the respective criteria, the model demonstrated better fit than did the null and one factor models. The present authors added four paths between error variables within items, but the modified model did not show satisfactory fit. Subsequent analysis revealed that a bi-factor model fit the data better than did the hypothesized or modified three factor models. The Japanese version of the Maslach Burnout Inventory-Student Survey needs minor changes to improve the fit of its three factor model, but the scale as a whole can be used to adequately assess overall academic burnout in Japanese university students. Although the scale was back-translated, two items measuring exhaustion whose expressions overlapped should be modified, and all items measuring inefficacy should be reversed in order to statistically clarify the factorial difference between the scale's three factors. © 2015 The Authors. Japan Journal of Nursing Science © 2015 Japan Academy of Nursing Science.

  10. A study of software standards used in the avionics industry

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1994-01-01

    Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.

  11. Novel Application of Density Estimation Techniques in Muon Ionization Cooling Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohayai, Tanaz Angelina; Snopok, Pavel; Neuffer, David

    The international Muon Ionization Cooling Experiment (MICE) aims to demonstrate muon beam ionization cooling for the first time and constitutes a key part of the R&D towards a future neutrino factory or muon collider. Beam cooling reduces the size of the phase space volume occupied by the beam. Non-parametric density estimation techniques allow very precise calculation of the muon beam phase-space density and its increase as a result of cooling. These density estimation techniques are investigated in this paper and applied in order to estimate the reduction in muon beam size in MICE under various conditions.

  12. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  13. NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes

    NASA Technical Reports Server (NTRS)

    Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.

  14. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  15. Men's Preferences for Physical Activity Interventions: An Exploratory Study Using a Factorial Survey Design Created With R Software.

    PubMed

    Chatfield, Sheryl L; Gamble, Abigail; Hallam, Jeffrey S

    2018-03-01

    Effective exercise interventions are needed to improve quality of life and decrease the impact of chronic disease. Researchers suggest males have been underrepresented in exercise intervention studies, resulting in less understanding of their exercise practices. Findings from preference survey methods suggest reasonable association between preference and behavior. The purpose of the research described in this article was to use factorial survey, a preference method, to identify the characteristics of exercise interventions most likely to appeal to male participants, so preferences might be incorporated into future intervention research. The research was guided by the framework of Bandura's social cognitive theory, such that variations in individual, environmental, and behavioral factors were incorporated into vignettes. Participants included 53 adult male nonadministrative staff and contract employees at a public university in the Southeastern United States, who each scored 8 vignettes resulting in 423 observations. Multilevel models were used to assess the influence of the factors. Participants scored vignettes that included exercising with a single partner, playing basketball, and exercising in the evening higher than vignettes with other options. Qualitative analysis of an open response item identified additional alternatives in group size, participant desire for coaching support, and interest in programs that incorporate a range of activity alternatives. Findings from this research were consistent with elements of social cognitive theory as applied to health promotion. Factorial surveys potentially provide a resource effective means of identifying participants' preferences for use when planning interventions. The addition of a single qualitative item helped clarify and expand findings from statistical analysis.

  16. Putting Automated Visual Inspection Systems To Work On The Factory Floor: What's Missing?

    NASA Astrophysics Data System (ADS)

    Waltz, Frederick M.; Snyder, Michael A.; Batchelor, Bruce G.

    1990-02-01

    Machine vision systems and other automated visual inspection (AVI) systems have been proving their usefulness in factories for more than a decade. In spite of this, the number of installed systems is far below the number that could profitably be employed. In the opinion of the authors, the primary reason for this is the high cost of customizing vision systems to meet applications requirements. A three-part approach to this problem has proven to be useful: 1. A multi-phase paradigm for customer interaction, system specification, system development, and system installation; 2. A powerful and easy-to-use system development environment, including a a flexible laboratory lighting setup, plus software-based tools to assist in the design of image acquisition systems, b. an image processing environment with a very large repertoire of image processing and feature extraction operations and an easy-to-use command interpreter having macro capabilities, and c. an image analysis environment with high-level constructs, a flexible and powerful syntax, and a "seamless" interface to the image processing level; and 3. A moderately-priced high-speed "target" system fully compatible with the development environment, so that algorithms developed thereon can be transferred directly to the factory environment without further development costs or reprogramming. Items 1 and 2 are covered in other papers1,23,4,5 and are touched on here only briefly. Item 3 is the main subject of this paper. Our major motivation in presenting this paper is to offer suggestions to vendors developing commercial boards and systems, in hopes that the special needs of industrial inspection can be met.

  17. Software techniques for a distributed real-time processing system. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Lesh, F.; Lecoq, P.

    1976-01-01

    The paper describes software techniques developed for the Unified Data System (UDS), a distributed processor network for control and data handling onboard a planetary spacecraft. These techniques include a structured language for specifying the programs contained in each module, and a small executive program in each module which performs scheduling and implements the module task.

  18. The Loci Mnemonic Technique in Learning and Memory.

    ERIC Educational Resources Information Center

    Montague, William E.; Carter, John

    This study investigated the facilitative effects of the loci system using mental imagery for acquisition and recall within a retroactive inhibition (RI) paradigm. Fifty-five college undergraduates were randomly assigned to five treatment conditions. Four groups formed cells in a 2x2 factorial design which included (1) an RI factor (AB-AD vs.…

  19. Multilevel Factor Analyses of Family Data from the Hawai'i Family Study of Cognition

    ERIC Educational Resources Information Center

    McArdle, John J.; Hamagami, Fumiaki; Bautista, Randy; Onoye, Jane; Hishinuma, Earl S.; Prescott, Carol A.; Takeshita, Junji; Zonderman, Alan B.; Johnson, Ronald C.

    2014-01-01

    In this study, we reanalyzed the classic Hawai'i Family Study of Cognition (HFSC) data using contemporary multilevel modeling techniques. We used the HFSC baseline data ("N" = 6,579) and reexamined the factorial structure of 16 cognitive variables using confirmatory (restricted) measurement models in an explicit sequence. These models…

  20. Will Technological Convergence Reverse Globalization (Strategic Forum, Number 297)

    DTIC Science & Technology

    2016-07-01

    for counterfeiting high-value products they are contracted to produce. One technique is for the Chinese contractor to build a duplicate factory...www.meatinstitute.org/index. php? ht =d/sp/i/47465/pid/47465>. 66 “Soybean Meal Exports by Country in 1000 MT,” IndexMundi.com, available at <www.indexmundi.com

  1. Superwoman Schema: Using Structural Equation Modeling to Investigate Measurement Invariance in a Questionnaire

    ERIC Educational Resources Information Center

    Steed, Teneka C.

    2013-01-01

    Evaluating the psychometric properties of a newly developed instrument is critical to understanding how well an instrument measures what it intends to measure, and ensuring proposed use and interpretation of questionnaire scores are valid. The current study uses Structural Equation Modeling (SEM) techniques to examine the factorial structure and…

  2. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  3. Involuntary eye motion correction in retinal optical coherence tomography: Hardware or software solution?

    PubMed

    Baghaie, Ahmadreza; Yu, Zeyun; D'Souza, Roshan M

    2017-04-01

    In this paper, we review state-of-the-art techniques to correct eye motion artifacts in Optical Coherence Tomography (OCT) imaging. The methods for eye motion artifact reduction can be categorized into two major classes: (1) hardware-based techniques and (2) software-based techniques. In the first class, additional hardware is mounted onto the OCT scanner to gather information about the eye motion patterns during OCT data acquisition. This information is later processed and applied to the OCT data for creating an anatomically correct representation of the retina, either in an offline or online manner. In software based techniques, the motion patterns are approximated either by comparing the acquired data to a reference image, or by considering some prior assumptions about the nature of the eye motion. Careful investigations done on the most common methods in the field provides invaluable insight regarding future directions of the research in this area. The challenge in hardware-based techniques lies in the implementation aspects of particular devices. However, the results of these techniques are superior to those obtained from software-based techniques because they are capable of capturing secondary data related to eye motion during OCT acquisition. Software-based techniques on the other hand, achieve moderate success and their performance is highly dependent on the quality of the OCT data in terms of the amount of motion artifacts contained in them. However, they are still relevant to the field since they are the sole class of techniques with the ability to be applied to legacy data acquired using systems that do not have extra hardware to track eye motion. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  5. Software Aids for radiologists: Part 1, Useful Photoshop skills.

    PubMed

    Gross, Joel A; Thapa, Mahesh M

    2012-12-01

    The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.

  6. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  7. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  8. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    PubMed Central

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678

  9. Automated construction of node software using attributes in a ubiquitous sensor network environment.

    PubMed

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  10. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.

  11. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  12. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  13. Use of an embedded, micro-randomised trial to investigate non-compliance in telehealth interventions.

    PubMed

    Law, Lisa M; Edirisinghe, Nuwani; Wason, James Ms

    2016-08-01

    Many types of telehealth interventions rely on activity from the patient in order to have a beneficial effect on their outcome. Remote monitoring systems require the patient to record regular measurements at home, for example, blood pressure, so clinicians can see whether the patient's health changes over time and intervene if necessary. A big problem in this type of intervention is non-compliance. Most telehealth trials report compliance rates, but they rarely compare compliance among various options of telehealth delivery, of which there may be many. Optimising telehealth delivery is vital for improving compliance and, therefore, clinical outcomes. We propose a trial design which investigates ways of improving compliance. For efficiency, this trial is embedded in a larger trial for evaluating clinical effectiveness. It employs a technique called micro-randomisation, where individual patients are randomised multiple times throughout the study. The aims of this article are (1) to verify whether the presence of an embedded secondary trial still allows valid analysis of the primary research and (2) to demonstrate the usefulness of the micro-randomisation technique for comparing compliance interventions. Simulation studies were used to simulate a large number of clinical trials, in which no embedded trial was used, a micro-randomised embedded trial was used, and a factorial embedded trial was used. Each simulation recorded the operating characteristics of the primary and secondary trials. We show that the type I error rate of the primary analysis was not affected by the presence of an embedded secondary trial. Furthermore, we show that micro-randomisation is superior to a factorial design as it reduces the variation caused by within-patient correlation. It therefore requires smaller sample sizes - our simulations showed a requirement of 128 patients for a micro-randomised trial versus 760 patients for a factorial design, in the presence of within-patient correlation. We believe that an embedded, micro-randomised trial is a feasible technique that can potentially be highly useful in telehealth trials. © The Author(s) 2016.

  14. The CORONIS Trial. International study of caesarean section surgical techniques: a randomised fractional, factorial trial

    PubMed Central

    2007-01-01

    Background Caesarean section is one of the most commonly performed operations on women throughout the world. Rates have increased in recent years – about 20–25% in many developed countries. Rates in other parts of the world vary widely. A variety of surgical techniques for all elements of the caesarean section operation are in use. Many have not yet been rigorously evaluated in randomised controlled trials, and it is not known whether any are associated with better outcomes for women and babies. Because huge numbers of women undergo caesarean section, even small differences in post-operative morbidity rates between techniques could translate into improved health for substantial numbers of women, and significant cost savings. Design CORONIS is a multicentre, fractional, factorial randomised controlled trial and will be conducted in centres in Argentina, Ghana, India, Kenya, Pakistan and Sudan. Women are eligible if they are undergoing their first or second caesarean section through a transverse abdominal incision. Five comparisons will be carried out in one trial, using a 2 × 2 × 2 × 2 × 2 fractional factorial design. This design has rarely been used, but is appropriate for the evaluation of several procedures which will be used together in clinical practice. The interventions are: • Blunt versus sharp abdominal entry • Exteriorisation of the uterus for repair versus intra-abdominal repair • Single versus double layer closure of the uterus • Closure versus non-closure of the peritoneum (pelvic and parietal) • Chromic catgut versus Polyglactin-910 for uterine repair The primary outcome is death or maternal infectious morbidity (one or more of the following: antibiotic use for maternal febrile morbidity during postnatal hospital stay, antibiotic use for endometritis, wound infection or peritonitis) or further operative procedures; or blood transfusion. The sample size required is 15,000 women in total; at least 7,586 women in each comparison. Discussion Improvements in health from optimising caesarean section techniques are likely to be more significant in developing countries, because the rates of postoperative morbidity in these countries tend to be higher. More women could therefore benefit from improvements in techniques. Trial registration The CORONIS Trial is registered in the Current Controlled Trials registry. ISCRTN31089967. PMID:18336721

  15. Resolution V fractional factorial design for screening of factors affecting weakly basic drugs liposomal systems.

    PubMed

    Nageeb El-Helaly, Sara; Habib, Basant A; Abd El-Rahman, Mohamed K

    2018-07-01

    This study aims to investigate factors affecting weakly basic drugs liposomal systems. Resolution V fractional factorial design (2 V 5-1 ) is used as an example of screening designs that would better be used as a wise step before proceeding with detailed factors effects or optimization studies. Five factors probable to affect liposomal systems of weakly basic drugs were investigated using Amisulpride as a model drug. Factors studied were; A: Preparation technique B: Phosphatidyl choline (PhC) amount (mg) C: Cholesterol: PhC molar ratio, D: Hydration volume (ml) and E: Sonication type. Levels investigated were; Ammonium sulphate-pH gradient technique or Transmembrane zinc chelation-pH gradient technique, 200 or 400 mg, 0 or 0.5, 10 or 20 ml and bath or probe sonication for A, B, C, D and E respectively. Responses measured were Particle size (PS) (nm), Zeta potential (ZP) (mV) and Entrapment efficiency percent (EE%). Ion selective electrode was used as a novel method for measuring unentrapped drug concentration and calculating entrapment efficiency without the need for liposomal separation. Factors mainly affecting the studied responses were Cholesterol: PhC ratio and hydration volume for PS, preparation technique for ZP and preparation technique and hydration volume for EE%. The applied 2 V 5-1 design enabled the use of only 16 trial combinations for screening the influence of five factors on weakly basic drugs liposomal systems. This clarifies the value of the use of screening experiments before extensive investigation of certain factors in detailed optimization studies. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Determination of tailored filter sets to create rayfiles including spatial and angular resolved spectral information.

    PubMed

    Rotscholl, Ingo; Trampert, Klaus; Krüger, Udo; Perner, Martin; Schmidt, Franz; Neumann, Cornelius

    2015-11-16

    To simulate and optimize optical designs regarding perceived color and homogeneity in commercial ray tracing software, realistic light source models are needed. Spectral rayfiles provide angular and spatial varying spectral information. We propose a spectral reconstruction method with a minimum of time consuming goniophotometric near field measurements with optical filters for the purpose of creating spectral rayfiles. Our discussion focuses on the selection of the ideal optical filter combination for any arbitrary spectrum out of a given filter set by considering measurement uncertainties with Monte Carlo simulations. We minimize the simulation time by a preselection of all filter combinations, which bases on factorial design.

  17. Rapid recipe formulation for plasma etching of new materials

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali; Zhang, Zizhuo; Ekerdt, John; Bonnecaze, Roger T.

    2016-03-01

    A fast and inexpensive scheme for etch rate prediction using flexible continuum models and Bayesian statistics is demonstrated. Bulk etch rates of MgO are predicted using a steady-state model with volume-averaged plasma parameters and classical Langmuir surface kinetics. Plasma particle and surface kinetics are modeled within a global plasma framework using single component Metropolis Hastings methods and limited data. The accuracy of these predictions is evaluated with synthetic and experimental etch rate data for magnesium oxide in an ICP-RIE system. This approach is compared and superior to factorial models generated from JMP, a software package frequently employed for recipe creation and optimization.

  18. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  19. 2,4,6-Trinitrotoluene in soil and groundwater under a waste lagoon at the former Explosives Factory Maribyrnong (EFM), Victoria, Australia

    NASA Astrophysics Data System (ADS)

    Martel, Richard; Robertson, Timothy James; Doan, Minh Quan; Thiboutot, Sonia; Ampleman, Guy; Provatas, Arthur; Jenkins, Thomas

    2008-01-01

    Energetic materials contamination was investigated at the former Explosives Factory Maribyrnong, Victoria, Australia. Spectrophotometric/high performance liquid chromatography (HPLC) analysis was utilised to delineate a 5 tonne crystalline 2,4,6-trinitrotoluene (TNT) source in a former process waste lagoon that was found to be supplying contaminant leachate to the surficial clay aquitard with a maximum-recorded concentration of 7.0 ppm TNT. Groundwater within underlying sand and gravel aquifers was found to be uncontaminated due to upward hydraulic gradients resulting in slow plume development and propagation. Adsorption and microcosm test results from a parallel study were used as input parameters to simulate aqueous TNT transport in the clay aquitard using ATRANS20 software. The simulated TNT plume was localised within a few metres of the source, and at steady state, though leaching rate calculations suggest that without mitigation or other changes to the system, persistence of the source would be approximately 2,000 years. Remediation strategies may involve removal of the near surface source zone and infilling with an impermeable capping to impede leaching while facilitating ongoing natural attenuation by anaerobic degradation.

  20. Study of fault-tolerant software technology

    NASA Technical Reports Server (NTRS)

    Slivinski, T.; Broglio, C.; Wild, C.; Goldberg, J.; Levitt, K.; Hitt, E.; Webb, J.

    1984-01-01

    Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance.

  1. Experimental software engineering: Seventeen years of lessons in the SEL

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank E.

    1992-01-01

    Seven key principles developed by the Software Engineering Laboratory (SEL) at the Goddard Space Flight Center (GSFC) of the National Aeronautics and Space Administration (NASA) are described. For the past 17 years, the SEL has been experimentally analyzing the development of production software as varying techniques and methodologies are applied in this one environment. The SEL has collected, archived, and studied detailed measures from more than 100 flight dynamics projects, thereby gaining significant insight into the effectiveness of numerous software techniques, as well as extensive experience in the overall effectiveness of 'Experimental Software Engineering'. This experience has helped formulate follow-on studies in the SEL, and it has helped other software organizations better understand just what can be accomplished and what cannot be accomplished through experimentation.

  2. New Results in Software Model Checking and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  3. The Influence of Accreditation on the Sustainability of Organizations with the Brazilian Accreditation Methodology

    PubMed Central

    de Paiva, Anderson Paulo

    2018-01-01

    This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939

  4. Impact of synthetic biology and metabolic engineering on industrial production of fine chemicals.

    PubMed

    Jullesson, David; David, Florian; Pfleger, Brian; Nielsen, Jens

    2015-11-15

    Industrial bio-processes for fine chemical production are increasingly relying on cell factories developed through metabolic engineering and synthetic biology. The use of high throughput techniques and automation for the design of cell factories, and especially platform strains, has played an important role in the transition from laboratory research to industrial production. Model organisms such as Saccharomyces cerevisiae and Escherichia coli remain widely used host strains for industrial production due to their robust and desirable traits. This review describes some of the bio-based fine chemicals that have reached the market, key metabolic engineering tools that have allowed this to happen and some of the companies that are currently utilizing these technologies for developing industrial production processes. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  6. A toolbox for developing bioinformatics software

    PubMed Central

    Potrzebowski, Wojciech; Puton, Tomasz; Rother, Magdalena; Wywial, Ewa; Bujnicki, Janusz M.

    2012-01-01

    Creating useful software is a major activity of many scientists, including bioinformaticians. Nevertheless, software development in an academic setting is often unsystematic, which can lead to problems associated with maintenance and long-term availibility. Unfortunately, well-documented software development methodology is difficult to adopt, and technical measures that directly improve bioinformatic programming have not been described comprehensively. We have examined 22 software projects and have identified a set of practices for software development in an academic environment. We found them useful to plan a project, support the involvement of experts (e.g. experimentalists), and to promote higher quality and maintainability of the resulting programs. This article describes 12 techniques that facilitate a quick start into software engineering. We describe 3 of the 22 projects in detail and give many examples to illustrate the usage of particular techniques. We expect this toolbox to be useful for many bioinformatics programming projects and to the training of scientific programmers. PMID:21803787

  7. Flight Software for the LADEE Mission

    NASA Technical Reports Server (NTRS)

    Cannon, Howard N.

    2015-01-01

    The Lunar Atmosphere and Dust Environment Explorer (LADEE) spacecraft was launched on September 6, 2013, and completed its mission on April 17, 2014 with a directed impact to the Lunar Surface. Its primary goals were to examine the lunar atmosphere, measure lunar dust, and to demonstrate high rate laser communications. The LADEE mission was a resounding success, achieving all mission objectives, much of which can be attributed to careful planning and preparation. This paper discusses some of the highlights from the mission, and then discusses the techniques used for developing the onboard Flight Software. A large emphasis for the Flight Software was to develop it within tight schedule and cost constraints. To accomplish this, the Flight Software team leveraged heritage software, used model based development techniques, and utilized an automated test infrastructure. This resulted in the software being delivered on time and within budget. The resulting software was able to meet all system requirements, and had very problems in flight.

  8. MMX-I: data-processing software for multimodal X-ray imaging and tomography.

    PubMed

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-05-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors' knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments.

  9. Retinal Image Simulation of Subjective Refraction Techniques.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  10. A comparison of time-shared vs. batch development of space software

    NASA Technical Reports Server (NTRS)

    Forthofer, M.

    1977-01-01

    In connection with a study regarding the ground support software development for the Space Shuttle, an investigation was conducted concerning the most suitable software development techniques to be employed. A time-sharing 'trial period' was used to determine whether or not time-sharing would be a cost-effective software development technique for the Ground Based Shuttle system. It was found that time-sharing substantially improved job turnaround and programmer access to the computer for the representative group of ground support programmers. Moreover, this improvement resulted in an estimated saving of over fifty programmer days during the trial period.

  11. Towards a balanced software team formation based on Belbin team role using fuzzy technique

    NASA Astrophysics Data System (ADS)

    Omar, Mazni; Hasan, Bikhtiyar; Ahmad, Mazida; Yasin, Azman; Baharom, Fauziah; Mohd, Haslina; Darus, Norida Muhd

    2016-08-01

    In software engineering (SE), team roles play significant impact in determining the project success. To ensure the optimal outcome of the project the team is working on, it is essential to ensure that the team members are assigned to the right role with the right characteristics. One of the prevalent team roles is Belbin team role. A successful team must have a balance of team roles. Thus, this study demonstrates steps taken to determine balance of software team formation based on Belbin team role using fuzzy technique. Fuzzy technique was chosen because it allows analyzing of imprecise data and classifying selected criteria. In this study, two roles in Belbin team role, which are Shaper (Sh) and Plant (Pl) were chosen to assign the specific role in software team. Results show that the technique is able to be used for determining the balance of team roles. Future works will focus on the validation of the proposed method by using empirical data in industrial setting.

  12. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  13. Relationships among Reading Performance, Locus of Control and Achievement for Marginal Admission Students.

    ERIC Educational Resources Information Center

    Pepper, Roger S.; Drexler, John A., Jr.

    The first phase of the study was a 2 x 2 factorial design, with locus of control and instructional method (lecture and demonstration) as independent variables and honor point average (HPA) as the dependent variable. The second phase used correlational techniques to test the extent to which reading performance and traditional predictors of…

  14. Software engineering techniques and CASE tools in RD13

    NASA Astrophysics Data System (ADS)

    Buono, S.; Gaponenko, I.; Jones, R.; Khodabandeh, A.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Polesello, G.; Aguer, M.; Huet, M.

    1994-12-01

    The RD13 project was approved in April 1991 for the development of a scalable data-taking system suitable for hosting various LHC studies. One of its goals is the exploitation of software engineering techniques, in order to indicate their overall suitability for data acquisition (DAQ), software design and implementation. This paper describes how such techniques have been applied to the development of components of the RD13 DAQ used in test-beam runs at CERN. We describe our experience with the Artifex CASE tool and its associated methodology. The issues raised when code generated by a CASE tool has to be integrated into an existing environment are also discussed.

  15. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  16. A Role-Playing Game for a Software Engineering Lab: Developing a Product Line

    ERIC Educational Resources Information Center

    Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio

    2012-01-01

    Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…

  17. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    ERIC Educational Resources Information Center

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  18. Practical Methods for Estimating Software Systems Fault Content and Location

    NASA Technical Reports Server (NTRS)

    Nikora, A.; Schneidewind, N.; Munson, J.

    1999-01-01

    Over the past several years, we have developed techniques to discriminate between fault-prone software modules and those that are not, to estimate a software system's residual fault content, to identify those portions of a software system having the highest estimated number of faults, and to estimate the effects of requirements changes on software quality.

  19. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  20. Development and validation of techniques for improving software dependability

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    A collection of document abstracts are presented on the topic of improving software dependability through NASA grant NAG-1-1123. Specific topics include: modeling of error detection; software inspection; test cases; Magnetic Stereotaxis System safety specifications and fault trees; and injection of synthetic faults into software.

  1. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    NASA Astrophysics Data System (ADS)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  2. Manufacturing engineering: Principles for optimization

    NASA Astrophysics Data System (ADS)

    Koenig, Daniel T.

    Various subjects in the area of manufacturing engineering are addressed. The topics considered include: manufacturing engineering organization concepts and management techniques, factory capacity and loading techniques, capital equipment programs, machine tool and equipment selection and implementation, producibility engineering, methods, planning and work management, and process control engineering in job shops. Also discussed are: maintenance engineering, numerical control of machine tools, fundamentals of computer-aided design/computer-aided manufacture, computer-aided process planning and data collection, group technology basis for plant layout, environmental control and safety, and the Integrated Productivity Improvement Program.

  3. MMX-I: data-processing software for multimodal X-ray imaging and tomography

    PubMed Central

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-01-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors’ knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159

  4. Location and allocation decision for supply chain network of Cajeput oil (Case in XYZ company)

    NASA Astrophysics Data System (ADS)

    Mahardika, F. A.; Hisjam, M.; Widodo, B.; Kurniawan, B.

    2017-11-01

    Cajeput oil is a very promising business. And now, the fulfillment of Cajeput oil in Indonesia is still lacking. It's because the rate of production Cajeput leaves in Indonesia is still low. In Indonesia, XYZ company manages forests in 7 regions. XYZ currently are developing Cajeput oil business. XYZ is currently doing business productivity improvement of Cajeput by planting Cajeput trees in Location 3, Sragen. Besides the Cajeput trees planting program, XYZ plan to do the construction distillery Cajeput leaves. The purpose of the research in this paper is to minimize the total cost of the supply chain network of Cajeput oil in XYZ and to determine whether the construction of a Cajeput distillery should be done or not. This paper uses mixed integer linear programming to make matemathical models. To minimize the total cost, used IBM® ILOG®CPLEX software. From IBM® ILOG®CPLEX software. From the calculation ILOG®CPLEX IBM® software can be seen that the minimum total cost would be obtained if XYZ opened a new distillery with a capacity of 25000kg and a new factory with a capacity of 10000kg. Besides all the truck owned can be used entirely at optimal capacity. And the total cost from IBM® ILOG®CPLEX is IDR 113,406,250.

  5. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    ERIC Educational Resources Information Center

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  6. Advanced software techniques for data management systems. Volume 1: Study of software aspects of the phase B space shuttle avionics system

    NASA Technical Reports Server (NTRS)

    Martin, F. H.

    1972-01-01

    An overview of the executive system design task is presented. The flight software executive system, software verification, phase B baseline avionics system review, higher order languages and compilers, and computer hardware features are also discussed.

  7. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  8. Self-assembling software generator

    DOEpatents

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  9. An overview to CERSSO's self evaluation of the cost-benefit on the investment in occupational safety and health in the textile factories: "a step by step methodology".

    PubMed

    Amador-Rodezno, Rafael

    2005-01-01

    The Pan American Health Organization (PAHO) and CERSSO collaborated to develop a new Tool Kit (TK), which became available in May 2002. PAHO already had a TK in place, and CERSSO requested that one be developed for their needs. CERSSO wanted to enable managers and line workers in garment factories to self-diagnose plant and workstation hazards and to estimate the costs and benefits of investing in occupational safety and health (OSH) as a way to improve productivity and competitiveness. For consistency, the collaborating organizations agreed to construct the TK according to PAHO's methodology. The instrument was developed to be comprehensive enough that any user can collect the data easily. It integrates epidemiologic, risk assessment, clinic, engineering, and accountability issues, organized to include step-by-step training in: (a) performing risk assessments in the workplaces (risk factors); (b) making cause-effect relationships; (c) improving decision making on OSH interventions; (d) doing calculations of direct and indirect costs and savings; and (e) doing calculation of the overall cost-benefit of OSH interventions. Since July 2002, about 2,400 employees and officials from 736 garment factories, Ministries of Labor, Health, Social Security Institutes, and Technical Training Institutions of Central America and the Dominican Republic have used this instrument. Systematically, they have calculated a positive relationship of the investment (3 to 33 times). Employers are now aware of the financial rewards of investing in OSH. The TK is available in Spanish, Korean, and English. In July 2003, a software program in Spanish and English was developed (180 persons have been trained in the region), which requires less time to execute with better reliability.

  10. Estimated Nutritive Value of Low-Price Model Lunch Sets Provided to Garment Workers in Cambodia

    PubMed Central

    Makurat, Jan; Pillai, Aarati; Wieringa, Frank T.; Chamnan, Chhoun; Krawinkel, Michael B.

    2017-01-01

    Background: The establishment of staff canteens is expected to improve the nutritional situation of Cambodian garment workers. The objective of this study is to assess the nutritive value of low-price model lunch sets provided at a garment factory in Phnom Penh, Cambodia. Methods: Exemplary lunch sets were served to female workers through a temporary canteen at a garment factory in Phnom Penh. Dish samples were collected repeatedly to examine mean serving sizes of individual ingredients. Food composition tables and NutriSurvey software were used to assess mean amounts and contributions to recommended dietary allowances (RDAs) or adequate intake of energy, macronutrients, dietary fiber, vitamin C (VitC), iron, vitamin A (VitA), folate and vitamin B12 (VitB12). Results: On average, lunch sets provided roughly one third of RDA or adequate intake of energy, carbohydrates, fat and dietary fiber. Contribution to RDA of protein was high (46% RDA). The sets contained a high mean share of VitC (159% RDA), VitA (66% RDA), and folate (44% RDA), but were low in VitB12 (29% RDA) and iron (20% RDA). Conclusions: Overall, lunches satisfied recommendations of caloric content and macronutrient composition. Sets on average contained a beneficial amount of VitC, VitA and folate. Adjustments are needed for a higher iron content. Alternative iron-rich foods are expected to be better suited, compared to increasing portions of costly meat/fish components. Lunch provision at Cambodian garment factories holds the potential to improve food security of workers, approximately at costs of <1 USD/person/day at large scale. Data on quantitative total dietary intake as well as physical activity among workers are needed to further optimize the concept of staff canteens. PMID:28754003

  11. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  12. Intelligent Extruder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AlperEker; Mark Giammattia; Paul Houpt

    ''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. Themore » project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.« less

  13. Factorials of real negative and imaginary numbers - A new perspective.

    PubMed

    Thukral, Ashwani K

    2014-01-01

    Presently, factorials of real negative numbers and imaginary numbers, except for zero and negative integers are interpolated using the Euler's gamma function. In the present paper, the concept of factorials has been generalised as applicable to real and imaginary numbers, and multifactorials. New functions based on Euler's factorial function have been proposed for the factorials of real negative and imaginary numbers. As per the present concept, the factorials of real negative numbers, are complex numbers. The factorials of real negative integers have their imaginary part equal to zero, thus are real numbers. Similarly, the factorials of imaginary numbers are complex numbers. The moduli of the complex factorials of real negative numbers, and imaginary numbers are equal to their respective real positive number factorials. Fractional factorials and multifactorials have been defined in a new perspective. The proposed concept has also been extended to Euler's gamma function for real negative numbers and imaginary numbers, and beta function.

  14. The cost of software fault tolerance

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1982-01-01

    The proposed use of software fault tolerance techniques as a means of reducing software costs in avionics and as a means of addressing the issue of system unreliability due to faults in software is examined. A model is developed to provide a view of the relationships among cost, redundancy, and reliability which suggests strategies for software development and maintenance which are not conventional.

  15. Translations on North Korea No. 622

    DTIC Science & Technology

    1978-10-13

    Pyongyang Power Station 5 July Electric Factory Hamhung Machine Tool Factory Kosan Plastic Pipe Factory Sog’wangea Plastic Pipe Factory 8...August Factory Double Chollima Hamhung Disabled Veterans’ Plastic Goods Factory Mangyongdae Machine Tool Factory Kangso Coal Mine Tongdaewon Garment...21 Jul 78 p 4) innovating in machine tool production (NC 21 Jul 78 p 2) in 40 days of the 蔴 days of combat" raised coal production 10 percent

  16. [Shoes stitched, workers unstitched: a study on working and health conditions among women factory workers in the footwear industry in Franca, São Paulo State, Brazil].

    PubMed

    Prazeres, Taísa Junqueira; Navarro, Vera Lucia

    2011-10-01

    This study aimed to analyze associations between working conditions and health problems reported by women workers assigned to mechanical stitching in the footwear industry in Franca, São Paulo State, Brazil. The qualitative study's theory and methodology were based on historical and dialectical materialism and combined sociological and ethnographic research techniques. Data were collected with taped interviews, focusing on the workers' life and work stories, systematic observation of the work process, consultation of historical documents, and imagistic production. Analysis of the data revealed the effects of work in mechanical stitching on the health of women workers employed in the factory and at home, who experience precarious labor conditions involving workday intensification and extension, preset production targets, job insecurity, and unhealthy workplaces.

  17. Software resilience and the effectiveness of software mitigation in microcontrollers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather; Baker, Zachary; Fairbanks, Tom

    Commercially available microprocessors could be useful to the space community for noncritical computations. There are many possible components that are smaller, lower-power, and less expensive than traditional radiation-hardened microprocessors. Many commercial microprocessors have issues with single-event effects (SEEs), such as single-event upsets (SEUs) and single-event transients (SETs), that can cause the microprocessor to calculate an incorrect result or crash. In this paper we present the Trikaya technique for masking SEUs and SETs through software mitigation techniques. Furthermore, test results show that this technique can be very effective at masking errors, making it possible to fly these microprocessors for a varietymore » of missions.« less

  18. Software resilience and the effectiveness of software mitigation in microcontrollers

    DOE PAGES

    Quinn, Heather; Baker, Zachary; Fairbanks, Tom; ...

    2015-12-01

    Commercially available microprocessors could be useful to the space community for noncritical computations. There are many possible components that are smaller, lower-power, and less expensive than traditional radiation-hardened microprocessors. Many commercial microprocessors have issues with single-event effects (SEEs), such as single-event upsets (SEUs) and single-event transients (SETs), that can cause the microprocessor to calculate an incorrect result or crash. In this paper we present the Trikaya technique for masking SEUs and SETs through software mitigation techniques. Furthermore, test results show that this technique can be very effective at masking errors, making it possible to fly these microprocessors for a varietymore » of missions.« less

  19. Development and Confirmatory Factory Analysis of the Achievement Task Value Scale for University Students

    ERIC Educational Resources Information Center

    Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen

    2013-01-01

    The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…

  20. On Improved Least Squares Regression and Artificial Neural Network Meta-Models for Simulation via Control Variates

    DTIC Science & Technology

    2016-09-15

    18] under the context of robust parameter design for simulation. Bellucci’s technique is used in this research, primarily because the interior -point...Fundamentals of Radial Basis Neural Network (RBNN) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 1.2.2.2 Design of Experiments...with Neural Nets . . . . . . . . . . . . . 31 1.2.2.3 Factorial Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 1.2.2.4

  1. Asymmetric B-factory note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderon, M.

    Three main issues giving purpose to our visit to CERN, ESRF and DESY were to: assess the current thinking at CERN on whether Eta, the gas desorption coefficient, would continue to decrease with continued with continued beam cleaning, determine if the time between NEG reconditioning could be expanded, and acquire a knowledge of the basic fabrication processes and techniques for producing beam vacuum chambers of copper.

  2. Assessing the Factors Associated with Sexual Harassment among Young Female Migrant Workers in Nepal

    ERIC Educational Resources Information Center

    Puri, Mahesh; Cleland, John

    2007-01-01

    This article explores the extent of, and factors associated with, sexual harassment of young female migrant workers in the carpet and garment factories in Kathmandu Valley. Information is drawn from a survey of 550 female workers aged 14 to 19 and 12 in-depth case histories. Bivariate and multivariate techniques were applied to identify the…

  3. Bringing Adam Smith's Pin Factory to Life: Field Trips and Discussions as Forms of Experiential Learning

    ERIC Educational Resources Information Center

    Galizzi, Monica

    2014-01-01

    Educators are often aware of the need to implement a variety of teaching techniques to reach out to students with different learning styles. I describe an attempt to target multimodal learners by bringing classical economic texts and concepts to life through discussions, field visits and role playing exercises. In my Labor Economics class I…

  4. Performance of mixed pine-hardwood stands 16 years after fell-and-burn treatments

    Treesearch

    Elizabeth M. Blizzard; David H. van Lear; G. Geoff Wang; Thomas A. Waldrop

    2006-01-01

    Four variations of the fell-and-burn technique were compared for height and volume production on dry Piedmont sites. A two-factorial randomized complete block design of winter versus spring felling, with and without a summer burn, was implemented, followed by planting of loblolly pine (Pinus taeda L.) at 15 x 15 foot spacing. After 16 growing seasons...

  5. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  6. Combining chemometric tools for assessing hazard sources and factors acting simultaneously in contaminated areas. Case study: "Mar Piccolo" Taranto (South Italy).

    PubMed

    Mali, Matilda; Dell'Anna, Maria Michela; Notarnicola, Michele; Damiani, Leonardo; Mastrorilli, Piero

    2017-10-01

    Almost all marine coastal ecosystems possess complex structural and dynamic characteristics, which are influenced by anthropogenic causes and natural processes as well. Revealing the impact of sources and factors controlling the spatial distributions of contaminants within highly polluted areas is a fundamental propaedeutic step of their quality evaluation. Combination of different pattern recognition techniques, applied to one of the most polluted Mediterranean coastal basin, resulted in a more reliable hazard assessment. PCA/CA and factorial ANOVA were exploited as complementary techniques for apprehending the impact of multi-sources and multi-factors acting simultaneously and leading to similarities or differences in the spatial contamination pattern. The combination of PCA/CA and factorial ANOVA allowed, on one hand to determine the main processes and factors controlling the contamination trend within different layers and different basins, and, on the other hand, to ascertain possible synergistic effects. This approach showed the significance of a spatially representative overview given by the combination of PCA-CA/ANOVA in inferring the historical anthropogenic sources loading on the area. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Semantic Entity-Component State Management Techniques to Enhance Software Quality for Multimodal VR-Systems.

    PubMed

    Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich

    2017-04-01

    Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.

  8. State transition storyboards: A tool for designing the Goldstone solar system radar data acquisition system user interface software

    NASA Technical Reports Server (NTRS)

    Howard, S. D.

    1987-01-01

    Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.

  9. Level 1 Daq System for Kloe

    NASA Astrophysics Data System (ADS)

    Aloisio, A.; Cavaliere, S.; Cevenini, F.; Della Volpe, D.; Merola, L.; Anastasio, A.; Fiore, D. J.

    KLOE is a general purpose detector optimized to observe CP violation in K0 decays. This detector will be installed at the DAΦNE Φ-factory, in Frascati (Italy) and it is expected to run at the end of 1997. The KLOE DAQ system can be divided mainly into the front-end fast readout section (the Level 1 DAQ), the FDDI Switch and the processor farm. The total bandwidth requirement is estimated to be of the order of 50 Mbyte/s. In this paper, we describe the Level 1 DAQ section, which is based on custom protocols and hardware controllers, developed to achieve high data transfer rates and event building capabilities without software overhead.

  10. ISIS and META projects

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth; Cooper, Robert; Marzullo, Keith

    1990-01-01

    The ISIS project has developed a new methodology, virtual synchony, for writing robust distributed software. High performance multicast, large scale applications, and wide area networks are the focus of interest. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project is distributed control in a soft real-time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor, and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are reported.

  11. AADL and Model-based Engineering

    DTIC Science & Technology

    2014-10-20

    and MBE Feiler, Oct 20, 2014 © 2014 Carnegie Mellon University We Rely on Software for Safe Aircraft Operation Embedded software systems ...D eveloper Compute Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency...confusion Hardware Engineer Why do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software

  12. Highly selective and efficient removal of lead with magnetic nano-adsorbent: Multivariate optimization, isotherm and thermodynamic studies.

    PubMed

    Khani, Rouhollah; Sobhani, Sara; Beyki, Mostafa Hossein

    2016-03-15

    2-Hydroxyethylammonium sulfonate immobilized on γ-Fe2O3 nanoparticles (γ-Fe2O3-2-HEAS) was synthesized by the reaction of n-butylsulfonated γ-Fe2O3 with ethanolamine. The structure of the resulting product was confirmed by fourier transform infrared (FT-IR) spectra, X-ray diffraction (XRD) spectrometry, transmission electron microscopy (TEM), thermogravimetric analysis (TGA), elemental analysis, N2 adsorption-desorption and vibrating sample magnetometer (VSM) techniques. The supported ionic liquid on γ-Fe2O3 was applied as a new and green adsorbent to remove Pb(II) from aqueous solution. The effect of adsorption parameters such as pH, shaking time and amount of the adsorbent were investigated using two level three factor (2(3)) full factorial central composite design with the help of Design-Expert, Stat-Ease Inc. version 9.0 software. The significance of independent variables and their interactions were tested by means of the analysis of variance (ANOVA) with 95% confidence limits (α=0.05). The thermodynamic parameters of the adsorption process are estimated. It is found that the process is exothermic and spontaneous. The Langmuir and Freundlich models have been also applied to evaluate the removal efficiency and the data were correlated well with the Freundlich model. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Educational Software: A Developer's Perspective.

    ERIC Educational Resources Information Center

    Armstrong, Timothy C.; Loane, Russell F.

    1994-01-01

    Examines the current status and short-term future of computer software development in higher education. Topics discussed include educational advantages of software; current program development techniques, including object oriented programming; and market trends, including IBM versus Macintosh and multimedia programs. (LRW)

  14. Software for Automated Image-to-Image Co-registration

    NASA Technical Reports Server (NTRS)

    Benkelman, Cody A.; Hughes, Heidi

    2007-01-01

    The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.

  15. Selection of morphoagronomic descriptors for the characterization of accessions of cassava of the Eastern Brazilian Amazon.

    PubMed

    Silva, R S; Moura, E F; Farias-Neto, J T; Ledo, C A S; Sampaio, J E

    2017-04-13

    The aim of this study was to select morphoagronomic descriptors to characterize cassava accessions representative of Eastern Brazilian Amazonia. It was characterized 262 accessions using 21 qualitative descriptors. The multiple-correspondence analysis (MCA) technique was applied using the criteria: contribution of the descriptor in the last factorial axis of analysis in successive cycles (SMCA); reverse order of the descriptor's contribution in the last factorial axis of analysis with all descriptors ('O'´p') of Jolliffe's method; mean of the contribution orders of the descriptor in the first three factorial axes in the analysis with all descriptors ('Os') together with ('O'´p'); and order of contribution of weighted mean in the first three factorial axes in the analysis of all descriptors ('Oz'). The dissimilarity coefficient was measured by the method of multicategorical variables. The correlation among the matrix generated with all descriptors and matrices based on each criteria varied (r = 0.21, r = 0.97, r = 0.98, r = 0.13 for SMCA, 'Os', 'Oz' and 'O'´p', respectively). The least informative descriptors were discarded independently and according to both 'Os' and 'Oz' criteria. Thirteen descriptors were capable to discriminate the accessions and to represent the morphological variability of accessions sampled in Brazilian Eastern Amazonia: color of apical leaves, petiole color, color of stem exterior, external color of storage root, color of stem cortex, color of root pulp, texture of root epidermis, color of leaf vein, color of stem epidermis, color of end branches of adult plant, branching habit, root shape, and constriction of root.

  16. The simulation of air recirculation and fire/explosion phenomena within a semiconductor factory.

    PubMed

    I, Yet-Pole; Chiu, Yi-Long; Wu, Shi-Jen

    2009-04-30

    The semiconductor industry is the collection of capital-intensive firms that employ a variety of hazardous chemicals and engage in the design and fabrication of semiconductor devices. Owing to its processing characteristics, the fully confined structure of the fabrication area (fab) and the vertical airflow ventilation design restrict the applications of traditional consequence analysis techniques that are commonly used in other industries. The adverse situation also limits the advancement of a fire/explosion prevention design for the industry. In this research, a realistic model of a semiconductor factory with a fab, sub-fabrication area, supply air plenum, and return air plenum structures was constructed and the computational fluid dynamics algorithm was employed to simulate the possible fire/explosion range and its severity. The semiconductor factory has fan module units with high efficiency particulate air filters that can keep the airflow uniform within the cleanroom. This condition was modeled by 25 fans, three layers of porous ceiling, and one layer of porous floor. The obtained results predicted very well the real airflow pattern in the semiconductor factory. Different released gases, leak locations, and leak rates were applied to investigate their influence on the hazard range and severity. Common mitigation measures such as a water spray system and a pressure relief panel were also provided to study their potential effectiveness to relieve thermal radiation and overpressure hazards within a fab. The semiconductor industry can use this simulation procedure as a reference on how to implement a consequence analysis for a flammable gas release accident within an air recirculation cleanroom.

  17. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  18. More flexibility in representing geometric distortion in astronomical images

    NASA Astrophysics Data System (ADS)

    Shupe, David L.; Laher, Russ R.; Storrie-Lombardi, Lisa; Surace, Jason; Grillmair, Carl; Levitan, David; Sesar, Branimir

    2012-09-01

    A number of popular software tools in the public domain are used by astronomers, professional and amateur alike, but some of the tools that have similar purposes cannot be easily interchanged, owing to the lack of a common standard. For the case of image distortion, SCAMP and SExtractor, available from Astromatic.net, perform astrometric calibration and source-object extraction on image data, and image-data geometric distortion is computed in celestial coordinates with polynomial coefficients stored in the FITS header with the PV i_j keywords. Another widely-used astrometric-calibration service, Astrometry.net, solves for distortion in pixel coordinates using the SIP convention that was introduced by the Spitzer Science Center. Up until now, due to the complexity of these distortion representations, it was very difficult to use the output of one of these packages as input to the other. New Python software, along with faster-computing C-language translations, have been developed at the Infrared Processing and Analysis Center (IPAC) to convert FITS-image headers from PV to SIP and vice versa. It is now possible to straightforwardly use Astrometry.net for astrometric calibration and then SExtractor for source-object extraction. The new software also enables astrometric calibration by SCAMP followed by image visualization with tools that support SIP distortion, but not PV . The software has been incorporated into the image-processing pipelines of the Palomar Transient Factory (PTF), which generate FITS images with headers containing both distortion representations. The software permits the conversion of archived images, such as from the Spitzer Heritage Archive and NASA/IPAC Infrared Science Archive, from SIP to PV or vice versa. This new capability renders unnecessary any new representation, such as the proposed TPV distortion convention.

  19. The study on network security based on software engineering

    NASA Astrophysics Data System (ADS)

    Jia, Shande; Ao, Qian

    2012-04-01

    Developing a SP is a sensitive task because the SP itself can lead to security weaknesses if it is not conform to the security properties. Hence, appropriate techniques are necessary to overcome such problems. These techniques must accompany the policy throughout its deployment phases. The main contribution of this paper is then, the proposition of three of these activities: validation, test and multi-SP conflict management. Our techniques are inspired by the well established techniques of the software engineering for which we have found some similarities with the security domain.

  20. Methods for characterizing plant fibers.

    PubMed

    Cruthers, Natasha; Carr, Debra; Niven, Brian; Girvan, Elizabeth; Laing, Raechel

    2005-08-01

    The effectiveness of different microscopy techniques for measuring the dimensions of ultimate fibers from harakeke (Phormium tenax, New Zealand flax) was investigated using a factorial experimental design. Constant variables were geographical location, location of specimens along the leaf, season (winter), individual plant, a fourth leaf from a north-facing fan, age of plant, and cultivars (two). Experimental variables were microscopy techniques and measurement axis. Measurements of width and length of harakeke ultimate fibers depended on the microscopic preparation/technique used as well as the cultivar examined. The best methods were (i) transverse sections of leaf specimens 4 microm thick, embedded in Paraplast and observed using light microscopy, and (ii) nonfixed ultimate fibers observed using scanning electron microscopy. (c) 2005 Wiley-Liss, Inc.

  1. Testing Scientific Software: A Systematic Literature Review

    PubMed Central

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  2. Remote software upload techniques in future vehicles and their performance analysis

    NASA Astrophysics Data System (ADS)

    Hossain, Irina

    Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.

  3. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  4. Applying Hypertext Structures to Software Documentation.

    ERIC Educational Resources Information Center

    French, James C.; And Others

    1997-01-01

    Describes a prototype system for software documentation management called SLEUTH (Software Literacy Enhancing Usefulness to Humans) being developed at the University of Virginia. Highlights include information retrieval techniques, hypertext links that are installed automatically, a WAIS (Wide Area Information Server) search engine, user…

  5. Optimization of permeability for quality improvement by using factorial design

    NASA Astrophysics Data System (ADS)

    Said, Rahaini Mohd; Miswan, Nor Hamizah; Juan, Ng Shu; Hussin, Nor Hafizah; Ahmad, Aminah; Kamal, Mohamad Ridzuan Mohamad

    2017-05-01

    Sand castings are used worldwide by the manufacturing process in Metal Casting Industry, whereby the green sand are the commonly used sand mould type in the industry of sand casting. The defects on the surface of casting product is one of the problems in the industry of sand casting. The problems that relates to the defect composition of green sand are such as blowholes, pinholes shrinkage and porosity. Our objective is to optimize the best composition of green sand in order to minimize the occurrence of defects. Sand specimen of difference parameters (Bentonite, Green Sand, Cold dust and water) were design and prepared to undergo permeability test. The 24 factorial design experiment with four factors at difference composition were runs, and the total of 16 runs experiment were conducted. The developed models based on the experimental design necessary models were obtained. The model with a high coefficient of determination (R2=0.9841) and model for predicted and actual fitted well with the experimental data. Using the Analysis of Design Expert software, we identified that bentonite and water are the main interaction effect in the experiments. The optimal settings for green sand composition are 100g silica sand, 21g bentonite, 6.5 g water and 6g coal dust. This composition gives an effect of permeability number 598.3GP.

  6. Assessment of land allotment support power industry in Grati, Pasuruan Regency

    NASA Astrophysics Data System (ADS)

    Muzaqqi, M. A. R.

    2017-06-01

    The industrial sector is always in need of land for factory as well as other supporting facilities, on the other side of the ability of the environment (support) the uneven terrain of every area in favor of intensive activities such as industry. Land uses that are not adapted to the support power, will cause pollution, damage, disaster and loss that generally uses the environment. The purpose of this research was to assess the resources support neighborhood Grati district associated with the existence of a plan to build an industrial area in accordance with the direction of Grati utilization of space in the spatial plan of the Pasuruan Regency area. In this study of land carrying capacity power comparison capability and land use. The Analysis technique used is the technique of overlay with analysis tools namely software using the software Arcgis 10.1. The parameters of the ability of land-adapted to the characteristics of the land for industry, namely the slope the slope ranges 0-25% on the slope of 25-45% can be developed with industry improvement area contours, and on a slope above 45% not allocated as an industrial area, the type of soil that is not easy slopes, the intensity of the rain of less than 3000 mm, potential landslide and flood-prone lowlands. Each parameter will be provided scoring between 1-5. Score of 1 was given to the condition of land the most harm, and a score of 5 is given for the condition of the land which supports most of the location industry. The result scoring is divided in 5 clases those are bad (5-9), is bad (9.1-13), medium (13.1), good (17,1-9) and good (21.1-25). The need for industrial land, calculated from the vast land of existing industries. Based on research results, obtained the ability to land on the area of research has 3 classes of 5 classes, i.e. good, moderate and bad. The results of the comparison between the broad capabilities and the needs of the farm industry, it can be concluded that the power of the land to support the industry in Grati still has not been exceeded.

  7. Integrating Text-to-Speech Software into Pedagogically Sound Teaching and Learning Scenarios

    ERIC Educational Resources Information Center

    Rughooputh, S. D. D. V.; Santally, M. I.

    2009-01-01

    This paper presents a new technique of delivery of classes--an instructional technique which will no doubt revolutionize the teaching and learning, whether for on-campus, blended or online modules. This is based on the simple task of instructionally incorporating text-to-speech software embedded in the lecture slides that will simulate exactly the…

  8. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  9. Systems Security Engineering

    DTIC Science & Technology

    2010-08-22

    Commission (IEC). “Information technology — Security techniques — Code of practice for information security management ( ISO /IEC 27002 ...Information technology — Security techniques — Information security management systems —Requirements ( ISO /IEC 27002 ),”, “Information technology — Security...was a draft ISO standard on Systems and software engineering, Systems and software assurance [18]. Created by systems engineers for systems

  10. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    PubMed

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.

  11. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  12. Reflections of Computing Experiences in a Steel Factory in the Early 1960s

    NASA Astrophysics Data System (ADS)

    Järvinen, Pertti

    We can best see many things from a historical perspective. What were the first pioneers doing in the information technology departments of Finnish manufacturing companies? In early 1960s, I had a special chance to work in a steel industry that had long traditions to use rather advanced tools and methods to intensify their productivity. The first computer in our company had such novel properties as movable disk packs making a direct access of stored data possible. In this paper, we describe the following issues and innovations in some depth. These include (a) transitioning from the punched card machines to a new computer era, (b) using advanced programming language to intensify production of new computer software, (c) drawing pictures by using a line printer, (d) supporting steel making with mathematical software, (e) storing executable programs to the disk memory and calling and moving them from there to the core memory for running, and (f) building a simple report generator. I will also pay attention to the breakthrough in those innovations and in this way demonstrate how some computing solutions were growing at that time.

  13. Total fume and metal concentrations during welding in selected factories in Jeddah, Saudi Arabia.

    PubMed

    Balkhyour, Mansour Ahmed; Goknil, Mohammad Khalid

    2010-07-01

    Welding is a major industrial process used for joining metals. Occupational exposure to welding fumes is a serious occupational health problem all over the world. The degree of risk to welder's health from fumes depends on composition, concentration, and the length of exposure. The aim of this study was to investigate workers' welding fume exposure levels in some industries in Jeddah, Saudi Arabia. In each factory, the air in the breathing zone within 0.5 m from welders was sampled during 8-hour shifts. Total particulates, manganese, copper, and molybdenum concentrations of welding fumes were determined. Mean values of eight-hour average particulate concentrations measured during welding at the welders breathing zone were 6.3 mg/m(3) (Factory 1), 5.3 mg/m(3) (Factory 2), 11.3 mg/m(3) (Factory 3), 6.8 mg/m(3) (Factory 4), 4.7 mg/m(3) (Factory 5), and 3.0 mg/m(3) (Factory 6). Mean values of airborne manganese, copper, and molybdenum levels measured during welding were in the range of 0.010 mg/m(3)-0.477 mg/m(3), 0.001 mg/m(3)-0.080 mg/m(3) and 0.001 mg/m(3)-0.058 mg/m(3) respectively. Mean values of calculated equivalent exposure values were: 1.50 (Factory 1), 1.56 (Factory 2), 5.14 (Factory 3), 2.21 (Factory 4), 2.89 (Factory 5), and 1.20 (Factory 6). The welders in factories 1, 2, 3, and 4 were exposed to welding fume concentration above the SASO limit value, which may increase the risk of respiratory health problems.

  14. A simple model for factory distribution: Historical effect in an industry city

    NASA Astrophysics Data System (ADS)

    Uehara, Takashi; Sato, Kazunori; Morita, Satoru; Maeda, Yasunobu; Yoshimura, Jin; Tainaka, Kei-ichi

    2016-02-01

    The construction and discontinuance processes of factories are complicated problems in sociology. We focus on the spatial and temporal changes of factories at Hamamatsu city in Japan. Real data indicate that the clumping degree of factories decreases as the density of factory increases. To represent the spatial and temporal changes of factories, we apply "contact process" which is one of cellular automata. This model roughly explains the dynamics of factory distribution. We also find "historical effect" in spatial distribution. Namely, the recent factories have been dispersed due to the past distribution during the period of economic bubble. This effect may be related to heavy shock in Japanese stock market.

  15. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  16. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  17. The Impact of Interactive Environment and Metacognitive Support on Academic Achievement and Transactional Distance in Online Learning

    ERIC Educational Resources Information Center

    Yilmaz, Ramazan; Keser, Hafize

    2017-01-01

    The aim of the present study is to reveal the impact of the interactive environment and metacognitive support (MS) in online learning on academic achievement and transactional distance (TD). The study is designed as 2 × 2 factorial design, and both qualitative and quantitative research techniques are used. The study was carried out on 127…

  18. Measurements of Weight Bearing Asymmetry Using the Nintendo Wii Fit Balance Board Are Not Reliable for Older Adults and Individuals With Stroke.

    PubMed

    Liuzzo, Derek M; Peters, Denise M; Middleton, Addie; Lanier, Wes; Chain, Rebecca; Barksdale, Brittany; Fritz, Stacy L

    Clinicians and researchers have used bathroom scales, balance performance monitors with feedback, postural scale analysis, and force platforms to evaluate weight bearing asymmetry (WBA). Now video game consoles offer a novel alternative for assessing this construct. By using specialized software, the Nintendo Wii Fit balance board can provide reliable measurements of WBA in healthy, young adults. However, reliability of measurements obtained using only the factory settings to assess WBA in older adults and individuals with stroke has not been established. To determine whether measurements of WBA obtained using the Nintendo Wii Fit balance board and default settings are reliable in older adults and individuals with stroke. Weight bearing asymmetry was assessed using the Nintendo Wii Fit balance board in 2 groups of participants-individuals older than 65 years (n = 41) and individuals with stroke (n = 41). Participants were given a standardized set of instructions and were not provided auditory or visual feedback. Two trials were performed. Intraclass correlation coefficients (ICC), standard error of measure (SEM), and minimal detectable change (MDC) scores were determined for each group. The ICC for the older adults sample was 0.59 (0.35-0.76) with SEM95 = 6.2% and MDC95 = 8.8%. The ICC for the sample including individuals with stroke was 0.60 (0.47-0.70) with SEM95 = 9.6% and MDC95 = 13.6%. Although measurements of WBA obtained using the Nintendo Wii Fit balance board, and its default factory settings, demonstrate moderate reliability in older adults and individuals with stroke, the relatively high associated SEM and MDC values substantially reduce the clinical utility of the Nintendo Wii Fit balance board as an assessment tool for WBA. Weight bearing asymmetry cannot be measured reliably in older adults and individuals with stroke using the Nintendo Wii Fit balance board without the use of specialized software.

  19. Measurements of Weight Bearing Asymmetry Using the Nintendo Wii Fit Balance Board Are Not Reliable for Older Adults and Individuals With Stroke

    PubMed Central

    Liuzzo, Derek M.; Peters, Denise M.; Middleton, Addie; Lanier, Wes; Chain, Rebecca; Barksdale, Brittany; Fritz, Stacy L.

    2015-01-01

    Background Clinicians and researchers have used bathroom scales, balance performance monitors with feedback, postural scale analysis, and force platforms to evaluate weight bearing asymmetry (WBA). Now video game consoles offer a novel alternative for assessing this construct. By using specialized software, the Nintendo Wii Fit balance board can provide reliable measurements of WBA in healthy, young adults. However, reliability of measurements obtained using only the factory settings to assess WBA in older adults and individuals with stroke has not been established. Purpose To determine whether measurements of WBA obtained using the Nintendo Wii Fit balance board and default settings are reliable in older adults and individuals with stroke. Methods Weight bearing asymmetry was assessed using the Nintendo Wii Fit balance board in 2 groups of participants—individuals older than 65 years (n = 41) and individuals with stroke (n = 41). Participants were given a standardized set of instructions and were not provided auditory or visual feedback. Two trials were performed. Intraclass correlation coefficients (ICC), standard error of measure (SEM), and minimal detectable change (MDC) scores were determined for each group. Results The ICC for the older adults sample was 0.59 (0.35–0.76) with SEM95= 6.2% and MDC95= 8.8%. The ICC for the sample including individuals with stroke was 0.60 (0.47–0.70) with SEM95= 9.6% and MDC95= 13.6%. Discussion Although measurements of WBA obtained using the Nintendo Wii Fit balance board, and its default factory settings, demonstrate moderate reliability in older adults and individuals with stroke, the relatively high associated SEM and MDC values substantially reduce the clinical utility of the Nintendo Wii Fit balance board as an assessment tool for WBA. Conclusions Weight bearing asymmetry cannot be measured reliably in older adults and individuals with stroke using the Nintendo Wii Fit balance board without the use of specialized software. PMID:26288237

  20. Doi-Peliti path integral methods for stochastic systems with partial exclusion

    NASA Astrophysics Data System (ADS)

    Greenman, Chris D.

    2018-09-01

    Doi-Peliti methods are developed for stochastic models with finite maximum occupation numbers per site. We provide a generalized framework for the different Fock spaces reported in the literature. Paragrassmannian techniques are then utilized to construct path integral formulations of factorial moments. We show that for many models of interest, a Magnus expansion is required to construct a suitable action, meaning actions containing a finite number of terms are not always feasible. However, for such systems, perturbative techniques are still viable, and for some examples, including carrying capacity population dynamics, and diffusion with partial exclusion, the expansions are exactly summable.

  1. Assessing the factors associated with sexual harassment among young female migrant workers in Nepal.

    PubMed

    Puri, Mahesh; Cleland, John

    2007-11-01

    This article explores the extent of, and factors associated with, sexual harassment of young female migrant workers in the carpet and garment factories in Kathmandu Valley. Information is drawn from a survey of 550 female workers aged 14 to 19 and 12 in-depth case histories. Bivariate and multivariate techniques were applied to identify the factors associated with harassment. The survey found that 1 in 10 young women had experienced sexual harassment or coercion. Those who were exposed to pornographic movies were more likely than those with no exposure to any kind of movies to report sexual harassment. Perpetrators included coworkers, boyfriends, employers, and relatives. Case histories revealed that the inability of young women to communicate effectively with their peers and sex partners, lack of self-esteem, job insecurity, and other socioeconomic problems made them vulnerable to these abuses. The results suggest the need for advocacy and a range of factory-based interventions.

  2. Development of a software safety process and a case study of its use

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1993-01-01

    The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.

  3. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  4. Dose Estimating Application Software Modification: Additional Function of a Size-Specific Effective Dose Calculator and Auto Exposure Control.

    PubMed

    Kobayashi, Masanao; Asada, Yasuki; Matsubara, Kosuke; Suzuki, Shouichi; Matsunaga, Yuta; Haba, Tomonobu; Kawaguchi, Ai; Daioku, Tomihiko; Toyama, Hiroshi; Kato, Ryoichi

    2017-05-01

    Adequate dose management during computed tomography is important. In the present study, the dosimetric application software ImPACT was added to a functional calculator of the size-specific dose estimate and was part of the scan settings for the auto exposure control (AEC) technique. This study aimed to assess the practicality and accuracy of the modified ImPACT software for dose estimation. We compared the conversion factors identified by the software with the values reported by the American Association of Physicists in Medicine Task Group 204, and we noted similar results. Moreover, doses were calculated with the AEC technique and a fixed-tube current of 200 mA for the chest-pelvis region. The modified ImPACT software could estimate each organ dose, which was based on the modulated tube current. The ability to perform beneficial modifications indicates the flexibility of the ImPACT software. The ImPACT software can be further modified for estimation of other doses. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  6. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  7. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    NASA Astrophysics Data System (ADS)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  8. Integrated Environment for Development and Assurance

    DTIC Science & Technology

    2015-01-26

    Jan 26, 2015 © 2015 Carnegie Mellon University We Rely on Software for Safe Aircraft Operation Embedded software systems introduce a new class of...eveloper Compute Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects...Why do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  9. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  10. Making the PACS workstation a browser of image processing software: a feasibility study using inter-process communication techniques.

    PubMed

    Wang, Chunliang; Ritter, Felix; Smedby, Orjan

    2010-07-01

    To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added <10ms of processing time while the other IPC methods cost 1-5 s in our experiments. The browser-server style communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.

  11. Moving Target Techniques: Leveraging Uncertainty for Cyber Defense

    DTIC Science & Technology

    2015-08-24

    vulnerability (a flaw or bug that an attacker can exploit to penetrate or disrupt a system) to successfully compromise systems. Defenders, however...device drivers, numerous software applications, and hardware components. Within the cyberspace, this imbalance between a simple, one- bug attack...parsing code itself could have security-relevant software bugs . Dynamic  Network   Techniques in the dynamic network domain change the properties

  12. Using Software Simulators to Enhance the Learning of Digital Logic Design for the Information Technology Students

    ERIC Educational Resources Information Center

    Alsadoon, Abeer; Prasad, P. W. C.; Beg, Azam

    2017-01-01

    Making the students understand the theoretical concepts of digital logic design concepts is one of the major issues faced by the academics, therefore the teachers have tried different techniques to link the theoretical information to the practical knowledge. Use of software simulations is a technique for learning and practice that can be applied…

  13. Deploying an Intelligent Pairing Assistant for Air Operation Centers

    DTIC Science & Technology

    2016-06-23

    primary contributions of this case study are applying artificial intelligence techniques to a novel domain and discussing the software evaluation...their standard workflows. The primary contributions of this case study are applying artificial intelligence techniques to a novel domain and...users for more efficient and accurate pairing? Participants Participants in the evaluation consisted of three SMEs employed at Intelligent Software

  14. Unstructured Grid Generation Techniques and Software

    NASA Technical Reports Server (NTRS)

    Posenau, Mary-Anne K. (Editor)

    1993-01-01

    The Workshop on Unstructured Grid Generation Techniques and Software was conducted for NASA to assess its unstructured grid activities, improve the coordination among NASA centers, and promote technology transfer to industry. The proceedings represent contributions from Ames, Langley, and Lewis Research Centers, and the Johnson and Marshall Space Flight Centers. This report is a compilation of the presentations made at the workshop.

  15. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  16. The systematic evolution of a NASA software technology, Appendix C

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    A long range program is described whose ultimate purpose is to make possible the production of software in NASA within predictable schedule and budget constraints and with major characteristics such as size, run-time, and correctness predictable within reasonable tolerances. As part of the program a pilot NASA computer center will be chosen to apply software development and management techniques systematically and determine a set which is effective. The techniques will be developed by a Technology Group, which will guide the pilot project and be responsible for its success. The application of the technology will involve a sequence of NASA programming tasks graduated from simpler ones at first to complex systems in late phases of the project. The evaluation of the technology will be made by monitoring the operation of the software at the users' installations. In this way a coherent discipline for software design, production maintenance, and management will be evolved.

  17. Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog

    NASA Technical Reports Server (NTRS)

    Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William

    2017-01-01

    NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.

  18. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  19. The Case for Open Source Software: The Interactional Discourse Lab

    ERIC Educational Resources Information Center

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  20. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  1. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  2. Software Reliability Issues Concerning Large and Safety Critical Software Systems

    NASA Technical Reports Server (NTRS)

    Kamel, Khaled; Brown, Barbara

    1996-01-01

    This research was undertaken to provide NASA with a survey of state-of-the-art techniques using in industrial and academia to provide safe, reliable, and maintainable software to drive large systems. Such systems must match the complexity and strict safety requirements of NASA's shuttle system. In particular, the Launch Processing System (LPS) is being considered for replacement. The LPS is responsible for monitoring and commanding the shuttle during test, repair, and launch phases. NASA built this system in the 1970's using mostly hardware techniques to provide for increased reliability, but it did so often using custom-built equipment, which has not been able to keep up with current technologies. This report surveys the major techniques used in industry and academia to ensure reliability in large and critical computer systems.

  3. Does bottle type and acid-washing influence trace element analyses by ICP-MS on water samples? A test covering 62 elements and four bottle types: high density polyethene (HDPE), polypropene (PP), fluorinated ethene propene copolymer (FEP) and perfluoroalkoxy polymer (PFA).

    PubMed

    Reimann, C; Siewers, U; Skarphagen, H; Banks, D

    1999-10-01

    Groundwater samples from 15 boreholes in crystalline bedrock aquifers in South Norway (Oslo area) have been collected in parallel in five different clear plastic bottle types (high density polyethene [HDPE], polypropene [PP, two manufacturers], fluorinated ethene propene copolymer [FEP] and perfluoroalkoxy polymer [PFA]. In the cases of polyethene and polypropene, parallel samples have been collected in factory-new (unwashed) bottles and acid-washed factory-new bottles. Samples have been analysed by ICP-MS techniques for a wide range of inorganic elements down to the ppt (ng/l) range. It was found that acid-washing of factory-new flasks had no clear systematic beneficial effect on analytical result. On the contrary, for the PP-bottles concentrations of Pb and Sn were clearly elevated in the acid-washed bottles. Likewise, for the vast majority of elements, bottle type was of no importance for analytical result. For six elements (Al, Cr, Hf, Hg, Pb and Sn) some systematic differences for one or more bottle types could be tentatively discerned, but in no case was the discrepancy of major cause for concern. The most pronounced effect was for Cr, with clearly elevated concentrations returned from the samples collected in HDPE bottles, regardless of acid-washing or not. For the above six elements, FEP or PFA bottles seemed to be marginally preferable to PP and HDPE. In general, cheap HDPE, factory new, unwashed flasks are suitable for sampling waters for ICP-MS ultra-trace analysis of the elements tested.

  4. The 1994 EUROMET collection of micrometeorites at Cap-Prudhomme, Antarctica

    NASA Astrophysics Data System (ADS)

    Maurette, M.; Immel, G.; Engrand, C.; Kurat, G.; Pillinger, C. T.

    1994-07-01

    Advance funding from IFRTP (Institut Francais pour la Recherche et pour la Technique Polaire) for micrometeorite collection at Cap-Prudhomme has allowed construction of a new micrometeorite 'factory,' conceived to greatly reduce contamination of the ultraclean ice by our activities. The potential problems include fly ash, rust grains, fuel spills 'sticking to the shoes,' and trace elements from the plasticizers used in plastic tubing. In the new factory, intended to produce and then cycle 10-15 tons of melt ice water per day, all parts exposed to water were replaced by either stainless steel or teflon. After examination with a microscope and their transfer into teflon and or glass vials, all samples were frozen the day of their collection. The factory operated from December 15, 1993, through February 6, 1994. Problems included injuries as well as very bad weather conditions, characterized by both the heaviest snow falls observed and unexpected gusts from a blizzard. Also, several new components of the factory did not function properly under the extreme conditions of Antarctica. Our major objective was to obtain the 'cleanest' and 'purest' 25-50 microns micrometeorites ever collected in Antarctica, for comparison with stratospheric Interplanetary Dust Particles (IDPs). We could not fulfill this premise, but we recovered the best 100-400-micron-size fraction of 'giant' micrometeorites ever collected on Earth. Our 26 daily collections are listed,refering to an 'index of quality.' Aliquots of these daily collections will be distributed to major institutions in Austria, England, the U.S.A., and Japan, to be preserved for future generations. The Antarctica ice sheet is truly a gigantic, ultraclean, and inexhaustible micrometeorite collector, but it is very tricky to recover 'ultraclean' micrometeorites from it.

  5. Ruggedized minicomputer hardware and software topics, 1981: Proceedings of the 4th ROLM MIL-SPEC Computer User's Group Conference

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Presentations of a conference on the use of ruggedized minicomputers are summarized. The following topics are discussed: (1) the role of minicomputers in the development and/or certification of commercial or military airplanes in both the United States and Europe; (2) generalized software error detection techniques; (3) real time software development tools; (4) a redundancy management research tool for aircraft navigation/flight control sensors; (5) extended memory management techniques using a high order language; and (6) some comments on establishing a system maintenance scheme. Copies of presentation slides are also included.

  6. Behavior driven testing in ALMA telescope calibration software

    NASA Astrophysics Data System (ADS)

    Gil, Juan P.; Garces, Mario; Broguiere, Dominique; Shen, Tzu-Chiang

    2016-07-01

    ALMA software development cycle includes well defined testing stages that involves developers, testers and scientists. We adapted Behavior Driven Development (BDD) to testing activities applied to Telescope Calibration (TELCAL) software. BDD is an agile technique that encourages communication between roles by defining test cases using natural language to specify features and scenarios, what allows participants to share a common language and provides a high level set of automated tests. This work describes how we implemented and maintain BDD testing for TELCAL, the infrastructure needed to support it and proposals to expand this technique to other subsystems.

  7. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  8. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  9. Ultrasonic non invasive techniques for microbiological instrumentation

    NASA Astrophysics Data System (ADS)

    Elvira, L.; Sierra, C.; Galán, B.; Resa, P.

    2010-01-01

    Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.

  10. A Mechanized Decision Support System for Academic Scheduling.

    DTIC Science & Technology

    1986-03-01

    an operational system called software. The first step in the development phase is Design . Designers destribute software control by factoring the Data...SUBJECT TERMS (Continue on reverse if necessary and identify by block number) ELD GROUP SUB-GROUP Scheduling, Decision Support System , Software Design ...scheduling system . It will also examine software - design techniques to identify the most appropriate method- ology for this problem. " - Chapter 3 will

  11. A second generation experiment in fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    The primary goal was to determine whether the application of fault tolerance to software increases its reliability if the cost of production is the same as for an equivalent nonfault tolerance version derived from the same requirements specification. Software development protocols are discussed. The feasibility of adapting to software design fault tolerance the technique of N-fold Modular Redundancy with majority voting was studied.

  12. Developing Confidence Limits For Reliability Of Software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1991-01-01

    Technique developed for estimating reliability of software by use of Moranda geometric de-eutrophication model. Pivotal method enables straightforward construction of exact bounds with associated degree of statistical confidence about reliability of software. Confidence limits thus derived provide precise means of assessing quality of software. Limits take into account number of bugs found while testing and effects of sampling variation associated with random order of discovering bugs.

  13. Toward Reusable Graphics Components in Ada

    DTIC Science & Technology

    1993-03-01

    Then alternatives for obtaining well- engineered reusable software components were examined. Finally, the alternatives were analyzed, and the most...reusable software components. Chapter 4 describes detailed design and implementation strategies in building a well- engineered reusable set of components in...study. 2.2 The Object-Oriented Paradigm 2.2.1 The Need for Object-Oriented Techniques. Among software engineers the software crisis is a well known

  14. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  15. Parallel Logic Programming and Parallel Systems Software and Hardware

    DTIC Science & Technology

    1989-07-29

    Conference, Dallas TX. January 1985. (55) [Rous75] Roussel, P., "PROLOG: Manuel de Reference et d’Uilisation", Group d’ Intelligence Artificielle , Universite d...completed. Tools were provided for software development using artificial intelligence techniques. Al software for massively parallel architectures was...using artificial intelligence tech- niques. Al software for massively parallel architectures was started. 1. Introduction We describe research conducted

  16. Contributions to optimization of storage and transporting industrial goods

    NASA Astrophysics Data System (ADS)

    Babanatsas, T.; Babanatis Merce, R. M.; Glăvan, D. O.; Glăvan, A.

    2018-01-01

    Optimization of storage and transporting industrial goods in a factory either from a constructive, functional, or technological point of view is a determinant parameter in programming the manufacturing process, the performance of the whole process being determined by the correlation realized taking in consideration those two factors (optimization and programming the process). It is imperative to take into consideration each type of production program (range), to restrain as much as possible the area that we are using and to minimize the times of execution, all of these in order to satisfy the client’s needs, to try to classify them in order to be able to define a global software (with general rules) that is expected to fulfil each client’s needs.

  17. Production Techniques for Computer-Based Learning Material.

    ERIC Educational Resources Information Center

    Moonen, Jef; Schoenmaker, Jan

    Experiences in the development of educational software in the Netherlands have included the use of individual and team approaches, the determination of software content and how it should be presented, and the organization of the entire development process, from experimental programs to prototype to final product. Because educational software is a…

  18. A Characteristics Approach to the Evaluation of Economics Software Packages.

    ERIC Educational Resources Information Center

    Lumsden, Keith; Scott, Alex

    1988-01-01

    Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)

  19. Leveraging Code Comments to Improve Software Reliability

    ERIC Educational Resources Information Center

    Tan, Lin

    2009-01-01

    Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…

  20. Knowledge Sharing through Pair Programming in Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Kavitha, R. K.; Ahmed, M. S.

    2015-01-01

    Agile software development is an iterative and incremental methodology, where solutions evolve from self-organizing, cross-functional teams. Pair programming is a type of agile software development technique where two programmers work together with one computer for developing software. This paper reports the results of the pair programming…

  1. Interactive Visualization of Assessment Data: The Software Package Mondrian

    ERIC Educational Resources Information Center

    Unlu, Ali; Sargin, Anatol

    2009-01-01

    Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…

  2. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  3. The influence of cutting-bill requirements on lumber yield using a fractional-factorial design part II, correlation and number of part sizes

    Treesearch

    Urs Buehlmann; D. Earl Kline; Janice K. Wiedenbeck; R., Jr. Noble

    2008-01-01

    Cutting-bill requirements, among other factors, influence the yield obtained when cutting lumber into parts. The first part of this 2-part series described how different cutting-bill part sizes, when added to an existing cutting-bill, affect lumber yield, and quantified these observations. To accomplish this, the study employed linear least squares estimation technique...

  4. Estimation of rumen outflow in dairy cows fed grass silage-based diets by use of reticular sampling as an alternative to sampling from the omasal canal

    USDA-ARS?s Scientific Manuscript database

    TA study was conducted to compare nutrient flows determined by a reticular sampling technique with those made by sampling of digesta from the omasal canal. Six lactating dairy cows fitted with ruminal cannulas were used in a design with a 3 x 2 factorial arrangement of treatments and 4 periods. Trea...

  5. The Development of Educational Environment Suited to the Japan-Specific Educational Service Using Requirements Engineering Techniques: Case Study of Running Sakai with PostgreSQL

    ERIC Educational Resources Information Center

    Terawaki, Yuki; Takahashi, Yuichi; Kodama, Yasushi; Yana, Kazuo

    2011-01-01

    This paper describes an integration of different Relational Database Management System (RDBMS) of two Course Management Systems (CMS) called Sakai and the Common Factory for Inspiration and Value in Education (CFIVE). First, when the service of CMS is provided campus-wide, the problems of user support, CMS operation and customization of CMS are…

  6. Sparse and incomplete factorial matrices to screen membrane protein 2D crystallization

    PubMed Central

    Lasala, R.; Coudray, N.; Abdine, A.; Zhang, Z.; Lopez-Redondo, M.; Kirshenbaum, R.; Alexopoulos, J.; Zolnai, Z.; Stokes, D.L.; Ubarretxena-Belandia, I.

    2014-01-01

    Electron crystallography is well suited for studying the structure of membrane proteins in their native lipid bilayer environment. This technique relies on electron cryomicroscopy of two-dimensional (2D) crystals, grown generally by reconstitution of purified membrane proteins into proteoliposomes under conditions favoring the formation of well-ordered lattices. Growing these crystals presents one of the major hurdles in the application of this technique. To identify conditions favoring crystallization a wide range of factors that can lead to a vast matrix of possible reagent combinations must be screened. However, in 2D crystallization these factors have traditionally been surveyed in a relatively limited fashion. To address this problem we carried out a detailed analysis of published 2D crystallization conditions for 12 β-barrel and 138 α-helical membrane proteins. From this analysis we identified the most successful conditions and applied them in the design of new sparse and incomplete factorial matrices to screen membrane protein 2D crystallization. Using these matrices we have run 19 crystallization screens for 16 different membrane proteins totaling over 1,300 individual crystallization conditions. Six membrane proteins have yielded diffracting 2D crystals suitable for structure determination, indicating that these new matrices show promise to accelerate the success rate of membrane protein 2D crystallization. PMID:25478971

  7. 1. VIEW TO SOUTHEAST (NORTHWEST CORNER OF EDIBLE FATS FACTORY) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW TO SOUTHEAST (NORTHWEST CORNER OF EDIBLE FATS FACTORY) - Wilson's Oil House, Lard Refinery, & Edible Fats Factory, Edible Fats Factory, 2801 Southwest Fifteenth Street, Oklahoma City, Oklahoma County, OK

  8. 3. VIEW TO SOUTHWEST (NORTHEAST CORNER OF EDIBLE FATS FACTORY) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VIEW TO SOUTHWEST (NORTHEAST CORNER OF EDIBLE FATS FACTORY) - Wilson's Oil House, Lard Refinery, & Edible Fats Factory, Edible Fats Factory, 2801 Southwest Fifteenth Street, Oklahoma City, Oklahoma County, OK

  9. Applications of multigrid software in the atmospheric sciences

    NASA Technical Reports Server (NTRS)

    Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.

    1992-01-01

    Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.

  10. Securing mobile code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware ismore » necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements on this method as well as demonstrating its implementation for various algorithms. We also examine cryptographic techniques to achieve obfuscation including encrypted functions and offer a new application to digital signature algorithms. To better understand the lack of security proofs for obfuscation techniques, we examine in detail general theoretical models of obfuscation. We explain the need for formal models in order to obtain provable security and the progress made in this direction thus far. Finally we tackle the problem of verifying remote execution. We introduce some methods of verifying remote exponentiation computations and some insight into generic computation checking.« less

  11. 4. SOUTHEAST CORNER OF EDIBLE FATS FACTORY (CONNECTING BUILDING ON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. SOUTHEAST CORNER OF EDIBLE FATS FACTORY (CONNECTING BUILDING ON THE LEFT) - Wilson's Oil House, Lard Refinery, & Edible Fats Factory, Edible Fats Factory, 2801 Southwest Fifteenth Street, Oklahoma City, Oklahoma County, OK

  12. Progressive retry for software error recovery in distributed systems

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.

    1993-01-01

    In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.

  13. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  14. A Bibliography of Externally Published Works by the SEI Engineering Techniques Program

    DTIC Science & Technology

    1992-08-01

    media, and virtual reality * model- based engineering * programming languages * reuse * software architectures * software engineering as a discipline...Knowledge- Based Engineering Environments." IEEE Expert 3, 2 (May 1988): 18-23, 26-32. Audience: Practitioner [Klein89b] Klein, D.V. "Comparison of...Terms with Software Reuse Terminology: A Model- Based Approach." ACM SIGSOFT Software Engineering Notes 16, 2 (April 1991): 45-51. Audience: Practitioner

  15. Support Materials for the Software Technical Review Process

    DTIC Science & Technology

    1988-04-01

    the Software Technical Review Process Softwar-e reviewing is a general term applied to techniques for the use of human hitellectual power to detect...more systematic than random. It utilizes data supplied by students, rather than relying solely on the subjective opinions of the instructor. The...The experience of other users is now essential.) "• Are the resulting grades accurate? (Thus far, they appear to correlate with student grades on

  16. Software Acquisition Risk Management Key Process Area (KPA) - A Guidebook Version 1.0.

    DTIC Science & Technology

    1997-08-01

    Budget - Software Project Management Practices and Techniques. McGraw-Hill International (UK) Limited, 1992. [Boehm 81 ] Boehm, Barry . Software...Engineering Economics. Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1981. [Boehm 89] Boehm, Barry . IEEE Tutorial on Software Risk Management. New York: IEEE...95] [Mayrhauser 90] [Moran 90] [Myers 96] [NRC 89] [Osborn 53] [Paulk 95] [ Pressman 92] [Pulford 96] [Scholtes 88] [Sisti 94] [STSC 96

  17. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  18. A Research Agenda for Service-Oriented Architecture (SOA): Maintenance and Evolution of Service-Oriented Systems

    DTIC Science & Technology

    2010-03-01

    service consumers, and infrastructure. Techniques from any iterative and incremental software development methodology followed by the organiza- tion... Service -Oriented Architecture Environment (CMU/SEI-2008-TN-008). Software Engineering Institute, Carnegie Mellon University, 2008. http://www.sei.cmu.edu...Integrating Legacy Software into a Service Oriented Architecture.” Proceedings of the 10th European Conference on Software Maintenance (CSMR 2006). Bari

  19. Analysis of DDT and its metabolites in soil and water samples obtained in the vicinity of a closed-down factory in Bangladesh using various extraction methods.

    PubMed

    Al Mahmud, M N U; Khalil, Farzana; Rahman, Md Musfiqur; Mamun, M I R; Shoeb, Mohammad; Abd El-Aty, A M; Park, Jong-Hyouk; Shin, Ho-Chul; Nahar, Nilufar; Shim, Jae-Han

    2015-12-01

    This study was conducted to monitor the spread of dichlorodiphenyltrichloroethane (DDT) and its metabolites (dichlorodiphenyldichloroethylene (DDE), dichlorodiphenyldichloroethane (DDD)) in soil and water to regions surrounding a closed DDT factory in Bangladesh. This fulfillment was accomplished using inter-method and inter-laboratory validation studies. DDTs (DDT and its metabolites) from soil samples were extracted using microwave-assisted extraction (MAE), supercritical fluid extraction (SFE), and solvent extraction (SE). Inter-laboratory calibration was assessed by SE, and all methods were validated by intra- and inter-day accuracy (expressed as recovery %) and precision (expressed as relative standard deviation (RSD)) in the same laboratory, at three fortified concentrations (n = 4). DDTs extracted from water samples by liquid-liquid partitioning and all samples were analyzed by gas chromatography (GC)-electron capture detector (ECD) and confirmed by GC/mass spectrometry (GC/MS). Linearities expressed as determination coefficients (R (2)) were ≥0.995 for matrix-matched calibrations. The recovery rate was in the range of 72-120 and 83-110%, with <15% RSD in soil and water, respectively. The limit of quantification (LOQ) was 0.0165 mg kg(-1) in soil and 0.132 μg L(-1) in water. Greater quantities of DDTs were extracted from soil using the MAE and SE techniques than with the SFE method. Higher amounts of DDTs were discovered in the southern (2.2-936 × 10(2) mg kg(-1)) or southwestern (86.3-2067 × 10(2) mg kg(-1)) direction from the factory than in the eastern direction (1.0-48.6 × 10(2) mg kg(-1)). An exception was the soil sample collected 50 ft (15.24 m) east (2904 × 10(2) mg kg(-1)) of the factory. The spread of DDTs in the water bodies (0.59-3.01 μg L(-1)) was approximately equal in all directions. We concluded that DDTs might have been dumped randomly around the warehouse after the closing of the factory.

  20. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  1. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  2. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  3. Raman scattering spectroscopy for explosives identification

    NASA Astrophysics Data System (ADS)

    Nagli, L.; Gaft, M.

    2007-04-01

    Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS technique belongs to trace detection, namely to its micro-particles variety. We applied gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. Raman system was developed and tested by LDS for field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.

  4. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  5. Novel Sessile Drop Software for Quantitative Estimation of Slag Foaming in Carbon/Slag Interactions

    NASA Astrophysics Data System (ADS)

    Khanna, Rita; Rahman, Mahfuzur; Leow, Richard; Sahajwalla, Veena

    2007-08-01

    Novel video-processing software has been developed for the sessile drop technique for a rapid and quantitative estimation of slag foaming. The data processing was carried out in two stages: the first stage involved the initial transformation of digital video/audio signals into a format compatible with computing software, and the second stage involved the computation of slag droplet volume and area of contact in a chosen video frame. Experimental results are presented on slag foaming from synthetic graphite/slag system at 1550 °C. This technique can be used for determining the extent and stability of foam as a function of time.

  6. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  7. Automatic Parameter Tuning for the Morpheus Vehicle Using Particle Swarm Optimization

    NASA Technical Reports Server (NTRS)

    Birge, B.

    2013-01-01

    A high fidelity simulation using a PC based Trick framework has been developed for Johnson Space Center's Morpheus test bed flight vehicle. There is an iterative development loop of refining and testing the hardware, refining the software, comparing the software simulation to hardware performance and adjusting either or both the hardware and the simulation to extract the best performance from the hardware as well as the most realistic representation of the hardware from the software. A Particle Swarm Optimization (PSO) based technique has been developed that increases speed and accuracy of the iterative development cycle. Parameters in software can be automatically tuned to make the simulation match real world subsystem data from test flights. Special considerations for scale, linearity, discontinuities, can be all but ignored with this technique, allowing fast turnaround both for simulation tune up to match hardware changes as well as during the test and validation phase to help identify hardware issues. Software models with insufficient control authority to match hardware test data can be immediately identified and using this technique requires very little to no specialized knowledge of optimization, freeing model developers to concentrate on spacecraft engineering. Integration of the PSO into the Morpheus development cycle will be discussed as well as a case study highlighting the tool's effectiveness.

  8. Conceptualization and application of an approach for designing healthcare software interfaces.

    PubMed

    Kumar, Ajit; Maskara, Reena; Maskara, Sanjeev; Chiang, I-Jen

    2014-06-01

    The aim of this study is to conceptualize a novel approach, which facilitates us to design prototype interfaces for healthcare software. Concepts and techniques from various disciplines were used to conceptualize an interface design approach named MORTARS (Map Original Rhetorical To Adapted Rhetorical Situation). The concepts and techniques included in this approach are (1) rhetorical situation - a concept of philosophy provided by Bitzer (1968); (2) move analysis - an applied linguistic technique provided by Swales (1990) and Bhatia (1993); (3) interface design guidelines - a cognitive and computer science concept provided by Johnson (2010); (4) usability evaluation instrument - an interface evaluation questionnaire provided by Lund (2001); (5) user modeling via stereotyping - a cognitive and computer science concept provided by Rich (1979). A prototype interface for outpatient clinic software was designed to introduce the underlying concepts of MORTARS. The prototype interface was evaluated by thirty-two medical informaticians. The medical informaticians found the designed prototype interface to be useful (73.3%), easy to use (71.9%), easy to learn (93.1%), and satisfactory (53.2%). MORTARS approach was found to be effective in designing the prototype user interface for the outpatient clinic software. This approach might be further used to design interfaces for various software pertaining to healthcare and other domains. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Volumetric Analysis of Alveolar Bone Defect Using Three-Dimensional-Printed Models Versus Computer-Aided Engineering.

    PubMed

    Du, Fengzhou; Li, Binghang; Yin, Ningbei; Cao, Yilin; Wang, Yongqian

    2017-03-01

    Knowing the volume of a graft is essential in repairing alveolar bone defects. This study investigates the 2 advanced preoperative volume measurement methods: three-dimensional (3D) printing and computer-aided engineering (CAE). Ten unilateral alveolar cleft patients were enrolled in this study. Their computed tomographic data were sent to 3D printing and CAE software. A simulated graft was used on the 3D-printed model, and the graft volume was measured by water displacement. The volume calculated by CAE software used mirror-reverses technique. The authors compared the actual volumes of the simulated grafts with the CAE software-derived volumes. The average volume of the simulated bone grafts by 3D-printed models was 1.52 mL, higher than the mean volume of 1.47 calculated by CAE software. The difference between the 2 volumes was from -0.18 to 0.42 mL. The paired Student t test showed no statistically significant difference between the volumes derived from the 2 methods. This study demonstrated that the mirror-reversed technique by CAE software is as accurate as the simulated operation on 3D-printed models in unilateral alveolar cleft patients. These findings further validate the use of 3D printing and CAE technique in alveolar defect repairing.

  10. Comparative test-retest reliability of metabolite values assessed with magnetic resonance spectroscopy of the brain. The LCModel versus the manufacturer software.

    PubMed

    Fayed, Nicolas; Modrego, Pedro J; Medrano, Jaime

    2009-06-01

    Reproducibility is an essential strength of any diagnostic technique for cross-sectional and longitudinal works. To determine in vivo short-term comparatively, the test-retest reliability of magnetic resonance spectroscopy (MRS) of the brain was compared using the manufacturer's software package and the widely used linear combination of model (LCModel) technique. Single-voxel H-MRS was performed in a series of patients with different pathologies on a 1.5 T clinical scanner. Four areas of the brain were explored with the point resolved spectroscopy technique acquisition mode; the echo time was 35 milliseconds and the repetition time was 2000 milliseconds. We enrolled 15 patients for every area, and the intra-individual variations of metabolites were studied in two consecutive scans without removing the patient from the scanner. Curve fitting and analysis of metabolites were made with the software of GE and the LCModel. Spectra non-fulfilling the minimum criteria of quality in relation to linewidths and signal/noise ratio were rejected. The intraclass correlation coefficients for the N-acetylaspartate/creatine (NAA/Cr) ratios were 0.93, 0.89, 0.9 and 0.8 for the posterior cingulate gyrus, occipital, prefrontal and temporal regions, respectively, with the GE software. For the LCModel, the coefficients were 0.9, 0.89, 0.87 and 0.84, respectively. For the absolute value of NAA, the GE software was also slightly more reproducible than LCModel. However, for the choline/Cr and myo-inositol/Cr ratios, the LCModel was more reliable than the GE software. The variability we have seen hovers around the percentages observed in previous reports (around 10% for the NAA/Cr ratios). We did not find that the LCModel software is superior to the software of the manufacturer. Reproducibility of metabolite values relies more on the observance of the quality parameters than on the software used.

  11. [Example of product development by industry and research solidarity].

    PubMed

    Seki, Masayoshi

    2014-01-01

    When the industrial firms develop the product, the research result from research institutions is used or to reflect the ideas from users on the developed product would be significant in order to improve the product. To state the software product which developed jointly as an example to describe the adopted development technique and its result, and to consider the modality of the industry solidarity seen from the company side and joint development. The software development methods have the merit and demerit and necessary to choose the optimal development technique by the system which develops. We have been jointly developed the dose distribution browsing software. As the software development method, we adopted the prototype model. In order to display the dose distribution information, it is necessary to load four objects which are CT-Image, Structure Set, RT-Plan, and RT-Dose, are displayed in a composite manner. The prototype model which is the development technique was adopted by this joint development was optimal especially to develop the dose distribution browsing software. In a prototype model, since the detail design was created based on the program source code after the program was finally completed, there was merit on the period shortening of document written and consist in design and implementation. This software eventually opened to the public as an open source. Based on this developed prototype software, the release version of the dose distribution browsing software was developed. Developing this type of novelty software, it normally takes two to three years, but since the joint development was adopted, it shortens the development period to one year. Shortening the development period was able to hold down to the minimum development cost for a company and thus, this will be reflected to the product price. The specialists make requests on the product from user's point of view are important, but increase in specialists as professionals for product development will increase the expectations to develop a product to meet the users demand.

  12. Multi-objective problem of the modified distributed parallel machine and assembly scheduling problem (MDPMASP) with eligibility constraints

    NASA Astrophysics Data System (ADS)

    Amallynda, I.; Santosa, B.

    2017-11-01

    This paper proposes a new generalization of the distributed parallel machine and assembly scheduling problem (DPMASP) with eligibility constraints referred to as the modified distributed parallel machine and assembly scheduling problem (MDPMASP) with eligibility constraints. Within this generalization, we assume that there are a set non-identical factories or production lines, each one with a set unrelated parallel machine with different speeds in processing them disposed to a single assembly machine in series. A set of different products that are manufactured through an assembly program of a set of components (jobs) according to the requested demand. Each product requires several kinds of jobs with different sizes. Beside that we also consider to the multi-objective problem (MOP) of minimizing mean flow time and the number of tardy products simultaneously. This is known to be NP-Hard problem, is important to practice, as the former criterions to reflect the customer's demand and manufacturer's perspective. This is a realistic and complex problem with wide range of possible solutions, we propose four simple heuristics and two metaheuristics to solve it. Various parameters of the proposed metaheuristic algorithms are discussed and calibrated by means of Taguchi technique. All proposed algorithms are tested by Matlab software. Our computational experiments indicate that the proposed problem and fourth proposed algorithms are able to be implemented and can be used to solve moderately-sized instances, and giving efficient solutions, which are close to optimum in most cases.

  13. Process description language: an experiment in robust programming for manufacturing systems

    NASA Astrophysics Data System (ADS)

    Spooner, Natalie R.; Creak, G. Alan

    1998-10-01

    Maintaining stable, robust, and consistent software is difficult in face of the increasing rate of change of customers' preferences, materials, manufacturing techniques, computer equipment, and other characteristic features of manufacturing systems. It is argued that software is commonly difficult to keep up to date because many of the implications of these changing features on software details are obscure. A possible solution is to use a software generation system in which the transformation of system properties into system software is made explicit. The proposed generation system stores the system properties, such as machine properties, product properties and information on manufacturing techniques, in databases. As a result this information, on which system control is based, can also be made available to other programs. In particular, artificial intelligence programs such as fault diagnosis programs, can benefit from using the same information as the control system, rather than a separate database which must be developed and maintained separately to ensure consistency. Experience in developing a simplified model of such a system is presented.

  14. Unisys' experience in software quality and productivity management of an existing system

    NASA Technical Reports Server (NTRS)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  15. CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000

    DTIC Science & Technology

    2000-06-01

    Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S

  16. Using neural networks in software repositories

    NASA Technical Reports Server (NTRS)

    Eichmann, David (Editor); Srinivas, Kankanahalli; Boetticher, G.

    1992-01-01

    The first topic is an exploration of the use of neural network techniques to improve the effectiveness of retrieval in software repositories. The second topic relates to a series of experiments conducted to evaluate the feasibility of using adaptive neural networks as a means of deriving (or more specifically, learning) measures on software. Taken together, these two efforts illuminate a very promising mechanism supporting software infrastructures - one based upon a flexible and responsive technology.

  17. Building America Top Innovations 2014 Profile: Cost-Optimized Attic Insulation Solution for Factory-Built Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    This 2014 Top Innovation profile describes a low-cost, low-tech attic insulation technique developed by the ARIES Building America team with help from Southern Energy Homes and Johns Manville. Increasing attic insulation in manufactured housing has been a significant challenge due to cost, production and transportation constraints. The simplicity of this dense-pack solution to increasing attic insulation R-value promises real hope for widespread industry adoption.

  18. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  19. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  20. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  1. Myc Dynamically and Preferentially Relocates to a Transcription Factory Occupied by Igh

    PubMed Central

    Osborne, Cameron S; Chakalova, Lyubomira; Mitchell, Jennifer A; Horton, Alice; Wood, Andrew L; Bolland, Daniel J; Corcoran, Anne E; Fraser, Peter

    2007-01-01

    Transcription in mammalian nuclei is highly compartmentalized in RNA polymerase II-enriched nuclear foci known as transcription factories. Genes in cis and trans can share the same factory, suggesting that genes migrate to preassembled transcription sites. We used fluorescent in situ hybridization to investigate the dynamics of gene association with transcription factories during immediate early (IE) gene induction in mouse B lymphocytes. Here, we show that induction involves rapid gene relocation to transcription factories. Importantly, we find that the Myc proto-oncogene on Chromosome 15 is preferentially recruited to the same transcription factory as the highly transcribed Igh gene located on Chromosome 12. Myc and Igh are the most frequent translocation partners in plasmacytoma and Burkitt lymphoma. Our results show that transcriptional activation of IE genes involves rapid relocation to preassembled transcription factories. Furthermore, the data imply a direct link between the nonrandom interchromosomal organization of transcribed genes at transcription factories and the incidence of specific chromosomal translocations. PMID:17622196

  2. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  3. [Dissolution behavior of Fuzi Lizhong pill based on simultaneous determination of two components in Glycyrrhizae Radix et Rhizoma].

    PubMed

    Jiang, Mao-Yuan; Zhang, Zhen; Shi, Jin-Feng; Zhang, Jin-Ming; Fu, Chao-Mei; Lin, Xia; Liu, Yu-Mei

    2018-03-01

    To preliminarily investigate the dissolution behavior of Fuzi Lizhong pill, provide the basis for its quality control and lay foundation for in vivo dissolution behavior by determining the dissolution rate of liquiritin and glycyrrhizic acid. High-performance liquid chromatography (HPLC) method for simultaneous content determination of the two active ingredients of liquiritin and glycyrrhizic acid in Fuzi Lizhong pill was established; The dissolution amount of these two active ingredients in fifteen batches of Fuzi Lizhong pill from five manufacturers was obtained at different time points, and then the cumulative dissolution rate was calculated and cumulative dissolution curve was drawn. The similarity of cumulative dissolution curve of different batches was evaluated based on the same factory, and the similarity of cumulative dissolution curve of different factories was evaluated based on the same active ingredients. The dissolution model of Fuzi Lizhong pill based on two kinds of active ingredients was established by fitting with the dissolution data. The best dissolution medium was 0.25% sodium lauryl sulfate. The dissolution behavior of liquiritin and glycyrrhizic acid in Fuzi Lizhong pill was basically the same and sustained release in 48 h. Three batches of the factories (factory 2, factory 3, factory 4 and factory 5) appeared to be similar in dissolution behavior, indicating similarity in dissolution behavior in most factories. Two of the three batches from factory 1 appeared to be not similar in dissolution behavior of liquiritin and glycyrrhizic acid. The dissolution data of the effective ingredients from different factories were same in fitting, and Weibull model was the best model in these batches. Fuzi Lizhong pill in 15 batches from 5 factories showed sustained release in 48 h, proving obviously slow releasing characteristics "pill is lenitive and keeps a long-time efficacy". The generally good dissolution behavior also suggested that quality of different batches from most factories was stable. The dissolution behavior of liquiritin and glycyrrhizic acid in different factories was different, suggesting that the source of medicinal materials and preparation technology parameters in five factories were different. Copyright© by the Chinese Pharmaceutical Association.

  4. Techniques for Developing an Acquisition Strategy by Profiling Software Risks

    DTIC Science & Technology

    2006-08-01

    Drivers...................................................................................... 13 Figure 8: BMW 745Li Software... BMW 745Li, shown in Figure 8, is a good illustration of the increasing software control of hardware systems in automobiles. Among the many features...roll stabilization, dynamic brake con- trol, coded drive-away protection, an adaptive automatic transmission, and iDrive systems. This list can be

  5. An application generator for rapid prototyping of Ada real-time control software

    NASA Technical Reports Server (NTRS)

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  6. AspectAssay: A Technique for Expanding the Pool of Available Aspect Mining Test Data Using Concern Seeding

    ERIC Educational Resources Information Center

    Moore, David G., Jr.

    2013-01-01

    Aspect-oriented software design (AOSD) enables better and more complete separation of concerns in software-intensive systems. By extracting aspect code and relegating crosscutting functionality to aspects, software engineers can improve the maintainability of their code by reducing code tangling and coupling of code concerns. Further, the number…

  7. The Virtual Genetics Lab II: Improvements to a Freely Available Software Simulation of Genetics

    ERIC Educational Resources Information Center

    White, Brian T.

    2012-01-01

    The Virtual Genetics Lab II (VGLII) is an improved version of the highly successful genetics simulation software, the Virtual Genetics Lab (VGL). The software allows students to use the techniques of genetic analysis to design crosses and interpret data to solve realistic genetics problems involving a hypothetical diploid insect. This is a brief…

  8. Holographic radar imaging privacy techniques utilizing dual-frequency implementation

    NASA Astrophysics Data System (ADS)

    McMakin, Douglas L.; Hall, Thomas E.; Sheen, David M.

    2008-04-01

    Over the last 15 years, the Pacific Northwest National Laboratory has performed significant research and development activities to enhance the state of the art of holographic radar imaging systems to be used at security checkpoints for screening people for concealed threats hidden under their garments. These enhancement activities included improvements to privacy techniques to remove human features and providing automatic detection of body-worn concealed threats. The enhanced privacy and detection methods used both physical and software imaging techniques. The physical imaging techniques included polarization-diversity illumination and reception, dual-frequency implementation, and high-frequency imaging at 60 GHz. Software imaging techniques to enhance the privacy of the person under surveillance included extracting concealed threat artifacts from the imagery to automatically detect the threat. This paper will focus on physical privacy techniques using dual-frequency implementation.

  9. Holographic Radar Imaging Privacy Techniques Utilizing Dual-Frequency Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMakin, Douglas L.; Hall, Thomas E.; Sheen, David M.

    2008-04-18

    Over the last 15 years, the Pacific Northwest National Laboratory has performed significant research and development activities to enhance the state of the art of holographic radar imaging systems to be used at security checkpoints for screening people for concealed threats hidden under their garments. These enhancement activities included improvements to privacy techniques to remove human features and providing automatic detection of body-worn concealed threats. The enhanced privacy and detection methods used both physical and software imaging techniques. The physical imaging techniques included polarization-diversity illumination and reception, dual-frequency implementation, and high-frequency imaging at 60 GHz. Software imaging techniques to enhancemore » the privacy of the person under surveillance included extracting concealed threat artifacts from the imagery to automatically detect the threat. This paper will focus on physical privacy techniques using dual-frequency implementation.« less

  10. Secure UNIX socket-based controlling system for high-throughput protein crystallography experiments.

    PubMed

    Gaponov, Yurii; Igarashi, Noriyuki; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Suzuki, Mamoru; Kosuge, Takashi; Wakatsuki, Soichi

    2004-01-01

    A control system for high-throughput protein crystallography experiments has been developed based on a multilevel secure (SSL v2/v3) UNIX socket under the Linux operating system. Main features of protein crystallography experiments (purification, crystallization, loop preparation, data collecting, data processing) are dealt with by the software. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data, that are stored in Network File Server) in a relational database (MySQL). The system consists of several servers and clients. TCP/IP secure UNIX sockets with four predefined behaviors [(a) listening to a request followed by a reply, (b) sending a request and waiting for a reply, (c) listening to a broadcast message, and (d) sending a broadcast message] support communications between all servers and clients allowing one to control experiments, view data, edit experimental conditions and perform data processing remotely. The usage of the interface software is well suited for developing well organized control software with a hierarchical structure of different software units (Gaponov et al., 1998), which will pass and receive different types of information. All communication is divided into two parts: low and top levels. Large and complicated control tasks are split into several smaller ones, which can be processed by control clients independently. For communicating with experimental equipment (beamline optical elements, robots, and specialized experimental equipment etc.), the STARS server, developed at the Photon Factory, is used (Kosuge et al., 2002). The STARS server allows any application with an open socket to be connected with any other clients that control experimental equipment. Majority of the source code is written in C/C++. GUI modules of the system were built mainly using Glade user interface builder for GTK+ and Gnome under Red Hat Linux 7.1 operating system.

  11. Methodology of decreasing software complexity using ontology

    NASA Astrophysics Data System (ADS)

    DÄ browska-Kubik, Katarzyna

    2015-09-01

    In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.

  12. Development of a support software system for real-time HAL/S applications

    NASA Technical Reports Server (NTRS)

    Smith, R. S.

    1984-01-01

    Methodologies employed in defining and implementing a software support system for the HAL/S computer language for real-time operations on the Shuttle are detailed. Attention is also given to the management and validation techniques used during software development and software maintenance. Utilities developed to support the real-time operating conditions are described. With the support system being produced on Cyber computers and executable code then processed through Cyber or PDP machines, the support system has a production level status and can serve as a model for other software development projects.

  13. Classification software technique assessment

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.

    1976-01-01

    A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.

  14. Space Station: NASA's software development approach increases safety and cost risks. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.

  15. Real-time surgical simulation for deformable soft-tissue objects with a tumour using Boundary Element techniques

    NASA Astrophysics Data System (ADS)

    Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.

    2009-08-01

    A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.

  16. Sequence Factorial and Its Applications

    ERIC Educational Resources Information Center

    Asiru, Muniru A.

    2012-01-01

    In this note, we introduce sequence factorial and use this to study generalized M-bonomial coefficients. For the sequence of natural numbers, the twin concepts of sequence factorial and generalized M-bonomial coefficients, respectively, extend the corresponding concepts of factorial of an integer and binomial coefficients. Some latent properties…

  17. REDIR: Automated Static Detection of Obfuscated Anti-Debugging Techniques

    DTIC Science & Technology

    2014-03-27

    analyzing code samples that resist other forms of analysis. 2.5.6 RODS and HASTI: Software Engineering Cognitive Support Software Engineering (SE) is another...and (c) this method is resistant to common obfuscation techniques. To achieve this goal, the Data/Frame sensemaking theory guides the process of...No Starch Press, 2012. [46] C.-W. Hsu, S. W. Shieh et al., “Divergence Detector: A Fine-Grained Approach to Detecting VM-Awareness Malware,” in

  18. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  19. Software error detection

    NASA Technical Reports Server (NTRS)

    Buechler, W.; Tucker, A. G.

    1981-01-01

    Several methods were employed to detect both the occurrence and source of errors in the operational software of the AN/SLQ-32. A large embedded real time electronic warfare command and control system for the ROLM 1606 computer are presented. The ROLM computer provides information about invalid addressing, improper use of privileged instructions, stack overflows, and unimplemented instructions. Additionally, software techniques were developed to detect invalid jumps, indices out of range, infinte loops, stack underflows, and field size errors. Finally, data are saved to provide information about the status of the system when an error is detected. This information includes I/O buffers, interrupt counts, stack contents, and recently passed locations. The various errors detected, techniques to assist in debugging problems, and segment simulation on a nontarget computer are discussed. These error detection techniques were a major factor in the success of finding the primary cause of error in 98% of over 500 system dumps.

  20. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  1. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Promon's participation in the Brasilsat program: first & second generations

    NASA Astrophysics Data System (ADS)

    Depaiva, Ricardo N.

    This paper presents an overview of the Brasilsat program, space and ground segments, developed by Hughes and Promon. Promon is a Brazilian engineering company that has been actively participating in the Brasilsat Satellite Telecommunications Program since its beginning. During the first generation, as subcontractor of the Spar/Hughes/SED consortium, Promon had a significant participation in the site installation of the Ground Segment, including the antennas. During the second generation, as partner of a consortium with Hughes, Promon participated in the upgrade of Brasilsat's Ground Segment systems: the TT&C (TCR1, TCR2, and SCC) and the COCC (Communications and Operations Control Center). This upgrade consisted of the design and development of hardware and software to support the second generation requirements, followed by integration and tests, factory acceptance tests, transport to site, site installation, site acceptance tests and warranty support. The upgraded systems are distributed over four sites with remote access to the main ground station. The solutions adopted provide a high level of automation, and easy operator interaction. The hardware and software technologies were selected to provide the flexibility to incorporate new technologies and services from the demanding satellite telecommunications market.

  3. Investigation of cadmium pollution in the spruce saplings near the metal production factory.

    PubMed

    Hashemi, Seyed Armin; Farajpour, Ghasem

    2016-02-01

    Toxic metals such as lead and cadmium are among the pollutants that are created by the metal production factories and disseminated in the nature. In order to study the quantity of cadmium pollution in the environment of the metal production factories, 50 saplings of the spruce species at the peripheries of the metal production factories were examined and the samples of the leaves, roots, and stems of saplings planted around the factory and the soil of the environment of the factory were studied to investigate pollution with cadmium. They were compared to the soil and saplings of the spruce trees planted outside the factory as observer region. The results showed that the quantity of pollution in the leaves, stems, and roots of the trees planted inside the factory environment were estimated at 1.1, 1.5, and 2.5 mg/kg, respectively, and this indicated a significant difference with the observer region (p < 0.05). The quantity of cadmium in the soil of the peripheries of the metal production factory was estimated at 6.8 mg/kg in the depth of 0-10 cm beneath the level of the soil. The length of roots in the saplings planted around the factory of metal production stood at 11 and 14.5 cm in the observer region which had a significant difference with the observer region (p < 0.05). The quantity of soil resources and spruce species' pollution with cadmium in the region has been influenced by the production processes in the factory. © The Author(s) 2013.

  4. Testing Factorial Invariance in Multilevel Data: A Monte Carlo Study

    ERIC Educational Resources Information Center

    Kim, Eun Sook; Kwok, Oi-man; Yoon, Myeongsun

    2012-01-01

    Testing factorial invariance has recently gained more attention in different social science disciplines. Nevertheless, when examining factorial invariance, it is generally assumed that the observations are independent of each other, which might not be always true. In this study, we examined the impact of testing factorial invariance in multilevel…

  5. The MICRO-BOSS scheduling system: Current status and future efforts

    NASA Technical Reports Server (NTRS)

    Sadeh, Norman M.

    1992-01-01

    In this paper, a micro-opportunistic approach to factory scheduling was described that closely monitors the evolution of bottlenecks during the construction of the schedule and continuously redirects search towards the bottleneck that appears to be most critical. This approach differs from earlier opportunistic approaches, as it does not require scheduling large resource subproblems or large job subproblems before revising the current scheduling strategy. This micro-opportunistic approach was implemented in the context of the MICRO-BOSS factory scheduling system. A study comparing MICRO-BOSS against a macro-opportunistic scheduler suggests that the additional flexibility of the micro-opportunistic approach to scheduling generally yields important reductions in both tardiness and inventory. Current research efforts include: adaptation of MICRO-BOSS to deal with sequence-dependent setups and development of micro-opportunistic reactive scheduling techniques that will enable the system to patch the schedule in the presence of contingencies such as machine breakdowns, raw materials arriving late, job cancellations, etc.

  6. Statistical moments in superposition models and strongly intensive measures

    NASA Astrophysics Data System (ADS)

    Broniowski, Wojciech; Olszewski, Adam

    2017-06-01

    First, we present a concise glossary of formulas for composition of standard, cumulant, factorial, and factorial cumulant moments in superposition (compound) models, where final particles are created via independent emission from a collection of sources. Explicit mathematical formulas for the composed moments are given to all orders. We discuss the composition laws for various types of moments via the generating-function methods and list the formulas for the unfolding of the unwanted fluctuations. Second, the technique is applied to the difference of the scaled multiplicities of two particle types. This allows for a systematic derivation and a simple algebraic interpretation of the so-called strongly intensive fluctuation measures. With the help of the formalism we obtain several new strongly intensive measures involving higher-rank moments. The reviewed as well as the new results may be useful in investigations of mechanisms of particle production and event-by-event fluctuations in high-energy nuclear and hadronic collisions, and in particular in the search for signatures of the QCD phase transition at a finite baryon density.

  7. Factors affecting the musculoskeletal disorders of workers in the frozen food manufacturing factories in Thailand.

    PubMed

    Thetkathuek, Anamai; Meepradit, Parvena; Jaidee, Wanlop

    2016-01-01

    The purpose of this research was to study factors affecting musculoskeletal disorders. The sample population of the study was 528 factory workers from the frozen food industry, as well as a controlled group of 255 office workers. The samples were collected during interviews using the Nordic questionnaire to assess musculoskeletal disorders, and to assess the risk by the rapid upper limb assessment and rapid entire body assessment techniques. The findings of the study were that most symptoms were found in the dissecting department, higher than in the controlled group. The details of the symptoms were, accordingly: elbow pain (adjusted odds ratio, 35.1; 95% CI [17.4, 70.9]). Regarding the risk of alcohol drinking, workers were exposed to more risks when alcohol was consumed. It is suggested that workers' health should be monitored regularly. People who work in a cold environment should be encouraged to wear body protection and to avoid drinking.

  8. Adaptive laboratory evolution -- principles and applications for biotechnology.

    PubMed

    Dragosits, Martin; Mattanovich, Diethard

    2013-07-01

    Adaptive laboratory evolution is a frequent method in biological studies to gain insights into the basic mechanisms of molecular evolution and adaptive changes that accumulate in microbial populations during long term selection under specified growth conditions. Although regularly performed for more than 25 years, the advent of transcript and cheap next-generation sequencing technologies has resulted in many recent studies, which successfully applied this technique in order to engineer microbial cells for biotechnological applications. Adaptive laboratory evolution has some major benefits as compared with classical genetic engineering but also some inherent limitations. However, recent studies show how some of the limitations may be overcome in order to successfully incorporate adaptive laboratory evolution in microbial cell factory design. Over the last two decades important insights into nutrient and stress metabolism of relevant model species were acquired, whereas some other aspects such as niche-specific differences of non-conventional cell factories are not completely understood. Altogether the current status and its future perspectives highlight the importance and potential of adaptive laboratory evolution as approach in biotechnological engineering.

  9. Software safety - A user's practical perspective

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Corliss, Lloyd D.

    1990-01-01

    Software safety assurance philosophy and practices at the NASA Ames are discussed. It is shown that, to be safe, software must be error-free. Software developments on two digital flight control systems and two ground facility systems are examined, including the overall system and software organization and function, the software-safety issues, and their resolution. The effectiveness of safety assurance methods is discussed, including conventional life-cycle practices, verification and validation testing, software safety analysis, and formal design methods. It is concluded (1) that a practical software safety technology does not yet exist, (2) that it is unlikely that a set of general-purpose analytical techniques can be developed for proving that software is safe, and (3) that successful software safety-assurance practices will have to take into account the detailed design processes employed and show that the software will execute correctly under all possible conditions.

  10. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  11. Software manual for operating particle displacement tracking data acquisition and reduction system

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1991-01-01

    The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.

  12. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  13. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  14. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  15. From gene to structure: The protein factory of the NBICS Centre of Kurchatov Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyko, K. M.; Lipkin, A. V.; Popov, V. O., E-mail: vpopov@inbi.ras.ru

    2013-05-15

    The Protein Factory was established at the Centre for Nano, Bio, Info, Cognitive, and Social Sciences and Technologies (NBICS Centre) of the National Research Centre 'Kurchatov Institute' in 2010. The Protein Factory, together with the Centre for Synchrotron Radiation and Nanotechnology, promote research on structural biology. This paper presents the technology platforms developed at the Protein Factory and the facilities available for researchers. The main projects currently being performed at the Protein Factory are briefly described.

  16. Architecture of the software for LAMOST fiber positioning subsystem

    NASA Astrophysics Data System (ADS)

    Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin

    2004-09-01

    The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.

  17. Approaching the socialist factory and its workforce: considerations from fieldwork in (former) Yugoslavia

    PubMed Central

    Archer, Rory; Musić, Goran

    2017-01-01

    Abstract The socialist factory, as the ‘incubator’ of the new socialist (wo)man, is a productive entry point for the study of socialist modernization and its contradictions. By outlining some theoretical and methodological insights gathered through field-research in factories in former Yugoslavia, we seek to connect the state of labour history in the Balkans to recent breakthroughs made by labour historians of other socialist countries. The first part of this article sketches some of the specificities of the Yugoslav self-managed factory and its heterogeneous workforce. It presents the ambiguous relationship between workers and the factory and demonstrates the variety of life trajectories for workers in Yugoslav state-socialism (from model communists to alienated workers). The second part engages with the available sources for conducting research inside and outside the factory advocating an approach which combines factory and local archives, print media and oral history. PMID:28190894

  18. Approaching the socialist factory and its workforce: considerations from fieldwork in (former) Yugoslavia.

    PubMed

    Archer, Rory; Musić, Goran

    2017-01-01

    The socialist factory, as the 'incubator' of the new socialist (wo)man, is a productive entry point for the study of socialist modernization and its contradictions. By outlining some theoretical and methodological insights gathered through field-research in factories in former Yugoslavia, we seek to connect the state of labour history in the Balkans to recent breakthroughs made by labour historians of other socialist countries. The first part of this article sketches some of the specificities of the Yugoslav self-managed factory and its heterogeneous workforce. It presents the ambiguous relationship between workers and the factory and demonstrates the variety of life trajectories for workers in Yugoslav state-socialism (from model communists to alienated workers). The second part engages with the available sources for conducting research inside and outside the factory advocating an approach which combines factory and local archives, print media and oral history.

  19. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  20. Reverse Engineering and Software Products Reuse to Teach Collaborative Web Portals: A Case Study with Final-Year Computer Science Students

    ERIC Educational Resources Information Center

    Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio

    2010-01-01

    The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…

  1. Future Software Sizing Metrics and Estimation Challenges

    DTIC Science & Technology

    2011-07-01

    systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban

  2. Analysis and Synthesis of Robust Data Structures

    DTIC Science & Technology

    1990-08-01

    1.3.2 Multiversion Software. .. .. .. .. .. .... .. ... .. ...... 5 1.3.3 Robust Data Structure .. .. .. .. .. .. .. .. .. ... .. ..... 6 1.4...context are 0 multiversion software, which is an adaptation oi N-modulo redundancy (NMR) tech- nique. * recovery blocks, which is an adaptation of...implementations using these features for such a hybrid approach. 1.3.2 Multiversion Software Avizienis [AC77] was the first to adapt NMR technique into

  3. Digital adaptive controllers for VTOL vehicles. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Hartmann, G. L.; Stein, G.; Pratt, S. G.

    1979-01-01

    The VTOL approach and landing test (VALT) adaptive software is documented. Two self-adaptive algorithms, one based on an implicit model reference design and the other on an explicit parameter estimation technique were evaluated. The organization of the software, user options, and a nominal set of input data are presented along with a flow chart and program listing of each algorithm.

  4. An Incremental Life-cycle Assurance Strategy for Critical System Certification

    DTIC Science & Technology

    2014-11-04

    for Safe Aircraft Operation Embedded software systems introduce a new class of problems not addressed by traditional system modeling & analysis...Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects control behavior...do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  5. Software Requirements Engineering Methodology (Development)

    DTIC Science & Technology

    1979-06-01

    Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway

  6. The Virtual Factory Teaching System (VFTS): Project Review and Results.

    ERIC Educational Resources Information Center

    Kazlauskas, E. J.; Boyd, E. F., III; Dessouky, M. M.

    This paper presents a review of the Virtual Factory Teaching (VFTS) project, a Web-based, multimedia collaborative learning network. The system allows students, working alone or in teams, to build factories, forecast demand for products, plan production, establish release rules for new work into the factory, and set scheduling rules for…

  7. Sequence Factorial of "g"-Gonal Numbers

    ERIC Educational Resources Information Center

    Asiru, Muniru A.

    2013-01-01

    The gamma function, which has the property to interpolate the factorial whenever the argument is an integer, is a special case (the case "g"?=?2) of the general term of the sequence factorial of "g"-gonal numbers. In relation to this special case, a formula for calculating the general term of the sequence factorial of any…

  8. A survey of airborne and skin exposures to chemicals in footwear and equipment factories in Thailand.

    PubMed

    Todd, Lori A; Mottus, Kathleen; Mihlan, Gary J

    2008-03-01

    This research reports on a pilot industrial hygiene study that was performed at four footwear factories and two equipment factories in Thailand. Workers in these factories were exposed through inhalation and dermal contact to a large number of organic vapors from solvents and cements that were hand applied. In addition, these workers were exposed to highly toxic isocyanates primarily through the dermal route. A total of 286 personal air samples were obtained at the four footwear factories using organic vapor monitors; individual job tasks were monitored using a real-time MIRAN Spectrometer. A total of 64 surface, tool, or hand samples were monitored for isocyanates using surface contamination detectors. Real-time measurements were also obtained for organic vapors in two equipment factories. From 8% to 21% of the workers sampled in each footwear factory were overexposed to mixtures of chemicals from solvents and cements. Up to 100% of the workers performing specific job tasks were overexposed to mixtures of chemicals. From 39% to 69% of the surface samples were positive for unreacted isocyanates. Many of the real-time measurements obtained in the equipment factories exceeded occupational exposure limits. Personal protective equipment and engineering controls were inadequate in all of the factories.

  9. Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.

    PubMed

    Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K

    2012-01-01

    Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.

  10. Textual data compression in computational biology: a synopsis.

    PubMed

    Giancarlo, Raffaele; Scaturro, Davide; Utro, Filippo

    2009-07-01

    Textual data compression, and the associated techniques coming from information theory, are often perceived as being of interest for data communication and storage. However, they are also deeply related to classification and data mining and analysis. In recent years, a substantial effort has been made for the application of textual data compression techniques to various computational biology tasks, ranging from storage and indexing of large datasets to comparison and reverse engineering of biological networks. The main focus of this review is on a systematic presentation of the key areas of bioinformatics and computational biology where compression has been used. When possible, a unifying organization of the main ideas and techniques is also provided. It goes without saying that most of the research results reviewed here offer software prototypes to the bioinformatics community. The Supplementary Material provides pointers to software and benchmark datasets for a range of applications of broad interest. In addition to provide reference to software, the Supplementary Material also gives a brief presentation of some fundamental results and techniques related to this paper. It is at: http://www.math.unipa.it/ approximately raffaele/suppMaterial/compReview/

  11. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Evaluation of three different validation procedures regarding the accuracy of template-guided implant placement: an in vitro study.

    PubMed

    Vasak, Christoph; Strbac, Georg D; Huber, Christian D; Lettner, Stefan; Gahleitner, André; Zechner, Werner

    2015-02-01

    The study aims to evaluate the accuracy of the NobelGuide™ (Medicim/Nobel Biocare, Göteborg, Sweden) concept maximally reducing the influence of clinical and surgical parameters. Moreover, the study was to compare and validate two validation procedures versus a reference method. Overall, 60 implants were placed in 10 artificial edentulous mandibles according to the NobelGuide™ protocol. For merging the pre- and postoperative DICOM data sets, three different fusion methods (Triple Scan Technique, NobelGuide™ Validation software, and AMIRA® software [VSG - Visualization Sciences Group, Burlington, MA, USA] as reference) were applied. Discrepancies between the virtual and the actual implant positions were measured. The mean deviations measured with AMIRA® were 0.49 mm (implant shoulder), 0.69 mm (implant apex), and 1.98°mm (implant axis). The Triple Scan Technique as well as the NobelGuide™ Validation software revealed similar deviations compared with the reference method. A significant correlation between angular and apical deviations was seen (r = 0.53; p < .001). A greater implant diameter was associated with greater deviations (p = .03). The Triple Scan Technique as a system-independent validation procedure as well as the NobelGuide™ Validation software are in accordance with the AMIRA® software. The NobelGuide™ system showed similar or less spatial and angular deviations compared with others. © 2013 Wiley Periodicals, Inc.

  13. Methods, Software and Tools for Three Numerical Applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  14. Algorithms and software for solving finite element equations on serial and parallel architectures

    NASA Technical Reports Server (NTRS)

    George, Alan

    1989-01-01

    Over the past 15 years numerous new techniques have been developed for solving systems of equations and eigenvalue problems arising in finite element computations. A package called SPARSPAK has been developed by the author and his co-workers which exploits these new methods. The broad objective of this research project is to incorporate some of this software in the Computational Structural Mechanics (CSM) testbed, and to extend the techniques for use on multiprocessor architectures.

  15. The software and algorithms for hyperspectral data processing

    NASA Astrophysics Data System (ADS)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.

  16. Does open-air exposure to volatile organic compounds near a plastic recycling factory cause health effects?

    PubMed

    Yorifuji, Takashi; Noguchi, Miyuki; Tsuda, Toshihide; Suzuki, Etsuji; Takao, Soshi; Kashima, Saori; Yanagisawa, Yukio

    2012-01-01

    After a plastic reprocessing factory began to operate in August 2004, the residents around the factory in Neyagawa, Osaka, Japan, began to complain of symptoms. Therefore, we conducted an exposure assessment and a population-based epidemiological study in 2006. To assess exposure, volatile organic compounds (VOCs) and total VOCs were measured at two locations in the vicinity of the factory. In the population-based study, a total of 3,950 residents were targeted. A self-administered questionnaire was used to collect information about subjects' mucocutaneous or respiratory symptoms. Using logistic regression models, we compared the prevalence of symptoms in July 2006 by employing the farthest area from the factory as a reference, and prevalence odds ratios (PORs) and their 95% confidence intervals (CIs) were estimated. The concentration of total VOCs was higher in the vicinity of the factory. The prevalence of mucocutaneous and respiratory symptoms was the highest among the residents in the closest area to the factory. Some symptoms were significantly increased among the residents within 500 m of the factory compared with residents of an area 2800 m from the factory: e.g., sore throat (POR=3.2, 95% CI: 1.3-8.0), eye itch (POR=3.0, 95% CI: 1.5-6.0), eye discharge (POR=6.0, 95% CI: 2.3-15.9), eczema (POR=3.0, 95% CI: 1.1-7.9) and sputum (POR=2.4, 95% CI: 1.1-5.1). Despite of the limitations of this study, these results imply a possible association of open-air VOCs with mucocutaneous and respiratory symptoms. Because this kind of plasticre cycling factory only recently came into operation, more attention should be paid to the operation of plastic recycling factories in the environment.

  17. Technical considerations to avoid delayed and non-union.

    PubMed

    McMillan, Tristan E; Johnstone, Alan J

    2017-06-01

    For many years intramedullary nails have been a well accepted and successful method of diaphyseal fracture fixation. However, delayed and non unions with this technique do still occur and are associated with significant patient morbidity. The reason for this can be multi-factorial. We discuss a number of technical considerations to maximise fracture reduction, fracture stability and fracture vascularity in order to achieve bony union. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  18. Determination of total Cr in wastewaters of Cr electroplating factories in the I.organize industry region (Kayseri, Turkey) by ICP-AES.

    PubMed

    Yilmaz, Selehattin; Türe, Melike; Sadikoglu, Murat; Duran, Ali

    2010-08-01

    The wastewater pollution in industrial areas is one of the most important environmental problems. Heavy metal pollution, especially chromium pollution in the wastewater sources from electroplating, dyeing, and tannery, has affected the life on earth. This pollution can affect on all ecosystems and human health directly or by food chain. Therefore, the determination of total chromium in this study is of great importance. In this study, accurate, rapid, sensitive, selective, simple, and low-cost technique for the direct determination of total Cr in wastewater samples collected from the some Cr electroplating factories in March 2008 by inductively coupled plasma-atomic emission spectrometry has been developed. The analysis of a given sample is completed in about 15 min by this technique applied. As the result of the chromium analysis, the limit of quantification for the total Cr were founded to be over the limit value (0.05 mg L(-1); WHO, EPA, TSE 266, and inland water quality classification) as 1,898.78+/-0.34 mg/L at station 1 and 3,189.02+/-0.56 mg/L at station 2. The found concentration of total Cr has been determined to be IV class quality water according to the inland water classification. In order to validate the applied method, recovery studies were performed.

  19. Efficiency in Complexity: Composition and Dynamic Nature of Mimivirus Replication Factories

    PubMed Central

    Milrot, Elad; Mutsafi, Yael; Ben-Dor, Shifra; Levin, Yishai; Savidor, Alon; Kartvelishvily, Elena

    2016-01-01

    ABSTRACT The recent discovery of multiple giant double-stranded DNA (dsDNA) viruses blurred the consensual distinction between viruses and cells due to their size, as well as to their structural and genetic complexity. A dramatic feature revealed by these viruses as well as by many positive-strand RNA viruses is their ability to rapidly form elaborate intracellular organelles, termed “viral factories,” where viral progeny are continuously generated. Here we report the first isolation of viral factories at progressive postinfection time points. The isolated factories were subjected to mass spectrometry-based proteomics, bioinformatics, and imaging analyses. These analyses revealed that numerous viral proteins are present in the factories but not in mature virions, thus implying that multiple and diverse proteins are required to promote the efficiency of viral factories as “production lines” of viral progeny. Moreover, our results highlight the dynamic and highly complex nature of viral factories, provide new and general insights into viral infection, and substantiate the intriguing notion that viral factories may represent the living state of viruses. IMPORTANCE Large dsDNA viruses such as vaccinia virus and the giant mimivirus, as well as many positive-strand RNA viruses, generate elaborate cytoplasmic organelles in which the multiple and diverse transactions required for viral replication and assembly occur. These organelles, which were termed “viral factories,” are attracting much interest due to the increasing realization that the rapid and continuous production of viral progeny is a direct outcome of the elaborate structure and composition of the factories, which act as efficient production lines. To get new insights into the nature and function of viral factories, we devised a method that allows, for the first time, the isolation of these organelles. Analyses of the isolated factories generated at different times postinfection by mass spectrometry-based proteomics provide new perceptions of their role and reveal the highly dynamic nature of these organelles. PMID:27581975

  20. Software Assurance in Acquisition: Mitigating Risks to the Enterprise. A Reference Guide for Security-Enhanced Software Acquisition and Outsourcing

    DTIC Science & Technology

    2009-02-01

    management, available at <http://www.iso.org/ iso /en/CatalogueDetailPage.CatalogueDetail?CSNUMBER=39612&ICS1=35&ICS2=40 &ICS3=>. ISO /IEC 27001 . Information...Management of the Systems Engineering Process. [ ISO /IEC 27001 ] ISO /IEC 27001 :2005. Information technology -- Security techniques -- Information security...software life cycles [ ISO /IEC 15026]. Software assurance is a key element of national security and homeland security. It is critical because dramatic

  1. Estimation and enhancement of real-time software reliability through mutation analysis

    NASA Technical Reports Server (NTRS)

    Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.

    1992-01-01

    A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.

  2. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  3. Real time flaw detection and characterization in tube through partial least squares and SVR: Application to eddy current testing

    NASA Astrophysics Data System (ADS)

    Ahmed, Shamim; Miorelli, Roberto; Calmon, Pierre; Anselmi, Nicola; Salucci, Marco

    2018-04-01

    This paper describes Learning-By-Examples (LBE) technique for performing quasi real time flaw localization and characterization within a conductive tube based on Eddy Current Testing (ECT) signals. Within the framework of LBE, the combination of full-factorial (i.e., GRID) sampling and Partial Least Squares (PLS) feature extraction (i.e., GRID-PLS) techniques are applied for generating a suitable training set in offine phase. Support Vector Regression (SVR) is utilized for model development and inversion during offine and online phases, respectively. The performance and robustness of the proposed GIRD-PLS/SVR strategy on noisy test set is evaluated and compared with standard GRID/SVR approach.

  4. Genome engineering for microbial natural product discovery.

    PubMed

    Choi, Si-Sun; Katsuyama, Yohei; Bai, Linquan; Deng, Zixin; Ohnishi, Yasuo; Kim, Eung-Soo

    2018-03-03

    The discovery and development of microbial natural products (MNPs) have played pivotal roles in the fields of human medicine and its related biotechnology sectors over the past several decades. The post-genomic era has witnessed the development of microbial genome mining approaches to isolate previously unsuspected MNP biosynthetic gene clusters (BGCs) hidden in the genome, followed by various BGC awakening techniques to visualize compound production. Additional microbial genome engineering techniques have allowed higher MNP production titers, which could complement a traditional culture-based MNP chasing approach. Here, we describe recent developments in the MNP research paradigm, including microbial genome mining, NP BGC activation, and NP overproducing cell factory design. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Iteration and Prototyping in Creating Technical Specifications.

    ERIC Educational Resources Information Center

    Flynt, John P.

    1994-01-01

    Claims that the development process for computer software can be greatly aided by the writers of specifications if they employ basic iteration and prototyping techniques. Asserts that computer software configuration management practices provide ready models for iteration and prototyping. (HB)

  6. Engineering software development with HyperCard

    NASA Technical Reports Server (NTRS)

    Darko, Robert J.

    1990-01-01

    The successful and unsuccessful techniques used in the development of software using HyperCard are described. The viability of the HyperCard for engineering is evaluated and the future use of HyperCard by this particular group of developers is discussed.

  7. 76 FR 77175 - New York Fun Factory Fireworks Display, Western Long Island Sound; Mamaroneck, NY

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ...-AA00 New York Fun Factory Fireworks Display, Western Long Island Sound; Mamaroneck, NY AGENCY: Coast... in support of the New York Fun Factory Fireworks display. This action is necessary to provide for the... the Coast Guard to define regulatory safety zones. On May 10, 2012 New York Fun Factory Events is...

  8. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  9. As-built design specification for proportion estimate software subsystem

    NASA Technical Reports Server (NTRS)

    Obrien, S. (Principal Investigator)

    1980-01-01

    The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.

  10. Binary-mask generation for diffractive optical elements using microcomputers.

    PubMed

    O'Shea, D C; Beletic, J W; Poutous, M

    1993-05-10

    A new technique for generation of binary masks for the fabrication of diffractive optical elements is investigated. This technique, which uses commercially available desktop-publishing hardware and software in conjunction with a standard photoreduction camera, is much faster and less expensive thanhe conventional methods. The short turnaround time and low cost should give researchers a much greater degree of flexibility in the field of binary optics and enable wider application of diffractive-optics technology. Techniques for generating optical elements by using standard software packages that produce PostScript output are described. An evaluation of the dimensional fidelity of the mask reproduction from design to its realization in photoresist is presented.

  11. Selecting reusable components using algebraic specifications

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.

    1992-01-01

    A significant hurdle confronts the software reuser attempting to select candidate components from a software repository - discriminating between those components without resorting to inspection of the implementation(s). We outline a mixed classification/axiomatic approach to this problem based upon our lattice-based faceted classification technique and Guttag and Horning's algebraic specification techniques. This approach selects candidates by natural language-derived classification, by their interfaces, using signatures, and by their behavior, using axioms. We briefly outline our problem domain and related work. Lattice-based faceted classifications are described; the reader is referred to surveys of the extensive literature for algebraic specification techniques. Behavioral support for reuse queries is presented, followed by the conclusions.

  12. Benchmarking the ATLAS software through the Kit Validation engine

    NASA Astrophysics Data System (ADS)

    De Salvo, Alessandro; Brasolin, Franco

    2010-04-01

    The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.

  13. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  14. Investigation of near-surface chemical, physical and mechanical properties of silicon carbide crystals and fibers modified by ion implantation

    NASA Astrophysics Data System (ADS)

    Spitznagel, J. A.; Wood, Susan

    1988-08-01

    The Software Engineering institute is a federally funded research and development center sponsored by the Department of Defense (DOD). It was chartered by the Undersecretary of Defense for Research and Engineering on June 15, 1984. The SEI was established and is operated by Carnegie Mellon University (CUM) under contract F19628-C-0003, which was competitively awarded on December 28, 1984, by the Air Force Electronic Systems Division. The mission of the SEI is to provide the means to bring the ablest minds and the most effective technology to bear on the rapid improvement of the quality of operational software in mission-critical computer systems; to accelerate the reduction to practice of modern software engineering techniques and methods; to promulgate the use of modern techniques and methods throughout the mission-critical systems community; and to establish standards of excellence for the practice of software engineering. This report provides a summary of the programs and projects, staff, facilities, and service accomplishments of the Software Engineering Institute during 1987.

  15. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  16. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  17. Remote sensing for urban planning

    NASA Technical Reports Server (NTRS)

    Davis, Bruce A.; Schmidt, Nicholas; Jensen, John R.; Cowen, Dave J.; Halls, Joanne; Narumalani, Sunil; Burgess, Bryan

    1994-01-01

    Utility companies are challenged to provide services to a highly dynamic customer base. With factory closures and shifts in employment becoming a routine occurrence, the utility industry must develop new techniques to maintain records and plan for expected growth. BellSouth Telecommunications, the largest of the Bell telephone companies, currently serves over 13 million residences and 2 million commercial customers. Tracking the movement of customers and scheduling the delivery of service are major tasks for BellSouth that require intensive manpower and sophisticated information management techniques. Through NASA's Commercial Remote Sensing Program Office, BellSouth is investigating the utility of remote sensing and geographic information system techniques to forecast residential development. This paper highlights the initial results of this project, which indicate a high correlation between the U.S. Bureau of Census block group statistics and statistics derived from remote sensing data.

  18. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  19. Factorial Structure of the New Ecological Paradigm Scale in Two French Samples

    ERIC Educational Resources Information Center

    Fleury-Bahi, Ghozlane; Marcouyeux, Aurore; Renard, Elise; Roussiau, Nicolas

    2015-01-01

    The principal objective of this research is to test the factorial structure of the New Ecological Paradigm scale on a population of men and women residing in France. The tested model is a second-order factorial model. This factorial structure is evaluated on two separate samples to test the stability of the solution (a first sample of 253…

  20. The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).

    ERIC Educational Resources Information Center

    Library Software Review, 1984

    1984-01-01

    Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…

  1. Do code of conduct audits improve chemical safety in garment factories? Lessons on corporate social responsibility in the supply chain from Fair Wear Foundation.

    PubMed

    Lindholm, Henrik; Egels-Zandén, Niklas; Rudén, Christina

    2016-10-01

    In managing chemical risks to the environment and human health in supply chains, voluntary corporate social responsibility (CSR) measures, such as auditing code of conduct compliance, play an important role. To examine how well suppliers' chemical health and safety performance complies with buyers' CSR policies and whether audited factories improve their performance. CSR audits (n = 288) of garment factories conducted by Fair Wear Foundation (FWF), an independent non-profit organization, were analyzed using descriptive statistics and statistical modeling. Forty-three per cent of factories did not comply with the FWF code of conduct, i.e. received remarks on chemical safety. Only among factories audited 10 or more times was there a significant increase in the number of factories receiving no remarks. Compliance with chemical safety requirements in garment supply chains is low and auditing is statistically correlated with improvements only at factories that have undergone numerous audits.

  2. Do code of conduct audits improve chemical safety in garment factories? Lessons on corporate social responsibility in the supply chain from Fair Wear Foundation

    PubMed Central

    2016-01-01

    Background In managing chemical risks to the environment and human health in supply chains, voluntary corporate social responsibility (CSR) measures, such as auditing code of conduct compliance, play an important role. Objectives To examine how well suppliers’ chemical health and safety performance complies with buyers’ CSR policies and whether audited factories improve their performance. Methods CSR audits (n = 288) of garment factories conducted by Fair Wear Foundation (FWF), an independent non-profit organization, were analyzed using descriptive statistics and statistical modeling. Results Forty-three per cent of factories did not comply with the FWF code of conduct, i.e. received remarks on chemical safety. Only among factories audited 10 or more times was there a significant increase in the number of factories receiving no remarks. Conclusions Compliance with chemical safety requirements in garment supply chains is low and auditing is statistically correlated with improvements only at factories that have undergone numerous audits. PMID:27611103

  3. Effect of factory effluents on physiological and biochemical contents of Gossypium hirsutum l.

    PubMed

    Muthusamy, A; Jayabalan, N

    2001-10-01

    The effect of sago and sugar factory effluents was studied on Gossypium hirsutum L. var. MCU 5 and MCU 11. Plants were irrigated with 0, 25, 50, 75 and 100% of effluents of both factories. At lower concentration (25%) of sugar factory effluents had stimulatory effect on all biochemical contents observed. Moreover, all concentration of sago factory effluents were found to have inhibitory effect on all biochemical contents except proline content which increased with increasing concentration of both the effluents. Plants growing on adjacent to sago and sugar factories or they irrigated with such type of polluted water, may accumulate the heavy metals found in both the effluents, at higher levels in plant products and if consumed may have similar effect on living organisms.

  4. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  5. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  6. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  7. The International Design Study for the Neutrino Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, K.

    2008-02-21

    The International Design Study for a future Neutrino Factory and super-beam facility (the ISS) established the physics case for a high-precision programme of long-baseline neutrino-oscillation measurements. The ISS also identified baseline specifications for the Neutrino Factory accelerator complex and the neutrino detector systems. This paper summarises the objectives of the International Design Study for the Neutrino Factory (the IDS-NF). The IDS-NF will build on the work of the ISS to deliver a Reference Design Report for the Neutrino Factory by 2012/13 and an Interim Design Report by 2010/11.

  8. 19 CFR Appendix to Part 102 - Textile and Apparel Manufacturer Identification

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... factories in Macau start with the same words, “Fabrica de Artigos de Vestuario,” which means “Factory of Clothing.” For a factory named “Fabrica de Artigos de Vestuario JUMP HIGH Ltd,” the portion of the factory...; VE20TCEN5880CAR Fabrica de Artigos de Vestuario TOP JOB, Grand River Building, FI 2-4, Macau; MOTOPJOB24MAC THE...

  9. 19 CFR Appendix to Part 102 - Textile and Apparel Manufacturer Identification

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... factories in Macau start with the same words, “Fabrica de Artigos de Vestuario,” which means “Factory of Clothing.” For a factory named “Fabrica de Artigos de Vestuario JUMP HIGH Ltd,” the portion of the factory...; VE20TCEN5880CAR Fabrica de Artigos de Vestuario TOP JOB, Grand River Building, FI 2-4, Macau; MOTOPJOB24MAC THE...

  10. Bulgar Factories (Trading Posts) in the Kama River Area as a Factor of Adjustment to Feudalism

    ERIC Educational Resources Information Center

    Krylasova, Natalia B.; Belavin, Andrei M.; Podosenova, Yulia A.

    2016-01-01

    At the start of the 2nd ML AD a number of trading posts, or factories, emerged in the Cis-Ural region with participation of Bulgar handicraftsmen and merchants. They were townships populated by various ethnic groups. Several centuries later similar factories were set up by natives of the Cis-Ural region in Western Siberia. These factories have…

  11. Photovoltaic pilot projects in the European community

    NASA Astrophysics Data System (ADS)

    Treble, F. C.; Grassi, G.; Schnell, W.

    The paper presents proposals received for the construction of photovoltaic pilot plants as part of the Commission of the European Communities' second 4-year solar energy R and D program. The proposed plants range from 30 to 300 kWp and cover a variety of applications including rural electrification, water pumping, desalination, dairy farming, factories, hospitals, schools and vacation centers. Fifteen projects will be accepted with a total generating capacity of 1 MWp, with preference given to those projects involving the development of new techniques, components and systems.

  12. The Factory of the Future

    NASA Technical Reports Server (NTRS)

    Byman, J. E.

    1985-01-01

    A brief history of aircraft production techniques is given. A flexible machining cell is then described. It is a computer controlled system capable of performing 4-axis machining part cleaning, dimensional inspection and materials handling functions in an unmanned environment. The cell was designed to: allow processing of similar and dissimilar parts in random order without disrupting production; allow serial (one-shipset-at-a-time) manufacturing; reduce work-in-process inventory; maximize machine utilization through remote set-up; maximize throughput and minimize labor.

  13. Evaluation of cavity size, kind, and filling technique of composite shrinkage by finite element.

    PubMed

    Jafari, Toloo; Alaghehmad, Homayoon; Moodi, Ehsan

    2018-01-01

    Cavity preparation reduces the rigidity of tooth and its resistance to deformation. The purpose of this study was to evaluate the dimensional changes of the repaired teeth using two types of light cure composite and two methods of incremental and bulk filling by the use of finite element method. In this computerized in vitro experimental study, an intact maxillary premolar was scanned using cone beam computed tomography instrument (SCANORA, Switzerland), then each section of tooth image was transmitted to Ansys software using AUTOCAD. Then, eight sizes of cavity preparations and two methods of restoration (bulk and incremental) using two different types of composite resin materials (Heliomolar, Brilliant) were proposed on software and analysis was completed with Ansys software. Dimensional change increased by widening and deepening of the cavities. It was also increased using Brilliant composite resin and incremental filling technique. Increase in depth and type of filling technique has the greatest role of dimensional change after curing, but the type of composite resin does not have a significant role.

  14. A common distributed language approach to software integration

    NASA Technical Reports Server (NTRS)

    Antonelli, Charles J.; Volz, Richard A.; Mudge, Trevor N.

    1989-01-01

    An important objective in software integration is the development of techniques to allow programs written in different languages to function together. Several approaches are discussed toward achieving this objective and the Common Distributed Language Approach is presented as the approach of choice.

  15. Industry best practices for the software development life cycle

    DOT National Transportation Integrated Search

    2007-11-01

    In the area of software development, there are many different views of what constitutes a best practice. The goal of this project was to identify a set of industry best practice techniques that fit the needs of the Montana Department of Transportatio...

  16. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  17. Standards guide for space and earth sciences computer software

    NASA Technical Reports Server (NTRS)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  18. Large Eddy Simulations using oodlesDST

    DTIC Science & Technology

    2016-01-01

    Research Agency DST-Group-TR-3205 ABSTRACT The oodlesDST code is based on OpenFOAM software and performs Large Eddy Simulations of......maritime platforms using a variety of simulation techniques. He is currently using OpenFOAM software to perform both Reynolds Averaged Navier-Stokes

  19. A simple method of measuring tibial tubercle to trochlear groove distance on MRI: description of a novel and reliable technique.

    PubMed

    Camp, Christopher L; Heidenreich, Mark J; Dahm, Diane L; Bond, Jeffrey R; Collins, Mark S; Krych, Aaron J

    2016-03-01

    Tibial tubercle-trochlear groove (TT-TG) distance is a variable that helps guide surgical decision-making in patients with patellar instability. The purpose of this study was to compare the accuracy and reliability of an MRI TT-TG measuring technique using a simple external alignment method to a previously validated gold standard technique that requires advanced software read by radiologists. TT-TG was calculated by MRI on 59 knees with a clinical diagnosis of patellar instability in a blinded and randomized fashion by two musculoskeletal radiologists using advanced software and by two orthopaedists using the study technique which utilizes measurements taken on a simple electronic imaging platform. Interrater reliability between the two radiologists and the two orthopaedists and intermethods reliability between the two techniques were calculated using interclass correlation coefficients (ICC) and concordance correlation coefficients (CCC). ICC and CCC values greater than 0.75 were considered to represent excellent agreement. The mean TT-TG distance was 14.7 mm (Standard Deviation (SD) 4.87 mm) and 15.4 mm (SD 5.41) as measured by the radiologists and orthopaedists, respectively. Excellent interobserver agreement was noted between the radiologists (ICC 0.941; CCC 0.941), the orthopaedists (ICC 0.978; CCC 0.976), and the two techniques (ICC 0.941; CCC 0.933). The simple TT-TG distance measurement technique analysed in this study resulted in excellent agreement and reliability as compared to the gold standard technique. This method can predictably be performed by orthopaedic surgeons without advanced radiologic software. II.

  20. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    NASA Technical Reports Server (NTRS)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  1. Analysis of replication factories in human cells by super-resolution light microscopy

    PubMed Central

    2009-01-01

    Background DNA replication in human cells is performed in discrete sub-nuclear locations known as replication foci or factories. These factories form in the nucleus during S phase and are sites of DNA synthesis and high local concentrations of enzymes required for chromatin replication. Why these structures are required, and how they are organised internally has yet to be identified. It has been difficult to analyse the structure of these factories as they are small in size and thus below the resolution limit of the standard confocal microscope. We have used stimulated emission depletion (STED) microscopy, which improves on the resolving power of the confocal microscope, to probe the structure of these factories at sub-diffraction limit resolution. Results Using immunofluorescent imaging of PCNA (proliferating cell nuclear antigen) and RPA (replication protein A) we show that factories are smaller in size (approximately 150 nm diameter), and greater in number (up to 1400 in an early S- phase nucleus), than is determined by confocal imaging. The replication inhibitor hydroxyurea caused an approximately 40% reduction in number and a 30% increase in diameter of replication factories, changes that were not clearly identified by standard confocal imaging. Conclusions These measurements for replication factory size now approach the dimensions suggested by electron microscopy. This agreement between these two methods, that use very different sample preparation and imaging conditions, suggests that we have arrived at a true measurement for the size of these structures. The number of individual factories present in a single nucleus that we measure using this system is greater than has been previously reported. This analysis therefore suggests that each replication factory contains fewer active replication forks than previously envisaged. PMID:20015367

  2. GNU Radio Sandia Utilities v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Jacob; Knee, Peter

    This software adds a data handling module to the GNU Radio (GR) software defined radio (SDR) framework as well as some general-purpose function blocks (filters, metadata control, etc). This software is useful for processing bursty RF transmissions with GR, and serves as a base for applying SDR signal processing techniques to a whole burst of data at a time, as opposed to streaming data which GR has been primarily focused around.

  3. Thermography based prescreening software tool for veterinary clinics

    NASA Astrophysics Data System (ADS)

    Dahal, Rohini; Umbaugh, Scott E.; Mishra, Deependra; Lama, Norsang; Alvandipour, Mehrdad; Umbaugh, David; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Under development is a clinical software tool which can be used in the veterinary clinics as a prescreening tool for these pathologies: anterior cruciate ligament (ACL) disease, bone cancer and feline hyperthyroidism. Currently, veterinary clinical practice uses several imaging techniques including radiology, computed tomography (CT), and magnetic resonance imaging (MRI). But, harmful radiation involved during imaging, expensive equipment setup, excessive time consumption and the need for a cooperative patient during imaging, are major drawbacks of these techniques. In veterinary procedures, it is very difficult for animals to remain still for the time periods necessary for standard imaging without resorting to sedation - which creates another set of complexities. Therefore, clinical application software integrated with a thermal imaging system and the algorithms with high sensitivity and specificity for these pathologies, can address the major drawbacks of the existing imaging techniques. A graphical user interface (GUI) has been created to allow ease of use for the clinical technician. The technician inputs an image, enters patient information, and selects the camera view associated with the image and the pathology to be diagnosed. The software will classify the image using an optimized classification algorithm that has been developed through thousands of experiments. Optimal image features are extracted and the feature vector is then used in conjunction with the stored image database for classification. Classification success rates as high as 88% for bone cancer, 75% for ACL and 90% for feline hyperthyroidism have been achieved. The software is currently undergoing preliminary clinical testing.

  4. Digital Image Correlation from Commercial to FOS Software: a Mature Technique for Full-Field Displacement Measurements

    NASA Astrophysics Data System (ADS)

    Belloni, V.; Ravanelli, R.; Nascetti, A.; Di Rita, M.; Mattei, D.; Crespi, M.

    2018-05-01

    In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC) has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome "La Sapienza"; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome "La Sapienza" and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.

  5. The software-cycle model for re-engineering and reuse

    NASA Technical Reports Server (NTRS)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  6. Permutation testing of orthogonal factorial effects in a language-processing experiment using fMRI.

    PubMed

    Suckling, John; Davis, Matthew H; Ooi, Cinly; Wink, Alle Meije; Fadili, Jalal; Salvador, Raymond; Welchew, David; Sendur, Levent; Maxim, Vochita; Bullmore, Edward T

    2006-05-01

    The block-paradigm of the Functional Image Analysis Contest (FIAC) dataset was analysed with the Brain Activation and Morphological Mapping software. Permutation methods in the wavelet domain were used for inference on cluster-based test statistics of orthogonal contrasts relevant to the factorial design of the study, namely: the average response across all active blocks, the main effect of speaker, the main effect of sentence, and the interaction between sentence and speaker. Extensive activation was seen with all these contrasts. In particular, different vs. same-speaker blocks produced elevated activation in bilateral regions of the superior temporal lobe and repetition suppression for linguistic materials (same vs. different-sentence blocks) in left inferior frontal regions. These are regions previously reported in the literature. Additional regions were detected in this study, perhaps due to the enhanced sensitivity of the methodology. Within-block sentence suppression was tested post-hoc by regression of an exponential decay model onto the extracted time series from the left inferior frontal gyrus, but no strong evidence of such an effect was found. The significance levels set for the activation maps are P-values at which we expect <1 false-positive cluster per image. Nominal type I error control was verified by empirical testing of a test statistic corresponding to a randomly ordered design matrix. The small size of the BOLD effect necessitates sensitive methods of detection of brain activation. Permutation methods permit the necessary flexibility to develop novel test statistics to meet this challenge.

  7. Prediction of hearing loss among the noise-exposed workers in a steel factory using artificial intelligence approach.

    PubMed

    Aliabadi, Mohsen; Farhadian, Maryam; Darvishi, Ebrahim

    2015-08-01

    Prediction of hearing loss in noisy workplaces is considered to be an important aspect of hearing conservation program. Artificial intelligence, as a new approach, can be used to predict the complex phenomenon such as hearing loss. Using artificial neural networks, this study aims to present an empirical model for the prediction of the hearing loss threshold among noise-exposed workers. Two hundred and ten workers employed in a steel factory were chosen, and their occupational exposure histories were collected. To determine the hearing loss threshold, the audiometric test was carried out using a calibrated audiometer. The personal noise exposure was also measured using a noise dosimeter in the workstations of workers. Finally, data obtained five variables, which can influence the hearing loss, were used for the development of the prediction model. Multilayer feed-forward neural networks with different structures were developed using MATLAB software. Neural network structures had one hidden layer with the number of neurons being approximately between 5 and 15 neurons. The best developed neural networks with one hidden layer and ten neurons could accurately predict the hearing loss threshold with RMSE = 2.6 dB and R(2) = 0.89. The results also confirmed that neural networks could provide more accurate predictions than multiple regressions. Since occupational hearing loss is frequently non-curable, results of accurate prediction can be used by occupational health experts to modify and improve noise exposure conditions.

  8. Determination of the clean-up efficiency of the solid-phase extraction of rosemary extracts: Application of full-factorial design in hyphenation with Gaussian peak fit function.

    PubMed

    Meischl, Florian; Kirchler, Christian Günter; Jäger, Michael Andreas; Huck, Christian Wolfgang; Rainer, Matthias

    2018-02-01

    We present a novel method for the quantitative determination of the clean-up efficiency to provide a calculated parameter for peak purity through iterative fitting in conjunction with design of experiments. Rosemary extracts were used and analyzed before and after solid-phase extraction using a self-fabricated mixed-mode sorbent based on poly(N-vinylimidazole/ethylene glycol dimethacrylate). Optimization was performed by variation of washing steps using a full three-level factorial design and response surface methodology. Separation efficiency of rosmarinic acid from interfering compounds was calculated using an iterative fit of Gaussian-like signals and quantifications were performed by the separate integration of the two interfering peak areas. Results and recoveries were analyzed using Design-Expert® software and revealed significant differences between the washing steps. Optimized parameters were considered and used for all further experiments. Furthermore, the solid-phase extraction procedure was tested and compared with commercial available sorbents. In contrast to generic protocols of the manufacturers, the optimized procedure showed excellent recoveries and clean-up rates for the polymer with ion exchange properties. Finally, rosemary extracts from different manufacturing areas and application types were studied to verify the developed method for its applicability. The cleaned-up extracts were analyzed by liquid chromatography with tandem mass spectrometry for detailed compound evaluation to exclude any interference from coeluting molecules. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  10. Software development for safety-critical medical applications

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.

  11. Adapting a Computerized Medical Dictation System to Prepare Academic Papers in Radiology.

    PubMed

    Sánchez, Yadiel; Prabhakar, Anand M; Uppot, Raul N

    2017-09-14

    Everyday radiologists use dictation software to compose clinical reports of imaging findings. The dictation software is tailored for medical use and to the speech pattern of each radiologist. Over the past 10 years we have used dictation software to compose academic manuscripts, correspondence letters, and texts of educational exhibits. The advantages of using voice dictation is faster composition of manuscripts. However, use of such software requires preparation. The purpose of this article is to review the steps of adapting a clinical dictation software for dictating academic manuscripts and detail the advantages and limitations of this technique. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  13. High-energy physics software parallelization using database techniques

    NASA Astrophysics Data System (ADS)

    Argante, E.; van der Stok, P. D. V.; Willers, I.

    1997-02-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.

  14. Dental caries experience in high risk soft drinks factory workers of South India: a comparative study.

    PubMed

    Kumar, Sandeep; Acharya, Shashidhar; Vasthare, Ramprasad; Singh, Siddharth Kumar; Gupta, Anjali; Debnath, Nitai

    2014-01-01

    The consumption of soft-drinks has been associated with dental caries development. The aim was to evaluate dental caries experience amongst the workers working in soft-drink industries located in South India and compare it with other factory workers. To evaluate the validity of specific caries index (SCI), which is newer index for caries diagnosis. This was a cross-sectional study carried out among 420 workers (210 in soft-drinks factory and 210 in other factories), in the age group of 20-45 years of Udupi district, Karnataka, India. Index used for clinical examination was decayed, missing, filled surfaces (DMFS) index and SCI. The mean and standard deviation (SD) of decayed surface (5.8 ± 1.8), missing surface (4.3 ± 2) and filled surface (1.94 ± 1.95) and total DMFS score (12.11 ± 3.8) in soft-drinks factory workers were found to be significantly higher than the other factory workers. The total SCI score (mean and SD) was found to be significantly higher in soft-drinks factory workers (5.83 ± 1.80) compared with other factory workers (4.56 ± 1.45). There was a high correlation obtained between SCI score and DMFS score. The regression equation given by DMFS = 1.178 + 1.866 (SCI scores). The caries experience was higher in workers working in soft-drinks factory and this study also showed that specific caries index can be used as a valid index for assessing dental caries experience.

  15. Correlations between lead, cadmium, copper, zinc, and iron concentrations in frozen tuna fish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galindo, L.; Hardisson, A.; Montelongo, F.G.

    1986-04-01

    The presence of metallic pollutants in marine ecosystems has promoted wide research plans in order to evaluate pollution levels in marine organisms. However, little is known concerning environmental and physiological processes that regulate the concentration of trace metals in marine organisms. Even though the toxicity of lead and cadmium is well established, copper, zinc and iron are considered as essential elements for mammals. Little is known about heavy metals, other than mercury, concentrations in fresh and frozen tuna fish. Fifty samples obtained at the entrance of a canning factory in Santa Cruz de Tenerife (Canary Islands), were analyzed by atomicmore » absorption spectrophotometry. Results were treated by applying the Statistical Package for the Social Sciences compiled and linked in the software of a Digital VAX/VMS 11/780 computer.« less

  16. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, lands at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  17. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, arrives at KSC's Shuttle Landing Facility from the factory of Alenia Aerospazio in Turin, Italy. Its cargo is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  18. The Role of Spatial Analysis in Detecting the Consequence of the Factory Sites : Case Study of Assalaya Factory-Sudan

    NASA Astrophysics Data System (ADS)

    Khair, Amar Sharaf Eldin; Purwanto; RyaSunoko, Henna; Abdullah, Omer Adam

    2018-02-01

    Spatial analysis is considered as one of the most important science for identifying the most appropriate site for industrialization and also to alleviate the environmental ramifications caused by factories. This study aims at analyzing the Assalaya sugarcane factory site by the use of spatial analysis to determine whether it has ramification on the White Nile River. The methodology employed for this study is Global Position System (GPS) to identify the coordinate system of the study phenomena and other relative factors. The study will also make use Geographical Information System (GIS) to implement the spatial analysis. Satellite data (LandsatDem-Digital Elevation Model) will be considered for the study area and factory in identifying the consequences by analyzing the location of the factory through several features such as hydrological, contour line and geological analysis. Data analysis reveals that the factory site is inappropriate and according to observation on the ground it has consequences on the White Nile River. Based on the finding, the study recommended some suggestions to avoid the aftermath of any factory in general. We have to take advantage of this new technological method to aid in selecting most apt locations for industries that will create an ambient environment.

  19. Factorial Experiments: Efficient Tools for Evaluation of Intervention Components

    PubMed Central

    Collins, Linda M.; Dziak, John J.; Kugler, Kari C.; Trail, Jessica B.

    2014-01-01

    Background An understanding of the individual and combined effects of a set of intervention components is important for moving the science of preventive medicine interventions forward. This understanding can often be achieved in an efficient and economical way via a factorial experiment, in which two or more independent variables are manipulated. The factorial experiment is a complement to the randomized controlled trial (RCT); the two designs address different research questions. Purpose This article offers an introduction to factorial experiments aimed at investigators trained primarily in the RCT. Method The factorial experiment is compared and contrasted with other experimental designs used commonly in intervention science to highlight where each is most efficient and appropriate. Results Several points are made: factorial experiments make very efficient use of experimental subjects when the data are properly analyzed; a factorial experiment can have excellent statistical power even if it has relatively few subjects per experimental condition; and when conducting research to select components for inclusion in a multicomponent intervention, interactions should be studied rather than avoided. Conclusions Investigators in preventive medicine and related areas should begin considering factorial experiments alongside other approaches. Experimental designs should be chosen from a resource management perspective, which states that the best experimental design is the one that provides the greatest scientific benefit without exceeding available resources. PMID:25092122

  20. Dual-surface dielectric depth detector for holographic millimeter-wave security scanners

    NASA Astrophysics Data System (ADS)

    McMakin, Douglas L.; Keller, Paul E.; Sheen, David M.; Hall, Thomas E.

    2009-05-01

    The Transportation Security Administration (TSA) is presently deploying millimeter-wave whole body scanners at over 20 airports in the United States. Threats that may be concealed on a person are displayed to the security operator of this scanner. "Passenger privacy is ensured through the anonymity of the image. The officer attending the passenger cannot view the image, and the officer viewing the image is remotely located and cannot see the passenger. Additionally, the image cannot be stored, transmitted or printed and is deleted immediately after being viewed. Finally, the facial area of the image has been blurred to further ensure privacy." Pacific Northwest National Laboratory (PNNL) originated research into this novel security technology which has been independently commercialized by L-3 Communications, SafeView, Inc. PNNL continues to perform fundamental research into improved software techniques which are applicable to the field of holographic security screening technology. This includes performing significant research to remove human features from the imagery. Both physical and software imaging techniques have been employed. The physical imaging techniques include polarization diversity illumination and reception, dual frequency implementation, and high frequency imaging at 100 GHz. This paper will focus on a software privacy technique using a dual surface dielectric depth detector method.

  1. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  2. [An Introduction to A Newly-developed "Acupuncture Needle Manipulation Training-evaluation System" Based on Optical Motion Capture Technique].

    PubMed

    Zhang, Ao; Yan, Xing-Ke; Liu, An-Guo

    2016-12-25

    In the present paper, the authors introduce a newly-developed "Acupuncture Needle Manipulation Training-evaluation System" based on optical motion capture technique. It is composed of two parts, sensor and software, and overcomes some shortages of mechanical motion capture technique. This device is able to analyze the data of operations of the pressing-hand and needle-insertion hand during acupuncture performance and its software contains personal computer (PC) version, Android version, and Internetwork Operating System (IOS) Apple version. It is competent in recording and analyzing information of any ope-rator's needling manipulations, and is quite helpful for teachers in teaching, training and examining students in clinical practice.

  3. Performance evaluation of the RITG148+ set of TomoTherapy quality assurance tools using RTQA2 radiochromic film.

    PubMed

    Lobb, Eric C

    2016-07-08

    Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.

  4. Supernova Dust Factory in M74

    NASA Image and Video Library

    2006-06-09

    Astronomers using NASA Spitzer Space Telescope have spotted a dust factory 30 million light-years away in the spiral galaxy M74. The factory is located at the scene of a massive star explosive death, or supernova.

  5. An overview of 3D software visualization.

    PubMed

    Teyseyre, Alfredo R; Campo, Marcelo R

    2009-01-01

    Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.

  6. Inclusion of LCCA in Alaska flexible pavement design software manual.

    DOT National Transportation Integrated Search

    2012-10-01

    Life cycle cost analysis is a key part for selecting materials and techniques that optimize the service life of a pavement in terms of cost and performance. While the Alaska : Flexible Pavement Design software has been in use since 2004, there is no ...

  7. Heterogeneous Software System Interoperability Through Computer-Aided Resolution of Modeling Differences

    DTIC Science & Technology

    2002-06-01

    techniques for addressing the software component retrieval problem. Steigerwald [Ste91] introduced the use of algebraic specifications for defining the...provided in terms of a specification written using Luqi’s Prototype Specification Description Language (PSDL) [LBY88] augmented with an algebraic

  8. Software structure for Vega/Chara instrument

    NASA Astrophysics Data System (ADS)

    Clausse, J.-M.

    2008-07-01

    VEGA (Visible spEctroGraph and polArimeter) is one of the focal instruments of the CHARA array at Mount Wilson near Los Angeles. Its control system is based on techniques developed on the GI2T interferometer (Grand Interferometre a 2 Telescopes) and on the SIRIUS fibered hyper telescope testbed at OCA (Observatoire de la Cote d'Azur). This article describes the software and electronics architecture of the instrument. It is based on local network architecture and uses also Virtual Private Network connections. The server part is based on Windows XP (VC++). The control software is on Linux (C, GTK). For the control of the science detector and the fringe tracking systems, distributed API use real-time techniques. The control software gathers all the necessary informations of the instrument. It allows an automatic management of the instrument by using an original task scheduler. This architecture intends to drive the instrument from remote sites, such as our institute in South of France.

  9. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  10. Software and languages for microprocessors

    NASA Astrophysics Data System (ADS)

    Williams, David O.

    1986-08-01

    This paper forms the basis for lectures given at the 6th Summer School on Computing Techniques in Physics, organised by the Computational Physics group of the European Physics Society, and held at the Hotel Ski, Nové Město na Moravě, Czechoslovakia, on 17-26 September 1985. Various types of microprocessor applications are discussed and the main emphasis of the paper is devoted to 'embedded' systems, where the software development is not carried out on the target microprocessor. Some information is provided on the general characteristics of microprocessor hardware. Various types of microprocessor operating system are compared and contrasted. The selection of appropriate languages and software environments for use with microprocessors is discussed. Mechanisms for interworking between different languages, including reasonable error handling, are treated. The CERN developed cross-software suite for the Motorola 68000 family is described. Some remarks are made concerning program tools applicable to microprocessors. PILS, a Portable Interactive Language System, which can be interpreted or compiled for a range of microprocessors, is described in some detail, and the implementation techniques are discussed.

  11. Baby Factories in Nigeria: Starting the Discussion Toward a National Prevention Policy.

    PubMed

    Makinde, Olusesan Ayodeji; Olaleye, Olalekan; Makinde, Olufunmbi Olukemi; Huntley, Svetlana S; Brown, Brandon

    2017-01-01

    Baby factories and baby harvesting are relatively new terms that involve breeding, trafficking, and abuse of infants and their biological mothers. Since it was first described in a United Nations Educational, Scientific and Cultural Organization report in Nigeria in 2006, several more baby factories have been discovered over the years. Infertile women are noted to be major patrons of these baby factories due to the stigmatization of childless couples in Southern Nigeria and issues around cultural acceptability of surrogacy and adoption. These practices have contributed to the growth in the industry which results in physical, psychological, and sexual violence to the victims. Tackling baby factories will involve a multifaceted approach that includes advocacy and enacting of legislation barring baby factories and infant trafficking and harsh consequences for their patrons. Also, programs to educate young girls on preventing unwanted pregnancies are needed. Methods of improving awareness and acceptability of adoption and surrogacy and reducing the administrative and legal bottlenecks associated with these options for infertile couples should be explored to diminish the importance of baby factories. © The Author(s) 2015.

  12. A practical limit to trials needed in one-person randomized controlled experiments.

    PubMed

    Alemi, Roshan; Alemi, Farrokh

    2007-01-01

    Recently in this journal, J. Olsson and colleagues suggested the use of factorial experimental designs to guide a patient's efforts to choose among multiple interventions. These authors argue that factorial design, where every possible combination of the interventions is tried, is superior to sequential trial and errors. Factorial design is efficient in identifying the effectiveness of interventions (factor effect). Most patients care only about feeling better and not why their conditions are improving. If the goal of the patient is to get better and not to estimate the factor effect, then no control groups are needed. In this article, we show a modification in the factorial design of experiments proposed by Olsson and colleagues where a full-factorial design is planned, but experimentation is stopped when the patient's condition improves. With this modification, the number of trials is radically fewer than those needed by factorial design. For example, a patient trying out 4 different interventions with a median probability of success of .50 is expected to need 2 trials before stopping the experimentation in comparison with 32 in a full-factorial design.

  13. Can IR scene projectors reduce total system cost?

    NASA Astrophysics Data System (ADS)

    Ginn, Robert; Solomon, Steven

    2006-05-01

    There is an incredible amount of system engineering involved in turning the typical infrared system needs of probability of detection, probability of identification, and probability of false alarm into focal plane array (FPA) requirements of noise equivalent irradiance (NEI), modulation transfer function (MTF), fixed pattern noise (FPN), and defective pixels. Unfortunately, there are no analytic solutions to this problem so many approximations and plenty of "seat of the pants" engineering is employed. This leads to conservative specifications, which needlessly drive up system costs by increasing system engineering costs, reducing FPA yields, increasing test costs, increasing rework and the never ending renegotiation of requirements in an effort to rein in costs. These issues do not include the added complexity to the FPA factory manager of trying to meet varied, and changing, requirements for similar products because different customers have made different approximations and flown down different specifications. Scene generation technology may well be mature and cost effective enough to generate considerable overall savings for FPA based systems. We will compare the costs and capabilities of various existing scene generation systems and estimate the potential savings if implemented at several locations in the IR system fabrication cycle. The costs of implementing this new testing methodology will be compared to the probable savings in systems engineering, test, rework, yield improvement and others. The diverse requirements and techniques required for testing missile warning systems, missile seekers, and FLIRs will be defined. Last, we will discuss both the hardware and software requirements necessary to meet the new test paradigm and discuss additional cost improvements related to the incorporation of these technologies.

  14. GenoGAM: genome-wide generalized additive models for ChIP-Seq analysis.

    PubMed

    Stricker, Georg; Engelhardt, Alexander; Schulz, Daniel; Schmid, Matthias; Tresch, Achim; Gagneur, Julien

    2017-08-01

    Chromatin immunoprecipitation followed by deep sequencing (ChIP-Seq) is a widely used approach to study protein-DNA interactions. Often, the quantities of interest are the differential occupancies relative to controls, between genetic backgrounds, treatments, or combinations thereof. Current methods for differential occupancy of ChIP-Seq data rely however on binning or sliding window techniques, for which the choice of the window and bin sizes are subjective. Here, we present GenoGAM (Genome-wide Generalized Additive Model), which brings the well-established and flexible generalized additive models framework to genomic applications using a data parallelism strategy. We model ChIP-Seq read count frequencies as products of smooth functions along chromosomes. Smoothing parameters are objectively estimated from the data by cross-validation, eliminating ad hoc binning and windowing needed by current approaches. GenoGAM provides base-level and region-level significance testing for full factorial designs. Application to a ChIP-Seq dataset in yeast showed increased sensitivity over existing differential occupancy methods while controlling for type I error rate. By analyzing a set of DNA methylation data and illustrating an extension to a peak caller, we further demonstrate the potential of GenoGAM as a generic statistical modeling tool for genome-wide assays. Software is available from Bioconductor: https://www.bioconductor.org/packages/release/bioc/html/GenoGAM.html . gagneur@in.tum.de. Supplementary information is available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Dereplication of Natural Products Using GC-TOF Mass Spectrometry: Improved Metabolite Identification by Spectral Deconvolution Ratio Analysis.

    PubMed

    Carnevale Neto, Fausto; Pilon, Alan C; Selegato, Denise M; Freire, Rafael T; Gu, Haiwei; Raftery, Daniel; Lopes, Norberto P; Castro-Gamboa, Ian

    2016-01-01

    Dereplication based on hyphenated techniques has been extensively applied in plant metabolomics, thereby avoiding re-isolation of known natural products. However, due to the complex nature of biological samples and their large concentration range, dereplication requires the use of chemometric tools to comprehensively extract information from the acquired data. In this work we developed a reliable GC-MS-based method for the identification of non-targeted plant metabolites by combining the Ratio Analysis of Mass Spectrometry deconvolution tool (RAMSY) with Automated Mass Spectral Deconvolution and Identification System software (AMDIS). Plants species from Solanaceae, Chrysobalanaceae and Euphorbiaceae were selected as model systems due to their molecular diversity, ethnopharmacological potential, and economical value. The samples were analyzed by GC-MS after methoximation and silylation reactions. Dereplication was initiated with the use of a factorial design of experiments to determine the best AMDIS configuration for each sample, considering linear retention indices and mass spectral data. A heuristic factor (CDF, compound detection factor) was developed and applied to the AMDIS results in order to decrease the false-positive rates. Despite the enhancement in deconvolution and peak identification, the empirical AMDIS method was not able to fully deconvolute all GC-peaks, leading to low MF values and/or missing metabolites. RAMSY was applied as a complementary deconvolution method to AMDIS to peaks exhibiting substantial overlap, resulting in recovery of low-intensity co-eluted ions. The results from this combination of optimized AMDIS with RAMSY attested to the ability of this approach as an improved dereplication method for complex biological samples such as plant extracts.

  16. Dereplication of Natural Products Using GC-TOF Mass Spectrometry: Improved Metabolite Identification by Spectral Deconvolution Ratio Analysis

    PubMed Central

    Carnevale Neto, Fausto; Pilon, Alan C.; Selegato, Denise M.; Freire, Rafael T.; Gu, Haiwei; Raftery, Daniel; Lopes, Norberto P.; Castro-Gamboa, Ian

    2016-01-01

    Dereplication based on hyphenated techniques has been extensively applied in plant metabolomics, thereby avoiding re-isolation of known natural products. However, due to the complex nature of biological samples and their large concentration range, dereplication requires the use of chemometric tools to comprehensively extract information from the acquired data. In this work we developed a reliable GC-MS-based method for the identification of non-targeted plant metabolites by combining the Ratio Analysis of Mass Spectrometry deconvolution tool (RAMSY) with Automated Mass Spectral Deconvolution and Identification System software (AMDIS). Plants species from Solanaceae, Chrysobalanaceae and Euphorbiaceae were selected as model systems due to their molecular diversity, ethnopharmacological potential, and economical value. The samples were analyzed by GC-MS after methoximation and silylation reactions. Dereplication was initiated with the use of a factorial design of experiments to determine the best AMDIS configuration for each sample, considering linear retention indices and mass spectral data. A heuristic factor (CDF, compound detection factor) was developed and applied to the AMDIS results in order to decrease the false-positive rates. Despite the enhancement in deconvolution and peak identification, the empirical AMDIS method was not able to fully deconvolute all GC-peaks, leading to low MF values and/or missing metabolites. RAMSY was applied as a complementary deconvolution method to AMDIS to peaks exhibiting substantial overlap, resulting in recovery of low-intensity co-eluted ions. The results from this combination of optimized AMDIS with RAMSY attested to the ability of this approach as an improved dereplication method for complex biological samples such as plant extracts. PMID:27747213

  17. Modeling human behaviors and reactions under dangerous environment.

    PubMed

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  18. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  19. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  20. Software-implemented fault insertion: An FTMP example

    NASA Technical Reports Server (NTRS)

    Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.

    1987-01-01

    This report presents a model for fault insertion through software; describes its implementation on a fault-tolerant computer, FTMP; presents a summary of fault detection, identification, and reconfiguration data collected with software-implemented fault insertion; and compares the results to hardware fault insertion data. Experimental results show detection time to be a function of time of insertion and system workload. For the fault detection time, there is no correlation between software-inserted faults and hardware-inserted faults; this is because hardware-inserted faults must manifest as errors before detection, whereas software-inserted faults immediately exercise the error detection mechanisms. In summary, the software-implemented fault insertion is able to be used as an evaluation technique for the fault-handling capabilities of a system in fault detection, identification and recovery. Although the software-inserted faults do not map directly to hardware-inserted faults, experiments show software-implemented fault insertion is capable of emulating hardware fault insertion, with greater ease and automation.

Top