Science.gov

Sample records for acm sigsoft software

  1. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  2. ACM TOMS replicated computational results initiative

    SciTech Connect

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  3. ACM TOMS replicated computational results initiative

    DOE PAGES

    Heroux, Michael Allen

    2015-06-03

    In this study, the scientific community relies on the peer review process for assuring the quality of published material, the goal of which is to build a body of work we can trust. Computational journals such as The ACM Transactions on Mathematical Software (TOMS) use this process for rigorously promoting the clarity and completeness of content, and citation of prior work. At the same time, it is unusual to independently confirm computational results.

  4. ACME-III and ACME-IV Final Campaign Reports

    SciTech Connect

    Biraud, S. C.

    2016-01-01

    The goals of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s third and fourth Airborne Carbon Measurements (ACME) field campaigns, ACME-III and ACME-IV, are: 1) to measure and model the exchange of CO2, water vapor, and other greenhouse gases by the natural, agricultural, and industrial ecosystems of the Southern Great Plains (SGP) region; 2) to develop quantitative approaches to relate these local fluxes to the concentration of greenhouse gases measured at the Central Facility tower and in the atmospheric column above the ARM SGP Central Facility, 3) to develop and test bottom-up measurement and modeling approaches to estimate regional scale carbon balances, and 4) to develop and test inverse modeling approaches to estimate regional scale carbon balance and anthropogenic sources over continental regions. Regular soundings of the atmosphere from near the surface into the mid-troposphere are essential for this research.

  5. A Distributed, Cross-Agency Software Architecture for Sharing Climate Models and Observational Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Mattmann, C. A.; Braverman, A. J.; Cinquini, L.

    2010-12-01

    in linking together multiple agency data systems along with plans for extending the architecture and its implementation. [1] D. Crichton, D. Williams, A. Braverman, Y. Chao, C. Mattmann. Facilitating Climate Research by Integrating NASA and the Earth System Grid, sg-pcmdi.llnl.gov/review-folder/collaborations_partnerships/ESG-NASA.pdf , 2008. [2] C. Mattmann, A. Braverman, D. Crichton. Understanding Architectural Tradeoffs Necessary to Increase Climate Model Intercomparison Efficiency. ACM SIGSOFT Software Engineering Notes, vol. 35, no. 3, pp. 1-6, May 2010.

  6. Quark ACM with topologically generated gluon mass

    NASA Astrophysics Data System (ADS)

    Choudhury, Ishita Dutta; Lahiri, Amitabha

    2016-03-01

    We investigate the effect of a small, gauge-invariant mass of the gluon on the anomalous chromomagnetic moment (ACM) of quarks by perturbative calculations at one-loop level. The mass of the gluon is taken to have been generated via a topological mass generation mechanism, in which the gluon acquires a mass through its interaction with an antisymmetric tensor field Bμν. For a small gluon mass ( < 10 MeV), we calculate the ACM at momentum transfer q2 = -M Z2. We compare those with the ACM calculated for the gluon mass arising from a Proca mass term. We find that the ACM of up, down, strange and charm quarks vary significantly with the gluon mass, while the ACM of top and bottom quarks show negligible gluon mass dependence. The mechanism of gluon mass generation is most important for the strange quarks ACM, but not so much for the other quarks. We also show the results at q2 = -m t2. We find that the dependence on gluon mass at q2 = -m t2 is much less than at q2 = -M Z2 for all quarks.

  7. Additive Construction with Mobile Emplacement (ACME)

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Additive Construction with Mobile Emplacement (ACME) project is developing technology to build structures on planetary surfaces using in-situ resources. The project focuses on the construction of both 2D (landing pads, roads, and structure foundations) and 3D (habitats, garages, radiation shelters, and other structures) infrastructure needs for planetary surface missions. The ACME project seeks to raise the Technology Readiness Level (TRL) of two components needed for planetary surface habitation and exploration: 3D additive construction (e.g., contour crafting), and excavation and handling technologies (to effectively and continuously produce in-situ feedstock). Additionally, the ACME project supports the research and development of new materials for planetary surface construction, with the goal of reducing the amount of material to be launched from Earth.

  8. How to recycle asbestos containing materials (ACM)

    SciTech Connect

    Jantzen, C.M.

    2000-04-11

    The current disposal of asbestos containing materials (ACM) in the private sector consists of sealing asbestos wetted with water in plastic for safe transportation and burial in regulated land fills. This disposal methodology requires large disposal volumes especially for asbestos covered pipe and asbestos/fiberglass adhering to metal framework, e.g. filters. This wrap and bury technology precludes recycle of the asbestos, the pipe and/or the metal frameworks. Safe disposal of ACM at U.S. Department of Energy (DOE) sites, likewise, requires large disposal volumes in landfills for non-radioactive ACM and large disposal volumes in radioactive burial grounds for radioactive and suspect contaminated ACM. The availability of regulated disposal sites is rapidly diminishing causing recycle to be a more attractive option. Asbestos adhering to metal (e.g., pipes) can be recycled by safely removing the asbestos from the metal in a patented hot caustic bath which prevents airborne contamination /inhalation of asbestos fibers. The dissolution residue (caustic and asbestos) can be wet slurry fed to a melter and vitrified into a glass or glass-ceramic. Palex glasses, which are commercially manufactured, are shown to be preferred over conventional borosilicate glasses. The Palex glasses are alkali magnesium silicate glasses derived by substituting MgO for B{sub 2}O{sub 3} in borosilicate type glasses. Palex glasses are very tolerant of the high MgO and high CaO content of the fillers used in forming asbestos coverings for pipes and found in boiler lashing, e.g., hydromagnesite (3MgCO{sub 3} Mg(OH){sub 2} 3H{sub 2}O) and plaster of paris, gypsum (CaSO{sub 4}). The high temperate of the vitrification process destroys the asbestos fibers and renders the asbestos non-hazardous, e.g., a glass or glass-ceramic. In this manner the glass or glass-ceramic produced can be recycled, e.g., glassphalt or glasscrete, as can the clean metal pipe or metal framework.

  9. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  10. AcmD, a Homolog of the Major Autolysin AcmA of Lactococcus lactis, Binds to the Cell Wall and Contributes to Cell Separation and Autolysis

    PubMed Central

    Visweswaran, Ganesh Ram R.; Steen, Anton; Leenhouts, Kees; Szeliga, Monika; Ruban, Beata; Hesseling-Meinders, Anne; Dijkstra, Bauke W.; Kuipers, Oscar P.; Kok, Jan; Buist, Girbe

    2013-01-01

    Lactococcus lactis expresses the homologous glucosaminidases AcmB, AcmC, AcmA and AcmD. The latter two have three C-terminal LysM repeats for peptidoglycan binding. AcmD has much shorter intervening sequences separating the LysM repeats and a lower iso-electric point (4.3) than AcmA (10.3). Under standard laboratory conditions AcmD was mainly secreted into the culture supernatant. An L. lactis acmAacmD double mutant formed longer chains than the acmA single mutant, indicating that AcmD contributes to cell separation. This phenotype could be complemented by plasmid-encoded expression of AcmD in the double mutant. No clear difference in cellular lysis and protein secretion was observed between both mutants. Nevertheless, overexpression of AcmD resulted in increased autolysis when AcmA was present (as in the wild type strain) or when AcmA was added to the culture medium of an AcmA-minus strain. Possibly, AcmD is mainly active within the cell wall, at places where proper conditions are present for its binding and catalytic activity. Various fusion proteins carrying either the three LysM repeats of AcmA or AcmD were used to study and compare their cell wall binding characteristics. Whereas binding of the LysM domain of AcmA took place at pHs ranging from 4 to 8, LysM domain of AcmD seems to bind strongest at pH 4. PMID:23951292

  11. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    NASA Astrophysics Data System (ADS)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs

  12. Prevalence and genetic diversity of arginine catabolic mobile element (ACME) in clinical isolates of coagulase-negative staphylococci: identification of ACME type I variants in Staphylococcus epidermidis.

    PubMed

    Onishi, Mayumi; Urushibara, Noriko; Kawaguchiya, Mitsuyo; Ghosh, Souvik; Shinagawa, Masaaki; Watanabe, Naoki; Kobayashi, Nobumichi

    2013-12-01

    Arginine catabolic mobile element (ACME), a genomic island consisting of the arc and/or opp3 gene clusters found in staphylococcal species, is related to increased bacterial adaptability to hosts. Staphylococcus epidermidis is considered a major ACME reservoir; however, prevalence and genetic diversity of ACME in coagulase-negative staphylococci (CNS) have not yet been well characterized for clinical isolates in Japan. A total of 271 clinical isolates of CNS in a Japanese hospital were investigated for the presence and genotype of ACME and SCCmec. The prevalence of ACME-arcA was significantly higher (p<0.001) in S. epidermidis (45.8%) than in other CNS species (3.7%). ACME in S. epidermidis isolates (n=87) were differentiated into type I (n=33), variant forms of type I (ΔI, n=26) newly identified in this study, type II (n=6), and type ΔII (n=19). ACME-type ΔI, which were further classified into three subtypes, lacked some genetic components between the arc and opp3 clusters in archetypal type I, whereas the arc and opp3 clusters were intact. The arc cluster exhibited high sequence identity (95.8-100%) to that of type I ACME; in contrast, the opp3 cluster was highly diverse, and showed relatively lower identities (94.8-98.7%) to the identical regions in type I ACME. Twenty-one isolates of ΔI ACME-carrying S. epidermidis possessed SCCmec IVa and belonged to ST5 (clonal complex 2). Phylogenetic analysis revealed that isolates harboring ACME ΔI in this study clustered with previously reported S. epidermidis strains with other lineges, suggesting that S. epidermidis originally had some genetic variations in the opp3 cluster. In summary, ACME type ΔI, a truncated variant of ACME-I, was first identified in S. epidermidis, and revealed to be prevalent in ST5 MRSE clinical isolates with SCCmec IVa.

  13. Sealing Force Increasing of ACM Gasket through Electron Beam Radiation

    NASA Astrophysics Data System (ADS)

    dos Santos, D. J.; Batalha, G. F.

    2011-01-01

    Rubber is an engineering material largely used as sealing parts, in form of O-rings, solid gaskets and liquid gaskets, materials applied in liquid state with posterior vulcanization and sealing. Stress relaxation is a rubber characteristic which impacts negatively in such industrial applications (rings and solid gaskets). This work has the purpose to investigate the use of electron beam radiation (EB) as a technology able to decrease the stress relaxation in acrylic rubber (ACM), consequently increasing the sealing capability of this material. ACM samples were irradiated with dose of 100 kGy and 250 kGy, its behavior was comparatively investigated using, dynamic mechanical analysis (DMA) and compression stress relaxation (CSR) experiments. The results obtained by DMA shown an increase of Tg and changes in dynamic mechanical behavior.

  14. Autonomous collaborative mission systems (ACMS) for multi-UAV missions

    NASA Astrophysics Data System (ADS)

    Chen, Y.-L.; Peot, M.; Lee, J.; Sundareswaran, V.; Altshuler, T.

    2005-05-01

    UAVs are a key element of the Army"s vision for Force Transformation, and are expected to be employed in large numbers per FCS Unit of Action (UoA). This necessitates a multi-UAV level of autonomous collaboration behavior capability that meets RSTA and other mission needs of FCS UoAs. Autonomous Collaborative Mission Systems (ACMS) is a scalable architecture and behavior planning / collaborative approach to achieve this level of capability. The architecture is modular and the modules may be run in different locations/platforms to accommodate the constraints of available hardware, processing resources and mission needs. The Mission Management Module determines the role of member autonomous entities by employing collaboration mechanisms (e.g., market-based, etc.), the individual Entity Management Modules work with the Mission Manager in determining the role and task of the entity, the individual Entity Execution Modules monitor task execution and platform navigation and sensor control, and the World Model Module hosts local and global versions of the environment and the Common Operating Picture (COP). The modules and uniform interfaces provide a consistent and platform-independent baseline mission collaboration mechanism and signaling protocol across different platforms. Further, the modular design allows flexible and convenient addition of new autonomous collaborative behaviors to the ACMS through: adding new behavioral templates in the Mission Planner component, adding new components in appropriate ACMS modules to provide new mission specific functionality, adding or modifying constraints or parameters to the existing components, or any combination of these. We describe the ACMS architecture, its main features, current development status and future plans for simulations in this report.

  15. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  16. Design and implementation of GaAs HBT circuits with ACME

    NASA Technical Reports Server (NTRS)

    Hutchings, Brad L.; Carter, Tony M.

    1993-01-01

    GaAs HBT circuits offer high performance (5-20 GHz) and radiation hardness (500 Mrad) that is attractive for space applications. ACME is a CAD tool specifically developed for HBT circuits. ACME implements a novel physical schematic-capture design technique where designers simultaneously view the structure and physical organization of a circuit. ACME's design interface is similar to schematic capture; however, unlike conventional schematic capture, designers can directly control the physical placement of both function and interconnect at the schematic level. In addition, ACME provides design-time parasitic extraction, complex wire models, and extensions to Multi-Chip Modules (MCM's). A GaAs HBT gate-array and semi-custom circuits have been developed with ACME; several circuits have been fabricated and found to be fully functional .

  17. Pomegranate MR images analysis using ACM and FCM algorithms

    NASA Astrophysics Data System (ADS)

    Morad, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.

    2011-10-01

    Segmentation of an image plays an important role in image processing applications. In this paper segmentation of pomegranate magnetic resonance (MR) images has been explored. Pomegranate has healthy nutritional and medicinal properties for which the maturity indices and quality of internal tissues play an important role in the sorting process in which the admissible determination of features mentioned above cannot be easily achieved by human operator. Seeds and soft tissues are the main internal components of pomegranate. For research purposes, such as non-destructive investigation, in order to determine the ripening index and the percentage of seeds in growth period, segmentation of the internal structures should be performed as exactly as possible. In this paper, we present an automatic algorithm to segment the internal structure of pomegranate. Since its intensity of stem and calyx is close to the internal tissues, the stem and calyx pixels are usually labeled to the internal tissues by segmentation algorithm. To solve this problem, first, the fruit shape is extracted from its background using active contour model (ACM). Then stem and calyx are removed using morphological filters. Finally the image is segmented by fuzzy c-means (FCM). The experimental results represent an accuracy of 95.91% in the presence of stem and calyx, while the accuracy of segmentation increases to 97.53% when stem and calyx are first removed by morphological filters.

  18. ARM Airborne Carbon Measurements VI (ACME VI) Science Plan

    SciTech Connect

    Biraud, S

    2015-12-01

    From October 1 through September 30, 2016, the Atmospheric Radiation Measurement (ARM) Aerial Facility will deploy the Cessna 206 aircraft over the Southern Great Plains (SGP) site, collecting observations of trace-gas mixing ratios over the ARM’s SGP facility. The aircraft payload includes two Atmospheric Observing Systems, Inc., analyzers for continuous measurements of CO2 and a 12-flask sampler for analysis of carbon cycle gases (CO2, CO, CH4, N2O, 13CO2, 14CO2, carbonyl sulfide, and trace hydrocarbon species, including ethane). The aircraft payload also includes instrumentation for solar/infrared radiation measurements. This research is supported by the U.S. Department of Energy’s ARM Climate Research Facility and Terrestrial Ecosystem Science Program and builds upon previous ARM Airborne Carbon Measurements (ARM-ACME) missions. The goal of these measurements is to improve understanding of 1) the carbon exchange at the SGP site, 2) how CO2 and associated water and energy fluxes influence radiative forcing, convective processes and CO2 concentrations over the SGP site, and 3) how greenhouse gases are transported on continental scales.

  19. Comparative study of numerical schemes of TVD3, UNO3-ACM and optimized compact scheme

    NASA Technical Reports Server (NTRS)

    Lee, Duck-Joo; Hwang, Chang-Jeon; Ko, Duck-Kon; Kim, Jae-Wook

    1995-01-01

    Three different schemes are employed to solve the benchmark problem. The first one is a conventional TVD-MUSCL (Monotone Upwind Schemes for Conservation Laws) scheme. The second scheme is a UNO3-ACM (Uniformly Non-Oscillatory Artificial Compression Method) scheme. The third scheme is an optimized compact finite difference scheme modified by us: the 4th order Runge Kutta time stepping, the 4th order pentadiagonal compact spatial discretization with the maximum resolution characteristics. The problems of category 1 are solved by using the second (UNO3-ACM) and third (Optimized Compact) schemes. The problems of category 2 are solved by using the first (TVD3) and second (UNO3-ACM) schemes. The problem of category 5 is solved by using the first (TVD3) scheme. It can be concluded from the present calculations that the Optimized Compact scheme and the UN03-ACM show good resolutions for category 1 and category 2 respectively.

  20. The acellular matrix (ACM) for bladder tissue engineering: A quantitative magnetic resonance imaging study.

    PubMed

    Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A

    2010-08-01

    Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.

  1. CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through upregulating L-type calcium channel activity.

    PubMed

    Sun, Meiqun; Liu, Hongli; Xu, Huanbai; Wang, Hongtao; Wang, Xiaojing

    2016-09-01

    A specialized culture medium termed ciliary neurotrophic factor-treated astrocyte-conditioned medium (CNTF-ACM) allows investigators to assess the peripheral effects of CNTF-induced activated astrocytes upon cultured neurons. CNTF-ACM has been shown to upregulate neuronal L-type calcium channel current activity, which has been previously linked to changes in mitochondrial respiration and oxidative stress. Therefore, the aim of this study was to evaluate CNTF-ACM's effects upon mitochondrial respiration and oxidative stress in rat cortical neurons. Cortical neurons, CNTF-ACM, and untreated control astrocyte-conditioned medium (UC-ACM) were prepared from neonatal Sprague-Dawley rat cortical tissue. Neurons were cultured in either CNTF-ACM or UC-ACM for a 48-h period. Changes in the following parameters before and after treatment with the L-type calcium channel blocker isradipine were assessed: (i) intracellular calcium levels, (ii) mitochondrial membrane potential (ΔΨm), (iii) oxygen consumption rate (OCR) and adenosine triphosphate (ATP) formation, (iv) intracellular nitric oxide (NO) levels, (v) mitochondrial reactive oxygen species (ROS) production, and (vi) susceptibility to the mitochondrial complex I toxin rotenone. CNTF-ACM neurons displayed the following significant changes relative to UC-ACM neurons: (i) increased intracellular calcium levels (p < 0.05), (ii) elevation in ΔΨm (p < 0.05), (iii) increased OCR and ATP formation (p < 0.05), (iv) increased intracellular NO levels (p < 0.05), (v) increased mitochondrial ROS production (p < 0.05), and (vi) increased susceptibility to rotenone (p < 0.05). Treatment with isradipine was able to partially rescue these negative effects of CNTF-ACM (p < 0.05). CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through elevating L-type calcium channel activity. PMID:27514537

  2. An audience-channel-message-evaluation (ACME) framework for health communication campaigns.

    PubMed

    Noar, Seth M

    2012-07-01

    Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework.

  3. An audience-channel-message-evaluation (ACME) framework for health communication campaigns.

    PubMed

    Noar, Seth M

    2012-07-01

    Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework. PMID:21441207

  4. 76 FR 64943 - Proposed Cercla Administrative Cost Recovery Settlement; ACM Smelter and Refinery Site, Located...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... administrative settlement for recovery of past and projected future response costs concerning the ACM Smelter and... past response costs, as well as future response costs under the settlement. The settlement includes a... considerations which indicate that the settlement is inappropriate, improper, or inadequate. The...

  5. SUPERFUND TREATABILITY CLEARINGHOUSE: FINAL REPORT, PHASE I - IMMEDIATE ASSESSMENT, ACME SOLVENTS SITE

    EPA Science Inventory

    This is a site assessment and feasibility study of incineration alternatives at the ACME Solvents Site at Rockford, Illinois. The document contains laboratory results that are reported to simulate incineration conditions but no details on test methods were provided. The d...

  6. Optimizing the Advanced Ceramic Material (ACM) for Diesel Particulate Filter Applications

    SciTech Connect

    Dillon, Heather E.; Stewart, Mark L.; Maupin, Gary D.; Gallant, Thomas R.; Li, Cheng; Mao, Frank H.; Pyzik, Aleksander J.; Ramanathan, Ravi

    2006-10-02

    This paper describes the application of pore-scale filtration simulations to the ‘Advanced Ceramic Material’ (ACM) developed by Dow Automotive for use in advanced diesel particulate filters. The application required the generation of a three dimensional substrate geometry to provide the boundary conditions for the flow model. An innovative stochastic modeling technique was applied matching chord length distribution and the porosity profile of the material. Additional experimental validation was provided by the single channel experimental apparatus. Results show that the stochastic reconstruction techniques provide flexibility and appropriate accuracy for the modeling efforts. Early optimization efforts imply that needle length may provide a mechanism for adjusting performance of the ACM for DPF applications. New techniques have been developed to visualize soot deposition in both traditional and new DPF substrate materials. Loading experiments have been conducted on a variety of single channel DPF substrates to develop a deeper understanding of soot penetration, soot deposition characteristics, and to confirm modeling results.

  7. Use of a new microporous insulation in a sub car at Acme Steel

    SciTech Connect

    Harvey, H.; Gamble, F.C.; MacKenzie, I.B.

    1996-12-31

    Acme Steel Co. is a small integrated steel company headquartered in Riverdale IL., with its blast furnace and coke plant operations located in the city of Chicago. Rail transportation between the two plants is by Conrail with two crews assigned exclusively to Acme. The torpedo cars used for this service are specially reinforced, with 36 in. wheels and additional braking capability for safety on public rail tracks. Over a seven month period, microporous insulating panels 0.28 in. thick in No. 49 sub ladle saved an average 24 degrees in the iron on arrival at the BOF compared to the average for the rest of the fleet. The microporous insulation replaced 0.25 in. of compressed fiber panel.

  8. Categorization of Computing Education Resources into the ACM Computing Classification System

    SciTech Connect

    Chen, Yinlin; Bogen, Paul Logasa; Fox, Dr. Edward A.; Hsieh, Dr. Haowei; Cassel, Dr. Lillian N.

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  9. On the modeling of a single-stage, entrained-flow gasifier using Aspen Custom Modeler (ACM)

    SciTech Connect

    Kasule, J.; Turton, R.; Bhattacharyya, D.; Zitney, S.

    2010-01-01

    Coal-fired gasifiers are the centerpiece of integrated gasification combined cycle (IGCC) power plants. The gasifier produces synthesis gas that is subsequently converted into electricity through combustion in a gas turbine. Several mathematical models have been developed to study the physical and chemical processes taking place inside the gasifier. Such models range from simple one-dimensional (1D) steady-state models to sophisticated dynamic 3D computational fluid dynamics (CFD) models that incorporate turbulence effects in the reactor. The practical operation of the gasifier is dynamic in nature but most 1D and some higher-dimensional models are often steady state. On the other hand, many higher order CFD-based models are dynamic in nature, but are too computationally expensive to be used directly in operability and controllability dynamic studies. They are also difficult to incorporate in the framework of process simulation software such as Aspen Plus Dynamics. Thus lower-dimensional dynamic models are still useful in these types of studies. In the current study, a 1D dynamic model for a single-stage, downward-firing, entrained-flow GE-type gasifier is developed using Aspen Custom Modeler{reg_sign} (ACM), which is a commercial equation-based simulator for creating, editing, and re-using models of process units. The gasifier model is based on mass, momentum, and energy balances for the solid and gas phases. The physical and chemical reactions considered in the model are drying, devolatilization/pyrolysis, gasification, combustion, and the homogeneous gas phase reactions. The dynamic gasifier model is being developed for use in a plant-wide dynamic model of an IGCC power plant. For dynamic simulation, the resulting highly nonlinear system of partial differential algebraic equations (PDAE) is solved in ACM using the well-known Method of Lines (MoL) approach. The MoL discretizes the space domain and leaves the time domain continuous, thereby converting the PDAE to

  10. TWO NOVEL ACM (ACTIVE CONTOUR MODEL) METHODS FOR INTRAVASCULAR ULTRASOUND IMAGE SEGMENTATION

    SciTech Connect

    Chen, Chi Hau; Potdat, Labhesh; Chittineni, Rakesh

    2010-02-22

    One of the attractive image segmentation methods is the Active Contour Model (ACM) which has been widely used in medical imaging as it always produces sub-regions with continuous boundaries. Intravascular ultrasound (IVUS) is a catheter based medical imaging technique which is used for quantitative assessment of atherosclerotic disease. Two methods of ACM realizations are presented in this paper. The gradient descent flow based on minimizing energy functional can be used for segmentation of IVUS images. However this local operation alone may not be adequate to work with the complex IVUS images. The first method presented consists of basically combining the local geodesic active contours and global region-based active contours. The advantage of combining the local and global operations is to allow curves deforming under the energy to find only significant local minima and delineate object borders despite noise, poor edge information and heterogeneous intensity profiles. Results for this algorithm are compared to standard techniques to demonstrate the method's robustness and accuracy. In the second method, the energy function is appropriately modified and minimized using a Hopfield neural network. Proper modifications in the definition of the bias of the neurons have been introduced to incorporate image characteristics. The method overcomes distortions in the expected image pattern, such as due to the presence of calcium, and employs a specialized structure of the neural network and boundary correction schemes which are based on a priori knowledge about the vessel geometry. The presented method is very fast and has been evaluated using sequences of IVUS frames.

  11. Superfund Record of Decision (EPA Region 5): Acme Solvents, Morristown, Illinois, September 1985. Final report

    SciTech Connect

    Not Available

    1985-09-27

    The Acme Solvents Reclaiming, Inc. facility is located approximately five miles south of Rockford, Illinois. From 1960 until 1973, the facility served as a disposal site for paints, oils and still bottoms from the solvent reclamation plant located in Rockford. In addition, empty drums were stored onsite. Wastes were dumped into depressions created from either previous quarrying activities or by scraping overburden from the near surface bedrock to form berms. In September 1972, the Illinois Pollution Control Board (IPCB) ordered Acme to remove all drums and wastes from the facility and to backfill the lagoons. Follow-up inspections revealed that wastes and crushed drums were being left onsite and merely covered with soil. Sampling of the site revealed high concentrations of chlorinated organics in the drinking water. The major source of hazardous substances at the facility are the waste disposal mounds. These mounds contain volatile and semi-volatile organic compounds and concentrations of PCBs up to several hundred mg/kg. The selected remedial action is included.

  12. AIHA position statement on the removal of asbestos-containing materials (ACM) from buildings

    SciTech Connect

    Not Available

    1991-06-01

    The health risks associated with asbestos exposure for building occupants has been demonstrated to be very low. The decision to remove asbestos-containing materials (ACM) in undamaged, intact condition that are not readily accessible to occupants should be made only after assessing all other options. Both technical and financial issues should be fully explored by a team of trained specialists, including industrial hygienists, architects, and engineers. The optimal solution will vary from building to building, based on factors unique to each situation. One important consideration is the use of a well-designed air-monitoring program to identify changes in airborne levels of asbestos. Special training and maintenance programs are needed to ensure the safety and health of building and contract workers who may encounter asbestos or who may disturb it during routine or nonroutine activities. Each building owner who has ACM in a building should identify an in-house asbestos manager, and it is also necessary to provide appropriate resources, including professional consultants, to develop and manage a responsible and effective in-place management program throughout the life of a building containing asbestos.

  13. Are Academic Programs Adequate for the Software Profession?

    ERIC Educational Resources Information Center

    Koster, Alexis

    2010-01-01

    According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…

  14. Paleoenvironmental conditions for the development of calcareous nannofossil acme during the late Miocene in the eastern equatorial Pacific

    NASA Astrophysics Data System (ADS)

    Beltran, Catherine; Rousselle, Gabrielle; Backman, Jan; Wade, Bridget S.; Sicre, Marie Alexandrine

    2014-03-01

    Repeated monospecific coccolithophore dominance intervals (acmes) of specimens belonging to the Noelaerhabdaceae family—including the genus Reticulofenestra and modern descendants Emiliania and Gephyrocapsa—occurred during the Neogene. Such acme was recognized during the late Miocene (~ 8.6 Ma), at a time of a major reorganization of nannofossil assemblages resulting in a worldwide temporary disappearance of larger forms of the genus Reticulofenestra (R. pseudoumbilicus) and the gradual recovery and dominance of its smaller forms (< 5 µm). In this study we present a multiproxy investigation of late Miocene sediments from the east equatorial Pacific Integrated Ocean Drilling Program Site U1338 where small reticulofenestrid-type placoliths with a closed central area—known as small Dictyococcites spp. (< 3 µm)—formed an acme. We report on oxygen and carbon stable isotope records of multispecies planktic calcite and alkenone-derived sea surface temperature. Our data indicate that, during this 100 kyr long acme, the east equatorial Pacific thermocline remained deep and stable. Local surface stratification state fails to explain this acme and thus contradicts the model-based hypothesis of a Southern Ocean high-latitude nutrient control of the surface waters in the east equatorial Pacific. Instead, our findings suggest that external forcing such as an extended period of low eccentricity may have created favorable conditions for the small Dictyococcites spp. growth.

  15. Public health assessment for ACME Solvent Reclaiming Incorporated, Winnebago, Winnebago County, Illinois, region 6. Cerclis No. ILD053219259. Final report

    SciTech Connect

    1995-08-11

    Acme Solvent Reclaiming, Inc. (ACME), covers approximately 20 acres 5 miles south of Rockford on Lindenwood Road in Winnebago County. The wastes disposed on-site included paints, oils, still-bottoms, sludges, and non-recoverable solvents. Disposal practices resulted in soils contaminated with numerous inorganic and organic compounds including metals, volatiles, semi-volatiles, and polychlorinated biphenyls (PCBs). In addition to the soil contamination, a a contaminant plume migrating south-southwest has been identified in groundwater beneath and around the ACME site. Based on available information, this site is considered to be a public health hazard because of the risk to human health resulting from past, present, and potential future exposure to groundwater contaminated with various inorganic and organic compounds, including metals, volatiles, semi-volatiles, and polychlorinated biphenyls (PCBs), at concentrations that may result in an increased risk of adverse health effects.

  16. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  17. Health assessment for Acme Solvents Reclamation, Inc. , Winnebago County, Illinois, Region 5. CERCLIS No. ILD053219259. Final report

    SciTech Connect

    Not Available

    1988-08-01

    The Acme Solvents Reclamation, Inc. (Acme) National Priorities List Site is located in Winnebago County, Illinois. There are volatile organic compounds, base neutral extractable compounds, polychlorinated biphenyls, and several metals present in the soil, sediment, ground water, air, and/or leachate at or around the site. The Record of Decision signed September 1985, mandated several remedial actions which included the provision of interim alternate water, excavation, and incineration of waste and contaminated soil, landfilling of non-incinerable waste in an off-site Resource Conservation and Recovery Act landfill, and continued investigation of the connection between the ground water flow and the bedrock.

  18. AcmB Is an S-Layer-Associated β-N-Acetylglucosaminidase and Functional Autolysin in Lactobacillus acidophilus NCFM

    PubMed Central

    Johnson, Brant R.

    2016-01-01

    ABSTRACT Autolysins, also known as peptidoglycan hydrolases, are enzymes that hydrolyze specific bonds within bacterial cell wall peptidoglycan during cell division and daughter cell separation. Within the genome of Lactobacillus acidophilus NCFM, there are 11 genes encoding proteins with peptidoglycan hydrolase catalytic domains, 9 of which are predicted to be functional. Notably, 5 of the 9 putative autolysins in L. acidophilus NCFM are S-layer-associated proteins (SLAPs) noncovalently colocalized along with the surface (S)-layer at the cell surface. One of these SLAPs, AcmB, a β-N-acetylglucosaminidase encoded by the gene lba0176 (acmB), was selected for functional analysis. In silico analysis revealed that acmB orthologs are found exclusively in S-layer- forming species of Lactobacillus. Chromosomal deletion of acmB resulted in aberrant cell division, autolysis, and autoaggregation. Complementation of acmB in the ΔacmB mutant restored the wild-type phenotype, confirming the role of this SLAP in cell division. The absence of AcmB within the exoproteome had a pleiotropic effect on the extracellular proteins covalently and noncovalently bound to the peptidoglycan, which likely led to the observed decrease in the binding capacity of the ΔacmB strain for mucin and extracellular matrices fibronectin, laminin, and collagen in vitro. These data suggest a functional association between the S-layer and the multiple autolysins noncovalently colocalized at the cell surface of L. acidophilus NCFM and other S-layer-producing Lactobacillus species. IMPORTANCE Lactobacillus acidophilus is one of the most widely used probiotic microbes incorporated in many dairy foods and dietary supplements. This organism produces a surface (S)-layer, which is a self-assembling crystalline array found as the outermost layer of the cell wall. The S-layer, along with colocalized associated proteins, is an important mediator of probiotic activity through intestinal adhesion and modulation of

  19. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  20. Preliminary survey report: control technology for the ceramic industry at Acme Brick Company, Malvern, Arkansas

    SciTech Connect

    Godbey, F.W.

    1983-06-01

    Health-hazard control methods, work processes, and existing control technologies used in the manufacture of brick were surveyed at Acme Brick Company, Malvern, Arkansas in June, 1983. The company employed about 32 workers to produce structural brick from alluvial clay, free clay, shale, and aggregate. A potential hazard existed from silica exposure since the clays contained about 20% quartz. Raw materials were transported in a cab-enclosed front-end loader to feeders that delivered the materials to a crusher. Blended coarsely crushed material was moved by conveyor to a hammer mill for fine crushing. Production-size product was transported by overhead conveyor to storage silos in the production building. The entire material particle-size reduction process was completely automated. The clay-preparation building and raw-material storage area were isolated from the production building, and only two workers performed the crushing and grinding operations. Material transfer points had removable covers, and a water-mist spray was used on one conveyor of each line. The operation was monitored from a totally enclosed air-conditioned control room. Head and eye protection were required. The author does not recommend an in-depth study of control technologies of the company.

  1. Improving the Limit on the Electron EDM: Data Acquisition and Systematics Studies in the ACME Experiment

    NASA Astrophysics Data System (ADS)

    Hess, Paul William

    The ACME collaboration has completed a measurement setting a new upper limit on the size of the electron's permanent electric dipole moment (EDM). The existence of the EDM is well motivated by theories extending the standard model of particle physics, with predicted sizes very close to the current experimental limit. The new limit was set by measuring spin precession within the metastable H state of the polar molecule thorium monoxide (ThO). A particular focus here is on the automated data acquisition system developed to search for a precession phase odd under internal and external reversal of the electric field. Automated switching of many different experimental controls allowed a rapid diagnosis of major systematics, including the dominant systematic caused by non-reversing electric fields and laser polarization gradients. Polarimetry measurements made it possible to quantify and minimize the polarization gradients in our state preparation and probe lasers. Three separate measurements were used to determine the electric field that did not reverse when we tried to switch the field direction. The new bound of | de| < 8.7 x 10--29 e·cm is over an order of magnitude smaller than previous limits, and strongly limits T-violating physics at TeV energy scales.

  2. Geologic, geotechnical, and geophysical properties of core from the Acme Fire-Pit-1 drill hole, Sheridan County, Wyoming

    USGS Publications Warehouse

    Collins, Donley S.

    1983-01-01

    A preliminary core study from the Acme Fire-Pit-1 drill hole, Sheridan County, Wyoming, revealed that the upper portion of the core had been baked by a fire confined to the underlying Monarch coal bed. The baked (clinkered) sediment above the Monarch coal bed was determined to have higher point-load strength values (greater than 2 MPa) than the sediment under the burned coal

  3. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    NASA Astrophysics Data System (ADS)

    Seow, P.; Win, M. T.; Wong, J. H. D.; Abdullah, N. A.; Ramli, N.

    2016-03-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging.

  4. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  5. Development and first application of an Aerosol Collection Module (ACM) for quasi online compound specific aerosol measurements

    NASA Astrophysics Data System (ADS)

    Hohaus, Thorsten; Kiendler-Scharr, Astrid; Trimborn, Dagmar; Jayne, John; Wahner, Andreas; Worsnop, Doug

    2010-05-01

    Atmospheric aerosols influence climate and human health on regional and global scales (IPCC, 2007). In many environments organics are a major fraction of the aerosol influencing its properties. Due to the huge variety of organic compounds present in atmospheric aerosol current measurement techniques are far from providing a full speciation of organic aerosol (Hallquist et al., 2009). The development of new techniques for compound specific measurements with high time resolution is a timely issue in organic aerosol research. Here we present first laboratory characterisations of an aerosol collection module (ACM) which was developed to allow for the sampling and transfer of atmospheric PM1 aerosol. The system consists of an aerodynamic lens system focussing particles on a beam. This beam is directed to a 3.4 mm in diameter surface which is cooled to -30 °C with liquid nitrogen. After collection the aerosol sample can be evaporated from the surface by heating it to up to 270 °C. The sample is transferred through a 60cm long line with a carrier gas. In order to test the ACM for linearity and sensitivity we combined it with a GC-MS system. The tests were performed with octadecane aerosol. The octadecane mass as measured with the ACM-GC-MS was compared versus the mass as calculated from SMPS derived total volume. The data correlate well (R2 0.99, slope of linear fit 1.1) indicating 100 % collection efficiency. From 150 °C to 270 °C no effect of desorption temperature on transfer efficiency could be observed. The ACM-GC-MS system was proven to be linear over the mass range 2-100 ng and has a detection limit of ~ 2 ng. First experiments applying the ACM-GC-MS system were conducted at the Jülich Aerosol Chamber. Secondary organic aerosol (SOA) was formed from ozonolysis of 600 ppbv of b-pinene. The major oxidation product nopinone was detected in the aerosol and could be shown to decrease from 2 % of the total aerosol to 0.5 % of the aerosol over the 48 hours of

  6. A Prediction of the Damping Properties of Hindered Phenol AO-60/polyacrylate Rubber (AO-60/ACM) Composites through Molecular Dynamics Simulation

    NASA Astrophysics Data System (ADS)

    Yang, Da-Wei; Zhao, Xiu-Ying; Zhang, Geng; Li, Qiang-Guo; Wu, Si-Zhu

    2016-05-01

    Molecule dynamics (MD) simulation, a molecular-level method, was applied to predict the damping properties of AO-60/polyacrylate rubber (AO-60/ACM) composites before experimental measures were performed. MD simulation results revealed that two types of hydrogen bond, namely, type A (AO-60) -OH•••O=C- (ACM), type B (AO-60) - OH•••O=C- (AO-60) were formed. Then, the AO-60/ACM composites were fabricated and tested to verify the accuracy of the MD simulation through dynamic mechanical thermal analysis (DMTA). DMTA results showed that the introduction of AO-60 could remarkably improve the damping properties of the composites, including the increase of glass transition temperature (Tg) alongside with the loss factor (tan δ), also indicating the AO-60/ACM(98/100) had the best damping performance amongst the composites which verified by the experimental.

  7. Selecting Software.

    ERIC Educational Resources Information Center

    Pereus, Steven C.

    2002-01-01

    Describes a comprehensive computer software selection and evaluation process, including documenting district needs, evaluating software packages, weighing the alternatives, and making the purchase. (PKP)

  8. Insights into Degron Recognition by APC/C Coactivators from the Structure of an Acm1-Cdh1 Complex

    PubMed Central

    He, Jun; Chao, William C.H.; Zhang, Ziguo; Yang, Jing; Cronin, Nora; Barford, David

    2013-01-01

    Summary The anaphase-promoting complex/cyclosome (APC/C) regulates sister chromatid segregation and the exit from mitosis. Selection of most APC/C substrates is controlled by coactivator subunits (either Cdc20 or Cdh1) that interact with substrate destruction motifs—predominantly the destruction (D) box and KEN box degrons. How coactivators recognize D box degrons and how this is inhibited by APC/C regulatory proteins is not defined at the atomic level. Here, from the crystal structure of S. cerevisiae Cdh1 in complex with its specific inhibitor Acm1, which incorporates D and KEN box pseudosubstrate motifs, we describe the molecular basis for D box recognition. Additional interactions between Acm1 and Cdh1 identify a third protein-binding site on Cdh1 that is likely to confer coactivator-specific protein functions including substrate association. We provide a structural rationalization for D box and KEN box recognition by coactivators and demonstrate that many noncanonical APC/C degrons bind APC/C coactivators at the D box coreceptor. PMID:23707760

  9. Superfund Record of Decision (EPA Region 5): Acme Solvent Reclaiming, Winnebago County, IL. (Second remedial action), December 1990

    SciTech Connect

    Not Available

    1990-12-31

    The 20-acre Acme Solvent Reclaiming site is a former industrial disposal site in Winnebago County, Illinois. Land use in the area is mixed agricultural and residential. From 1960 to 1973, Acme Solvent Reclaiming disposed of paints, oils, and still bottoms onsite from its solvent reclamation plant. Wastes were dumped into depressions created from previous quarrying and landscaping operations, and empty drums also were stored onsite. State investigations in 1981 identified elevated levels of chlorinated organic compounds in ground water. A 1985 Record of Decision (ROD) provided for excavation and onsite incineration of 26,000 cubic yards of contaminated soil and sludge, supplying home carbon treatment units to affected residences, and further study of ground water and bedrock. During illegal removal actions taken by PRPs in 1986, 40,000 tons of soil and sludge were removed from the site. The selected remedial action for the site includes excavating and treating 6,000 tons of soil and sludge from two waste areas, using low-temperature thermal stripping; treating residuals using solidification, if necessary, followed by onsite or offsite disposal; treating the remaining contaminated soil and possibly bedrock using soil/bedrock vapor extraction; consolidating the remaining contaminated soil onsite with any treatment residuals, followed by capping; incinerating offsite 8,000 gallons of liquids and sludge from two remaining tanks, and disposing of the tanks offsite; providing an alternate water supply to residents with contaminated wells; pumping and onsite treatment of VOC-contaminated ground water.

  10. CLIC-ACM: generic modular rad-hard data acquisition system based on CERN GBT versatile link

    NASA Astrophysics Data System (ADS)

    Bielawski, B.; Locci, F.; Magnoni, S.

    2015-01-01

    CLIC is a world-wide collaboration to study the next ``terascale'' lepton collider, relying upon a very innovative concept of two-beam-acceleration. This accelerator, currently under study, will be composed of the subsequence of 21000 two-beam-modules. Each module requires more than 300 analogue and digital signals which need to be acquired and controlled in a synchronous way. CLIC-ACM (Acquisition and Control Module) is the 'generic' control and acquisition module developed to accommodate the controls of all these signals for various sub-systems and related specification in term of data bandwidth, triggering and timing synchronization. This paper describes the system architecture with respect to its radiation-tolerance, power consumption and scalability.

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  12. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  13. Proprietary software

    NASA Technical Reports Server (NTRS)

    Marnock, M. J.

    1971-01-01

    The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.

  14. Characterization of PVL/ACME-positive methicillin-resistant Staphylococcus aureus (genotypes ST8-MRSA-IV and ST5-MRSA-II) isolated from a university hospital in Japan.

    PubMed

    Kawaguchiya, Mitsuyo; Urushibara, Noriko; Yamamoto, Dai; Yamashita, Toshiharu; Shinagawa, Masaaki; Watanabe, Naoki; Kobayashi, Nobumichi

    2013-02-01

    The ST8 methicillin-resistant Staphylococcus aureus (MRSA) with Staphylococcal cassette chromosome mec (SCCmec) type IVa, known as USA300, is a prevalent community-acquired MRSA (CA-MRSA) clone in the United States and has been spreading worldwide. The USA300 characteristically harbors Panton-Valentine Leukocidin (PVL) genes and the arginine catabolic mobile element (ACME, type I). Prevalence and molecular characteristics of PVL(+) and/or ACME(+) S. aureus were investigated in a university hospital located in northern Japan, for 1,366 S. aureus isolates, including 601 MRSA strains derived from clinical specimens collected from 2008 to 2010. The PVL gene was identified in three MRSA strains with SCCmec IV, which belonged to ST8, spa type t008, coagulase type III, and agr type I. Two PVL-positive MRSA strains had also type I ACME, and were isolated from skin abscess of outpatients who have not travelled abroad recently. One of these PVL(+)/ACME(+) strains carried tet(K), msrA, and aph(3')-IIIa, showing resistance to kanamycin, tetracycline, erythromycin, and ciprofloxacin, suggesting acquisition of more resistance than ST8 CA-MRSA reported in Japan previously. In contrast, another PVL(+)/ACME(+) strain and a PVL(+)/ACME(-) strain were susceptible to more antimicrobials and had less virulence factors than PVL(-)/ACME(+) MRSA strains. Besides the two PVL(+) MRSA strains, ACME (type-ΔII) was identified into seven MRSA strains with SCCmec II belonging to ST5, one of the three spa types (t002, t067, and t071), coagulase type II, and agr type II. These PVL(-)/ACME(+) MRSA strains showed multiple drug resistance and harbored various toxin genes as observed for ST5 PVL(-)/ACME(-) MRSA-II. The present study suggested the spread of ST8-MRSA-IV in northern Japan, and a potential significance of ACME-positive ST5-MRSA-II as an emerging MRSA clone in a hospital.

  15. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  16. Proceeding of the ACM/IEEE-CS Joint Conference on Digital Libraries (1st, Roanoke, Virginia, June 24-28, 2001).

    ERIC Educational Resources Information Center

    Association for Computing Machinery, New York, NY.

    Papers in this Proceedings of the ACM/IEEE-CS Joint Conference on Digital Libraries (Roanoke, Virginia, June 24-28, 2001) discuss: automatic genre analysis; text categorization; automated name authority control; automatic event generation; linked active content; designing e-books for legal research; metadata harvesting; mapping the…

  17. Software Engineering Code of Ethics and Professional Practice.

    PubMed

    2001-04-01

    The Software Engineering Code of Ethics and Professional Practice, intended as a standard for teaching and practicing software engineering, documents the ethical and professional obligations of software engineers. The code should instruct practitioners about the standards society expects them to meet, about what their peers strive for, and about what to expect of one another. In addition, the code should also inform the public about the responsibilities that are important to the profession. Adopted in 2000 by the IEEE Computer Society and the ACM--two leading international computing societies--the code of ethics is intended as a guide for members of the evolving software engineering profession. The code was developed by a multinational task force with additional input from other professionals from industry, government posts, military installations, and educational professions.

  18. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  19. Software Reviews.

    ERIC Educational Resources Information Center

    Beezer, Robert A.; And Others

    1988-01-01

    Reviews for three software packages are given. Those packages are: Linear Algebra Computer Companion; Probability and Statistics Demonstrations and Tutorials; and Math Utilities: CURVES, SURFS, AND DIFFS. (PK)

  20. Hydrologic effects of phreatophyte control, Acme-Artesia reach of the Pecos River, New Mexico, 1967-82

    USGS Publications Warehouse

    Welder, G.E.

    1988-01-01

    The U.S. Bureau of Reclamation began a phreatophyte clearing and control program in the bottom land of the Acme-Artesia reach of the Pecos River in March 1967. The initial cutting of 19,000 acres of saltcedar trees, the dominant phreatophyte in the area, was completed in May 1969. Saltcedar regrowth continued each year until July 1975, when root plowing eradicated most of the regrowth. The major objective of the clearing and control program was to salvage water that could be put to beneficial use. Measurements of changes in the water table in the bottom land and changes in the base flow of the Pecos River were made in order to determine the hydrologic effects of the program. Some salvage of water was indicated, but it is not readily recognized as an increase in base flow. The quantity of salvage probably is less than the average annual base-flow gain of 19 ,110 acre-ft in the reach during 1967-82. (Author 's abstract)

  1. Detection of structural and numerical chomosomal abnormalities by ACM-FISH analysis in sperm of oligozoospermic infertility patients

    SciTech Connect

    Schmid, T E; Brinkworth, M H; Hill, F; Sloter, E; Kamischke, A; Marchetti, F; Nieschlag, E; Wyrobek, A J

    2003-11-10

    Modern reproductive technologies are enabling the treatment of infertile men with severe disturbances of spermatogenesis. The possibility of elevated frequencies of genetically and chromosomally defective sperm has become an issue of concern with the increased usage of intracytoplasmic sperm injection (ICSI), which can enable men with severely impaired sperm production to father children. Several papers have been published about aneuploidy in oligozoospermic patients, but relatively little is known about chromosome structural aberrations in the sperm of these patients. We examined sperm from infertile, oligozoospermic individuals for structural and numerical chromosomal abnormalities using a multicolor ACM FISH assay that utilizes DNA probes specific for three regions of chromosome 1 to detect human sperm that carry numerical chromosomal abnormalities plus two categories of structural aberrations: duplications and deletions of 1pter and 1cen, and chromosomal breaks within the 1cen-1q12 region. There was a significant increase in the average frequencies of sperm with duplications and deletions in the infertility patients compared with the healthy concurrent controls. There was also a significantly elevated level of breaks within the 1cen-1q12 region. There was no evidence for an increase in chromosome-1 disomy, or in diploidy. Our data reveal that oligozoospermia is associated with chromosomal structural abnormalities suggesting that, oligozoospermic men carry a higher burden of transmissible, chromosome damage. The findings raise the possibility of elevated levels of transmissible chromosomal defects following ICSI treatment.

  2. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    PubMed

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together.

  3. Assessment of Two Planetary Boundary Layer Schemes (ACM2 and YSU) within the Weather Research and Forecasting (WRF) Model

    NASA Astrophysics Data System (ADS)

    Wolff, J.; Harrold, M.; Xu, M.

    2014-12-01

    The Weather Research and Forecasting (WRF) model is a highly configurable numerical weather prediction system used in both research and operational forecasting applications. Rigorously testing select configurations and evaluating the performance for specific applications is necessary due to the flexibility offered by the system. The Developmental Testbed Center (DTC) performed extensive testing and evaluation with the Advanced Research WRF (ARW) dynamic core for two physics suite configurations with a goal of assessing the impact that the planetary boundary layer (PBL) scheme had on the final forecast performance. The baseline configuration was run with the Air Force Weather Agency's physics suite, which includes the Yonsei University PBL scheme, while the second configuration was substituted with the Asymmetric Convective Model (ACM2) PBL scheme. This presentation will focus on assessing the forecast performance of the two configurations; both configurations were run over the same set of cases, allowing for a direct comparison of performance. The evaluation was performed over a 15 km CONUS domain for a testing period from September 2013 through August 2014. Simulations were initialized every 36 hours and run out to 48 hours; a 6-hour "warm start" spin-up, including data assimilation using the Gridpoint Statistical Interpolation system preceded each simulation. The extensive testing period allows for robust results as well as the ability to investigate seasonal and regional differences between the two configurations. Results will focus on the evaluation of traditional verification metrics for surface and upper air variables, along with an assessment of statistical and practical significance.

  4. Software Bridge

    NASA Technical Reports Server (NTRS)

    1995-01-01

    I-Bridge is a commercial version of software developed by I-Kinetics under a NASA Small Business Innovation Research (SBIR) contract. The software allows users of Windows applications to gain quick, easy access to databases, programs and files on UNIX services. Information goes directly onto spreadsheets and other applications; users need not manually locate, transfer and convert data.

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Reviews two software packages for the Macintosh series. "Course Builder 2.0," a courseware authoring system, allows the user to create programs which stand alone and may be used independently in the classroom. "World Builder," an artificial intelligence software package, allows creative thinking, problem-solving, and decision-making. (YP)

  6. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1990-01-01

    Reviews three computer software: (1) "Elastic Lines: The Electronic Geoboard" on elementary geometry; (2) "Wildlife Adventures: Whales" on environmental science; and (3) "What Do You Do with a Broken Calculator?" on computation and problem solving. Summarizes the descriptions, strengths and weaknesses, and applications of each software. (YP)

  7. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen

    1988-01-01

    Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6) "Biomes." All are for Apple series microcomputers.…

  8. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1988

    1988-01-01

    Reviews six software packages for the Apple II family. Programs reviewed include "Science Courseware: Earth Science Series"; "Heat and Light"; "In Search of Space: Introduction to Model Rocketry"; "Drug Education Series: Drugs--Their Effects on You'"; "Uncertainties and Measurement"; and "Software Films: Learning about Science Series," which…

  9. Software Smarts

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.

  10. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  11. Prescriptions for ACME's Future.

    ERIC Educational Resources Information Center

    Felch, William Campbell

    1991-01-01

    Five prescriptions for the future agenda of the Alliance for Continuing Medical Education are (1) a core curriculum; (2) informatics; (3) remedial continuing medical education (CME); (4) focus on the individual learner; and (5) practice-oriented CME. (SK)

  12. WTP Calculation Sheet: Determining the LAW Glass Former Constituents and Amounts for G2 and Acm Models. 24590-LAW-M4C-LFP-00002, Rev. B

    SciTech Connect

    Gimpel, Rodney F.; Kruger, Albert A.

    2013-12-16

    The purpose of this calculation is to determine the LAW glass former recipe and additives with their respective amounts. The methodology and equations contained herein are to be used in the G2 and ACM models until better information is supplied by R&T efforts. This revision includes calculations that determines the mass and volume of the bulk chemicals/minerals needed per batch. Plus, it contains calculations (for the G2 model) to help prevent overflow in LAW Feed Preparation Vessel.

  13. Software Reviews.

    ERIC Educational Resources Information Center

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  14. Software Reviews.

    ERIC Educational Resources Information Center

    McGrath, Diane

    1990-01-01

    Reviews two programs: (1) "The Weather Machine" on understanding weather and weather forecasting and (2) "The Mystery of the Hotel Victoria" on problem solving in mathematics. Presents the descriptions, advantages, and weaknesses of the software. (YP)

  15. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1990-01-01

    Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)

  16. Software Reviews.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  17. Software Reviews.

    ERIC Educational Resources Information Center

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  18. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  19. Software Reviews.

    ERIC Educational Resources Information Center

    Mathematics and Computer Education, 1988

    1988-01-01

    Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)

  20. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Eugene T., Ed.

    1988-01-01

    Presents reviews by classroom teachers of software for teaching science. Includes material on the work of geologists, genetics, earth science, classification of living things, astronomy, endangered species, skeleton, drugs, and heartbeat. Provides information on availability and equipment needed. (RT)

  1. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1987-01-01

    Provides a review of four science software programs. Includes topics such as plate tectonics, laboratory experiment simulations, the human body, and light and temperature. Contains information on ordering and reviewers' comments. (ML)

  2. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1987-01-01

    Provides reviews of six computer software programs designed for use in elementary science education programs. Provides the title, publisher, grade level, and descriptions of courseware on ant farms, drugs, genetics, beachcombing, matter, and test generation. (TW)

  3. A 90-day subchronic feeding study of genetically modified maize expressing Cry1Ac-M protein in Sprague-Dawley rats.

    PubMed

    Liu, Pengfei; He, Xiaoyun; Chen, Delong; Luo, Yunbo; Cao, Sishuo; Song, Huan; Liu, Ting; Huang, Kunlun; Xu, Wentao

    2012-09-01

    The cry1Ac-M gene, coding one of Bacillus thuringiensis (Bt) crystal proteins, was introduced into maize H99 × Hi IIB genome to produce insect-resistant GM maize BT-38. The food safety assessment of the BT-38 maize was conducted in Sprague-Dawley rats by a 90-days feeding study. We incorporated maize grains from BT-38 and H99 × Hi IIB into rodent diets at three concentrations (12.5%, 25%, 50%) and administered to Sprague-Dawley rats (n=10/sex/group) for 90 days. A commercialized rodent diet was fed to an additional group as control group. Body weight, feed consumption and toxicological response variables were measured, and gross as well as microscopic pathology were examined. Moreover, detection of residual Cry1Ac-M protein in the serum of rats fed with GM maize was conducted. No death or adverse effects were observed in the current feeding study. No adverse differences in the values of the response variables were observed between rats that consumed diets containing GM maize BT-38 and non-GM maize H99 × Hi IIB. No detectable Cry1Ac-M protein was found in the serum of rats after feeding diets containing GM maize for 3 months. The results demonstrated that BT-38 maize is as safe as conventional non-GM maize.

  4. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  5. Antiterrorist Software

    NASA Technical Reports Server (NTRS)

    Clark, David A.

    1998-01-01

    In light of the escalation of terrorism, the Department of Defense spearheaded the development of new antiterrorist software for all Government agencies by issuing a Broad Agency Announcement to solicit proposals. This Government-wide competition resulted in a team that includes NASA Lewis Research Center's Computer Services Division, who will develop the graphical user interface (GUI) and test it in their usability lab. The team launched a program entitled Joint Sphere of Security (JSOS), crafted a design architecture (see the following figure), and is testing the interface. This software system has a state-ofthe- art, object-oriented architecture, with a main kernel composed of the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS will be used as the software "breadboard" for assembling the components of explosions, such as blast and collapse simulations.

  6. [Software version and medical device software supervision].

    PubMed

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  7. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1990-01-01

    Reviewed are six software packages for Apple and/or IBM computers. Included are "Autograph,""The New Game Show,""Science Probe-Earth Science,""Pollution Patrol,""Investigating Plant Growth," and "AIDS: The Investigation." Discussed are the grade level, function, availability, cost, and hardware requirements of each. (CW)

  8. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1989

    1989-01-01

    Reviews of seven software packages are presented including "The Environment I: Habitats and EcoSystems; II Cycles and Interactions"; "Super Sign Maker"; "The Great Knowledge Race: Substance Abuse"; "Exploring Science: Temperature"; "Fast Food Calculator and RD Aide"; "The Human Body: Circulation and Respiration" and "Forces in Liquids and Gases."…

  9. Star Software.

    ERIC Educational Resources Information Center

    Kloza, Brad

    2000-01-01

    Presents a collection of computer software programs designed to spark learning enthusiasm at every grade level and across the curriculum. They include Reader Rabbit's Learn to Read, Spelling Power, Mind Twister Math, Community Construction Kit, Breaking the Code, Encarta Africana 2000, Virtual Serengeti, Operation: Frog (Deluxe), and My First…

  10. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1988

    1988-01-01

    Reviews five software packages for use with school age children. Includes "Science Toolkit Module 2: Earthquake Lab"; "Adaptations and Identification"; "Geoworld"; "Body Systems II Series: The Blood System: A Liquid of Life," all for Apple II, and "Science Courseware: Life Science/Biology" for Apple II and IBM. (CW)

  11. Software Update.

    ERIC Educational Resources Information Center

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  12. Software Comparison

    NASA Technical Reports Server (NTRS)

    Blanchard, D. C.

    1986-01-01

    Software Comparison Package (SCP) compares similar files. Normally, these are 90-character files produced by CDC UPDATE utility from program libraries that contain FORTRAN source code plus identifier. SCP also used to compare load maps, cross-reference outputs, and UPDATE corrections sets. Helps wherever line-by-line comparison of similarly structured files required.

  13. Software Reviews.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are two computer software packages: "Super Solvers Midnight Rescue!" a problem-solving program for IBM PCs; and "Interactive Physics," a simulation program for the Macintosh computer. The functions of the package are discussed including strengths and weaknesses and teaching suggestions. (CW)

  14. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1989-01-01

    Describes three software packages: (1) "MacMendeleev"--database/graphic display for chemistry, grades 10-12, Macintosh; (2) "Geometry One: Foundations"--geometry tutorial, grades 7-12, IBM; (3) "Mathematics Exploration Toolkit"--algebra and calculus tutorial, grades 8-12, IBM. (MVL)

  15. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.

  16. Software Patents.

    ERIC Educational Resources Information Center

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  17. Software Reviews.

    ERIC Educational Resources Information Center

    Mathematics and Computer Education, 1987

    1987-01-01

    Presented are reviews of several microcomputer software programs. Included are reviews of: (1) Microstat (Zenith); (2) MathCAD (MathSoft); (3) Discrete Mathematics (True Basic); (4) CALCULUS (True Basic); (5) Linear-Kit (John Wiley); and (6) Geometry Sensei (Broderbund). (RH)

  18. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1988-01-01

    Reviews two software packages, "Solutions Unlimited" and "BASIC Data Base System." Provides a description, summary, strengths and weaknesses, availability and costs. Includes reviews of three structured BASIC packages: "True BASIC (2.0)"; "Turbo BASIC (1.0)"; and "QuickBASIC (3.0)." Explains significant features such as graphics, costs,…

  19. Reviews: Software.

    ERIC Educational Resources Information Center

    Mackenzie, Norma N.; And Others

    1988-01-01

    Reviews four computer software packages including: "The Physical Science Series: Sound" which demonstrates making waves, speed of sound, doppler effect, and human hearing; "Andromeda" depicting celestial motions in any direction; "Biology Quiz: Humans" covering chemistry, cells, viruses, and human biology; and "MacStronomy" covering information on…

  20. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)

  1. Software Reviews.

    ERIC Educational Resources Information Center

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  2. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1989-01-01

    Six software packages are described in this review. Included are "Molecules and Atoms: Exploring the Essence of Matter"; "Heart Probe"; "GM Sunraycer"; "Six Puzzles"; "Information Laboratory--Life Science"; and "Science Test Builder." Hardware requirements, prices, and a summary of the abilities of each program are presented. (CW)

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1989-01-01

    Presents comments by classroom teachers on software for science teaching including topics on: the size of a molecule, matter, leaves, vitamins and minerals, dinosaurs, and collecting and measuring data. Each is an Apple computer series. Availability and costs are included. (RT)

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1987-01-01

    Reviewed are three computer software programs: the Astronomer (astronomy program for middle school students and older); Hands-on-Statistics: Explorations with a Microcomputer (statistics program for secondary school students and older); and CATGEN (a genetics program for secondary school students and older). Each review provides information on:…

  5. Software Review.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  6. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  7. Software Reviews.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed are two computer software programs for Apple II computers on weather for upper elementary and middle school grades. "Weather" introduces the major factors (temperature, humidity, wind, and air pressure) affecting weather. "How Weather Works" uses simulation and auto-tutorial formats on sun, wind, fronts, clouds, and storms. (YP)

  8. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1989-01-01

    Reviews three software packages: (1) "The Weather Machine Courseware Kit" for grades 7-12; (2) "Exploring Measurement, Time, and Money--Level I," for primary level mathematics; and (3) "Professor DOS with SmartGuide for DOS" providing an extensive tutorial covering DOS 2.1 to 4.0. Discusses the strengths and weaknesses of each package. (YP)

  9. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are six computer software packages including "Invisible Bugs,""Chaos Plus...,""The Botanist's Apprentice,""A Baby is Born," Storyboard Plus-Version 2.0," and "Weather." Hardware requirements, functions, performance, and use in the classroom are discussed. (CW)

  10. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1988

    1988-01-01

    Reviews six software packages for use with school age children ranging from grade 3 to grade 12. Includes "The Microcomputer Based Lab Project: Motion, Sound"; "Genetics"; "Geologic History"; "The Microscope Simulator"; and "Wiz Works" all for Apple II and "Reading for Information: Level II" for IBM. (CW)

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Mackenzie, Norma N.; And Others

    1988-01-01

    Describes computer software for use with various age groups. Topics include activities involving temperature, simulations, earth science, the circulatory system, human body, reading in science, and ecology. Provides information on equipment needed, availability, package contents, and price. Comments of reviews are presented by classroom teachers.…

  12. Experimental determination of the partitioning coefficient and volatility of important BVOC oxidation products using the Aerosol Collection Module (ACM) coupled to a PTR-ToF-MS

    NASA Astrophysics Data System (ADS)

    Gkatzelis, G.; Hohaus, T.; Tillmann, R.; Schmitt, S. H.; Yu, Z.; Schlag, P.; Wegener, R.; Kaminski, M.; Kiendler-Scharr, A.

    2015-12-01

    Atmospheric aerosol can alter the Earth's radiative budget and global climate but can also affect human health. A dominant contributor to the submicrometer particulate matter (PM) is organic aerosol (OA). OA can be either directly emitted through e.g. combustion processes (primary OA) or formed through the oxidation of organic gases (secondary organic aerosol, SOA). A detailed understanding of SOA formation is of importance as it constitutes a major contribution to the total OA. The partitioning between the gas and particle phase as well as the volatility of individual components of SOA is yet poorly understood adding uncertainties and thus complicating climate modelling. In this work, a new experimental methodology was used for compound-specific analysis of organic aerosol. The Aerosol Collection Module (ACM) is a newly developed instrument that deploys an aerodynamic lens to separate the gas and particle phase of an aerosol. The particle phase is directed to a cooled sampling surface. After collection particles are thermally desorbed and transferred to a detector for further analysis. In the present work, the ACM was coupled to a Proton Transfer Reaction-Time of Flight-Mass Spectrometer (PTR-ToF-MS) to detect and quantify organic compounds partitioning between the gas and particle phase. This experimental approach was used in a set of experiments at the atmosphere simulation chamber SAPHIR to investigate SOA formation. Ozone oxidation with subsequent photochemical aging of β-pinene, limonene and real plant emissions from Pinus sylvestris (Scots pine) were studied. Simultaneous measurement of the gas and particle phase using the ACM-PTR-ToF-MS allows to report partitioning coefficients of important BVOC oxidation products. Additionally, volatility trends and changes of the SOA with photochemical aging are investigated and compared for all systems studied.

  13. Space Station Software Issues

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor); Beskenis, S. (Editor)

    1985-01-01

    Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.

  14. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  15. Scheduling Software

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Advanced Scheduling Environment is a software product designed and marketed by AVYX, Inc. to provide scheduling solutions for complex manufacturing environments. It can be adapted to specific scheduling and manufacturing processes and has led to substantial cost savings. The system was originally developed for NASA use in scheduling Space Shuttle flights and satellite activities. AVYX, Inc. is an offshoot of a company formed to provide computer-related services to NASA. TREES-plus, the company's initial product became the programming language for the advanced scheduling environment system.

  16. Space Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  17. Simulation Software

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Various NASA Small Business Innovation Research grants from Marshall Space Flight Center, Langley Research Center and Ames Research Center were used to develop the 'kernel' of COMCO's modeling and simulation software, the PHLEX finite element code. NASA needed it to model designs of flight vehicles; one of many customized commercial applications is UNISIM, a PHLEX-based code for analyzing underground flows in oil reservoirs for Texaco, Inc. COMCO's products simulate a computational mechanics problem, estimate the solution's error and produce the optimal hp-adapted mesh for the accuracy the user chooses. The system is also used as a research or training tool in universities and in mechanical design in industrial corporations.

  18. Seminar Software

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Society for Computer Simulation International is a professional technical society that distributes information on methodology techniques and uses of computer simulation. The society uses NETS, a NASA-developed program, to assist seminar participants in learning to use neural networks for computer simulation. NETS is a software system modeled after the human brain; it is designed to help scientists exploring artificial intelligence to solve pattern matching problems. Examples from NETS are presented to seminar participants, who can then manipulate, alter or enhance them for their own applications.

  19. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.

  1. Choosing Software for Children.

    ERIC Educational Resources Information Center

    Spencer, Mima

    This Digest points out characteristics of quality computer software for children, describes different kinds of software, and suggests ways to get software for preview. The need to consider the purpose for which the software is to be used and the degree to which the software meets its stated goals is noted. Desirable software characteristics and…

  2. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  3. Software attribute visualization for high integrity software

    SciTech Connect

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  4. Global Software Engineering: A Software Process Approach

    NASA Astrophysics Data System (ADS)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  5. Report: Scientific Software.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1985-01-01

    Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)

  6. Space Flight Software Development Software for Intelligent System Health Management

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  7. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  8. Software Configuration Management Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.

  9. Software productivity improvement through software engineering technology

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  10. Software distribution using xnetlib

    SciTech Connect

    Dongarra, J.J. |; Rowan, T.H.; Wade, R.C.

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  11. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  12. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  13. Responsbility for unreliable software

    SciTech Connect

    Wahl, N.J.

    1994-12-31

    Unreliable software exposes software developers and distributors to legal risks. Under certain circumstances, the developer and distributor of unreliable software can be sued. To avoid lawsuits, software developers should do the following: determine what the risks am, understand the extent of the risks, and identify ways of avoiding the risks and lessening the consequences of the risks. Liability issues associated with unreliable software are explored in this article.

  14. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  15. Payload software technology: Software technology development plan

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  16. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  17. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  18. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multi-core, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to .50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  19. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  20. Software Engineering Improvement Plan

    NASA Technical Reports Server (NTRS)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  1. Commercial Data Mining Software

    NASA Astrophysics Data System (ADS)

    Zhang, Qingyu; Segall, Richard S.

    This chapter discusses selected commercial software for data mining, supercomputing data mining, text mining, and web mining. The selected software are compared with their features and also applied to available data sets. The software for data mining are SAS Enterprise Miner, Megaputer PolyAnalyst 5.0, PASW (formerly SPSS Clementine), IBM Intelligent Miner, and BioDiscovery GeneSight. The software for supercomputing are Avizo by Visualization Science Group and JMP Genomics from SAS Institute. The software for text mining are SAS Text Miner and Megaputer PolyAnalyst 5.0. The software for web mining are Megaputer PolyAnalyst and SPSS Clementine . Background on related literature and software are presented. Screen shots of each of the selected software are presented, as are conclusions and future directions.

  2. Guidelines for software inspections

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  3. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  4. The Problem of Software.

    ERIC Educational Resources Information Center

    Alexander, Wilma Jean

    1982-01-01

    Explains how schools can purchase computer software. Lists are presented of (1) sources of published evaluations of selected software, (2) publications which contain names and sources of programs, and (3) magazines providing program listings appropriate for classroom use. (CT)

  5. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  6. DSS command software update

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1980-01-01

    The modifications, additions, and testing results for a version of the Deep Space Station command software, generated for support of the Voyager Saturn encounter, are discussed. The software update requirements included efforts to: (1) recode portions of the software to permit recovery of approximately 2000 words of memory; (2) correct five Voyager Ground data System liens; (3) provide capability to automatically turn off the command processor assembly local printer during periods of low activity; and (4) correct anomalies existing in the software.

  7. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  8. Astronomical Software Directory Service

    NASA Technical Reports Server (NTRS)

    Hanisch, R. J.; Payne, H.; Hayes, J.

    1998-01-01

    This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.

  9. Software Shopper. Revised.

    ERIC Educational Resources Information Center

    Davis, Sandra Hart, Comp.

    This annotated index describes and illustrates a wide selection of public domain instructional software that may be useful in the education of deaf students and provides educators with a way to order the listed programs. The software programs are designed for use on Apple computers and their compatibles. The software descriptions are presented in…

  10. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  11. Java for flight software

    NASA Technical Reports Server (NTRS)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  12. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  13. Bioinformatics software resources.

    PubMed

    Gilbert, Don

    2004-09-01

    This review looks at internet archives, repositories and lists for obtaining popular and useful biology and bioinformatics software. Resources include collections of free software, services for the collaborative development of new programs, software news media and catalogues of links to bioinformatics software and web tools. Problems with such resources arise from needs for continued curator effort to collect and update these, combined with less than optimal community support, funding and collaboration. Despite some problems, the available software repositories provide needed public access to many tools that are a foundation for analyses in bioscience research efforts.

  14. Tracker 300 Software

    SciTech Connect

    Wysor, R. Wes

    2006-01-12

    The Tracker300 software is downloaded to an off-the-shelf product called RCM3400/RCM3410 made by Rabbit Semiconductor. The software is a closed loop control which computes the sun's position and provides stability compensation. Using the RCM3400/RCM3410 module, the software stores and retrieves parameters from the onboard flash. The software also allows for communication with a host. It will allow the parameters to be downloaded or uploaded, it will show the status of the controller, it will provide real-time feedback, and it will send command acknowledgements. The software will capture the GPS response and ensure the internal clock is set correctly.

  15. Payload software technology

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.

  16. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  17. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  18. COTS software selection process.

    SciTech Connect

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  19. Gammasphere software development

    SciTech Connect

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  20. Dtest Testing Software

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  1. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2005-01-01

    NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.

  2. Software Reuse Issues

    NASA Technical Reports Server (NTRS)

    Voigt, Susan J. (Editor); Smith, Kathryn A. (Editor)

    1989-01-01

    NASA Langley Research Center sponsored a Workshop on NASA Research in Software Reuse on November 17-18, 1988 in Melbourne, Florida, hosted by Software Productivity Solutions, Inc. Participants came from four NASA centers and headquarters, eight NASA contractor companies, and three research institutes. Presentations were made on software reuse research at the four NASA centers; on Eli, the reusable software synthesis system designed and currently under development by SPS; on Space Station Freedom plans for reuse; and on other reuse research projects. This publication summarizes the presentations made and the issues discussed during the workshop.

  3. On Software Compatibility.

    ERIC Educational Resources Information Center

    Ershov, Andrei P.

    The problem of compatibility of software hampers the development of computer application. One solution lies in standardization of languages, terms, peripherais, operating systems and computer characteristics. (AB)

  4. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  5. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  6. Teaching Social Software with Social Software

    ERIC Educational Resources Information Center

    Mejias, Ulises

    2006-01-01

    Ulises Mejias examines how social software--information and communications technologies that facilitate the collaboration and exchange of ideas--enables students to participate in distributed research, an approach to learning in which knowledge is collectively constructed and shared. During Fall 2005, Mejias taught a graduate seminar that provided…

  7. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  8. Communications Software for Microcomputers.

    ERIC Educational Resources Information Center

    Bruman, Janet L.

    Focusing on the use of microcomputers as "smart terminals" for accessing time-sharing systems for libraries, this document discusses the communications software needed to allow the microcomputer to appear as a terminal to the remote host. The functions which communications software programs are designed to perform are defined and explained,…

  9. Selecting the Right Software.

    ERIC Educational Resources Information Center

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  10. UWB Tracking Software Development

    NASA Technical Reports Server (NTRS)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  11. A User's Software Dilemma.

    ERIC Educational Resources Information Center

    Splittgerber, Fred; Stirzaker, N. A.

    1989-01-01

    Discusses several issues associated with purchasing computer software packages: (1) continual updates; (2) lack of industrial standards for software development; and (3) expense. Many packages fail to provide technical assistance from a local dealer or the package developer. Without standards, costs to business, education, and the general public…

  12. Plating Tank Control Software

    1998-03-01

    The Plating Tank Control Software is a graphical user interface that controls and records plating process conditions for plating in high aspect ratio channels that require use of low current and long times. The software is written for a Pentium II PC with an 8 channel data acquisition card, and the necessary shunt resistors for measuring currents in the millampere range.

  13. Software process assessments

    NASA Technical Reports Server (NTRS)

    Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.

    1992-01-01

    Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

  14. Reusable Software Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Timothy E.

    1995-01-01

    The objective of the Reusable Software System (RSS) is to provide NASA Langley Research Center and its contractor personnel with a reusable software technology through the Internet. The RSS is easily accessible, provides information that is extractable, and the capability to submit information or data for the purpose of scientific research at NASA Langley Research Center within the Atmospheric Science Division.

  15. Cartographic applications software

    USGS Publications Warehouse

    U.S. Geological Survey

    1992-01-01

    The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.

  16. Learning from Software Localization.

    ERIC Educational Resources Information Center

    Guo, She-Sen

    2003-01-01

    Localization is the process of adapting a product to meet the language, cultural and other requirements of a specific target environment or market. This article describes ways in which software localization impacts upon curriculum, and discusses what students will learn from software localization. (AEF)

  17. Measuring software technology

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Card, D. N.; Church, V. E.; Page, G.; Mcgarry, F. E.

    1983-01-01

    Results are reported from a series of investigations into the effectiveness of various methods and tools used in a software production environment. The basis for the analysis is a project data base, built through extensive data collection and process instrumentation. The project profiles become an organizational memory, serving as a reference point for an active program of measurement and experimentation on software technology.

  18. Software engineering ethics

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  19. Software measurement guidebook

    NASA Technical Reports Server (NTRS)

    Bassman, Mitchell J.; Mcgarry, Frank; Pajerski, Rose

    1994-01-01

    This software Measurement Guidebook presents information on the purpose and importance of measurement. It discusses the specific procedures and activities of a measurement program and the roles of the people involved. The guidebook also clarifies the roles that measurement can and must play in the goal of continual, sustained improvement for all software production and maintenance efforts.

  20. Cactus: Software Priorities

    ERIC Educational Resources Information Center

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  1. Fastbus software progress

    SciTech Connect

    Gustavson, D.B.

    1982-01-01

    The current status of the Fastbus software development program of the Fastbus Software Working Group is reported, and future plans are discussed. A package of Fastbus interface subroutines has been prepared as a proposed standard, language support for diagnostics and bench testing has been developed, and new documentation to help users find these resources and use them effectively is being written.

  2. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  3. Cathodoluminescence Spectrum Imaging Software

    2011-04-07

    The software developed for spectrum imaging is applied to the analysis of the spectrum series generated by our cathodoluminescence instrumentation. This software provides advanced processing capabilities s such: reconstruction of photon intensity (resolved in energy) and photon energy maps, extraction of the spectrum from selected areas, quantitative imaging mode, pixel-to-pixel correlation spectrum line scans, ASCII, output, filling routines, drift correction, etc.

  4. ITOUGH2 software qualification

    SciTech Connect

    Finsterle, S.; Pruess, K.; Fraser, P.

    1996-10-01

    The purpose of this report is to provide all software baseline documents necessary for the software qualification of ITOUGH2. ITOUGH2 is a computer program providing inverse modeling capabilities for TOUGH2. TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media.

  5. Measuring software design

    NASA Technical Reports Server (NTRS)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  6. PREVAPORATION PERFORMANCE PREDICTION SOFTWARE

    EPA Science Inventory

    The Pervaporation, Performance, Prediction Software and Database (PPPS&D) computer software program is currently being developed within the USEPA, NRMRL. The purpose of the PPPS&D program is to educate and assist potential users in identifying opportunities for using pervaporati...

  7. Software Solution Saves Dollars

    ERIC Educational Resources Information Center

    Trotter, Andrew

    2004-01-01

    This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…

  8. Software quality in 1997

    SciTech Connect

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  9. NASA Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda

    1997-01-01

    If software is a critical element in a safety critical system, it is imperative to implement a systematic approach to software safety as an integral part of the overall system safety programs. The NASA-STD-8719.13A, "NASA Software Safety Standard", describes the activities necessary to ensure that safety is designed into software that is acquired or developed by NASA, and that safety is maintained throughout the software life cycle. A PDF version, is available on the WWW from Lewis. A Guidebook that will assist in the implementation of the requirements in the Safety Standard is under development at the Lewis Research Center (LeRC). After completion, it will also be available on the WWW from Lewis.

  10. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  11. Tracker 300 Software

    2006-01-12

    The Tracker300 software is downloaded to an off-the-shelf product called RCM3400/RCM3410 made by Rabbit Semiconductor. The software is a closed loop control which computes the sun's position and provides stability compensation. Using the RCM3400/RCM3410 module, the software stores and retrieves parameters from the onboard flash. The software also allows for communication with a host. It will allow the parameters to be downloaded or uploaded, it will show the status of the controller, it will providemore » real-time feedback, and it will send command acknowledgements. The software will capture the GPS response and ensure the internal clock is set correctly.« less

  12. Scientific Software Component Technology

    SciTech Connect

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  13. Software Measurement Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.

  14. CASE: Software design technologies

    SciTech Connect

    Kalyanov, G.N.

    1994-05-01

    CASE (Computer-Aided Software Engineering) is a set of methodologies for software design, development, and maintenance supported by a complex of interconnected automation tools. CASE is a set of tools for the programmer, analyst, and developer for the automation of software design and development. Today, CASE has become an independent discipline in software engineering that has given rise to a powerful CASE industry made up of hundreds of firms and companies of various kinds. They include companies that develop tools for software analysis and design and have a wide network of distributors and dealers, firms that develop specialized tools for narrow subject areas or for individual stages of the software life cycle, firms that organize seminars and courses for specialists, consulting firms, which demonstrate the practical power of CASE toolkits for specific applications, and companies specializing in the publication of periodicals and bulletins on CASE. The principal purchasers of CASE toolkits abroad are military organizations, data-processing centers, and commercial software developers.

  15. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  16. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  17. Ascent/Descent Software

    NASA Technical Reports Server (NTRS)

    Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben

    2012-01-01

    The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.

  18. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  19. Error Free Software

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  20. CNEOST Control Software System

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Zhao, Hai-bin; Xia, Yan; Lu, Hao; Li, Bin

    2016-01-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the messaging mechanism based on the WebSocket protocol, and possesses good flexibility and expansibility. The user interface based on the responsive web design has realized the remote observations under both desktop and mobile devices. The stable operation of the software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  1. CNEOST Control Software System

    NASA Astrophysics Data System (ADS)

    Wang, X.; Zhao, H. B.; Xia, Y.; Lu, H.; Li, B.

    2015-03-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the message passing mechanism via WebSocket protocol, and improves its flexibility, expansibility, and scalability. The user interface with responsive web design realizes the remote operating under both desktop and mobile devices. The stable operating of software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  2. Multiphase flow calculation software

    DOEpatents

    Fincke, James R.

    2003-04-15

    Multiphase flow calculation software and computer-readable media carrying computer executable instructions for calculating liquid and gas phase mass flow rates of high void fraction multiphase flows. The multiphase flow calculation software employs various given, or experimentally determined, parameters in conjunction with a plurality of pressure differentials of a multiphase flow, preferably supplied by a differential pressure flowmeter or the like, to determine liquid and gas phase mass flow rates of the high void fraction multiphase flows. Embodiments of the multiphase flow calculation software are suitable for use in a variety of applications, including real-time management and control of an object system.

  3. Towards a software profession

    NASA Technical Reports Server (NTRS)

    Berard, Edward V.

    1986-01-01

    An increasing number of programmers have attempted to change their image. They have made it plain that they wish not only to be taken seriously, but they also wish to be regarded as professionals. Many programmers now wish to referred to as software engineers. If programmers wish to be considered professionals in every sense of the word, two obstacles must be overcome: the inability to think of software as a product, and the idea that little or no skill is required to create and handle software throughout its life cycle. The steps to be taken toward professionalization are outlined along with recommendations.

  4. Software quality assurance handbook

    SciTech Connect

    Not Available

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  5. Speakeasy software development

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  6. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  7. ACS: ALMA Common Software

    NASA Astrophysics Data System (ADS)

    Chiozzi, Gianluca; Šekoranja, Matej

    2013-02-01

    ALMA Common Software (ACS) provides a software infrastructure common to all ALMA partners and consists of a documented collection of common patterns and components which implement those patterns. The heart of ACS is based on a distributed Component-Container model, with ACS Components implemented as CORBA objects in any of the supported programming languages. ACS provides common CORBA-based services such as logging, error and alarm management, configuration database and lifecycle management. Although designed for ALMA, ACS can and is being used in other control systems and distributed software projects, since it implements proven design patterns using state of the art, reliable technology. It also allows, through the use of well-known standard constructs and components, that other team members whom are not authors of ACS easily understand the architecture of software modules, making maintenance affordable even on a very large project.

  8. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  9. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  10. A Symphony of Software.

    ERIC Educational Resources Information Center

    Currents, 2002

    2002-01-01

    Offers a descriptive table of databases that help higher education institutions orchestrate advancement operations. Information includes vendor, contact, software, price, database engine/server platform, recommended reporting tools, record capacity, and client type. (EV)

  11. HOMER® Energy Modeling Software

    2000-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  12. Writing Instructional Software.

    ERIC Educational Resources Information Center

    Lorenz, Marian; Moose, Allan

    1983-01-01

    Discusses the main categories of instructional software, including drill/practice, tutorials, simulation/problem solving, games, and management, along with factors involved in their design. (Author/MBR)

  13. Software Solutions for ICME

    NASA Astrophysics Data System (ADS)

    Schmitz, G. J.; Engstrom, A.; Bernhardt, R.; Prahl, U.; Adam, L.; Seyfarth, J.; Apel, M.; de Saracibar, C. Agelet; Korzhavyi, P.; Ågren, J.; Patzak, B.

    2016-01-01

    The Integrated Computational Materials Engineering expert group (ICMEg), a coordination activity of the European Commission, aims at developing a global and open standard for information exchange between the heterogeneous varieties of numerous simulation tools. The ICMEg consortium coordinates respective developments by a strategy of networking stakeholders in the first International Workshop on Software Solutions for ICME, compiling identified and relevant software tools into the Handbook of Software Solutions for ICME, discussing strategies for interoperability between different software tools during a second (planned) international workshop, and eventually proposing a scheme for standardized information exchange in a future book or document. The present article summarizes these respective actions to provide the ICME community with some additional insights and resources from which to help move this field forward.

  14. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  15. Economics of software utilization

    SciTech Connect

    Sidorov, N.A.

    1995-01-01

    The application of the reuse principle to software (use of methods, concepts, or system components in a context or a situation which is different from that originally envisaged in the development phase) requires solving many problems of technical, economic, organizational, and legal nature. At present, it is the technical problems of reuse that are receiving the greater attention. Economic aspects of reuse, which are the subject of this paper, are only beginning to be studied. In our analysis, an integrated approach to the economics of software recycling suggests three models that can be applied to examine reusability. Section 1 characterizes the application of the reuse principle in software systems. Section 2 identifies the factors which are relevant for reuse. Section 3 briefly describes the main processes of reuse. Section 4 presents the metrics for the evaluation of reuse models. Section 5 examines the reuse models, and Section 6 presents some recommendations for reducing the development costs of reusable software.

  16. Error-Free Software

    NASA Technical Reports Server (NTRS)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  17. Astronomical Software Directory Service

    NASA Technical Reports Server (NTRS)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as

  18. Biological Imaging Software Tools

    PubMed Central

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  19. LIGA Scanner Control Software

    1999-02-01

    The LIGA Scanner Software is a graphical user interface package that facilitates controlling the scanning operation of x-rays from a synchrotron and sample manipulation for making LIGA parts. The process requires scanning of the LIGA mask and the PMMA resist through a stationary x-ray beam to provide an evenly distributed x-ray exposure over the wafer. This software package has been written specifically to interface with Aerotech motor controllers.

  20. Public Key FPGA Software

    SciTech Connect

    Hymel, Ross

    2013-07-25

    The Public Key (PK) FPGA software performs asymmetric authentication using the 163-bit Elliptic Curve Digital Signature Algorithm (ECDSA) on an embedded FPGA platform. A digital signature is created on user-supplied data, and communication with a host system is performed via a Serial Peripheral Interface (SPI) bus. Software includes all components necessary for signing, including custom random number generator for key creation and SHA-256 for data hashing.

  1. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Ames digital image velocimetry technology has been incorporated in a commercially available image processing software package that allows motion measurement of images on a PC alone. The software, manufactured by Werner Frei Associates, is IMAGELAB FFT. IMAGELAB FFT is a general purpose image processing system with a variety of other applications, among them image enhancement of fingerprints and use by banks and law enforcement agencies for analysis of videos run during robberies.

  2. Engineering and Software Engineering

    NASA Astrophysics Data System (ADS)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  3. Software for the EVLA

    NASA Astrophysics Data System (ADS)

    Butler, Bryan J.; van Moorsel, Gustaaf; Tody, Doug

    2004-09-01

    The Expanded Very Large Array (EVLA) project is the next generation instrument for high resolution long-millimeter to short-meter wavelength radio astronomy. It is currently funded by NSF, with completion scheduled for 2012. The EVLA will upgrade the VLA with new feeds, receivers, data transmission hardware, correlator, and a new software system to enable the instrument to achieve its full potential. This software includes both that required for controlling and monitoring the instrument and that involved with the scientific dataflow. We concentrate here on a portion of the dataflow software, including: proposal preparation, submission, and handling; observation preparation, scheduling, and remote monitoring; data archiving; and data post-processing, including both automated (pipeline) and manual processing. The primary goals of the software are: to maximize the scientific return of the EVLA; provide ease of use, for both novices and experts; exploit commonality amongst all NRAO telescopes where possible. This last point is both a bane and a blessing: we are not at liberty to do whatever we want in the software, but on the other hand we may borrow from other projects (notably ALMA and GBT) where appropriate. The software design methodology includes detailed initial use-cases and requirements from the scientists, intimate interaction between the scientists and the programmers during design and implementation, and a thorough testing and acceptance plan.

  4. Star Wars software debate

    SciTech Connect

    Myers, W.

    1986-02-01

    David L. Parnas, Landsdowne Professor of Computer Science at the University of Victoria resigned from the SDI Organization's Panel on Computing in Support of Battle Management on June 28, 1985. Parnas, with 20 years of research on software engineering plus 8 years of work on military aircraft real-time software, says the software portion of SDI cannot be built error-free and he doesn't expect the next 20 years of research to change that fact. Since Parnas resigned, there have been several public debates on Star Wars software questions. In November 1985 the SDIO panel from which Parnas resigned released a draft of its report, reflecting its effort to critics of the project. While one might think that errors could be entirely eliminated with enough care and checking, most software professionals believe there will always be some residue of errors in a system of this size and complexity. The general line of the critics' argument is that the larger the amount of software in a single, unified system, the higher the percentage of errors it will contain. Proponents counter that the one very large system can be divided into a number of smaller, relatively independent pieces, thus reducing the proportionate number of errors in each separate piece. This approach is in turn countered by those who point to the intricate relations between these pieces, which themselves contribute to error.

  5. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  6. Encyclopedia of software components

    NASA Technical Reports Server (NTRS)

    Vanwarren, Lloyd (Inventor); Beckman, Brian C. (Inventor)

    1991-01-01

    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.

  7. Encyclopedia of Software Components

    NASA Technical Reports Server (NTRS)

    Warren, Lloyd V. (Inventor); Beckman, Brian C. (Inventor)

    1997-01-01

    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.

  8. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    USGS Publications Warehouse

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  9. Software management at Fermilab

    SciTech Connect

    Robert M. Harris

    1998-10-01

    We describe the structure and performance of a software management system in wide use at Fermilab. The system provides software version control with Con- current Versions System (CVS) con gured in a client-server mode. Management and building of software is provided by Software Release Tools (SoftRelTools) originally developed by the BaBar collaboration. Support for SoftRelTools, the heart of the system, is organized by the Fermilab computing division in close communication with the end users: CDF, D0, BTeV and CMS. Unix Product Support (UPS) is used to initialize environmental variables for multiple versions of software on multiple platforms. Distribution of frozen releases is currently handled by internally developed scripts, but will soon be performed by Unix Product Distribution (UPD). At CDF the development version of the software is also distributed daily and built in place on 18 di erent machines, with new machines added weekly. Although primarily intended for UNIX platforms, in- cluding Linux, the system is also supported for Windows NT by D0. This system handles the version control, management, building, and distri- bution of code written in Fortran, C, and C++. A single executable can call routines written in all three languages. A distinguishing feature of the system is its ability to allow rapid asynchronous development of package versions, which can be easily integrated into complete consistent releases of the entire o ine software. Daily rebuilds of all the software, along with automatic mailings of build errors to developers, test robustness and allow speedy integration. This system has been used since January 1997 by CDF, D0 and BTeV for the development and release of software for the next run of the Tevatron Collider. At CDF it has been used by roughly 30 developers to make over a dozen frozen releases of a million lines of software. D0's use is similar to CDF, and the system is just beginning to be used by CMS. The cooperative maintenance and

  10. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  11. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  12. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  13. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  14. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  15. The software engineering laboratory: An approach to measuring software technology

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.

    1980-01-01

    The investigations of the software evaluation laboratory into the software development process at NASA/Goddard are described. A data collection process for acquiring detailed histories of software development projects is outlined. The application of different sets of software methodologies to specific applications projects is summarized. The effect of the development methodology on productivity is discussed.

  16. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  17. SAR Product Control Software

    NASA Astrophysics Data System (ADS)

    Meadows, P. J.; Hounam, D.; Rye, A. J.; Rosich, B.; Börner, T.; Closa, J.; Schättler, B.; Smith, P. J.; Zink, M.

    2003-03-01

    As SAR instruments and their operating modes become more complex, as new applications place more and more demands on image quality and as our understanding of their imperfections becomes more sophisticated, there is increasing recognition that SAR data quality has to be controlled more completely to keep pace. The SAR product CONtrol software (SARCON) is a comprehensive SAR product control software suite tailored to the latest generation of SAR sensors. SARCON profits from the most up-to-date thinking on SAR image performance derived from other spaceborne and airborne SAR projects and is based on the newest applications. This paper gives an overview of the structure and the features of this new software tool, which is a product of a co-operation between teams at BAE SYSTEMS Advanced Technology Centre and DLR under contract to ESA (ESRIN). Work on SARCON began in 1999 and is continuing.

  18. The Ettention software package.

    PubMed

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. PMID:26686659

  19. Astronomers as Software Developers

    NASA Astrophysics Data System (ADS)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  20. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  1. The EOSDIS software challenge

    NASA Astrophysics Data System (ADS)

    Jaworski, Allan

    1993-08-01

    The Earth Observing System (EOS) Data and Information System (EOSDIS) will serve as a major resource for the earth science community, supporting both command and control of complex instruments onboard the EOS spacecraft and the archiving, distribution, and analysis of data. The scale of EOSDIS and the volume of multidisciplinary research to be conducted using EOSDIS resources will produce unparalleled needs for technology transparency, data integration, and system interoperability. The scale of this effort far outscopes any previous scientific data system in its breadth or operational and performance needs. Modern hardware technology can meet the EOSDIS technical challenge. Multiprocessing speeds of many giga-flops are being realized by modern computers. Online storage disk, optical disk, and videocassette libraries with storage capacities of many terabytes are now commercially available. Radio frequency and fiber optics communications networks with gigabit rates are demonstrable today. It remains, of course, to perform the system engineering to establish the requirements, architectures, and designs that will implement the EOSDIS systems. Software technology, however, has not enjoyed the price/performance advances of hardware. Although we have learned to engineer hardware systems which have several orders of magnitude greater complexity and performance than those built in the 1960's, we have not made comparable progress in dramatically reducing the cost of software development. This lack of progress may significantly reduce our capabilities to achieve economically the types of highly interoperable, responsive, integraded, and productive environments which are needed by the earth science community. This paper describes some of the EOSDIS software requirements and current activities in the software community which are applicable to meeting the EOSDIS challenge. Some of these areas include intelligent user interfaces, software reuse libraries, and domain engineering

  2. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  3. Software Management System

    NASA Technical Reports Server (NTRS)

    1994-01-01

    A software management system, originally developed for Goddard Space Flight Center (GSFC) by Century Computing, Inc. has evolved from a menu and command oriented system to a state-of-the art user interface development system supporting high resolution graphics workstations. Transportable Applications Environment (TAE) was initially distributed through COSMIC and backed by a TAE support office at GSFC. In 1993, Century Computing assumed the support and distribution functions and began marketing TAE Plus, the system's latest version. The software is easy to use and does not require programming experience.

  4. Star Atlases and Software

    NASA Astrophysics Data System (ADS)

    Brazell, Owen; Argyle, R. W.

    In the 7 years or so that have passed since the first edition of this book was published perhaps one of the areas that has changed the most has been in the area of charts and software. The realm of the paper chart has pretty much been taken over by software in all its guises. It would perhaps not have been possible to have foreseen 10 years ago that one could look up double stars and their information on your phone as you can do on many of today's smart phones. The popularity of tablets and netbooks also means that much more information is now available in the field that it was before.

  5. TIA Software User's Manual

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  6. Maintenance simulation: Software issues

    SciTech Connect

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  7. Thermal Analysis Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A new version of Space Payload Thermal Analyzer (SSPTA) that can be used on a 386 personal computer was developed by Nicholas Teti. SSPTA/386 software package includes the programs that Goddard has traditionally used in thermal design and analysis. The original programs were modified to run on the 386 system and automatic data transfer between programs was improved. SSPTA/386 includes all the features available in Goddard's VAX version of SSPTA. The software package is highly flexible in that it allows the user to run the programs interactively or in batch mode. It provides a menu system that allows the user to select a program or a combination of programs.

  8. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  9. NASA's Software Bank (CLIPS)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    C Language Integrated Production System (CLIPS) is a NASA Johnson Space Center developed software shell for developing expert systems, is used by researchers at Ohio State University to determine solid waste disposal sites to assist in historic preservation. The program has various other applications and has even been included in a widely-used textbook.

  10. The FARE Software

    ERIC Educational Resources Information Center

    Pitarello, Adriana

    2015-01-01

    This article highlights the importance of immediate corrective feedback in tutorial software for language teaching in an academic learning environment. We aim to demonstrate that, rather than simply reporting on the performance of the foreign language learner, this feedback can act as a mediator of students' cognitive and metacognitive activity.…

  11. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  12. Communications Software Comparison Chart.

    ERIC Educational Resources Information Center

    Elia, Joseph J., Jr.

    1984-01-01

    This chart comparing communications software packages focuses on features important to novice users. Less obvious features covered in the chart are defined, including file transfer, protocols, baud rates, file types, capture methods, file editing, macro, tutorial unattended operation, document reading, and prices. A list of manufacturers of…

  13. Writing testable software requirements

    SciTech Connect

    Knirk, D.

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  14. Software Geometry in Simulations

    NASA Astrophysics Data System (ADS)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  15. JSATS Decoder Software Manual

    SciTech Connect

    Flory, Adam E.; Lamarche, Brian L.; Weiland, Mark A.

    2013-05-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Decoder is a software application that converts a digitized acoustic signal (a waveform stored in the .bwm file format) into a list of potential JSATS Acoustic MicroTransmitter (AMT) tagcodes along with other data about the signal including time of arrival and signal to noise ratios (SNR). This software is capable of decoding single files, directories, and viewing raw acoustic waveforms. When coupled with the JSATS Detector, the Decoder is capable of decoding in ‘real-time’ and can also provide statistical information about acoustic beacons placed within receive range of hydrophones within a JSATS array. This document details the features and functionality of the software. The document begins with software installation instructions (section 2), followed in order by instructions for decoder setup (section 3), decoding process initiation (section 4), then monitoring of beacons (section 5) using real-time decoding features. The last section in the manual describes the beacon, beacon statistics, and the results file formats. This document does not consider the raw binary waveform file format.

  16. MRDIS Simulation Software

    SciTech Connect

    Pete Humphrey, Charles Babb

    2012-01-05

    The MRDIS Simulator is a software application to duplicate the tcp/ip output normally produced by a Mobile Radiation Detection and Identification System (MRDIS) radiation detector. Output simulates the data stream from TSA radiation detectors plus OCR data from AsiaVision OCR systems (used by the actual MRDIS).

  17. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  18. No Fail Software.

    ERIC Educational Resources Information Center

    Buckleitner, Warren

    1996-01-01

    Recommends and describes specific software programs for children based on a review of 1800 titles. Includes recommendations for preschoolers in the areas of language, math, logic, science, art, creativity, and all-purpose. Presents choices for early elementary students including reading, writing, math, logic, science, social studies, creativity,…

  19. Iterative software kernels

    SciTech Connect

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  20. Engaging New Software.

    ERIC Educational Resources Information Center

    Allen, Denise

    1994-01-01

    Reviews three educational computer software products: (1) a compact disc-read only memory (CD-ROM) bundle of five mathematics programs from the Apple Education Series; (2) "Sammy's Science House," with science activities for preschool through second grade (Edmark); and (3) "The Cat Came Back," an interactive CD-ROM game designed to build language…

  1. Software Tools: EPICUR.

    ERIC Educational Resources Information Center

    Abreu, Jose Luis; And Others

    EPICUR (Integrated Programing Environment for the Development of Educational Software) is a set of programming modules ranging from low level interfaces to high level algorithms aimed at the development of computer-assisted instruction (CAI) applications. The emphasis is on user-friendly interfaces and on multiplying productivity without loss of…

  2. Software Carpentry: lessons learned

    PubMed Central

    Wilson, Greg

    2016-01-01

    Since its start in 1998, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to improve researchers' computing skills. This paper explains what we have learned along the way, the challenges we now face, and our plans for the future. PMID:24715981

  3. Software for the Classroom.

    ERIC Educational Resources Information Center

    Isenberg, Joan

    1987-01-01

    Reviews three computer software programs for classroom use: Muppet Word Book, for children three to six; Dragon Game Series, on parts of speech, for students in grades four through high school; and Chemistry Achievement I, for students who are 16 to 18 years old, or in grades 10 to 12. (BB)

  4. Software management issues

    SciTech Connect

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken.

  5. Software engineering tools.

    PubMed

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  6. Generic Kalman Filter Software

    NASA Technical Reports Server (NTRS)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  7. Self-assembling software generator

    DOEpatents

    Bouchard, Ann M.; Osbourn, Gordon C.

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  8. Flight Software Math Library

    NASA Technical Reports Server (NTRS)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  9. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  10. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  11. Software Configurable Multichannel Transceiver

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter

    2009-01-01

    Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.

  12. Chemical recognition software

    SciTech Connect

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.H. Jr.; Tisone, G.C.

    1994-06-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures, even when the mixture is noisy and contaminated with unknowns.

  13. Chemical recognition software

    SciTech Connect

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.J. Jr.; Tisone, G.C.

    1994-12-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures. even when the mixture is noisy and contaminated with unknowns.

  14. Checking software contracts

    SciTech Connect

    Mitchell, R.; Maung, I.; Howse, J.; Heathcote, T.

    1995-12-31

    In object-oriented software, contracts between classes can be expressed in terms of preconditions, postconditions and invariants. In the programming language Eiffel, contracts can be checked at run-time. Within inheritance hierarchies, contracts can be used to control the redefinition of services. In Eiffel, the rules ensure that redefinition is safe in the presence of polymorphism and dynamic binding. This paper shows that Eiffel`s support for contracts can be extended, both to cover the polymorphic case more fully, and to cover other uses of inheritance, including its use for selectively reusing code from a parent class. The paper proposes that software designers should state what properties they claim for any uses of inheritance, in order that extra checks on consistency can be applied, and shows that some extra debugging power could be obtained easily, by changing only the run-time system for Eiffel, rather than the language itself.

  15. Software error detection

    NASA Technical Reports Server (NTRS)

    Buechler, W.; Tucker, A. G.

    1981-01-01

    Several methods were employed to detect both the occurrence and source of errors in the operational software of the AN/SLQ-32. A large embedded real time electronic warfare command and control system for the ROLM 1606 computer are presented. The ROLM computer provides information about invalid addressing, improper use of privileged instructions, stack overflows, and unimplemented instructions. Additionally, software techniques were developed to detect invalid jumps, indices out of range, infinte loops, stack underflows, and field size errors. Finally, data are saved to provide information about the status of the system when an error is detected. This information includes I/O buffers, interrupt counts, stack contents, and recently passed locations. The various errors detected, techniques to assist in debugging problems, and segment simulation on a nontarget computer are discussed. These error detection techniques were a major factor in the success of finding the primary cause of error in 98% of over 500 system dumps.

  16. Unified Parallel Software

    SciTech Connect

    McKay, Mike

    2003-12-01

    UPS (Unified Paralled Software is a collection of software tools libraries, scripts, executables) that assist in parallel programming. This consists of: o libups.a C/Fortran callable routines for message passing (utilities written on top of MPI) and file IO (utilities written on top of HDF). o libuserd-HDF.so EnSight user-defined reader for visualizing data files written with UPS File IO. o ups_libuserd_query, ups_libuserd_prep.pl, ups_libuserd_script.pl Executables/scripts to get information from data files and to simplify the use of EnSight on those data files. o ups_io_rm/ups_io_cp Manipulate data files written with UPS File IO These tools are portable to a wide variety of Unix platforms.

  17. Unified Parallel Software

    2003-12-01

    UPS (Unified Paralled Software is a collection of software tools libraries, scripts, executables) that assist in parallel programming. This consists of: o libups.a C/Fortran callable routines for message passing (utilities written on top of MPI) and file IO (utilities written on top of HDF). o libuserd-HDF.so EnSight user-defined reader for visualizing data files written with UPS File IO. o ups_libuserd_query, ups_libuserd_prep.pl, ups_libuserd_script.pl Executables/scripts to get information from data files and to simplify the use ofmore » EnSight on those data files. o ups_io_rm/ups_io_cp Manipulate data files written with UPS File IO These tools are portable to a wide variety of Unix platforms.« less

  18. TOUGH2 software qualification

    SciTech Connect

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  19. Antenna Controller Replacement Software

    NASA Technical Reports Server (NTRS)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; Wert, Michael; Leung, Patrick

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  20. The ALMA Software System

    NASA Astrophysics Data System (ADS)

    Schwarz, J.; Sommer, H.; Farris, A.

    2004-07-01

    Prospective users, instrumentation and location of the Atacama Large Millimeter Array (ALMA) all present its software developers with major challenges. The development of this software will be distributed among many institutes on two continents, mimicking the software itself, which will have to function in a distributed environment, spanning the 0.5-10 km baselines between antennas, as well as the much larger distances that will separate the array site at the 5000m-high Llano de Chajnantor, the Operations Support Facility in San Pedro de Atacama, the Santiago Central Office, and the ALMA Regional Centers in North America and Europe. To make distributed development successful, we have defined interfaces that allow separated groups to work independently of their counterparts at other locations as much as possible. We have defined a common architecture and infrastructure, so that work done at one location is not unnecessarily duplicated at another, and that similar tasks are done in a similar way throughout the project. A single, integrated Archive attends to the needs of all subsystems for persistent storage, and hides details of the underlying database technology. The separation of functional from technical concerns is built into the system architecture through the use of the Container-Component model: application developers can concentrate on implementing functionality in runtime-deployable components, which in turn depend on Containers to provide them with services such as access to remote resources, transparent serialization of value objects to XML, logging, error-handling and security. The resulting middleware, which forms part of the ALMA Common Software (ACS), is based on CORBA and XML.

  1. Addressing Software Security

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  2. Aviation Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.

  3. Computer software documentation

    NASA Technical Reports Server (NTRS)

    Comella, P. A.

    1973-01-01

    A tutorial in the documentation of computer software is presented. It presents a methodology for achieving an adequate level of documentation as a natural outgrowth of the total programming effort commencing with the initial problem statement and definition and terminating with the final verification of code. It discusses the content of adequate documentation, the necessity for such documentation and the problems impeding achievement of adequate documentation.

  4. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  5. Software sensors for bioprocesses.

    PubMed

    Bogaerts, Ph; Vande Wouwer, A

    2003-10-01

    State estimation is a significant problem in biotechnological processes, due to the general lack of hardware sensor measurements of the variables describing the process dynamics. The objective of this paper is to review a number of software sensor design methods, including extended Kalman filters, receding-horizon observers, asymptotic observers, and hybrid observers, which can be efficiently applied to bioprocesses. These several methods are illustrated with simulation and real-life case studies.

  6. NASA's Software Bank (NETS)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NETS (A Neural Network Development Tool) is a software system for mimicking the human brain. It is used in a University of Arkansas project in pattern matching of chemical systems. If successful, chemists would be able to identify mixtures of compounds without long and costly separation procedures. Using NETS, the group has trained the computer to recognize pattern relationships in a known compound and associate the results to an unknown compound. The research appears to be promising.

  7. System For Retrieving Reusable Software

    NASA Technical Reports Server (NTRS)

    Van Warren, Lloyd; Beckman, Brian C.

    1993-01-01

    Encyclopedia of Software Components (ESC) is information-retrieval system of computer hardware and software providing access to generic reusable software tools and parts. Core of ESC is central tool base, which is repository of reusable software. It receives queries and submissions from user through local browser subsystem and receives authorized updates from maintenance subsystem. Sends retrievals to local browser subsystem and user's submissions to maintenance subsystem. Future versions will provide for advanced media, including voice and video, and will link system to database-management system. Programmers will not only retrieve software, but also modify, execute, and cross-link with other software.

  8. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  9. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  10. Modular Software Performance Monitoring

    NASA Astrophysics Data System (ADS)

    Kruse, Daniele Francesco; Kruzelecki, Karol

    2011-12-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the programming paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chosen to solve this problem that involves decomposing the application into parts and monitoring each one of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 - a software interface to hardware counters) for CMSSW, Gaudi and Geant4 has been developed and deployed. We will show how this type of analysis has been proven useful in spotting specific performance problems and effective in helping with code optimization.

  11. Peppy: Proteogenomic Search Software

    PubMed Central

    Risk, Brian A.; Spitzer, Wendy J.; Giddings, Morgan C.

    2014-01-01

    Proteogenomic searching is a useful method for identifying novel proteins, annotating genes and detecting peptides unique to an individual genome. The approach, however, can be laborious, as it often requires search segmentation and the use of several unintegrated tools. Furthermore, many proteogenomic efforts have been limited to small genomes, as large genomes can prove impractical due to the required amount of computer memory and computation time. We present Peppy, a software tool designed to perform every necessary task of proteogenomic searches quickly, accurately and automatically. The software generates a peptide database from a genome, tracks peptide loci, matches peptides to MS/MS spectra and assigns confidence values to those matches. Peppy automatically performs a decoy database generation, search and analysis to return identifications at the desired false discovery rate threshold. Written in Java for cross-platform execution, the software is fully multithreaded for enhanced speed. The program can run on regular desktop computers, opening the doors of proteogenomic searching to a wider audience of proteomics and genomics researchers. Peppy is available at http://geneffects.com/peppy. PMID:23614390

  12. The ALMA software architecture

    NASA Astrophysics Data System (ADS)

    Schwarz, Joseph; Farris, Allen; Sommer, Heiko

    2004-09-01

    The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.

  13. Balloon Design Software

    NASA Technical Reports Server (NTRS)

    Farley, Rodger

    2007-01-01

    PlanetaryBalloon Version 5.0 is a software package for the design of meridionally lobed planetary balloons. It operates in a Windows environment, and programming was done in Visual Basic 6. By including the effects of circular lobes with load tapes, skin mass, hoop and meridional stress, and elasticity in the structural elements, a more accurate balloon shape of practical construction can be determined as well as the room-temperature cut pattern for the gore shapes. The computer algorithm is formulated for sizing meridionally lobed balloons for any generalized atmosphere or planet. This also covers zero-pressure, over-pressure, and super-pressure balloons. Low circumferential loads with meridionally reinforced load tapes will produce shapes close to what are known as the "natural shape." The software allows for the design of constant angle, constant radius, or constant hoop stress balloons. It uses the desired payload capacity for given atmospheric conditions and determines the required volume, allowing users to design exactly to their requirements. The formulations are generalized to use any lift gas (or mixture of gases), any atmosphere, or any planet as described by the local acceleration of gravity. PlanetaryBalloon software has a comprehensive user manual that covers features ranging from, but not limited to, buoyancy and super-pressure, convenient design equations, shape formulation, and orthotropic stress/strain.

  14. Evidence of Absence software

    USGS Publications Warehouse

    Dalthorp, Daniel; Huso, Manuela M. P.; Dail, David; Kenyon, Jessica

    2014-01-01

    Evidence of Absence software (EoA) is a user-friendly application used for estimating bird and bat fatalities at wind farms and designing search protocols. The software is particularly useful in addressing whether the number of fatalities has exceeded a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software is applicable even when zero carcasses have been found in searches. Depending on the effectiveness of the searches, such an absence of evidence of mortality may or may not be strong evidence that few fatalities occurred. Under a search protocol in which carcasses are detected with nearly 100 percent certainty, finding zero carcasses would be convincing evidence that overall mortality rate was near zero. By contrast, with a less effective search protocol with low probability of detecting a carcass, finding zero carcasses does not rule out the possibility that large numbers of animals were killed but not detected in the searches. EoA uses information about the search process and scavenging rates to estimate detection probabilities to determine a maximum credible number of fatalities, even when zero or few carcasses are observed.

  15. Geologic Sequestration Software Suite

    2013-11-04

    GS3 is the bundling of the Geological Sequestration Software Suite domain tools with the Velo wiki user interface, rich client interface, and data store. Velo is an application domain independent collaborative user environment for modeling and simulation. Velo has a web browser based wiki interface integrated with a sophisticated content management system supporting data and knowledge management required for large-scale scientific modeling projects. GS3 adds tools and capability specifically in the area of modeling subsurfacemore » reservoirs for the purpose of carbon sequestration. Velo is a core software framework to create scientific domain user environments. Velo is not tied to a specific domain although it provides novel capability needed by many application areas. A well-defined Velo integration layer allows custom applications such as GS3 to leverage the core Velo components to reduce development cost/time and ultimately provide a more capable software product. Compared with previous efforts like ECCE and SALSSA, Velo is a major advancement being a web browser based interface, having a more comprehensive data management architecture, and having intrinsic support for collaboration through the wiki. GS3 adds specific domain tools for looking at site data, developing conceptual and numerical models, building simulation input files, launching and monitoring the progress of those simulations and being able to look at and interpret simulation output.« less

  16. Geologic Sequestration Software Suite

    SciTech Connect

    Black, Gary; Bonneville, PNNL Alain; Sivaramakrishnan, PNNL Chandrika; Purohit, PNNL Sumit; White, PNNL Signe; Lansing, PNNL Carina; Gosink, PNNL Luke; Guillen, PNNL Zoe; Moeglein, PNNL William; Gorton, PNNL Ian; PNNL,

    2013-11-04

    GS3 is the bundling of the Geological Sequestration Software Suite domain tools with the Velo wiki user interface, rich client interface, and data store. Velo is an application domain independent collaborative user environment for modeling and simulation. Velo has a web browser based wiki interface integrated with a sophisticated content management system supporting data and knowledge management required for large-scale scientific modeling projects. GS3 adds tools and capability specifically in the area of modeling subsurface reservoirs for the purpose of carbon sequestration. Velo is a core software framework to create scientific domain user environments. Velo is not tied to a specific domain although it provides novel capability needed by many application areas. A well-defined Velo integration layer allows custom applications such as GS3 to leverage the core Velo components to reduce development cost/time and ultimately provide a more capable software product. Compared with previous efforts like ECCE and SALSSA, Velo is a major advancement being a web browser based interface, having a more comprehensive data management architecture, and having intrinsic support for collaboration through the wiki. GS3 adds specific domain tools for looking at site data, developing conceptual and numerical models, building simulation input files, launching and monitoring the progress of those simulations and being able to look at and interpret simulation output.

  17. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  18. Terra Harvest software architecture

    NASA Astrophysics Data System (ADS)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  19. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  20. NASA PC software evaluation project

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  1. Model-Based Software Testing for Object-Oriented Software

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  2. Library Software: Directory of Microcomputer Software for Libraries.

    ERIC Educational Resources Information Center

    Walton, Robert A.

    The availability of appropriate software for library applications is a continuing problem, and this directory is designed to reduce the frustration of librarians in their search for library software for a microcomputer by providing profiles of software packages designed specifically for libraries. Each profile describes the purpose of the program,…

  3. Software To Go: A Catalog of Software Available for Loan.

    ERIC Educational Resources Information Center

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  4. Software security checklist for the software life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, D. P.; Wolfe, T. L.; Sherif, J. S.

    2002-01-01

    A formal approach to security in the software life cycle is essential to protect corporate resources. However, little thought has been given to this aspect of software development. Due to its criticality, security should be integrated as a formal approach in the software life cycle.

  5. Taking a Hard Look at Software. What about Wimpy Software?

    ERIC Educational Resources Information Center

    Evans, Ron

    Much of the computer software currently available for English teachers fails to assess adequately computer strengths and weaknesses. Labeled "wimpy software," these products are often little more than animated textbooks whose lesson formats exercise little higher-order reasoning. The future for good quality software, therefore, rests with English…

  6. Real-time software receiver

    NASA Technical Reports Server (NTRS)

    Ledvina, Brent M. (Inventor); Psiaki, Mark L. (Inventor); Powell, Steven P. (Inventor); Kintner, Jr., Paul M. (Inventor)

    2007-01-01

    A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.

  7. Software Engineering for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  8. Real-time software receiver

    NASA Technical Reports Server (NTRS)

    Ledvina, Brent M. (Inventor); Psiaki, Mark L. (Inventor); Powell, Steven P. (Inventor); Kintner, Jr., Paul M. (Inventor)

    2006-01-01

    A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.

  9. Software Quality Assurance Audits Guidebooks

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  10. The Software Management Environment (SME)

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  11. Gammasphere software development. Progress report

    SciTech Connect

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  12. Software-Design-Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  13. Inequalities in Classroom Computer Software.

    ERIC Educational Resources Information Center

    Biraimah, Karen

    Biases based on gender and ethnicity in computer software available to schools were investigated in this study. A random sample of 15 software programs were selected and evaluated on the bases of gender and ethnicity. Data were gathered on the number of male and female characters portrayed and on the cross-cultural dimensions of the software in…

  14. Free Software and Free Textbooks

    ERIC Educational Resources Information Center

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  15. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  16. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  17. Sources and Types of Software.

    ERIC Educational Resources Information Center

    Trautman, Rodes

    1987-01-01

    Describes possible sources of computer software, including hardware bundles, commercial suppliers, users groups, electronic bulletin boards, as payment for software review services, prototype contracts, and writing the software oneself. Advantages and disadvantages of each, especially in terms of technical support, are discussed. (CLB)

  18. Software engineering standards and practices

    NASA Technical Reports Server (NTRS)

    Durachka, R. W.

    1981-01-01

    Guidelines are presented for the preparation of a software development plan. The various phases of a software development project are discussed throughout its life cycle including a general description of the software engineering standards and practices to be followed during each phase.

  19. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  20. Software Vulnerability Taxonomy Consolidation

    SciTech Connect

    Polepeddi, Sriram S.

    2004-12-07

    In today's environment, computers and networks are increasing exposed to a number of software vulnerabilities. Information about these vulnerabilities is collected and disseminated via various large publicly available databases such as BugTraq, OSVDB and ICAT. Each of these databases, individually, do not cover all aspects of a vulnerability and lack a standard format among them, making it difficult for end-users to easily compare various vulnerabilities. A central database of vulnerabilities has not been available until today for a number of reasons, such as the non-uniform methods by which current vulnerability database providers receive information, disagreement over which features of a particular vulnerability are important and how best to present them, and the non-utility of the information presented in many databases. The goal of this software vulnerability taxonomy consolidation project is to address the need for a universally accepted vulnerability taxonomy that classifies vulnerabilities in an unambiguous manner. A consolidated vulnerability database (CVDB) was implemented that coalesces and organizes vulnerability data from disparate data sources. Based on the work done in this paper, there is strong evidence that a consolidated taxonomy encompassing and organizing all relevant data can be achieved. However, three primary obstacles remain: lack of referencing a common ''primary key'', un-structured and free-form descriptions of necessary vulnerability data, and lack of data on all aspects of a vulnerability. This work has only considered data that can be unambiguously extracted from various data sources by straightforward parsers. It is felt that even with the use of more advanced, information mining tools, which can wade through the sea of unstructured vulnerability data, this current integration methodology would still provide repeatable, unambiguous, and exhaustive results. Though the goal of coalescing all available data, which would be of use to

  1. Software for batch farms

    SciTech Connect

    Ian Bird; Bryan Hess; Andy Kowalski

    2000-02-01

    Over the past few years, LSF has become a standard for job management on batch farms. However, there are many instances where it cannot be deployed for a variety of reasons. In large farms the cost may be prohibitive for the set of features actually used; small university groups who wish to clone the farms and software of larger laboratories often have constraints which preclude the use of LSF. This paper discusses a generic interface developed at Jefferson Lab to provide a set of common services to the user, while using any one of a variety of underlying batch management software products. Initially the system provides an interface to LSF and an alternative--Portable Batch System (PBS) developed by NASA and freely available in source form. It is straightforward to extend this to other systems. Such a generic interface allows users to move from one location to another and run their jobs with no modification, and by extension provides a framework for a ''global'' batch system where jobs submitted at one site may be transparently executed at another. The interface also provides additional features not found in the underlying batch software. Being written in Java, the client can be easily installed anywhere and allows for authenticated remote job submission and manipulation, including a web interface. This paper will also discuss the problem of keeping a large batch farm occupied with work without waiting for slow tape access. The use of file caching, pre-staging of files from tape and the interconnection with the batch system will be discussed. As well as automated techniques, the provision of appropriate information to the user to allow optimization should not be overlooked.

  2. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  3. Software for SARA

    SciTech Connect

    Quinn, P.

    1991-10-01

    This paper reports on an increasing number of consulting and software companies that offer PC-based information management programs designed to help companies comply with SARA Title III reporting requirements and the OSHA Hazard Communication Standard (HCS). These are available in a wide range of specific capabilities, from relatively simple systems that enable electronic filing of MSDSs to complex, multi-user systems that track and compute air and water emissions, maintain inventory and manifest records, and generate compliance reports. Prices range from less than $1,000 to more than $20,000.

  4. Misalignment estimation software system

    NASA Technical Reports Server (NTRS)

    Desjardins, R. L.

    1973-01-01

    A system of computer software, spacecraft, and ground system activity is described that enables spacecraft startrackers and inertial assemblies to be aligned and calibrated from the ground after the spacecraft has achieved orbit. The system generates in the uplink flow an exercise designed to render misalignments visible, and sends the exercise to the spacecraft where the spacecraft inserts the misalignment into the information in the form of attitude sensor error. The information is downlinked for processing into misalignment estimates to be used for correcting spacecraft model at data base.

  5. BLTC control system software

    SciTech Connect

    Logan, J.B., Fluor Daniel Hanford

    1997-02-10

    This is a direct revision to Rev. 0 of the BLTC Control System Software. The entire document is being revised and released as HNF-SD-FF-CSWD-025, Rev 1. The changes incorporated by this revision include addition of a feature to automate the sodium drain when removing assemblies from sodium wetted facilities. Other changes eliminate locked in alarms during cold operation and improve the function of the Oxygen Analyzer. See FCN-620498 for further details regarding these changes. Note the change in the document number prefix, in accordance with HNF-MD-003.

  6. Master Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    A basic function of a computational grid such as the NASA Information Power Grid (IPG) is to allow users to execute applications on remote computer systems. The Globus Resource Allocation Manager (GRAM) provides this functionality in the IPG and many other grids at this time. While the functionality provided by GRAM clients is adequate, GRAM does not support useful features such as staging several sets of files, running more than one executable in a single job submission, and maintaining historical information about execution operations. This specification is intended to provide the environmental and software functional requirements for the IPG Job Manager V2.0 being developed by AMTI for NASA.

  7. Software for surface analysis

    NASA Astrophysics Data System (ADS)

    Watson, D. G.; Doern, F. E.

    1985-04-01

    Two software packages designed to aid in the analysis of digitally stored Secondary Ion Mass Spectrometric (SIMS) and electron spectroscopic data are described. The first, MASS, is a program that normalizes, and allows the application of sensitivity coefficients to SIMS depth profiles. The second, DIP, is a digital image processor designed to enhance secondary, backscattered, and Auger electron spectroscopic (AES) maps. DIP can also provide quantitative area analysis of AES maps. The algorithms are currently optimized to handle data generated by Physical Electronics Industries data acquisition systems, but are generally applicable.

  8. DRAMA: Instrumentation software environment

    NASA Astrophysics Data System (ADS)

    Bailey, Jeremy; Shortridge, Keith; Farrell, Tony

    2015-07-01

    DRAMA is a fast, distributed environment for writing instrumentation control systems. It allows low level instrumentation software to be controlled from user interfaces running on UNIX, MS Windows or VMS machines in a consistent manner. Such instrumentation tasks can run either on these machines or on real time systems such as VxWorks. DRAMA uses techniques developed by the AAO while using the Starlink-ADAM environment, but is optimized for the requirements of instrumentation control, portability, embedded systems and speed. A special program is provided which allows seamless communication between ADAM and DRAMA tasks.

  9. Robust Software Architecture for Robots

    NASA Technical Reports Server (NTRS)

    Aghazanian, Hrand; Baumgartner, Eric; Garrett, Michael

    2009-01-01

    Robust Real-Time Reconfigurable Robotics Software Architecture (R4SA) is the name of both a software architecture and software that embodies the architecture. The architecture was conceived in the spirit of current practice in designing modular, hard, realtime aerospace systems. The architecture facilitates the integration of new sensory, motor, and control software modules into the software of a given robotic system. R4SA was developed for initial application aboard exploratory mobile robots on Mars, but is adaptable to terrestrial robotic systems, real-time embedded computing systems in general, and robotic toys.

  10. CSAM Metrology Software Tool

    NASA Technical Reports Server (NTRS)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  11. Gemini Scout Control Software

    SciTech Connect

    Clinton Hobart, Justin Garretson

    2010-11-23

    The Gemini Scout Control Software consists of two Windows applications that allow the Gemini Scout vehicle to be controlled by an operator. The Embedded application runs on the vehicle's Gemini Scout Control Software onboard computer and controls the vehicle's various motors and sensors. This application reports the vehicle's status and receives vehicle commands overthe local-area-network. The Embedded applicationalso allows the user to control the vehicle using a USB game-pad connected directly to the vehicle. The Operator Control Unit (OCU) application runs on an external PC and communicates with the vehicle via an Ethernet connection. The OCU application sends commands to and receives data from the Embedded application running on the vehicle. The OCU application also communicates directly with the digital video encoders and radios in order to display video from the vehicle's cameras and the status of the radio link. The OCU application has a graphical user interface (GUI) that displays the vehicle's status and allows the user to change various vehicle settings. Finally, the OCU application receives input from a USB game-pad connected to the PC in order to control the vehicle's functions.

  12. Software for Checking Statecharts

    NASA Technical Reports Server (NTRS)

    Pingree, Paula; Mikk, Erich

    2004-01-01

    HiVy is a software tool set that enables verification through model checking of designs represented as finite-state machines or statecharts. HiVy provides automated translation of (1) statecharts created by use of the MathWorks Stateflow program to (2) Promela, the input language of the Spin model checker, which can then be used to verify, or trace logical errors in, distributed software systems. HiVy can operate directly on Stateflow models, or its abstract syntax of hierarchical sequential automata (HSA) can be used independently as an intermediate format for translation to Promela. In a typical design application, HiVy parses and reformats Stateflow model file data using the programs SfParse and sf2hsa, respectively. If the parsing effort is successful, an abstract syntax tree is delivered into a file named with the extension .hsa. If the design comprises several model files, they may be merged into one .hsa file before translation into Promela. Stateflow scope is preserved, and name clashes are avoided in the merge process. The HiVy program hsa2pr translates the model from the intermediate HSA format into Promela. Additionally, HiVy provides through translation a list of all statechart model propositions that are the means for formalizing linear temporal logic (LTL) properties about the model for Spin verification.

  13. Gemini Scout Control Software

    2010-11-23

    The Gemini Scout Control Software consists of two Windows applications that allow the Gemini Scout vehicle to be controlled by an operator. The Embedded application runs on the vehicle's Gemini Scout Control Software onboard computer and controls the vehicle's various motors and sensors. This application reports the vehicle's status and receives vehicle commands overthe local-area-network. The Embedded applicationalso allows the user to control the vehicle using a USB game-pad connected directly to the vehicle. Themore » Operator Control Unit (OCU) application runs on an external PC and communicates with the vehicle via an Ethernet connection. The OCU application sends commands to and receives data from the Embedded application running on the vehicle. The OCU application also communicates directly with the digital video encoders and radios in order to display video from the vehicle's cameras and the status of the radio link. The OCU application has a graphical user interface (GUI) that displays the vehicle's status and allows the user to change various vehicle settings. Finally, the OCU application receives input from a USB game-pad connected to the PC in order to control the vehicle's functions.« less

  14. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  15. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  16. Modernization of software quality assurance

    NASA Technical Reports Server (NTRS)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  17. Software developments for gammasphere

    SciTech Connect

    Lauritsen, T.; Ahmad, I.; Carpenter, M.P.

    1995-08-01

    This year marked the year when data acquisition development for Gammasphere evolved from planning to accomplishment, both in hardware and software. Two VME crates now contain about 10 crate-processors which are used to handle the data from VXI processors - which in turn collect the data from germanium and BGO detectors in the array. The signals from the detectors are processed and digitized in custom-built electronics boards. The processing power in the VME crates is used to digitally filter the data before they are written to tape. The goal is to have highly processed data flowing to tape, eliminating the off-line filtering and manipulation of data that was standard procedure in earlier experiments.

  18. RETScreen Plus Software Tutorial

    NASA Technical Reports Server (NTRS)

    Ganoe, Rene D.; Stackhouse, Paul W., Jr.; DeYoung, Russell J.

    2014-01-01

    Greater emphasis is being placed on reducing both the carbon footprint and energy cost of buildings. A building's energy usage depends upon many factors one of the most important is the local weather and climate conditions to which it's electrical, heating and air conditioning systems must respond. Incorporating renewable energy systems, including solar systems, to supplement energy supplies and increase energy efficiency is important to saving costs and reducing emissions. Also retrofitting technologies to buildings requires knowledge of building performance in its current state, potential future climate state, projection of potential savings with capital investment, and then monitoring the performance once the improvements are made. RETScreen Plus is a performance analysis software module that supplies the needed functions of monitoring current building performance, targeting projected energy efficiency improvements and verifying improvements once completed. This tutorial defines the functions of RETScreen Plus as well as outlines the general procedure for monitoring and reporting building energy performance.

  19. Energy Tracking Software Platform

    SciTech Connect

    Ryan Davis; Nathan Bird; Rebecca Birx; Hal Knowles

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and help their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.

  20. The LSST Software Stack

    NASA Astrophysics Data System (ADS)

    Jenness, Timothy; LSST Data Management Team

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to the data center in North America and within 60 seconds it will be reduced using difference imaging and an alert list be generated for the community. Additionally, annual data releases will be constructed from all the data during the 10-year mission, producing catalogs and deep co-added images with unprecedented time resolution for such a large region of sky. In the paper we present the current status of the LSST stack including the data processing components, Qserv database and data visualization software, describe how to obtain it, and provide a summary of the development road map.

  1. Wrapper Induction Software

    2011-08-18

    Wrapper Induction is a software package that allows for unsupervised, semi-supervised, and manual extraction of social media data independent of language or site architecture. A large range of blog formats is available to individuals as means of publishing data to the internet. Blogs are a source of rich information for analysts. With a growing volume of information and blog engines, there is an increased need for automatic or semi-automatic extraction of that data for processingmore » to help deliver results to analysts. Wrapper Induction is designed to automatically or semi-automatically create a template that can be used to harvest blog data from websites. Blogs are in a variety of formats and languages. Wrapper Induction creates a template and extracts blog data in a way that is independent of a specified blog format or language.« less

  2. Wrapper Induction Software

    SciTech Connect

    2011-08-18

    Wrapper Induction is a software package that allows for unsupervised, semi-supervised, and manual extraction of social media data independent of language or site architecture. A large range of blog formats is available to individuals as means of publishing data to the internet. Blogs are a source of rich information for analysts. With a growing volume of information and blog engines, there is an increased need for automatic or semi-automatic extraction of that data for processing to help deliver results to analysts. Wrapper Induction is designed to automatically or semi-automatically create a template that can be used to harvest blog data from websites. Blogs are in a variety of formats and languages. Wrapper Induction creates a template and extracts blog data in a way that is independent of a specified blog format or language.

  3. Assessing multizone airflow software

    SciTech Connect

    Lorenzetti, D.M.

    2001-12-01

    Multizone models form the basis of most computer simulations of airflow and pollutant transport in buildings. In order to promote computational efficiency, some multizone simulation programs, such as COMIS and CONTAM, restrict the form that their flow models may take. While these tools allow scientists and engineers to explore a wide range of building airflow problems, increasingly their use has led to new questions not answerable by the current generation of programs. This paper, directed at software developers working on the next generation of building airflow models, identifies structural aspects of COMIS and related programs that prevent them from easily incorporating desirable new airflow models. The paper also suggests criteria for evaluating alternate simulation environments for future modeling efforts.

  4. Software development without languages

    NASA Technical Reports Server (NTRS)

    Osborne, Haywood S.

    1988-01-01

    Automatic programming generally involves the construction of a formal specification; i.e., one which allows unambiguous interpretation by tools for the subsequent production of the corresponding software. Previous practical efforts in this direction have focused on the serious problems of: (1) designing the optimum specification language; and (2) mapping (translating or compiling) from this specification language to the program itself. The approach proposed bypasses the above problems. It postulates that the specification proper should be an intermediate form, with the sole function of containing information sufficient to facilitate construction of programs and also of matching documentation. Thus, the means of forming the intermediary becomes a human factors task rather than a linguistic one; human users will read documents generated from the specification, rather than the specification itself.

  5. Evolvable Neural Software System

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  6. PROMOTIONS: PROper MOTION Software

    NASA Astrophysics Data System (ADS)

    Caleb Wherry, John; Sahai, R.

    2009-05-01

    We report on the development of a software tool (PROMOTIONS) to streamline the process of measuring proper motions of material in expanding nebulae. Our tool makes use of IDL's widget programming capabilities to design a unique GUI that is used to compare images of the objects from two epochs. The software allows us to first orient and register the images to a common frame of reference and pixel scale, using field stars in each of the images. We then cross-correlate specific morphological features in order to determine their proper motions, which consist of the proper motion of the nebula as a whole (PM-neb), and expansion motions of the features relative to the center. If the central star is not visible (quite common in bipolar nebulae with dense dusty waists), point-symmetric expansion is assumed and we use the average motion of high-quality symmetric pairs of features on opposite sides of the nebular center to compute PM-neb. This is then subtracted out to determine the individual movements of these and additional features relative to the nebular center. PROMOTIONS should find wide applicability in measuring proper motions in astrophysical objects such as the expanding outflows/jets commonly seen around young and dying stars. We present first results from using PROMOTIONS to successfully measure proper motions in several pre-planetary nebulae (transition objects between the red giant and planetary nebula phases), using images taken 7-10 years apart with the WFPC2 and ACS instruments on board HST. The authors are grateful to NASA's Undergradute Scholars Research Program (USRP) for supporting this research.

  7. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  8. HEAVENS system for software artifacts

    NASA Technical Reports Server (NTRS)

    Matthews, Paul

    1990-01-01

    The HEAVENS system is a workstation-based collection of software for analyzing, organizing, and viewing software artifacts. As a prototype, the system was used for visualizing source code structure, analyzing dependencies, and reconstructing to simplify maintenance. The system was also used in the early stages of software design to organize and relate design objects, maintain design documentation, and provide ready-made framework for later coding.

  9. Gammasphere software development. Progress report

    SciTech Connect

    Piercey, R.B.

    1993-05-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  10. Planetarium software in the classroom

    NASA Astrophysics Data System (ADS)

    Persson, J. R.; Eriksson, U.

    2016-03-01

    Students often find astronomy and astrophysics to be most interesting and exciting, but the Universe is difficult to access using only one’s eyes or simple equipment available at different educational settings. To open up the Universe and enhance learning astronomy and astrophysics different planetarium software can be used. In this article we discuss the usefulness of such simulation software and give four examples of how such software can be used for teaching and learning astronomy and astrophysics.

  11. Culture shock: Improving software quality

    SciTech Connect

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of the concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.

  12. Developing safety-critical software.

    PubMed

    Garnsworthy, J

    1996-05-01

    The role of safety-critical software in the development of medical devices is becoming increasingly important and ever more exacting demands are being made of software developers. This article considers safety issues and a software development life-cycle based on the safety life-cycle described in IEC 1508, "Safety-Related Systems: Functional Safety." It identifies relevant standards, both emerging and published, and provides guidance on methods that could be used to meet those standards.

  13. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  14. On-Orbit Software Analysis

    NASA Technical Reports Server (NTRS)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  15. Confined Space Imager (CSI) Software

    SciTech Connect

    Karelilz, David

    2013-07-03

    The software provides real-time image capture, enhancement, and display, and sensor control for the Confined Space Imager (CSI) sensor system The software captures images over a Cameralink connection and provides the following image enhancements: camera pixel to pixel non-uniformity correction, optical distortion correction, image registration and averaging, and illumination non-uniformity correction. The software communicates with the custom CSI hardware over USB to control sensor parameters and is capable of saving enhanced sensor images to an external USB drive. The software provides sensor control, image capture, enhancement, and display for the CSI sensor system. It is designed to work with the custom hardware.

  16. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  17. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  18. Empirically Driven Software Engineering Research

    NASA Astrophysics Data System (ADS)

    Rombach, Dieter

    Software engineering is a design discipline. As such, its engineering methods are based on cognitive instead of physical laws, and their effectiveness depends highly on context. Empirical methods can be used to observe the effects of software engineering methods in vivo and in vitro, to identify improvement potentials, and to validate new research results. This paper summarizes both the current body of knowledge and further challenges wrt. empirical methods in software engineering as well as empirically derived evidence regarding software typical engineering methods. Finally, future challenges wrt. education, research, and technology transfer will be outlined.

  19. Resource utilization during software development

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1988-01-01

    This paper discusses resource utilization over the life cycle of software development and discusses the role that the current 'waterfall' model plays in the actual software life cycle. Software production in the NASA environment was analyzed to measure these differences. The data from 13 different projects were collected by the Software Engineering Laboratory at NASA Goddard Space Flight Center and analyzed for similarities and differences. The results indicate that the waterfall model is not very realistic in practice, and that as technology introduces further perturbations to this model with concepts like executable specifications, rapid prototyping, and wide-spectrum languages, we need to modify our model of this process.

  20. Jitter Controller Software

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin; Schlensinger, Adam

    2011-01-01

    Sinusoidal jitter is produced by simply modulating a clock frequency sinusoidally with a given frequency and amplitude. But this can be expressed as phase jitter, frequency jitter, or cycle-to-cycle jitter, rms or peak, absolute units, or normalized to the base clock frequency. Jitter using other waveforms requires calculating and downloading these waveforms to an arbitrary waveform generator, and helping the user manage relationships among phase jitter crest factor, frequency jitter crest factor, and cycle-to-cycle jitter (CCJ) crest factor. Software was developed for managing these relationships, automatically configuring the generator, and saving test results documentation. Tighter management of clock jitter and jitter sensitivity is required by new codes that further extend the already high performance of space communication links, completely correcting symbol error rates higher than 10 percent, and therefore typically requiring demodulation and symbol synchronization hardware to operating at signal-to-noise ratios of less than one. To accomplish this, greater demands are also made on transmitter performance, and measurement techniques are needed to confirm performance. It was discovered early that sinusoidal jitter can be stepped on a grid such that one can connect points by constant phase jitter, constant frequency jitter, or constant cycle-cycle jitter. The tool automates adherence to a grid while also allowing adjustments off-grid. Also, the jitter can be set by the user on any dimension and the others are calculated. The calculations are all recorded, allowing the data to be rapidly plotted or re-plotted against different interpretations just by changing pointers to columns. A key advantage is taking data on a carefully controlled grid, which allowed a single data set to be post-analyzed many different ways. Another innovation was building a software tool to provide very tight coupling between the generator and the recorded data product, and the operator

  1. Software Development Group. Software Review Center. Microcomputing Working Paper Series.

    ERIC Educational Resources Information Center

    Perkey, Nadine; Smith, Shirley C.

    Two papers describe the roles of the Software Development Group (SDG) and the Software Review Center (SRC) at Drexel University. The first paper covers the primary role of the SDG, which is designed to assist Drexel faculty with the technical design and programming of courseware for the Apple Macintosh microcomputer; the relationship of the SDG…

  2. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  3. The Software Engineering Laboratory: An operational software experience factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon

    1992-01-01

    For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.

  4. What's New in Software? Selecting Software for Student Use.

    ERIC Educational Resources Information Center

    Ellsworth, Nancy J.; Hedley, Carolyn N.

    1993-01-01

    Suggests that, in reviewing software, educators should apply the following criteria within the context of their objectives and the students' needs: content; instructional presentation; demands placed on the learner; technical features; and documentation and management features. Concludes that integration of computer software with traditional means…

  5. Software for Acoustic Rendering

    NASA Technical Reports Server (NTRS)

    Miller, Joel D.

    2003-01-01

    SLAB is a software system that can be run on a personal computer to simulate an acoustic environment in real time. SLAB was developed to enable computational experimentation in which one can exert low-level control over a variety of signal-processing parameters, related to spatialization, for conducting psychoacoustic studies. Among the parameters that can be manipulated are the number and position of reflections, the fidelity (that is, the number of taps in finite-impulse-response filters), the system latency, and the update rate of the filters. Another goal in the development of SLAB was to provide an inexpensive means of dynamic synthesis of virtual audio over headphones, without need for special-purpose signal-processing hardware. SLAB has a modular, object-oriented design that affords the flexibility and extensibility needed to accommodate a variety of computational experiments and signal-flow structures. SLAB s spatial renderer has a fixed signal-flow architecture corresponding to a set of parallel signal paths from each source to a listener. This fixed architecture can be regarded as a compromise that optimizes efficiency at the expense of complete flexibility. Such a compromise is necessary, given the design goal of enabling computational psychoacoustic experimentation on inexpensive personal computers.

  6. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  7. Parallel time integration software

    SciTech Connect

    2014-07-01

    This package implements an optimal-scaling multigrid solver for the (non) linear systems that arise from the discretization of problems with evolutionary behavior. Typically, solution algorithms for evolution equations are based on a time-marching approach, solving sequentially for one time step after the other. Parallelism in these traditional time-integrarion techniques is limited to spatial parallelism. However, current trends in computer architectures are leading twards system with more, but not faster. processors. Therefore, faster compute speeds must come from greater parallelism. One approach to achieve parallelism in time is with multigrid, but extending classical multigrid methods for elliptic poerators to this setting is a significant achievement. In this software, we implement a non-intrusive, optimal-scaling time-parallel method based on multigrid reduction techniques. The examples in the package demonstrate optimality of our multigrid-reduction-in-time algorithm (MGRIT) for solving a variety of parabolic equations in two and three sparial dimensions. These examples can also be used to show that MGRIT can achieve significant speedup in comparison to sequential time marching on modern architectures.

  8. Evaluating software testing strategies

    NASA Technical Reports Server (NTRS)

    Selby, R. W., Jr.; Basili, V. R.; Page, J.; Mcgarry, F. E.

    1984-01-01

    The strategies of code reading, functional testing, and structural testing are compared in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. The major results are the following: (1) Code readers detected more faults than did those using the other techniques, while functional tester detected more faults than did structural testers; (2) Code readers had a higher fault detection rate than did those using the other methods, while there was no difference between functional testers and structural testers; (3) Subjects testing the abstract data type detected the most faults and had the highest fault detection rate, while individuals testing the database maintainer found the fewest faults and spent the most effort testing; (4) Subjects of intermediate and junior expertise were not different in number or percentage of faults found, fault detection rate, or fault detection effort; (5) subjects of advanced expertise found a greater number of faults than did the others, found a greater percentage of faults than did just those of junior expertise, and were not different from the others in either fault detection rate or effort; and (6) Code readers and functional testers both detected more omission faults and more control faults than did structural testers, while code readers detected more interface faults than did those using the other methods.

  9. Parallel time integration software

    2014-07-01

    This package implements an optimal-scaling multigrid solver for the (non) linear systems that arise from the discretization of problems with evolutionary behavior. Typically, solution algorithms for evolution equations are based on a time-marching approach, solving sequentially for one time step after the other. Parallelism in these traditional time-integrarion techniques is limited to spatial parallelism. However, current trends in computer architectures are leading twards system with more, but not faster. processors. Therefore, faster compute speeds mustmore » come from greater parallelism. One approach to achieve parallelism in time is with multigrid, but extending classical multigrid methods for elliptic poerators to this setting is a significant achievement. In this software, we implement a non-intrusive, optimal-scaling time-parallel method based on multigrid reduction techniques. The examples in the package demonstrate optimality of our multigrid-reduction-in-time algorithm (MGRIT) for solving a variety of parabolic equations in two and three sparial dimensions. These examples can also be used to show that MGRIT can achieve significant speedup in comparison to sequential time marching on modern architectures.« less

  10. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  11. Should software hold data hostage?

    SciTech Connect

    Wiley, H S.; Michaels, George S.

    2004-08-01

    Software tools have become an indispensable part of modern biology, but issues surrounding propriety file formats and closed software architectures threaten to stunt the growth of this rapidly expanding area of research. In an effort to ensure continuous software upgrades to provide a continuous income stream, some software companies have resorted to holding the user?s data hostage by locking them into proprietary file and data formats. Although this might make sense from a business perspective, it violates fundamental principles of data ownership and control. Such tactics should not be tolerated by the scientific community. The future of data-intensive biology depends on ensuring open data standards and freely exchangeable file formats. Compared to the engineering and chemistry fields, computers are a relatively recent addition to the arsenal of biological tools. Thus the pool of potential users of biology-oriented software is comparatively small. Biology itself is a broad field with many sub-disciplines, such as neurobiology, biochemistry, genomics and cell biology. This creates the need for task-oriented software tools that necessarily have a small user base. Simultaneously, the task of developing software has become more complex with the need for multi-platform software and increasing user expectations of sophisticated interfaces and a high degree of usability. Writing successful software in such an environment is very challenging, but progress in biology will increasingly depend on the success of companies and individuals in creating powerful new software tools. The trend to open source software could have an enormous impact on biology by providing the large number of specialized analysis tools that are required. Indeed, in the field of bioinformatics, open source software has become pervasive, largely because of the high degree of computer skill necessary for workers in this field. For these tools to be usable by non-specialists, however, requires the

  12. Avoiding Pedagogically Naive "Captive" Software.

    ERIC Educational Resources Information Center

    Walbert, Mark S.

    An increasing number of non-statistical software packages are being written as supplementary instructional materials provided free or at low cost for economics principles textbooks. This paper reviews the software programs currently available as ancillary material to several major texts and compares what is available as a group against what should…

  13. X-Ray Analysis Software

    2007-03-09

    XRAYS is a suite of prgrams related to scientific software in general and exray and neutron scattering problems in particular. It is expected to bean ongoing progam involving collaboration with other facilities. It is expected to include legacy and other existing software as well as new applications and new interfaces for existing applications.

  14. Music Software for Special Needs.

    ERIC Educational Resources Information Center

    McCord, Kimberly

    2001-01-01

    Discusses the use of computer software for students with special needs in the music classroom. Focuses on software programs that are appropriate for children with special needs such as: "Musicshop,""Band-in-a-Box,""Rock Rap'n Roll,""Music Mania,""Music Ace" and "Music Ace 2," and "Children's Songbook." (CMK)

  15. Educational Software: A Developer's Perspective.

    ERIC Educational Resources Information Center

    Armstrong, Timothy C.; Loane, Russell F.

    1994-01-01

    Examines the current status and short-term future of computer software development in higher education. Topics discussed include educational advantages of software; current program development techniques, including object oriented programming; and market trends, including IBM versus Macintosh and multimedia programs. (LRW)

  16. Department-Generated Microcomputer Software.

    ERIC Educational Resources Information Center

    Mantei, Erwin J.

    1986-01-01

    Explains how self-produced software can be used to perform rapid number analysis or number-crunching duties in geology classes. Reviews programs in mineralogy and petrology and identifies areas in geology where computers can be used effectively. Discusses the advantages and benefits of integrating department-generated software into a geology…

  17. Elementary Keyboarding Software Product Reports.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This report provides detailed product descriptions of 45 software programs designed to teach or improve the keyboarding skills of elementary school students that were identified by the MicroSIFT (Microcomputer Information and Software for Teachers) staff. The descriptions include program titles, producer names, costs, grade levels, hardware,…

  18. Putting Safety in the Software

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Berens, Kalynnda M.; Hardy, Sandra (Technical Monitor)

    2001-01-01

    Software is a vital component of nearly every piece of modern technology. It is not a 'sub-system', able to be separated out from the system as a whole, but a 'co-system' that controls, manipulates, or interacts with the hardware and with the end user. Software has its fingers into all the pieces of the pie. If that 'pie', the system, can lead to injury, death, loss of major equipment, or impact your business bottom line, then software safety becomes vitally important. Learning to think about software from a safety perspective is the focus of this paper. We want you to think of software as part of the safety critical system, a major part. This requires 'system thinking' - being able to grasp the whole picture. Software's contribution to modern technology is both good and potentially bad. Software allows more complex and useful devices to be built. It can also contribute to plane crashes and power outages. We want you to see software in a whole new light, see it as a contributor to system hazards, and also as a possible fix or mitigation to some of those hazards.

  19. A Guide to Software Evaluation.

    ERIC Educational Resources Information Center

    Leonard, Rex; LeCroy, Barbara

    Arguing that software evaluation is crucial to the quality of courseware available in a school, this paper begins by discussing reasons why microcomputers are making such a tremendous impact on education, and notes that, although the quality of software has improved over the years, the challenge for teachers to integrate computing into the…

  20. Software Development at Belle II

    NASA Astrophysics Data System (ADS)

    Kuhr, Thomas; Hauth, Thomas

    2015-12-01

    Belle II is a next generation B-factory experiment that will collect 50 times more data than its predecessor Belle. This requires not only a major upgrade of the detector hardware, but also of the simulation, reconstruction, and analysis software. The challenges of the software development at Belle II and the tools and procedures to address them are reviewed in this article.

  1. Analyzing Software Piracy in Education.

    ERIC Educational Resources Information Center

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  2. Computer Software for Process Control.

    ERIC Educational Resources Information Center

    Spector, Alfred Z.

    1984-01-01

    Computer software for process control has the primary function of communicating with and governing physical devices. The structure of such software, process-control systems, multitask systems, message passing, problems of deadlock, distributed computer systems, and protection against failure in process-control systems are among the areas examined.…

  3. Managers Handbook for Software Development

    NASA Technical Reports Server (NTRS)

    Agresti, W.; Mcgarry, F.; Card, D.; Page, J.; Church, V.; Werking, R.

    1984-01-01

    Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences with flight dynamics software development. The management aspects of organizing the project, producing a development plan, estimation costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying are described.

  4. Assistive Software for Disabled Learners

    ERIC Educational Resources Information Center

    Clark, Sharon; Baggaley, Jon

    2004-01-01

    Previous reports in this series (#32 and 36) have discussed online software features of value to disabled learners in distance education. The current report evaluates four specific assistive software products with useful features for visually and hearing impaired learners: "ATutor", "ACollab", "Natural Voice", and "Just Vanilla". The evaluative…

  5. NASA Software Engineering Benchmarking Study

    NASA Technical Reports Server (NTRS)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  6. Enhancing Instruction through Software Infusion.

    ERIC Educational Resources Information Center

    Sia, Archie P.

    The presence of the computer in the classroom is no longer considered an oddity; it has become an ordinary resource for teachers to use for the enhancement of instruction. This paper presents an examination of software infusion, i.e., the use of computer software to enrich instruction in an academic curriculum. The process occurs when a chosen…

  7. Records Inventory Data Collection Software

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  8. Many Paths to Learning Software.

    ERIC Educational Resources Information Center

    Harp, Candice; And Others

    1997-01-01

    Respondents drawn from a sample of licensed software users rated experimenting and asking coworkers as the most useful ways to learn new software. Clerical workers preferred interaction with trainers; knowledge workers/managers relied on experience and coworkers. Dependent learners (n=49) preferred an instructor-directed approach; self-directed…

  9. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  10. Special Education Microcomputer Software Directory.

    ERIC Educational Resources Information Center

    Anderson, Nancy; And Others

    Intended to help special educators select software appropriate for their classroom use, the directory contains evalutions of about 300 instructional software packages. An index lists titles by computer type and the following subject areas: computer literacy, critical thinking, grammar, life skills, literature, math computation, math problem…

  11. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  12. Choosing Software for Text Processing.

    ERIC Educational Resources Information Center

    Mason, Robert M.

    1983-01-01

    Review of text processing software for microcomputers covers data entry, text editing, document formatting, and spelling and proofreading programs including "Wordstar,""PeachText,""PerfectWriter,""Select," and "The Word Plus.""The Whole Earth Software Catalog" and a new terminal to be manufactured for OCLC by IBM are mentioned. (EJS)

  13. Revamp your software selection process.

    PubMed

    Allen, D J

    1999-11-01

    Very few software implementations fail from lack of functionality. More often, the failure results from other factors. Yet most companies continue to focus primarily on software functionality during the selection and evaluation process. By expanding the scope of your evaluation process to include other important factors, your probability of successful implementation and future happiness with your vendor can be dramatically enhanced.

  14. Software Solutions for Better Administration.

    ERIC Educational Resources Information Center

    Kazanjian, Edward

    1997-01-01

    The CO/OP (founded in 1973 as the Massachusetts Association of School Business Officials Cooperative Corporation) has created and produced administrative software for schools. Describes two areas in which software can increase revenue and provide protection for personnel: (1) invoice/accounts receivable for the rental of school space; and (2) an…

  15. Future of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  16. CMS Simulation Software

    NASA Astrophysics Data System (ADS)

    Banerjee, S.

    2012-12-01

    The CMS simulation, based on the Geant4 toolkit, has been operational within the new CMS software framework for more than four years. The description of the detector including the forward regions has been completed and detailed investigation of detector positioning and material budget has been carried out using collision data. Detailed modeling of detector noise has been performed and validated with the collision data. In view of the high luminosity runs of the Large Hadron Collider, simulation of pile-up events has become a key issue. Challenges have raised from the point of view of providing a realistic luminosity profile and modeling of out-of-time pileup events, as well as computing issues regarding memory footprint and IO access. These will be especially severe in the simulation of collision events for the LHC upgrades; a new pileup simulation architecture has been introduced to cope with these issues. The CMS detector has observed anomalous energy deposit in the calorimeters and there has been a substantial effort to understand these anomalous signal events present in the collision data. Emphasis has also been given to validation of the simulation code including the physics of the underlying models of Geant4. Test beam as well as collision data are used for this purpose. Measurements of mean response, resolution, energy sharing between the electromagnetic and hadron calorimeters, shower shapes for single hadrons are directly compared with predictions from Monte Carlo. A suite of performance analysis tools has been put in place and has been used to drive several optimizations to allow the code to fit the constraints posed by the CMS computing model.

  17. Software thermal imager simulator

    NASA Astrophysics Data System (ADS)

    Le Noc, Loic; Pancrati, Ovidiu; Doucet, Michel; Dufour, Denis; Debaque, Benoit; Turbide, Simon; Berthiaume, Francois; Saint-Laurent, Louis; Marchese, Linda; Bolduc, Martin; Bergeron, Alain

    2014-10-01

    A software application, SIST, has been developed for the simulation of the video at the output of a thermal imager. The approach offers a more suitable representation than current identification (ID) range predictors do: the end user can evaluate the adequacy of a virtual camera as if he was using it in real operating conditions. In particular, the ambiguity in the interpretation of ID range is cancelled. The application also allows for a cost-efficient determination of the optimal design of an imager and of its subsystems without over- or under-specification: the performances are known early in the development cycle, for targets, scene and environmental conditions of interest. The simulated image is also a powerful method for testing processing algorithms. Finally, the display, which can be a severe system limitation, is also fully considered in the system by the use of real hardware components. The application consists in Matlabtm routines that simulate the effect of the subsystems atmosphere, optical lens, detector, and image processing algorithms. Calls to MODTRAN® for the atmosphere modeling and to Zemax for the optical modeling have been implemented. The realism of the simulation depends on the adequacy of the input scene for the application and on the accuracy of the subsystem parameters. For high accuracy results, measured imager characteristics such as noise can be used with SIST instead of less accurate models. The ID ranges of potential imagers were assessed for various targets, backgrounds and atmospheric conditions. The optimal specifications for an optical design were determined by varying the Seidel aberration coefficients to find the worst MTF that still respects the desired ID range.

  18. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  19. Technology transfer in software engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.

    1989-01-01

    The University of Houston-Clear Lake is the prime contractor for the AdaNET Research Project under the direction of NASA Johnson Space Center. AdaNET was established to promote the principles of software engineering to the software development industry. AdaNET will contain not only environments and tools, but also concepts, principles, models, standards, guidelines and practices. Initially, AdaNET will serve clients from the U.S. government and private industry who are working in software development. It will seek new clients from those who have not yet adopted the principles and practices of software engineering. Some of the goals of AdaNET are to become known as an objective, authoritative source of new software engineering information and parts, to provide easy access to information and parts, and to keep abreast of innovations in the field.

  20. An experiment in software reliability

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Pierce, J. L.

    1986-01-01

    The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.

  1. Fermilab Software Tools Program: Fermitools

    SciTech Connect

    Pordes, R.

    1995-10-01

    The Fermilab Software Tools Program (Fermitools) was established in 1994 as an intiative under which Fermilab provides software it has developed to outside collaborators. During the year and a half since its start ten software products have been packaged and made available on the official Fermilab anonymous ftp site, and backup support and information services have been made available for them. During the past decade, institutions outside the Fermilab physics experiment user community have in general only been able to obtain and use Fermilab developed software on an adhoc or informal basis. With the Fermitools program the Fermilab Computing Division has instituted an umbrella under which software that is regarded by its internal user community as useful and of high quality can be provided to users outside of High Energy Physics experiments. The main thrust of the Fermitools program is stimulating collaborative use and further development of the software. Having established minimal umbrella beaurocracy makes collaborative development and support easier. The published caveat given to people who take the software includes the statement ``Provision of the software implies no commitment of support by Fermilab. The Fermilab Computing Division is open to discussing other levels of support for use of the software with responsible and committed users and collaborator``. There have been no negative comments in response to this and the policy has not given rise to any questions or complaints. In this paper we present the goals and strategy of the program and introduce some of the software made available through it. We discuss our experiences to date and mention the perceived benefits of the Program.

  2. Higher order software - A methodology for defining software

    NASA Technical Reports Server (NTRS)

    Hamilton, M.; Zeldin, S.

    1976-01-01

    Higher order software (HOS) is concerned only with computable functions and relationships. The HOS methodology can be used for the definition of software for multiprogrammed, multiprocessor, or multicomputer systems. A description of HOS methodology is presented, giving attention to questions of formulation, interface correctness, specification language principles, and HOS analyzers. Aspects of system design are considered, and details of software management are discussed. Attention is given to modularity as defined by HOS, frozen module management, the assembly control supervisor, and aspects of reliability and efficiency.

  3. Software requirements: Guidance and control software development specification

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward; Rich, Don C.; Lowman, Douglas S.; Buckland, R. C.

    1990-01-01

    The software requirements for an implementation of Guidance and Control Software (GCS) are specified. The purpose of the GCS is to provide guidance and engine control to a planetary landing vehicle during its terminal descent onto a planetary surface and to communicate sensory information about that vehicle and its descent to some receiving device. The specification was developed using the structured analysis for real time system specification methodology by Hatley and Pirbhai and was based on a simulation program used to study the probability of success of the 1976 Viking Lander missions to Mars. Three versions of GCS are being generated for use in software error studies.

  4. Self-assembled software and method of overriding software execution

    SciTech Connect

    Bouchard, Ann M.; Osbourn, Gordon C.

    2013-01-08

    A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.

  5. Software process improvement in the NASA software engineering laboratory

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  6. The SIFT hardware/software systems. Volume 2: Software listings

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1985-01-01

    This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.

  7. Modern Tools for Modern Software

    SciTech Connect

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  8. Packaging Software Assets for Reuse

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.; Marshall, J. J.; Downs, R. R.

    2010-12-01

    The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.

  9. Software reuse environment user's guide

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This document describes the services provided by the prototype Software Reuse Environment, which was developed by CTA for NASA Goddard Space Flight Center, Code 520. This is one of three guides delivered by CTA as part of the environment. The other two guides are: Software Generation and Installation Guide; and SEMANTX--Defining the Schema. The Software Generation and Installation Guide describes the software source modules that make up the Reuse Environment, with instructions on how to generate and install an executable system from the source code. SEMANTX--Defining the Schema describes how a reuse database is created. Actually this guide is more general than the reuse database, as it describes how to generate a SEMANTX database. SEMANTX is an off-the-shelf tool that we have used to implement the reuse database. It is a product of Semantyk Systems, Inc. The Software Reuse Environment is built upon SEMANTX as well as on the IDE Structured Analysis Integrated Environment. (IDE is Interactive Development Environments, Inc.) SEMANTX itself is built on top of the Unify Database Management System. To use the Software Reuse Environment you should have the User's Manuals for SEMANTX, for Unify, and for the IDE software. CTA has provided all of these with the environment.

  10. HEASARC Software Archive

    NASA Technical Reports Server (NTRS)

    White, Nicholas (Technical Monitor); Murray, Stephen S.

    2003-01-01

    (1) Chandra Archive: SAO has maintained the interfaces through which HEASARC gains access to the Chandra Data Archive. At HEASARC's request, we have implemented an anonymous ftp copy of a major part of the public archive and we keep that archive up-to- date. SAO has participated in the ADEC interoperability working group, establishing guidelines or interoperability standards and prototyping such interfaces. We have provided an NVO-based prototype interface, intending to serve the HEASARC-led NVO demo project. HEASARC's Astrobrowse interface was maintained and updated. In addition, we have participated in design discussions surrounding HEASARC's Caldb project. We have attended the HEASARC Users Group meeting and presented CDA status and developments. (2) Chandra CALDB: SA0 has maintained and expanded the Chandra CALDB by including four new data file types, defining the corresponding CALDB keyword/identification structures. We have provided CALDB upgrades for the public (CIAO) and for Standard Data Processing. Approximately 40 new files have been added to the CALDB in these version releases. There have been in the past year ten of these CALDB upgrades, each with unique index configurations. In addition, with the inputs from software, archive, and calibration scientists, as well as CIAO/SDP software developers, we have defined a generalized expansion of the existing CALDB interface and indexing structure. The purpose of this is to make the CALDB more generally applicable and useful in new and future missions that will be supported archivally by HEASARC. The generalized interface will identify additional configurational keywords and permit more extensive calibration parameter and boundary condition specifications for unique file selection. HEASARC scientists and developers from SAO and GSFC have become involved in this work, which is expected to produce a new interface for general use within the current year. (3) DS9: One of the decisions that came from last year

  11. HEASARC Software Archive

    NASA Technical Reports Server (NTRS)

    White, Nicholas (Technical Monitor); Murray, Stephen S.

    2003-01-01

    (1) Chandra Archive: SAO has maintained the interfaces through which HEASARC gains access to the Chandra Data Archive. At HEASARC's request, we have implemented an anonymous ftp copy of a major part of the public archive and we keep that archive up-to- date. SAO has participated in the ADEC interoperability working group, establishing guidelines or interoperability standards and prototyping such interfaces. We have provided an NVO-based prototype interface, intending to serve the HEASARC-led NVO demo project. HEASARC's Astrobrowse interface was maintained and updated. In addition, we have participated in design discussions surrounding HEASARC's Caldb project. We have attended the HEASARC Users Group meeting and presented CDA status and developments. (2) Chandra CALDB: SA0 has maintained and expanded the Chandra CALDB by including four new data file types, defining the corresponding CALDB keyword/identification structures. We have provided CALDB upgrades for the public (CIAO) and for Standard Data Processing. Approximately 40 new files have been added to the CALDB in these version releases. There have been in the past year ten of these CALDB upgrades, each with unique index configurations. In addition, with the inputs from software, archive, and calibration scientists, as well as CIAO/SDP software developers, we have defined a generalized expansion of the existing CALDB interface and indexing structure. The purpose of this is to make the CALDB more generally applicable and useful in new and future missions that will be supported archivally by HEASARC. The generalized interface will identify additional configurational keywords and permit more extensive calibration parameter and boundary condition specifications for unique file selection. HEASARC scientists and developers from SAO and GSFC have become involved in this work, which is expected to produce a new interface for general use within the current year. (3) DS9: One of the decisions that came from last year

  12. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  13. Writing software for the clinic.

    PubMed

    Rosen, I I

    1998-03-01

    Medical physicists often write computer programs to support scientific, educational, and clinical endeavors. Errors in scientific and educational software can waste time and effort by producing meaningless results, but errors in clinical software can contribute to patient injuries. Although the ultimate goal of error-free software is impossible to achieve except in very small programs, there are many good design, implementation, and testing practices that can be used by small development groups to significantly reduce errors, improve quality, and reduce maintenance. The software development process should include four basic steps: specifications, design, implementation, and testing. A specifications document defining what the software is intended to do is valuable for clearly delimiting the scope of the project and providing a benchmark for evaluating the final product. Keep the software design simple and straightforward. Document assumptions, and check them. Emphasize maintainability, portability, and reliability rather than speed. Use layers to isolate the application from hardware and the operating system. Plan for upgrades. Expect the software to be used in unplanned ways. Whenever possible, be generous with RAM and disk storage; hardware is cheaper than development and maintenance. During implementation, use well-known algorithms whenever possible. Use prototypes to try out ideas. Use generic modules, version numbering, unique file names, defensive programming, and operating system and language/compiler defaults. Avoid binary data files and clever tricks. Remember that real numbers are not exact in a computer. Get it right before making it faster. Document the software extensively. Test continuously during development; the later a problem is found, the more it costs to fix. Use a written procedure to test the final product exactly as a typical user would run it. Allow no changes after clinical release. Expect to spend at least an additional 50% of the initial

  14. High order software - A methodology for defining software

    NASA Technical Reports Server (NTRS)

    Hamilton, M.; Zeldin, S.

    1975-01-01

    Higher order software (HOS) is a formal methodology for reliable systems specification and development. HOS is concerned only with computable functions and their relationships for any given system. Questions of methodology are considered, taking into account aspects of formulation meta-language principles, and HOS analyzers. Details of system design are discussed, giving attention to aspects of immediate self-control and indirect self-control. A description is given of the approaches used for software management.

  15. National software exchange - An electronic marketplace for software

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1991-01-01

    A potential application is outlined for high-speed and high-performance networks of the future. A discussion is presented of possible contents of the NSE (National Software Exchange) services to be offered by the NSE, mechanisms for controlling software quality, and types of collaboration that might involve the NSE. Services that must be provided by the NREN (National Research and Education Network) to provide adequate support for the NSE are identified.

  16. PIV Data Validation Software Package

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  17. Reconfigurable Software for Mission Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2014-01-01

    We developed software that provides flexibility to mission organizations through modularity and composability. Modularity enables removal and addition of functionality through the installation of plug-ins. Composability enables users to assemble software from pre-built reusable objects, thus reducing or eliminating the walls associated with traditional application architectures and enabling unique combinations of functionality. We have used composable objects to reduce display build time, create workflows, and build scenarios to test concepts for lunar roving operations. The software is open source, and may be downloaded from https:github.comnasamct.

  18. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  19. SSL: A software specification language

    NASA Technical Reports Server (NTRS)

    Austin, S. L.; Buckles, B. P.; Ryan, J. P.

    1976-01-01

    SSL (Software Specification Language) is a new formalism for the definition of specifications for software systems. The language provides a linear format for the representation of the information normally displayed in a two-dimensional module inter-dependency diagram. In comparing SSL to FORTRAN or ALGOL, it is found to be largely complementary to the algorithmic (procedural) languages. SSL is capable of representing explicitly module interconnections and global data flow, information which is deeply imbedded in the algorithmic languages. On the other hand, SSL is not designed to depict the control flow within modules. The SSL level of software design explicitly depicts intermodule data flow as a functional specification.

  20. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1993-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory: software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. This document contains an index of these publications classified by individual author.

  1. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Groves, Paula; Valett, Jon

    1990-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory-software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. Subject and author indexes further classify these documents by specific topic and individual author.

  2. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon D.

    1991-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  3. Improving Software Engineering on NASA Projects

    NASA Technical Reports Server (NTRS)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  4. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  5. An introduction to software obfuscation.

    SciTech Connect

    Campbell, Philip LaRoche

    2004-06-01

    Obfuscation protects software by making the code more difficult to understand. We review a collection of obfuscation techniques. We then consider what would constitute a theory of obfuscation. Several possibilities that could lead to such a theory are explored.

  6. Software Systems: Consequence versus Functionality

    SciTech Connect

    Berg, Ray; Winter, Victor L.

    1999-08-05

    The purpose of this panel is to present different perspectives and opinions regarding the issues surrounding why software should or shouldn't be entrusted with critical (high consequence) functionality.

  7. A software surety analysis process

    SciTech Connect

    Trauth, S.; Tempel, P.

    1995-11-01

    As part of the High Consequence System Surety project, this work was undertaken to explore, one approach to conducting a surety theme analysis for a software-driven system. Originally, plans were to develop a theoretical approach to the analysis, and then to validate and refine this process by applying it to the software being developed for the Weight and Leak Check System (WALS), an automated nuclear weapon component handling system. As with the development of the higher level High consequence System surety Process, this work was not completed due to changes in funding levels. This document describes the software analysis process, discusses its application in a software, environment, and outlines next steps that could be taken to further develop and apply the approach to projects.

  8. Journal of Chemical Education: Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1988

    1988-01-01

    Describes a chemistry software program that emulates a modern binary gradient HPLC system with reversed phase column behavior. Allows for solvent selection, adjustment of gradient program, column selection, detectory selection, handling of computer sample data, and sample preparation. (MVL)

  9. Confined Space Imager (CSI) Software

    2013-07-03

    The software provides real-time image capture, enhancement, and display, and sensor control for the Confined Space Imager (CSI) sensor system The software captures images over a Cameralink connection and provides the following image enhancements: camera pixel to pixel non-uniformity correction, optical distortion correction, image registration and averaging, and illumination non-uniformity correction. The software communicates with the custom CSI hardware over USB to control sensor parameters and is capable of saving enhanced sensor images to anmore » external USB drive. The software provides sensor control, image capture, enhancement, and display for the CSI sensor system. It is designed to work with the custom hardware.« less

  10. HOMER® Energy Modeling Software 2003

    2003-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  11. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  12. Credible Software and Simulation Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; Nixon, David (Technical Monitor)

    1998-01-01

    The utility of software primarily depends on its reliability and performance; whereas, its significance depends solely on its credibility for intended use. The credibility of simulations confirms the credibility of software. The level of veracity and the level of validity of simulations determine the degree of credibility of simulations. The process of assessing this credibility in fields such as computational mechanics (CM) differs from that followed by the Defense Modeling and Simulation Office in operations research. Verification and validation (V&V) of CM simulations is not the same as V&V of CM software. Uncertainty is the measure of simulation credibility. Designers who use software are concerned with management of simulation uncertainty. Terminology and concepts are presented with a few examples from computational fluid dynamics.

  13. Managing risk in software systems

    SciTech Connect

    Fletcher, S.K.; Jansma, R.M.; Murphy, M.D.

    1995-07-01

    A methodology for risk management in the design of software systems is presented. It spans security, safety, and correct operation of software within the context of its environment, and produces a risk analysis and documented risk management strategy. It is designed to be iteratively applied, to attain appropriate levels of detail throughout the analysis. The methodology and supporting tools are discussed. The methodology is critiqued relative to other research in the field. Some sample applications of the methodology are presented.

  14. National Software Reference Library (NSRL)

    National Institute of Standards and Technology Data Gateway

    National Software Reference Library (NSRL) (PC database for purchase)   A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.

  15. Programming software for usability evaluation

    SciTech Connect

    Edwards, T.L.; Allen, H.W.

    1997-01-01

    This report provides an overview of the work completed for a portion of the User Interface Testbed for Technology Packaging (UseIT) project. The authors present software methods for programming systems to record and view interactions with a graphical user interface. A brief description of the human factors design process is presented. The software methods exploit features available in the X Window System and the operating system for Windows{trademark} 95 and Windows{trademark} NT{reg_sign}.

  16. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  17. Experimental research control software system

    NASA Astrophysics Data System (ADS)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  18. Collected software engineering papers, volume 2

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.

  19. Matching software practitioner needs to researcher activities

    NASA Technical Reports Server (NTRS)

    Feather, M. S.; Menzies, T.; Connelly, J. R.

    2003-01-01

    We present an approach to matching software practitioners' needs to software researchers' activities. It uses an accepted taxonomical software classfication scheme as intermediary, in terms of which practitioners express needs, and researchers express activities.

  20. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  1. Software and the future of programming languages.

    PubMed

    Aho, Alfred V

    2004-02-27

    Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.

  2. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software...

  3. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  4. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  5. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  6. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  7. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  8. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  9. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  10. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  11. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software...

  12. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software...

  13. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  14. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  15. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software...

  16. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software...

  17. Strengthening Software Authentication with the ROSE Software Suite

    SciTech Connect

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.

  18. WMAP C&DH Software

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan; Leath, Tim; Ferrer, Art; Miller, Todd; Walters, Mark; Savadkin, Bruce; Wu, Ji-Wei; Slegel, Steve; Stagmer, Emory

    2007-01-01

    The command-and-data-handling (C&DH) software of the Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft functions as the sole interface between (1) the spacecraft and its instrument subsystem and (2) ground operations equipment. This software includes a command-decoding and -distribution system, a telemetry/data-handling system, and a data-storage-and-playback system. This software performs onboard processing of attitude sensor data and generates commands for attitude-control actuators in a closed-loop fashion. It also processes stored commands and monitors health and safety functions for the spacecraft and its instrument subsystems. The basic functionality of this software is the same of that of the older C&DH software of the Rossi X-Ray Timing Explorer (RXTE) spacecraft, the main difference being the addition of the attitude-control functionality. Previously, the C&DH and attitude-control computations were performed by different processors because a single RXTE processor did not have enough processing power. The WMAP spacecraft includes a more-powerful processor capable of performing both computations.

  19. Software Complexity Threatens Performance Portability

    SciTech Connect

    Gamblin, T.

    2015-09-11

    Modern HPC software packages are rarely self-contained. They depend on a large number of external libraries, and many spend large fractions of their runtime in external subroutines. Performance portability depends not only on the effort of application teams, but also on the availability of well-tuned libraries. At most sites, the burden of maintaining libraries is shared by code teams and facilities. Facilities typically provide well-tuned default versions, but code teams frequently build with bleeding-edge compilers to achieve high performance. For this reason, HPC has no “standard” software stack, unlike other domains where performance is not critical. Incompatibilities among compilers and software versions force application teams and facility staff to re-build custom versions of libraries for each new toolchain. Because the number of potential configurations is combinatorial, and because HPC software is notoriously difficult to port to new machines [3, 7, 8], the tuning effort required to support and maintain performance-portable libraries outstrips the available manpower at most sites. Software complexity is a growing obstacle to performance portability for HPC.

  20. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; Thompson, James C.; Walter, Patrick; Brummel, David; Weismuller, Steven P.; Aadsen, Ron; Hurley, Keith; Ruhle, Chris

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  1. Software Design for Smile Analysis

    PubMed Central

    Sodagar, A.; Rafatjoo, R.; Gholami Borujeni, D.; Noroozi, H.; Sarkhosh, A.

    2010-01-01

    Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproducible smile. The record then should be analyzed to determine its characteristics. In this study, we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients. Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high (α=0.7–1). Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo). Conclusion: The results obtained from smile analysis could be used in diagnosis, treatment planning and evaluation of the treatment progress. PMID:21998792

  2. Deindividuation and Internet software piracy.

    PubMed

    Hinduja, Sameer

    2008-08-01

    Computer crime has increased exponentially in recent years as hardware, software, and network resources become more affordable and available to individuals from all walks of life. Software piracy is one prevalent type of cybercrime and has detrimentally affected the economic health of the software industry. Moreover, piracy arguably represents a rend in the moral fabric associated with the respect of intellectual property and reduces the financial incentive of product creation and innovation. Deindividuation theory, originating from the field of social psychology, argues that individuals are extricated from responsibility for their actions simply because they no longer have an acute awareness of the identity of self and of others. That is, external and internal constraints that would typically regulate questionable behavior are rendered less effective via certain anonymizing and disinhibiting conditions of the social and environmental context. This exploratory piece seeks to establish the role of deindividuation in liberating individuals to commit software piracy by testing the hypothesis that persons who prefer the anonymity and pseudonymity associated with interaction on the Internet are more likely to pirate software. Through this research, it is hoped that the empirical identification of such a social psychological determinant will help further illuminate the phenomenon.

  3. Fully Employing Software Inspections Data

    NASA Technical Reports Server (NTRS)

    Shull, Forrest; Feldmann, Raimund L.; Seaman, Carolyn; Regardie, Myrna; Godfrey, Sally

    2009-01-01

    Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the lifecycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team but are also valuable for process improvement activities. In this paper, we discuss NASA's use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.

  4. Understanding software faults and their role in software reliability modeling

    NASA Technical Reports Server (NTRS)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  5. Software interoperability for energy simulation

    SciTech Connect

    Hitchcock, Robert J.

    2002-07-31

    This paper provides an overview of software interoperability as it relates to the energy simulation of buildings. The paper begins with a discussion of the difficulties in using sophisticated analysis tools like energy simulation at various stages in the building life cycle, and the potential for interoperability to help overcome these difficulties. An overview of the Industry Foundation Classes (IFC), a common data model for supporting interoperability under continuing development by the International Alliance for Interoperability (IAI) is then given. The process of creating interoperable software is described next, followed by specific details for energy simulation tools. The paper closes with the current status of, and future plans for, the ongoing efforts to achieve software interoperability.

  6. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  7. Software Updates: Web Design--Software that Makes It Easy!

    ERIC Educational Resources Information Center

    Pattridge, Gregory C.

    2002-01-01

    This article discusses Web design software that provides an easy-to-use interface. The "Netscape Communicator" is highlighted for beginning Web page construction and step-by-step instructions are provided for starting out, page colors and properties, indents, bulleted lists, tables, adding links, navigating long documents, creating e-mail links,…

  8. Software as a service approach to sensor simulation software deployment

    NASA Astrophysics Data System (ADS)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  9. Continuous Software Integration and Quality Control during Software Development

    NASA Astrophysics Data System (ADS)

    Ettl, M.; Neidhardt, A.; Brisken, W.; Dassing, R.

    2012-12-01

    Modern software has to be stable, portable, fast, and reliable. This requires a sophisticated infrastructure supporting and providing the developers with additional information about the state and the quality of the project. That is why we have created a centralized software repository, where the whole code-base is managed and version controlled on a centralized server. Based on this, a hierarchical build system has been developed where each project and their sub-projects can be compiled by simply calling the top level Makefile. On the top of this, a nightly build system has been created where the top level Makefiles of each project are called every night. The results of the build including the compiler warnings are reported to the developers using generated HTML pages. In addition, all the source code is automatically checked using a static code analysis tool, called "cppcheck". This tool produces warnings, similar to those of a compiler, but more pedantic. The reports of this analysis are translated to HTML and reported to the developers similar to the nightly builds. Armed with this information,the developers can discover issues in their projects at an early development stage. In combination it reduces the number of possible issues in our software to ensure quality of our projects at different development stages. These checks are also offered to the community. They are currently used within the DiFX software correlator project.

  10. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  11. Navigation/Prop Software Suite

    NASA Technical Reports Server (NTRS)

    Bruchmiller, Tomas; Tran, Sanh; Lee, Mathew; Bucker, Scott; Bupane, Catherine; Bennett, Charles; Cantu, Sergio; Kwong, Ping; Propst, Carolyn

    2012-01-01

    Navigation (Nav)/Prop software is used to support shuttle mission analysis, production, and some operations tasks. The Nav/Prop suite containing configuration items (CIs) resides on IPS/Linux workstations. It features lifecycle documents, and data files used for shuttle navigation and propellant analysis for all flight segments. This suite also includes trajectory server, archive server, and RAT software residing on MCC/Linux workstations. Navigation/Prop represents tool versions established during or after IPS Equipment Rehost-3 or after the MCC Rehost.

  12. Software to Manage the Unmanageable

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In 1995, NASA s Jet Propulsion Laboratory (JPL) contracted Redmond, Washington-based Lucidoc Corporation, to design a technology infrastructure to automate the intersection between policy management and operations management with advanced software that automates document workflow, document status, and uniformity of document layout. JPL had very specific parameters for the software. It expected to store and catalog over 8,000 technical and procedural documents integrated with hundreds of processes. The project ended in 2000, but NASA still uses the resulting highly secure document management system, and Lucidoc has managed to help other organizations, large and small, with integrating document flow and operations management to ensure a compliance-ready culture.

  13. Revision and product generation software

    USGS Publications Warehouse

    ,

    1997-01-01

    The U.S. Geological Survey (USGS) developed revision and product generation (RevPG) software for updating digital line graph (DLG) data and producing maps from such data. This software is based on ARC/INFO, a geographic information system from Environmental Systems Resource Institute (ESRI). RevPG consists of ARC/INFO Arc Macro Language (AML) programs, C routines, and interface menus that permit operators to collect vector data using aerial images, to symbolize the data on-screen, and to produce plots and color-separated files for use in printing maps.

  14. Revision and Product Generation Software

    USGS Publications Warehouse

    ,

    1999-01-01

    The U.S. Geological Survey (USGS) developed revision and product generation (RevPG) software for updating digital line graph (DLG) data and producing maps from such data. This software is based on ARC/INFO, a geographic information system from Environmental Systems Resource Institute (ESRI). RevPG consists of ARC/INFO Arc Macro Language (AML) programs, C routines, and interface menus that permit operators to collect vector data using aerial images, to symbolize the data onscreen, and to produce plots and color-separated files for use in printing maps.

  15. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  16. Software synthesis using generic architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.

  17. Light duty utility arm software requirements specification

    SciTech Connect

    Kiebel, G.R.

    1995-12-18

    This document defines the software requirements for the integrated control and data acquisition system of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product.

  18. Bioboxes: standardised containers for interchangeable bioinformatics software.

    PubMed

    Belmann, Peter; Dröge, Johannes; Bremges, Andreas; McHardy, Alice C; Sczyrba, Alexander; Barton, Michael D

    2015-01-01

    Software is now both central and essential to modern biology, yet lack of availability, difficult installations, and complex user interfaces make software hard to obtain and use. Containerisation, as exemplified by the Docker platform, has the potential to solve the problems associated with sharing software. We propose bioboxes: containers with standardised interfaces to make bioinformatics software interchangeable.

  19. Bioboxes: standardised containers for interchangeable bioinformatics software.

    PubMed

    Belmann, Peter; Dröge, Johannes; Bremges, Andreas; McHardy, Alice C; Sczyrba, Alexander; Barton, Michael D

    2015-01-01

    Software is now both central and essential to modern biology, yet lack of availability, difficult installations, and complex user interfaces make software hard to obtain and use. Containerisation, as exemplified by the Docker platform, has the potential to solve the problems associated with sharing software. We propose bioboxes: containers with standardised interfaces to make bioinformatics software interchangeable. PMID:26473029

  20. Software Development as Music Education Research

    ERIC Educational Resources Information Center

    Brown, Andrew R.

    2007-01-01

    This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…

  1. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  2. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  3. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  4. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  5. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  6. Sandia software guidelines, Volume 4: Configuration management

    SciTech Connect

    Not Available

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  7. A multiple node software development environment

    SciTech Connect

    Heinicke, P.; Nicinski, T.; Constanta-Fanourakis, P.; Petravick, D.; Pordes, R.; Ritchie, D.; White, V.

    1987-06-01

    Experimenters on over 30 DECnet nodes at Fermilab use software developed, distributed, and maintained by the Data Acquisition Software Group. A general methodology and set of tools have been developed to distribute, use and manage the software on different sites. The methodology and tools are of interest to any group developing and using software on multiple nodes.

  8. Software Piracy, Ethics, and the Academician.

    ERIC Educational Resources Information Center

    Bassler, Richard A.

    The numerous software programs available for easy, low-cost copying raise ethical questions. The problem can be examined from the viewpoints of software users, teachers, authors, vendors, and distributors. Software users might hesitate to purchase or use software which prevents the making of back-up copies for program protection. Teachers in…

  9. Ten recommendations for software engineering in research.

    PubMed

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  10. The cost of software fault tolerance

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1982-01-01

    The proposed use of software fault tolerance techniques as a means of reducing software costs in avionics and as a means of addressing the issue of system unreliability due to faults in software is examined. A model is developed to provide a view of the relationships among cost, redundancy, and reliability which suggests strategies for software development and maintenance which are not conventional.

  11. Requirements Engineering in Building Climate Science Software

    ERIC Educational Resources Information Center

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  12. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Gibbs, Norman

    1988-01-01

    The goals of the Software Engineering Institute's Education Program are as follows: to increase the number of highly qualified software engineers--new software engineers and existing practitioners; and to be the leading center of expertise for software engineering education and training. A discussion of these goals is presented in vugraph form.

  13. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Berard, Edward V.

    1988-01-01

    The following topics are discussed in the context of software engineering: early use of the term; the 1968 NATO conference; Barry Boehm's definition; four requirements fo software engineering; and additional criteria for software engineering. Additionally, the four major requirements for software engineering--computer science, mathematics, engineering disciplines, and excellent communication skills--are discussed. The presentation is given in vugraph form.

  14. User systems guidelines for software projects

    SciTech Connect

    Abrahamson, L.

    1986-04-01

    This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)

  15. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Buhler, Melanie; Valett, Jon

    1989-01-01

    An annotated bibliography is presented of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. The bibliography was updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials were grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  16. Software Engineering Improvement Activities/Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  17. GridOPTICS Software System

    2014-02-24

    GridOPTICS Software System (GOSS) is a middleware that facilitates creation of new, modular and flexible operational and planning platforms that can meet the challenges of the next generation power grid. GOSS enables Department of Energy, power system utilities, and vendors to build better tools faster. GOSS makes it possible to integrate Future Power Grid Initiative software products/prototypes into existing power grid software systems, including the PNNL PowerNet and EIOC environments. GOSS is designed to allowmore » power grid applications developed for different underlying software platforms installed in different utilities to communicate with ease. This can be done in compliance with existing security and data sharing policies between the utilities. GOSS not only supports one-to-one data transfer between applications, but also publisher/subscriber scheme. To support interoperability requirements of future EMS, GOSS is designed for CIM compliance. In addition to this, it supports authentication and authorization capabilities to protect the system from cyber threats. In summary, the contributions of the GOSS middleware are as follows: • A platform to support future EMS development. • A middleware that promotes interoperability between power grid applications. • A distributed architecture that separates data sources from power grid applications. • Support for data exchange with either one-to-one or publisher/subscriber interfaces. • An authentication and authorization scheme for limiting the access to data between utilities.« less

  18. Journal of Chemical Education: Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1989

    1989-01-01

    Described are two reviews of a computer software package: "The Periodic Table Stack," a Hypercard stack that operates on a database of information about the properties and reactions of the elements and designed to help users of Macintosh computers access data found on "KC? Discoverer." Discussed are features of the program and hardware…

  19. Analog Input Data Acquisition Software

    NASA Technical Reports Server (NTRS)

    Arens, Ellen

    2009-01-01

    DAQ Master Software allows users to easily set up a system to monitor up to five analog input channels and save the data after acquisition. This program was written in LabVIEW 8.0, and requires the LabVIEW runtime engine 8.0 to run the executable.

  20. ERP Software Implementation Best Practices.

    ERIC Educational Resources Information Center

    Frantz, Pollyanne S.; Southerland, Arthur R.; Johnson, James T.

    2002-01-01

    Studied the perceptions of chief financial and information officers of enterprise resource planning (ERP) software implementation best practices. Usable responses from 159 respondents show consensus for the most part between the perceptions of the two groups and describe some best practices that represent common ground. (SLD)