Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
Progress in the Development of a Prototype Reuse Enablement System
NASA Astrophysics Data System (ADS)
Marshall, J. J.; Downs, R. R.; Gilliam, L. J.; Wolfe, R. E.
2008-12-01
An important part of promoting software reuse is to ensure that reusable software assets are readily available to the software developers who want to use them. Through dialogs with the community, the NASA Earth Science Data Systems Software Reuse Working Group has learned that the lack of a centralized, domain- specific software repository or catalog system addressing the needs of the Earth science community is a major barrier to software reuse within the community. The Working Group has proposed the creation of such a reuse enablement system, which would provide capabilities for contributing and obtaining reusable software, to remove this barrier. The Working Group has recommended the development of a Reuse Enablement System to NASA and has performed a trade study to review systems with similar capabilities and to identify potential platforms for the proposed system. This was followed by an architecture study to determine an expeditious and cost-effective solution for this system. A number of software packages and systems were examined through both creating prototypes and examining existing systems that use the same software packages and systems. Based on the results of the architecture study, the Working Group developed a prototype of the proposed system using the recommended software package, through an iterative process of identifying needed capabilities and improving the system to provide those capabilities. Policies for the operation and maintenance of the system are being established for the system, and the identification of system policies also has contributed to the development process. Additionally, a test plan is being developed for formal testing of the prototype, to ensure that it meets all of the requirements previously developed by the Working Group. This poster summarizes the results of our work to date, focusing on the most recent activities.
Software Technology Transfer and Export Control.
1981-01-01
development projects of their own. By analogy, a Soviet team might be able to repeat the learning experience of the ADEPT-50 junior staff...recommendations concerning product form and further study . The posture of this group has been to consider software technology and its transfer as a process...and views of the Software Subgroup of Technical Working Group 7 (Computers) of the Critical Technologies Project . The work reported
Software Engineering Improvement Plan
NASA Technical Reports Server (NTRS)
2006-01-01
In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.
Software Development Group. Software Review Center. Microcomputing Working Paper Series.
ERIC Educational Resources Information Center
Perkey, Nadine; Smith, Shirley C.
Two papers describe the roles of the Software Development Group (SDG) and the Software Review Center (SRC) at Drexel University. The first paper covers the primary role of the SDG, which is designed to assist Drexel faculty with the technical design and programming of courseware for the Apple Macintosh microcomputer; the relationship of the SDG…
Enhancing Collaborative Learning through Group Intelligence Software
NASA Astrophysics Data System (ADS)
Tan, Yin Leng; Macaulay, Linda A.
Employers increasingly demand not only academic excellence from graduates but also excellent interpersonal skills and the ability to work collaboratively in teams. This paper discusses the role of Group Intelligence software in helping to develop these higher order skills in the context of an enquiry based learning (EBL) project. The software supports teams in generating ideas, categorizing, prioritizing, voting and multi-criteria decision making and automatically generates a report of each team session. Students worked in a Group Intelligence lab designed to support both face to face and computer-mediated communication and employers provided feedback at two key points in the year long team project. Evaluation of the effectiveness of Group Intelligence software in collaborative learning was based on five key concepts of creativity, participation, productivity, engagement and understanding.
Software tools for interactive instruction in radiologic anatomy.
Alvarez, Antonio; Gold, Garry E; Tobin, Brian; Desser, Terry S
2006-04-01
To promote active learning in an introductory Radiologic Anatomy course through the use of computer-based exercises. DICOM datasets from our hospital PACS system were transferred to a networked cluster of desktop computers in a medical school classroom. Medical students in the Radiologic Anatomy course were divided into four small groups and assigned to work on a clinical case for 45 minutes. The groups used iPACS viewer software, a free DICOM viewer, to view images and annotate anatomic structures. The classroom instructor monitored and displayed each group's work sequentially on the master screen by running SynchronEyes, a software tool for controlling PC desktops remotely. Students were able to execute the assigned tasks using the iPACS software with minimal oversight or instruction. Course instructors displayed each group's work on the main display screen of the classroom as the students presented the rationale for their decisions. The interactive component of the course received high ratings from the students and overall course ratings were higher than in prior years when the course was given solely in lecture format. DICOM viewing software is an excellent tool for enabling students to learn radiologic anatomy from real-life clinical datasets. Interactive exercises performed in groups can be powerful tools for stimulating students to learn radiologic anatomy.
Scientific Software - the role of best practices and recommendations
NASA Astrophysics Data System (ADS)
Fritzsch, Bernadette; Bernstein, Erik; Castell, Wolfgang zu; Diesmann, Markus; Haas, Holger; Hammitzsch, Martin; Konrad, Uwe; Lähnemann, David; McHardy, Alice; Pampel, Heinz; Scheliga, Kaja; Schreiber, Andreas; Steglich, Dirk
2017-04-01
In Geosciences - like in most other communities - scientific work strongly depends on software. For big data analysis, existing (closed or open source) program packages are often mixed with newly developed codes. Different versions of software components and varying configurations can influence the result of data analysis. This often makes reproducibility of results and reuse of codes very difficult. Policies for publication and documentation of used and newly developed software, along with best practices, can help tackle this problem. Within the Helmholtz Association a Task Group "Access to and Re-use of scientific software" was implemented by the Open Science Working Group in 2016. The aim of the Task Group is to foster the discussion about scientific software in the Open Science context and to formulate recommendations for the production and publication of scientific software, ensuring open access to it. As a first step, a workshop gathered interested scientists from institutions across Germany. The workshop brought together various existing initiatives from different scientific communities to analyse current problems, share established best practices and come up with possible solutions. The subjects in the working groups covered a broad range of themes, including technical infrastructures, standards and quality assurance, citation of software and reproducibility. Initial recommendations are presented and discussed in the talk. They are the foundation for further discussions in the Helmholtz Association and the Priority Initiative "Digital Information" of the Alliance of Science Organisations in Germany. The talk aims to inform about the activities and to link with other initiatives on the national or international level.
Introduction of the UNIX International Performance Management Work Group
NASA Technical Reports Server (NTRS)
Newman, Henry
1993-01-01
In this paper we presented the planned direction of the UNIX International Performance Management Work Group. This group consists of concerned system developers and users who have organized to synthesize recommendations for standard UNIX performance management subsystem interfaces and architectures. The purpose of these recommendations is to provide a core set of performance management functions and these functions can be used to build tools by hardware system developers, vertical application software developers, and performance application software developers.
From LPF to eLISA: new approach in payload software
NASA Astrophysics Data System (ADS)
Gesa, Ll.; Martin, V.; Conchillo, A.; Ortega, J. A.; Mateos, I.; Torrents, A.; Lopez-Zaragoza, J. P.; Rivas, F.; Lloro, I.; Nofrarias, M.; Sopuerta, CF.
2017-05-01
eLISA will be the first observatory in space to explore the Gravitational Universe. It will gather revolutionary information about the dark universe. This implies a robust and reliable embedded control software and hardware working together. With the lessons learnt with the LISA Pathfinder payload software as baseline, we will introduce in this short article the key concepts and new approaches that our group is working on in terms of software: multiprocessor, self-modifying-code strategies, 100% hardware and software monitoring, embedded scripting, Time and Space Partition among others.
NASA Technical Reports Server (NTRS)
Hamel, Gary P.; Wijesinghe, R.
1996-01-01
Groupware is a term describing an emerging computer software technology enhancing the ability of people to work together as a group, (a software driven 'group support system'). This project originated at the beginning of 1992 and reports were issued describing the activity through May 1995. These reports stressed the need for process as well as technology. That is, while the technology represented a computer assisted method for groups to work together, the Group Support System (GSS) technology als required an understanding of the facilitation process electronic meetings demand. Even people trained in traditional facilitation techniques did not necessarily aimlessly adopt groupware techniques. The latest phase of this activity attempted to (1) improve the facilitation process by developing training support for a portable groupware computer system, and (2) to explore settings and uses for the portable groupware system using different software, such as Lotus Notes.
NASA Technical Reports Server (NTRS)
Garcia, Janette
2016-01-01
The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.
HEP Software Foundation Community White Paper Working Group - Detector Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.
A working group on detector simulation was formed as part of the high-energy physics (HEP) Software Foundation's initiative to prepare a Community White Paper that describes the main software challenges and opportunities to be faced in the HEP field over the next decade. The working group met over a period of several months in order to review the current status of the Full and Fast simulation applications of HEP experiments and the improvements that will need to be made in order to meet the goals of future HEP experimental programmes. The scope of the topics covered includes the main componentsmore » of a HEP simulation application, such as MC truth handling, geometry modeling, particle propagation in materials and fields, physics modeling of the interactions of particles with matter, the treatment of pileup and other backgrounds, as well as signal processing and digitisation. The resulting work programme described in this document focuses on the need to improve both the software performance and the physics of detector simulation. The goals are to increase the accuracy of the physics models and expand their applicability to future physics programmes, while achieving large factors in computing performance gains consistent with projections on available computing resources.« less
Working Group 1: Software System Design and Implementation for Environmental Modeling
ISCMEM Working Group One Presentation, presentation with the purpose of fostering the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases.
Parker, Steve; Mayner, Lidia; Michael Gillham, David
2015-12-01
Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking.
Parker, Steve; Mayner, Lidia; Michael Gillham, David
2015-01-01
Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469
Delay Tolerant Networking on NASA's Space Communication and Navigation Testbed
NASA Technical Reports Server (NTRS)
Johnson, Sandra; Eddy, Wesley
2016-01-01
This presentation covers the status of the implementation of an open source software that implements the specifications developed by the CCSDS Working Group. Interplanetary Overlay Network (ION) is open source software and it implements specifications that have been developed by two international working groups through IETF and CCSDS. ION was implemented on the SCaN Testbed, a testbed located on an external pallet on ISS, by the GRC team. The presentation will cover the architecture of the system, high level implementation details, and issues porting ION to VxWorks.
Software And Systems Engineering Risk Management
2010-04-01
RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software
Reflecting Indigenous Culture in Educational Software Design.
ERIC Educational Resources Information Center
Fleer, Marilyn
1989-01-01
Discusses research on Australian Aboriginal cognition which relates to the development of appropriate educational software. Describes "Tinja," a software program using familiar content and experiences, Aboriginal characters and cultural values, extensive graphics and animation, peer and group work, and open-ended design to help young…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; LeCompte, Tom
2015-10-29
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
Program Model Checking as a New Trend
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.
Using Qualitative Data Analysis Software in Teaching about Group Work Practice
ERIC Educational Resources Information Center
Macgowan, Mark J.; Beaulaurier, Richard L.
2005-01-01
Courses on social group work have traditionally relied on in-class role plays to teach group work skills. The most common technological aid in such courses has been analog videotape. In recent years new technologies have emerged that allow the instructor to customize and tailor didactic experiences to individual classes and individual learners.…
Irmak, A; Bumin, G; Irmak, R
2012-01-01
In direct proportion to current technological developments, both the computer usage in the workplaces is increased and requirement of leaving the desk for an office worker in order to photocopy a document, send or receive an e-mail is decreased. Therefore, office workers stay in the same postures accompanied by long periods of keyboard usage. In recent years, with intent to reduce the incidence of work related musculoskeletal disorders several exercise reminder software programs have been developed. The purpose of this study is to evaluate the effectiveness of the exercise reminder software program on office workers' perceived pain level, work performance and quality of life. 39 healthy office workers accepted to attend the study. Participants were randomly split in to two groups, control group (n = 19) and intervention group (n = 20). Visual Analogue Scale to evaluate the perceived pain was administered all of the participants in the beginning and at the end of the study. The intervention group used the program for 10 weeks. Findings showed that the control group VAS scores remained the same, but the intervention group VAS scores decreased in a statistically significant way (p < 0.01). Results support that such exercise reminder software programs may help to reduce perceived pain among office workers. Further long term studies with more subjects are needed to describe the effects of these programs and the mechanism under these effects.
[Development of ophthalmologic software for handheld devices].
Grottone, Gustavo Teixeira; Pisa, Ivan Torres; Grottone, João Carlos; Debs, Fernando; Schor, Paulo
2006-01-01
The formulas for calculation of intraocular lenses have evolved since the first theoretical formulas by Fyodorov. Among the second generation formulas, the SRK-I formula has a simple calculation, taking into account a calculation that only involved anteroposterior length, IOL constant and average keratometry. With the evolution of those formulas, complexicity increased making the reconfiguration of parameters in special situations impracticable. In this way the production and development of software for such a purpose, can help surgeons to recalculate those values if needed. To idealize, develop and test a Brazilian software for calculation of IOL dioptric power for handheld computers. For the development and programming of software for calculation of IOL, we used PocketC program (OrbWorks Concentrated Software, USA). We compared the results collected from a gold-standard device (Ultrascan/Alcon Labs) with the simulation of 100 fictitious patients, using the same IOL parameters. The results were grouped for ULTRASCAN data and SOFTWARE data. Using SRK/T formula the range of those parameters included a keratometry varying between 35 and 55D, axial length between 20 and 28 mm, IOL constants of 118.7, 118.3 and 115.8. Using Wilcoxon test, it was shown that the groups do not differ (p=0.314). We had a variation in the Ultrascan sample between 11.82 and 27.97. In the tested program sample the variation was practically similar (11.83-27.98). The average of the Ultrascan group was 20.93. The software group had a similar average. The standard deviation of the samples was also similar (4.53). The precision of IOL software for handheld devices was similar to that of the standard devices using the SRK/T formula. The software worked properly, was steady without bugs in tested models of operational system.
Report on the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3)
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Choi, Sou-Cheng T.; Niemeyer, Kyle E.; Hetherington, James; Löffler, Frank; Gunter, Dan; Idaszak, Ray; Brandt, Steven R.; Miller, Mark A.; Gesing, Sandra; Jones, Nick D.; Weber, Nic; Marru, Suresh; Allen, Gabrielle; Penzenstadler, Birgit; Venters, Colin C.; Davis, Ethan; Hwang, Lorraine; Todorov, Ilian; Patra, Abani; de Val-Borro, Miguel
2016-02-01
This report records and discusses the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3). The report includes a description of the keynote presentation of the workshop, which served as an overview of sustainable scientific software. It also summarizes a set of lightning talks in which speakers highlighted to-the-point lessons and challenges pertaining to sustaining scientific software. The final and main contribution of the report is a summary of the discussions, future steps, and future organization for a set of self-organized working groups on topics including developing pathways to funding scientific software; constructing useful common metrics for crediting software stakeholders; identifying principles for sustainable software engineering design; reaching out to research software organizations around the world; and building communities for software sustainability. For each group, we include a point of contact and a landing page that can be used by those who want to join that group's future activities. The main challenge left by the workshop is to see if the groups will execute these activities that they have scheduled, and how the WSSSPE community can encourage this to happen.
Educational Software Evaluation Form for Teachers
ERIC Educational Resources Information Center
Kara, Yilmaz
2007-01-01
The purpose of the study was to develop an educational software evaluation form to provide an evaluation and selection instrument of educational software that met the requirements of some balance between mechanics, content and pedagogy that is user friendly. The subjects for the study comprised a group of 32 biology teachers working in secondary…
Perspectives on Group Work in Distance Learning
ERIC Educational Resources Information Center
Hausstatter, Rune Sarromaa; Nordkvelle, Yngve Troye
2007-01-01
Current distance education benefits greatly from educational software that makes group work possible for students who are separated in time and space. However, some students prefer distance education because they can work on their own. This paper explores how students react to expectations on behalf of the course provider to do their assignments…
NASA Astrophysics Data System (ADS)
Kang, Won-Seok; Son, Chang-Sik; Lee, Sangho; Choi, Rock-Hyun; Ha, Yeong-Mi
2017-07-01
In this paper, we introduce a wellness software platform, called WellnessHumanCare, is a semi-automatic wellness management software platform which has the functions of complex wellness data acquisition(mental, physical and environmental one) with smart wearable devices, complex wellness condition analysis, private-aware online/offline recommendation, real-time monitoring apps (Smartphone-based, Web-based) and so on and we has demonstrated a wellness management service with 79 participants (experimental group: 39, control group: 40) who has worked at experimental group (H Corp.) and control group (K Corp.), Korea and 3 months in order to show the efficiency of the WellnessHumanCare.
NASA Technical Reports Server (NTRS)
1989-01-01
At their March 1988 meeting, members of the National Aeronautics and Space Administration (NASA) Information Resources Management (IRM) Council expressed concern that NASA may not have the infrastructure necessary to support the use of Ada for major NASA software projects. Members also observed that the agency has no coordinated strategy for applying its experiences with Ada to subsequent projects (Hinners, 27 June 1988). To deal with these problems, the IRM Council chair appointed an intercenter Ada and Software Management Assessment Working Group (ASMAWG). They prepared a report (McGarry et al., March 1989) entitled, 'Ada and Software Management in NASA: Findings and Recommendations'. That report presented a series of recommendations intended to enable NASA to develop better software at lower cost through the use of Ada and other state-of-the-art software engineering technologies. The purpose here is to describe the steps (called objectives) by which this goal may be achieved, to identify the NASA officials or organizations responsible for carrying out the steps, and to define a schedule for doing so. This document sets forth four goals: adopt agency-wide software standards and policies; use Ada as the programming language for all mission software; establish an infrastructure to support software engineering, including the use of Ada, and to leverage the agency's software experience; and build the agency's knowledge base in Ada and software engineering. A schedule for achieving the objectives and goals is given.
NASA Technical Reports Server (NTRS)
Dunham, J. R. (Editor); Knight, J. C. (Editor)
1982-01-01
The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.
The purpose of the Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) is to foster the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases that are all in the public domain. It is compos...
Tools to Support the Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin
2011-01-01
The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (SRWG) is chartered with the investigation, production, and dissemination of information related to the reuse of NASA Earth science software assets. One major current objective is to engage the NASA decadal missions in areas relevant to software reuse. In this paper we report on the current status of these activities. First, we provide some background on the SRWG in general and then discuss the group s flagship recommendation, the NASA Reuse Readiness Levels (RRLs). We continue by describing areas in which mission software may be reused in the context of NASA decadal missions. We conclude the paper with pointers to future directions.
Avionics Simulation, Development and Software Engineering
NASA Technical Reports Server (NTRS)
2002-01-01
During this reporting period, all technical responsibilities were accomplished as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14), the MSFC EXPRESS Project Office (FD31), and the Huntsville Boeing Company. Accomplishments included: performing special tasks; supporting Software Review Board (SRB), Avionics Test Bed (ATB), and EXPRESS Software Control Panel (ESCP) activities; participating in technical meetings; and coordinating issues between the Boeing Company and the MSFC Project Office.
Sequence System Building Blocks: Using a Component Architecture for Sequencing Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; O'Reilly, Taifun
2005-01-01
Over the last few years software engineering has made significant strides in making more flexible architectures and designs possible. However, at the same time, spacecraft have become more complex and flight software has become more sophisticated. Typically spacecraft are often one-of-a-kind entities that have different hardware designs, different capabilities, different instruments, etc. Ground software has become more complex and operations teams have had to learn a myriad of tools that all have different user interfaces and represent data in different ways. At Jet Propulsion Laboratory (JPL) these themes have collided to require an new approach to producing ground system software. Two different groups have been looking at tackling this particular problem. One group is working for the JPL Mars Technology Program in the Mars Science Laboratory (MSL) Focused Technology area. The other group is the JPL Multi-Mission Planning and Sequencing Group . The major concept driving these two approaches on a similar path is to provide software that can be a more cohesive flexible system that provides a act of planning and sequencing system of services. This paper describes the efforts that have been made to date to create a unified approach from these disparate groups.
Sequencing System Building Blocks: Using a Component Architecture for Sequencing Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; O'Reilly, Taifun
2006-01-01
Over the last few years software engineering has made significant strides in making more flexible architectures and designs possible. However, at the same time, spacecraft have become more complex and flight software has become more sophisticated. Typically spacecraft are often one-of-a-kind entities that have different hardware designs, different capabilities, different instruments, etc. Ground software has become more complex and operations teams have had to learn a myriad of tools that all have different user interfaces and represent data in different ways. At Jet Propulsion Laboratory (JPL) these themes have collided to require a new approach to producing ground system software. Two different groups have been looking at tackling this particular problem. One group is working for the JPL Mars Technology Program in the Mars Science Laboratory (MSL) Focused Technology area. The other group is the JPL Multi-Mission Planning and Sequencing Group. The major concept driving these two approaches on a similar path is to provide software that can be a more cohesive flexible system that provides a set of planning and sequencing system of services. This paper describes the efforts that have been made to date to create a unified approach from these disparate groups.
Sailer, Irena; Benic, Goran I; Fehmer, Vincent; Hämmerle, Christoph H F; Mühlemann, Sven
2017-07-01
Clinical studies are needed to evaluate the entire digital and conventional workflows in prosthetic dentistry. The purpose of the second part of this clinical study was to compare the laboratory production time for tooth-supported single crowns made with 4 different digital workflows and 1 conventional workflow and to compare these crowns clinically. For each of 10 participants, a monolithic crown was fabricated in lithium disilicate-reinforced glass ceramic (IPS e.max CAD). The computer-aided design and computer-aided manufacturing (CAD-CAM) systems were Lava C.O.S. CAD software and centralized CAM (group L), Cares CAD software and centralized CAM (group iT), Cerec Connect CAD software and lab side CAM (group CiL), and Cerec Connect CAD software with centralized CAM (group CiD). The conventional fabrication (group K) included a wax pattern of the crown and heat pressing according to the lost-wax technique (IPS e.max Press). The time for the fabrication of the casts and the crowns was recorded. Subsequently, the crowns were clinically evaluated and the corresponding treatment times were recorded. The Paired Wilcoxon test with the Bonferroni correction was applied to detect differences among treatment groups (α=.05). The total mean (±standard deviation) active working time for the dental technician was 88 ±6 minutes in group L, 74 ±12 minutes in group iT, 74 ±5 minutes in group CiL, 92 ±8 minutes in group CiD, and 148 ±11 minutes in group K. The dental technician spent significantly more working time for the conventional workflow than for the digital workflows (P<.001). No statistically significant differences were found between group L and group CiD or between group iT and group CiL. No statistical differences in time for the clinical evaluation were found among groups, indicating similar outcomes (P>.05). Irrespective of the CAD-CAM system, the overall laboratory working time for a digital workflow was significantly shorter than for the conventional workflow, since the dental technician needed less active working time. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
2011-05-27
frameworks 4 CMMI-DEV IEEE / ISO / IEC 15288 / 12207 Quality Assurance ©2011 Walz IEEE Life Cycle Processes & Artifacts • Systems Life Cycle Processes...TAG to ISO TC 176 Quality Management • Quality: ASQ, work experience • Software: three books, consulting, work experience • Systems: Telecom & DoD...and IEEE 730 SQA need to align. The P730 IEEE standards working group has expanded the scope of the SQA process standard to align with IS 12207
V & V Within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1996-01-01
Verification and validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission critical software. This paper describes the working group's success in identifying V&V tasks that could be performed in the domain engineering and transition levels of reuse-based software engineering. The primary motivation for V&V at the domain level is to provide assurance that the domain requirements are correct and that the domain artifacts correctly implement the domain requirements. A secondary motivation is the possible elimination of redundant V&V activities at the application level. The group also considered the criteria and motivation for performing V&V in domain engineering.
Health software: a new CEI Guide for software management in medical environment.
Giacomozzi, Claudia; Martelli, Francesco
2016-01-01
The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.
A Stigmergy Approach for Open Source Software Developer Community Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E
2009-01-01
The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agentmore » behaviors selection probability.« less
Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.
Hamilton, Ryan; Tamminana, Krishna; Boyd, John; Sasaki, Gen; Toda, Alex; Haskell, Sid; Danbe, Elizabeth
2013-04-01
We present a software platform developed by Genentech and MathWorks Consulting Group that allows arbitrary MATLAB (MATLAB is a registered trademark of The MathWorks, Inc.) functions to perform supervisory control of process equipment (in this case, fermentors) via the OLE for process control (OPC) communication protocol, under the direction of an industrial automation layer. The software features automated synchronization and deployment of server control code and has been proven to be tolerant of OPC communication interruptions. Since deployment in the spring of 2010, this software has successfully performed supervisory control of more than 700 microbial fermentations in the Genentech pilot plant and has enabled significant reductions in the time required to develop and implement novel control strategies (months reduced to days). The software is available for download at the MathWorks File Exchange Web site at http://www.mathworks.com/matlabcentral/fileexchange/36866.
Characterizing the scientific potential of satellite sensors. [San Francisco, California
NASA Technical Reports Server (NTRS)
1984-01-01
Eleven thematic mapper (TM) radiometric calibration programs were tested and evaluated in support of the task to characterize the potential of LANDSAT TM digital imagery for scientific investigations in the Earth sciences and terrestrial physics. Three software errors related to integer overflow, divide by zero, and nonexist file group were found and solved. Raw, calibrated, and corrected image groups that were created and stored on the Barker2 disk are enumerated. Black and white pixel print files were created for various subscenes of a San Francisco scene (ID 40392-18152). The development of linear regression software is discussed. The output of the software and its function are described. Future work in TM radiometric calibration, image processing, and software development is outlined.
A User’s Guide for the Software Technology Economic Impact Model
1991-10-01
Copy 16 oft22 oples U AD-A248 023 \\\\1\\\\\\i\\~\\1\\1\\\\1\\1\\11\\\\\\l~ IDA DOCUMENT D-971 E Tt USER’S GUIDE FOR THE SOFTWARE TECHNOLOGY ECONOMIC IMPACT MODEL I... studied , and they are released by the President of IDA. Group Reports Group Reports record the findings and results of IDA established working groups and...the senior individuals responsible for the project and others as selected by IDA to ensure their high quality and relevance to the problems studied
Ada and software management in NASA: Assessment and recommendations
NASA Technical Reports Server (NTRS)
1989-01-01
Recent NASA missions have required software systems that are larger, more complex, and more critical than NASA software systems of the past. The Ada programming language and the software methods and support environments associated with it are seen as potential breakthroughs in meeting NASA's software requirements. The findings of a study by the Ada and Software Management Assessment Working Group (ASMAWG) are presented. The study was chartered to perform three tasks: (1) assess the agency's ongoing and planned Ada activities; (2) assess the infrastructure (standards, policies, and internal organizations) supporting software management and the Ada activities; and (3) present an Ada implementation and use strategy appropriate for NASA over the next 5 years.
Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin
2010-01-01
Software assets from existing Earth science missions can be reused for the new decadal survey missions that are being planned by NASA in response to the 2007 Earth Science National Research Council (NRC) Study. The new missions will require the development of software to curate, process, and disseminate the data to science users of interest and to the broader NASA mission community. In this paper, we discuss new tools and a blossoming community that are being developed by the Earth Science Data System (ESDS) Software Reuse Working Group (SRWG) to improve capabilities for reusing NASA software assets.
Data format standard for sharing light source measurements
NASA Astrophysics Data System (ADS)
Gregory, G. Groot; Ashdown, Ian; Brandenburg, Willi; Chabaud, Dominique; Dross, Oliver; Gangadhara, Sanjay; Garcia, Kevin; Gauvin, Michael; Hansen, Dirk; Haraguchi, Kei; Hasna, Günther; Jiao, Jianzhong; Kelley, Ryan; Koshel, John; Muschaweck, Julius
2013-09-01
Optical design requires accurate characterization of light sources for computer aided design (CAD) software. Various methods have been used to model sources, from accurate physical models to measurement of light output. It has become common practice for designers to include measured source data for design simulations. Typically, a measured source will contain rays which sample the output distribution of the source. The ray data must then be exported to various formats suitable for import into optical analysis or design software. Source manufacturers are also making measurements of their products and supplying CAD models along with ray data sets for designers. The increasing availability of data has been beneficial to the design community but has caused a large expansion in storage needs for the source manufacturers since each software program uses a unique format to describe the source distribution. In 2012, the Illuminating Engineering Society (IES) formed a working group to understand the data requirements for ray data and recommend a standard file format. The working group included representatives from software companies supplying the analysis and design tools, source measurement companies providing metrology, source manufacturers creating the data and users from the design community. Within one year the working group proposed a file format which was recently approved by the IES for publication as TM-25. This paper will discuss the process used to define the proposed format, highlight some of the significant decisions leading to the format and list the data to be included in the first version of the standard.
The Assistant for Specifying the Quality Software (ASQS) Operational Concept Document. Volume 1
1990-09-01
Assistant in which the manager supplies system-specific characteristics and needs and the Assistant fills in the software quality concepts and methods. The...member(s) of the Computer Resources Working Group (CRWG) to aid in performing a software quality engineering study. Figure 3.4-1 outlines the...need to recovery from faults more likely than need _o provide alternative functions or interfaces), and more on Autcncmy - 27 - that Modularity
First year of ALMA site software deployment: where everything comes together
NASA Astrophysics Data System (ADS)
González, Víctor; Mora, Matias; Araya, Rodrigo; Arredondo, Diego; Bartsch, Marcelo; Burgos, Pablo; Ibsen, Jorge; Reveco, Johnny; Sáez, Norman; Schemrl, Anton; Sepulveda, Jorge; Shen, Tzu-Chiang; Soto, Rubén; Troncoso, Nicolás; Zambrano, Mauricio; Barriga, Nicolás; Glendenning, Brian; Raffi, Gianni; Kern, Jeff
2010-07-01
Starting 2009, the ALMA project initiated one of its most exciting phases within construction: the first antenna from one of the vendors was delivered to the Assembly, Integration and Verification team. With this milestone and the closure of the ALMA Test Facility in New Mexico, the JAO Computing Group in Chile found itself in the front line of the project's software deployment and integration effort. Among the group's main responsibilities are the deployment, configuration and support of the observation systems, in addition to infrastructure administration, all of which needs to be done in close coordination with the development groups in Europe, North America and Japan. Software support has been the primary interaction key with the current users (mainly scientists, operators and hardware engineers), as the software is normally the most visible part of the system. During this first year of work with the production hardware, three consecutive software releases have been deployed and commissioned. Also, the first three antennas have been moved to the Array Operations Site, at 5.000 meters elevation, and the complete end-to-end system has been successfully tested. This paper shares the experience of this 15-people group as part of the construction team at the ALMA site, and working together with Computing IPT, on the achievements and problems overcomed during this period. It explores the excellent results of teamwork, and also some of the troubles that such a complex and geographically distributed project can run into. Finally, it approaches the challenges still to come, with the transition to the ALMA operations plan.
PBL-SEE: An Authentic Assessment Model for PBL-Based Software Engineering Education
ERIC Educational Resources Information Center
dos Santos, Simone C.
2017-01-01
The problem-based learning (PBL) approach has been successfully applied to teaching software engineering thanks to its principles of group work, learning by solving real problems, and learning environments that match the market realities. However, the lack of well-defined methodologies and processes for implementing the PBL approach represents a…
Vahle-Hinz, K; Rybczynski, A; Jakstat, H; Ahlers, M O
2009-01-01
Condylar position analysis facilitates a quantitative comparison of the condylar position with and without a bite record, different records and changed influencing factors. Handling by the examiner when positioning the model is a significant factor with regard to the accuracy of the examination. Measurement accuracy could be improved when positioning the models by using special working bites, hence the objective of the experiments described in this study consisted in examining the extent to which the measuring results are influenced by different examiners and by using working bites. In the first trial, one examiner performed ten measurements without and with an interposed working bite for five model pairs in each case. In the second trial, nine examiners (three specialized dentists, three dental assistants, three students) performed ten measurements in each case without and with an interposed working bite. The three-dimensional position was read digitally with the E-CPM (Gamma Dental, Klosterneuburg/Vienna, Austria), recorded by means of spreadsheet software (Microsoft Excel) and diagnostic software (CMDfact, CMD3D module, dentaConcept, Hamburg), and evaluated with graphing software (Sigma Plot, Systat Software, USA). In the first trial, it was shown that the reproducibility of mounting was improved markedly (p <0.01) by using bite records in the form of working bites. In the second trial, it was shown that the mean error increased significantly (p <0.01) when several examiners performed the measurements compared with the results of one examiner alone. No significantly different results occurred (p < 0.01) in the comparison of the different groups of examiners with different educational and training backgrounds. This applied for the mounting methods without and with working bite. On the other hand, the reproducibility of mounting improved distinctly (p<0.01) in every group of examiners when working bites were used. Reproducibility of condylar position analysis was improved significantly by mounting the models with special working bites. This applied for operators of different professional background (dentists, dental assistants and dental students), while there were no significant differences between results of the three groups.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED 14), the MSFC EXPRESS Project Office (FD3 l), and the Huntsville Boeing Company. Work accomplishments included the support of SRB activities, ATB activities, ESCP activities, participating in technical meetings, coordinating issues between the Boeing Company and the MSFC Project Office, and performing special tasks as requested.
The Software Distribution for Gemini Observatory's Science Operations Group
NASA Astrophysics Data System (ADS)
Hoenig, M. D.; Clarke, M.; Pohlen, M.; Hirst, P.
2014-05-01
Gemini Observatory consists of two telescopes in different hemispheres. It also operates mostly on a queue observing model, meaning observations are performed by staff working shifts as opposed to PIs. For these two reasons alone, maintaining and distributing a diverse software suite is not a trivial matter. We present a way to make the appropriate tools available to staff at Gemini North and South, whether they are working on the summit or from our base facility offices in Hilo, Hawai'i and La Serena, Chile.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
Using Modern Methodologies with Maintenance Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.
2014-01-01
Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.
NASA Astrophysics Data System (ADS)
Arif Shah, Muhammad; Hashim, Rathiah; Shah, Adil Ali; Farooq Khattak, Umar
2016-11-01
Developing software through Global Software Development (GSD) became very common now days in the software industry. Pakistan is one of the countries where projects are taken and designed from different countries including Afghanistan. The purpose of this paper is to identify and provide an analysis on several communication barriers that can have a negative impact on the project and to provide management guidelines for medium size software organizations working in Pakistan with clients from Afghanistan and to overcome these communication barriers and challenges organizations face when coordinating with client. Initially we performed a literature review to identify different communication barriers and to check if there are any standardized communications management guidelines for medium size software houses provided in the past. The second stage of the research paper involves guidelines with vendor's perspective that include interviews and focus group discussions with different stakeholders and employees of software houses with clients from Afghanistan. Based on those interviews and discussions we established communication management guidelines in order to overcome the communication problems and barriers working with clients from Afghanistan. As a result of the literature review, we have identified that barriers such as cultural barriers and language barrier were one of the main reasons behind the project failure and suggested that software organizations working in Pakistan should follow certain defined communication guidelines in order to overcome communication barriers that affect the project directly.
HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauerdick, Lothar
At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. Asmore » part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.« less
Evaluation of open source data mining software packages
Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt
2009-01-01
Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...
Reference datasets for bioequivalence trials in a two-group parallel design.
Fuglsang, Anders; Schütz, Helmut; Labes, Detlew
2015-03-01
In order to help companies qualify and validate the software used to evaluate bioequivalence trials with two parallel treatment groups, this work aims to define datasets with known results. This paper puts a total 11 datasets into the public domain along with proposed consensus obtained via evaluations from six different software packages (R, SAS, WinNonlin, OpenOffice Calc, Kinetica, EquivTest). Insofar as possible, datasets were evaluated with and without the assumption of equal variances for the construction of a 90% confidence interval. Not all software packages provide functionality for the assumption of unequal variances (EquivTest, Kinetica), and not all packages can handle datasets with more than 1000 subjects per group (WinNonlin). Where results could be obtained across all packages, one showed questionable results when datasets contained unequal group sizes (Kinetica). A proposal is made for the results that should be used as validation targets.
Implementing large projects in software engineering courses
NASA Astrophysics Data System (ADS)
Coppit, David
2006-03-01
In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that threaten the realism of large projects. Third, quantitative evaluation of individuals who work in groups is notoriously difficult. As a result, many software engineering courses compromise the project experience by reducing the team sizes, project scope, and risk. In this paper, we present an approach to teaching a one-semester software engineering course in which 20 to 30 students work together to construct a moderately sized (15KLOC) software system. The approach combines carefully coordinated lectures and homeworks, a hierarchical project management structure, modern communication technologies, and a web-based project tracking and individual assessment system. Our approach provides a more realistic project experience for the students, without incurring significant additional overhead for the instructor. We present our experiences using the approach the last 2 years for the software engineering course at The College of William and Mary. Although the approach has some weaknesses, we believe that they are strongly outweighed by the pedagogical benefits.
Job satisfaction, job stress and psychosomatic health problems in software professionals in India
Madhura, Sahukar; Subramanya, Pailoor; Balaram, Pradhan
2014-01-01
This questionnaire based study investigates correlation between job satisfaction, job stress and psychosomatic health in Indian software professionals. Also, examines how yoga practicing Indian software professionals cope up with stress and psychosomatic health problems. The sample consisted of yoga practicing and non-yoga practicing Indian software professionals working in India. The findings of this study have shown that there is significant correlation among job satisfaction, job stress and health. In Yoga practitioners job satisfaction is not significantly related to Psychosomatic health whereas in non-yoga group Psychosomatic Health symptoms showed significant relationship with Job satisfaction. PMID:25598623
Student project of optical system analysis API-library development
NASA Astrophysics Data System (ADS)
Ivanova, Tatiana; Zhukova, Tatiana; Dantcaranov, Ruslan; Romanova, Maria; Zhadin, Alexander; Ivanov, Vyacheslav; Kalinkina, Olga
2017-08-01
In the paper API-library software developed by students of Applied and Computer Optics Department (ITMO University) for optical system design is presented. The library performs paraxial and real ray tracing, calculates 3d order (Seidel) aberration and real ray aberration of axis and non-axis beams (wave, lateral, longitudinal, coma, distortion etc.) and finally, approximate wave aberration by Zernike polynomials. Real aperture can be calculated by considering of real rays tracing failure on each surface. So far we assume optical system is centered, with spherical or 2d order aspherical surfaces. Optical glasses can be set directly by refraction index or by dispersion coefficients. The library can be used for education or research purposes in optical system design area. It provides ready to use software functions for optical system simulation and analysis that developer can simply plug into their software development for different purposes, for example for some specific synthesis tasks or investigation of new optimization modes. In the paper we present an example of using the library for development of cemented doublet synthesis software based on Slusarev's methodology. The library is used in optical system optimization recipes course for deep studying of optimization model and its application for optical system design. Development of such software is an excellent experience for students and help to understanding optical image modeling and quality analysis. This development is organized as student group joint project. We try to organize it as a group in real research and development project, so each student has his own role in the project and then use whole library functionality in his own master or bachelor thesis. Working in such group gives students useful experience and opportunity to work as research and development engineer of scientific software in the future.
Software tool for physics chart checks.
Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa
2014-01-01
Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.
Physically Based Rendering in the Nightshade NG Visualization Platform
NASA Astrophysics Data System (ADS)
Berglund, Karrie; Larey-Williams, Trystan; Spearman, Rob; Bogard, Arthur
2015-01-01
This poster describes our work on creating a physically based rendering model in Nightshade NG planetarium simulation and visualization software (project website: NightshadeSoftware.org). We discuss techniques used for rendering realistic scenes in the universe and dealing with astronomical distances in real time on consumer hardware. We also discuss some of the challenges of rewriting the software from scratch, a project which began in 2011.Nightshade NG can be a powerful tool for sharing data and visualizations. The desktop version of the software is free for anyone to download, use, and modify; it runs on Windows and Linux (and eventually Mac). If you are looking to disseminate your data or models, please stop by to discuss how we can work together.Nightshade software is used in literally hundreds of digital planetarium systems worldwide. Countless teachers and astronomy education groups run the software on flat screens. This wide use makes Nightshade an effective tool for dissemination to educators and the public.Nightshade NG is an especially powerful visualization tool when projected on a dome. We invite everyone to enter our inflatable dome in the exhibit hall to see this software in a 3D environment.
Current Practice in Software Development for Computational Neuroscience and How to Improve It
Gewaltig, Marc-Oliver; Cannon, Robert
2014-01-01
Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research. PMID:24465191
Current practice in software development for computational neuroscience and how to improve it.
Gewaltig, Marc-Oliver; Cannon, Robert
2014-01-01
Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.
CAD/CAM approach to improving industry productivity gathers momentum
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1982-01-01
Recent results and planning for the NASA/industry Integrated Programs for Aerospace-Vehicle Design (IPAD) program for improving productivity with CAD/CAM methods are outlined. The industrial group work is being mainly done by Boeing, and progress has been made in defining the designer work environment, developing requirements and a preliminary design for a future CAD/CAM system, and developing CAD/CAM technology. The work environment was defined by conducting a detailed study of a reference design process, and key software elements for a CAD/CAM system have been defined, specifically for interactive design or experiment control processes. Further work is proceeding on executive, data management, geometry and graphics, and general utility software, and dynamic aspects of the programs being developed are outlined
JPL Facilities and Software for Collaborative Design: 1994 - Present
NASA Technical Reports Server (NTRS)
DeFlorio, Paul A.
2004-01-01
The viewgraph presentation provides an overview of the history of the JPL Project Design Center (PDC) and, since 2000, the Center for Space Mission Architecture and Design (CSMAD). The discussion includes PDC objectives and scope; mission design metrics; distributed design; a software architecture timeline; facility design principles; optimized design for group work; CSMAD plan view, facility design, and infrastructure; and distributed collaboration tools.
NASA's Software Safety Standard
NASA Technical Reports Server (NTRS)
Ramsay, Christopher M.
2005-01-01
NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
Interweaving Objects, Gestures, and Talk in Context
ERIC Educational Resources Information Center
Brassac, Christian; Fixmer, Pierre; Mondada, Lorenza; Vinck, Dominique
2008-01-01
In a large French hospital, a group of professional experts (including physicians and software engineers) are working on the computerization of a blood-transfusion traceability device. By focusing on a particular moment in this slow process of design, we analyze their collaborative practices during a work session. The analysis takes a…
The TSO Logic and G2 Software Product
NASA Technical Reports Server (NTRS)
Davis, Derrick D.
2014-01-01
This internship assignment for spring 2014 was at John F. Kennedy Space Center (KSC), in NASAs Engineering and Technology (NE) group in support of the Control and Data Systems Division (NE-C) within the Systems Hardware Engineering Branch. (NEC-4) The primary focus was in system integration and benchmarking utilizing two separate computer software products. The first half of this 2014 internship is spent in assisting NE-C4s Electronics and Embedded Systems Engineer, Kelvin Ruiz and fellow intern Scott Ditto with the evaluation of a newly piece of software, called G2. Its developed by the Gensym Corporation and introduced to the group as a tool used in monitoring launch environments. All fellow interns and employees of the G2 group have been working together in order to better understand the significance of the G2 application and how KSC can benefit from its capabilities. The second stage of this Spring project is to assist with an ongoing integration of a benchmarking tool, developed by a group of engineers from a Canadian based organization known as TSO Logic. Guided by NE-C4s Computer Engineer, Allen Villorin, NASA 2014 interns put forth great effort in helping to integrate TSOs software into the Spaceport Processing Systems Development Laboratory (SPSDL) for further testing and evaluating. The TSO Logic group claims that their software is designed for, monitoring and reducing energy consumption at in-house server farms and large data centers, allows data centers to control the power state of servers, without impacting availability or performance and without changes to infrastructure and the focus of the assignment is to test this theory. TSOs Aaron Rallo Founder and CEO, and Chris Tivel CTO, both came to KSC to assist with the installation of their software in the SPSDL laboratory. TSOs software is installed onto 24 individual workstations running three different operating systems. The workstations were divided into three groups of 8 with each group having its own operating system. The first group is comprised of Ubuntus Debian -based Linux the second group is windows 7 Professional and the third group ran Red Hat Linux. The highlight of this portion of the assignment is to compose documentation expressing the overall impression of the software and its capabilities.
NASA's Intelligent Robotics Group
2017-01-06
Shareable video highlighting the Intelligent Robotics Group's 25 years of experience developing tools to allow humans and robots to work as teammates. Highlights the VERVE software, which allows researchers to see a 3D representation of the robot's world and mentions how Nissan is using a version of VERVE in the autonomous vehicle research.
Supporting Executive Functions during Children's Preliteracy Learning with the Computer
ERIC Educational Resources Information Center
Van de Sande, E.; Segers, E.; Verhoeven, L.
2016-01-01
The present study examined how embedded activities to support executive functions helped children to benefit from a computer intervention that targeted preliteracy skills. Three intervention groups were compared on their preliteracy gains in a randomized controlled trial design: an experimental group that worked with software to stimulate early…
Fourth Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE4)
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Niemeyer, Kyle E.; Gesing, Sandra; Hwang, Lorraine; Bangerth, Wolfgang; Hettrick, Simon; Idaszak, Ray; Salac, Jean; Hong, Neil Chue; Núñez-Corrales, Santiago; Allen, Alice; Geiger, R. Stuart; Miller, Jonah; Chen, Emily; Dubey, Anshu; Lago, Patricia
This article summarizes motivations, organization, and activities of the Fourth Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE4). The WSSSPE series promotes sustainable research software by positively impacting principles and best practices, careers, learning, and credit. This article discusses the code of conduct; the mission and vision statements that were drafted at the workshop and finalized shortly after it; the keynote and idea papers, position papers, experience papers, demos, and lightning talks presented during the workshop; and a panel discussion on best practices. The main part of the article discusses the set of working groups that formed during the meeting, along with contact information for readers who may want to join a group. Finally, it discusses a survey of the workshop attendees.
Present status and future of the sophisticated work station
NASA Astrophysics Data System (ADS)
Ishida, Haruhisa
The excellency of the work station is explained, by comparing the functions of software and hardware of work station with those of personal computer. As one of the examples utilizing the functions of work station, desk top publishing is explained. By describing the competition between the Group of ATT · Sun Microsystems which intends to have the leadership by integrating Berkeley version which is most popular at this moment and System V version, and the group led by IBM, future of UNIX as OS of work station is predicted. Development of RISC processor, TRON Plan and Sigma Projects by MITI are also mentioned as its background.
UTChem - A Program for Ab Initio Quantum Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yanai, Takeshi; Nakano, Haruyuki; Nakajima, Takahito
2003-06-18
UTChem is a quantum chemistry software developed by Hirao's group at the University of Tokyo. UTChem is a research product of our work to develop new and better theoretical methods in quantum chemistry.
Characterization of Morphology using MAMA Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gravelle, Julie
The MAMA (Morphological Analysis for Material Attribution) software was developed at the Los Alamos National Laboratory funded through the National Technical Nuclear Forensics Center in the Department of Homeland Security. The software allows images to be analysed and quantified. The largest project I worked on was to quantify images of plutonium oxides and ammonium diuranates prepared by the group with the software and provide analyses on the particles of each sample. Images were quantified through MAMA, with a color analysis, a lexicon description and powder x-ray diffraction. Through this we were able to visually see a difference between some ofmore » the syntheses. An additional project was to revise the manual for MAMA to help streamline training and provide useful tips to users to more quickly become acclimated to using the software. The third project investigated expanding the scope of MAMA and finding a statistically relevant baseline for the particulates through the analysis of maps in the software and using known measurements to compare the error associated with the software. During this internship, I worked on several different projects dealing with the MAMA software. The revision of the usermanual for the MAMA software was the first project I was able to work and collaborate on. I first learned how to use the software by getting instruction from a skilled user at the laboratory, Dan Schwartz, and by using the existing user manual and examples. After becoming accustomed to the program, I started to go over the manual to correct and change items that were not as useful or descriptive as they could have been. I also added in tips that I learned as I explored the software. The updated manual was also worked on by several others who have been developing the program. The goal of these revisions was to ensure the most concise and simple directions to the software were available to future users. By incorporating tricks and shortcuts that I discovered and picked up from watching other users into the user guide, I believe that anyone who utilizes the software will be able to quickly understand the best way to analyze their image and use the tools the program offers to achieve useful results.« less
Software engineering and automatic continuous verification of scientific software
NASA Astrophysics Data System (ADS)
Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.
2011-12-01
Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.
Strategy for a DOD Software Initiative. Volume 2. Appendices
1983-10-01
Druffel 9 PERFORMING ORGANIZATION NAME AND ADDRESS 1 10 . PROGRAM ELEMENT. PROJECT. TASK AREA 6 WORK UN17 NUMBERS Office of Under Secretary of Defense...New Approach j Lwering DoD Software Costs, Honeywell Aerospace and Defense Group, March 1982. 3 10 p - -q Recognizing that the opportunities and needs...Epstein, M. Fallon, R. A. Farrar, B. L. Fischer, Herman Fisher, Dave Fowler, Northrup, III Fox, Joseph Frager, David S. Frank, Geoffrey A. Fredette
Whole earth modeling: developing and disseminating scientific software for computational geophysics.
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.
C-C1-04: Building a Health Services Information Technology Research Environment
Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F
2010-01-01
Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.
Evaluation of Open-Source Hard Real Time Software Packages
NASA Technical Reports Server (NTRS)
Mattei, Nicholas S.
2004-01-01
Reliable software is, at times, hard to find. No piece of software can be guaranteed to work in every situation that may arise during its use here at Glenn Research Center or in space. The job of the Software Assurance (SA) group in the Risk Management Office is to rigorously test the software in an effort to ensure it matches the contract specifications. In some cases the SA team also researches new alternatives for selected software packages. This testing and research is an integral part of the department of Safety and Mission Assurance. Real Time operation in reference to a computer system is a particular style of handing the timing and manner with which inputs and outputs are handled. A real time system executes these commands and appropriate processing within a defined timing constraint. Within this definition there are two other classifications of real time systems: hard and soft. A soft real time system is one in which if the particular timing constraints are not rigidly met there will be no critical results. On the other hand, a hard real time system is one in which if the timing constraints are not met the results could be catastrophic. An example of a soft real time system is a DVD decoder. If the particular piece of data from the input is not decoded and displayed to the screen at exactly the correct moment nothing critical will become of it, the user may not even notice it. However, a hard real time system is needed to control the timing of fuel injections or steering on the Space Shuttle; a delay of even a fraction of a second could be catastrophic in such a complex system. The current real time system employed by most NASA projects is Wind River's VxWorks operating system. This is a proprietary operating system that can be configured to work with many of NASA s needs and it provides very accurate and reliable hard real time performance. The down side is that since it is a proprietary operating system it is also costly to implement. The prospect of replacing this somewhat costly implementation is the focus of one of the SA group s current research projects. The explosion of open source software in the last ten years has led to the development of a multitude of software solutions which were once only produced by major corporations. The benefits of these open projects include faster release and bug patching cycles as well as inexpensive if not free software solutions. The main packages for hard real time solutions under Linux are Real Time Application Interface (RTAI) and two varieties of Real Time Linux (RTL), RTLFree and RTLPro. During my time here at NASA I have been testing various hard real time solutions operating as layers on the Linux Operating System. All testing is being run on an Intel SBC 2590 which is a common embedded hardware platform. The test plan was provided to me by the Software Assurance group at the start of my internship and my job has been to test the systems by developing and executing the test cases on the hardware. These tests are constructed so that the Software Assurance group can get hard test data for a comparison between the open source and proprietary implementations of hard real time solutions.
Epos Working Group 10 Infrastructure for Georesources
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanisław; Kwiatek, Grzegorz
2013-04-01
Working Group 10 "Infrastructure for Georesources" deals primarily with induced seismicity (IS) infrastructure. Established during the EPOS Annual Meeting in Utrecht, November 2011, WG10 aims to integrate the research infrastructure in the area of seismicity induced by human activity: tremors and rockbursts in underground mines, seismicity associated with conventional and unconventional oil and gas production, induced by geothermal energy extraction and by underground reposition and storage of liquids (e.g. water disposal associated with energy extraction) and gases (CO2 sequestration, inter alia) and triggered by filling surface water reservoirs, etc. Until now the research in the area of IS has been organized around induced technologies rather than physical problems, common for these shallow seismic processes. This has hampered the integration of IS research community and the research progress. WG10 intends to work out a first step towards changing the IS research perspective from the present, technology-oriented, to physical problems-oriented without, however, losing touch with technological conditions of IS generation. This will be achieved by the integration of IS Research Infrastructure (ISRI) and the creation of Induced Seismicity Node within EPOS. The ISRI to be integrated has three components: data, software and reports. The IS data consists of seismic data and auxiliary data: geological, displacement, geomechanical, geodetic, etc, and last, but by no means least, technological data. A research in the field of IS cannot do without this last data class. The IS software comprises common software tools for data handling and visualisation, standard and advanced software for research and software based on newly proposed algorithms for tests and development. The IS reports are both peer reviewed and unreviewed as well as an internet forum. In addition to that the IS Node will play a significant role in integrating IS community and accelerating research, it will help to develop a synergy between research community and industrial partners. WG10 is working out the strategic solutions for integration and core services provided by future IS node for the European and other research groups, industrial partners, educational centers, central and local administration bodies. Measurable benefit of the integrated ISRI will be the intensification of studies on hazard and risk associated with anthropogenic seismicity and on methods of anthropogenic seismic risk mitigation. Best practices will be disseminated to industrial partners and relevant bodies of public administration. It is also planned to have an information node for the public use.
E-Books Mediator: Nicholas Bogaty--Open Ebook Forum, New York
ERIC Educational Resources Information Center
Library Journal, 2004
2004-01-01
This article is about the work of Nick Bogaty, executive director of the Open eBook Forum. Nick Bogaty is not a librarian, but he plays nicely with them, along with publishers, hardware manufacturers, software producers, database vendors, and disability rights advocates. All are groups that share an interest in making e-books work for their…
Analysis of Multilayered Printed Circuit Boards using Computed Tomography
2014-05-01
complex PCBs that present a challenge for any testing or fault analysis. Set-to- work testing and fault analysis of any electronic circuit require...Electronic Warfare and Radar Division in December 2010. He is currently in Electro- Optic Countermeasures Group. Samuel works on embedded system design...and software optimisation of complex electro-optical systems, including the set to work and characterisation of these systems. He has a Bachelor of
Weaver, Charlotte; O'Brien, Ann
2016-01-01
In 2014, a group of diverse informatics leaders from practice, academia, and the software industry formed to address how best to transform electronic documentation to provide knowledge at the point of care and to deliver value to front line nurses and nurse leaders. This presentation reports the recommendations from this Working Group geared towards a 2020 framework. The recommendations propose redesign to optimize nurses' documentation efficiency while contributing to knowledge generation and attaining a balance that ensures the capture of nursing's impact on safety, quality, yet minimizes "death by data entry."
1994-09-01
report for the Properties of User Interface Software Architetures ", draft DISCUS Working Group, Programmers Tutorial, MITRE paper, SEI. Carnegie...execution that we have defined called asynchronous remote procedure call (ARPC) [15], which allows concurrency in amounts proportional to the amount of...demonstration project to use STARS DoD software budget and the proportion concepts. IBM is one of the prime is expected to be increased during the contractors
Collected software engineering papers, volume 9
NASA Technical Reports Server (NTRS)
1991-01-01
This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.
Toxmatch-a new software tool to aid in the development and evaluation of chemically similar groups.
Patlewicz, G; Jeliazkova, N; Gallegos Saliner, A; Worth, A P
2008-01-01
Chemical similarity is a widely used concept in toxicology, and is based on the hypothesis that similar compounds should have similar biological activities. This forms the underlying basis for performing read-across, forming chemical groups and developing (Quantitative) Structure-Activity Relationships ((Q)SARs). Chemical similarity is often perceived as structural similarity but in fact there are a number of other approaches that can be used to assess similarity. A systematic similarity analysis usually comprises two main steps. Firstly the chemical structures to be compared need to be characterised in terms of relevant descriptors which encode their physicochemical, topological, geometrical and/or surface properties. A second step involves a quantitative comparison of those descriptors using similarity (or dissimilarity) indices. This work outlines the use of chemical similarity principles in the formation of endpoint specific chemical groupings. Examples are provided to illustrate the development and evaluation of chemical groupings using a new software application called Toxmatch that was recently commissioned by the European Chemicals Bureau (ECB), of the European Commission's Joint Research Centre. Insights from using this software are highlighted with specific focus on the prospective application of chemical groupings under the new chemicals legislation, REACH.
Team Software Process (TSP) Body of Knowledge (BOK)
2010-07-01
styles that correspond stereotypical extremes of group control and coordination, as shown in Figure 5. closed, random, open, and synchronous group ...and confirming the resolutions • managing the design change process and coordinating changes with the configuration control board • reporting...members. 123 | CMU/SEI-2010-TR-020 4. Coaching – Obtain a lead coach and the coaches for each team. 5. Conceptual design – Form a working group of
NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes
NASA Technical Reports Server (NTRS)
Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.
Distributed agile software development for the SKA
NASA Astrophysics Data System (ADS)
Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David
2012-09-01
The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
NASA Technical Reports Server (NTRS)
Wilber, George F.
2017-01-01
This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).
Abstraction, ethics and software: Why don`t the rules work?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warwick, S.
1994-12-31
A theory is presented that one of the reasons why the use of unlicensed software is so widespread and unstigmatized is that legislatures, courts and other bodies which create policy operate at a higher level of abstraction than do individuals, and that abstraction is a key factor in the divergence of societal behavior from that condoned by legal statute. This theory is explored through a pilot study consisting of medium depth interviews with two volunteers who had used unlicensed software. Their attitudes, understanding of the law, and characterization of the their use of unlicensed software as based on {open_quotes}need{close_quotes} ismore » reported. In addition, the concept of face is examined, and how it is maintained while violating law. It is suggested that further studies, using multiple methodologies, (in-depth interview, focus groups, and surveys) be conducted prior to developing further policy or legislation regarding intellectual property protection for software.« less
Implementing Kanban for agile process management within the ALMA Software Operations Group
NASA Astrophysics Data System (ADS)
Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge
2014-07-01
After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.
NASA Technical Reports Server (NTRS)
1987-01-01
A hypersonic transport aircraft design project was selected as a result of interactions with NASA Lewis Research Center personnel and fits the Presidential concept of the Orient Express. The Graduate Teaching Assistant (GTA) and an undergraduate student worked at the NASA Lewis Research Center during the 1986 summer conducting a literature survey, and relevant literature and useful software were collected. The computer software was implemented in the Computer Aided Design Laboratory of the Mechanical and Aerospace Engineering Department. In addition to the lectures by the three instructors, a series of guest lectures was conducted. The first of these lectures 'Anywhere in the World in Two Hours' was delivered by R. Luidens of NASA Lewis Center. In addition, videotaped copies of relevant seminars obtained from NASA Lewis were also featured. The first assignment was to individually research and develop the mission requirements and to discuss the findings with the class. The class in consultation with the instructors then developed a set of unified mission requirements. Then the class was divided into three design groups (1) Aerodynamics Group, (2) Propulsion Group, and (3) Structures and Thermal Analyses Group. The groups worked on their respective design areas and interacted with each other to finally come up with an integrated conceptual design. The three faculty members and the GTA acted as the resource persons for the three groups and aided in the integration of the individual group designs into the final design of a hypersonic aircraft.
CILogon: An Integrated Identity and Access Management Platform for Science
NASA Astrophysics Data System (ADS)
Basney, J.
2016-12-01
When scientists work together, they use web sites and other software to share their ideas and data. To ensure the integrity of their work, these systems require the scientists to log in and verify that they are part of the team working on a particular science problem. Too often, the identity and access verification process is a stumbling block for the scientists. Scientific research projects are forced to invest time and effort into developing and supporting Identity and Access Management (IAM) services, distracting them from the core goals of their research collaboration. CILogon provides an IAM platform that enables scientists to work together to meet their IAM needs more effectively so they can allocate more time and effort to their core mission of scientific research. The CILogon platform enables federated identity management and collaborative organization management. Federated identity management enables researchers to use their home organization identities to access cyberinfrastructure, rather than requiring yet another username and password to log on. Collaborative organization management enables research projects to define user groups for authorization to collaboration platforms (e.g., wikis, mailing lists, and domain applications). CILogon's IAM platform serves the unique needs of research collaborations, namely the need to dynamically form collaboration groups across organizations and countries, sharing access to data, instruments, compute clusters, and other resources to enable scientific discovery. CILogon provides a software-as-a-service platform to ease integration with cyberinfrastructure, while making all software components publicly available under open source licenses to enable re-use. Figure 1 illustrates the components and interfaces of this platform. CILogon has been operational since 2010 and has been used by over 7,000 researchers from more than 170 identity providers to access cyberinfrastructure including Globus, LIGO, Open Science Grid, SeedMe, and XSEDE. The "CILogon 2.0" platform, launched in 2016, adds support for virtual organization (VO) membership management, identity linking, international collaborations, and standard integration protocols, through integration with the Internet2 COmanage collaboration software.
Technology Dollars from Pennies Saved.
ERIC Educational Resources Information Center
Anderson, Mary Alice
1996-01-01
Suggests ways to stretch media center budgets for technology, based on experiences at Winona Middle School (Minnesota). Topics include keeping statistics, hardware purchase and warranty information, centralizing purchases, planning for the reallocation of hardware and software, creative financing, working with business and community groups, staff…
Systems biology driven software design for the research enterprise.
Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya
2008-06-25
In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.
Virtual building environments (VBE) - Applying information modeling to buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazjanac, Vladimir
2004-06-21
A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative andmore » the benefits from a couple of early VBE projects.« less
The 2006 NESCent Phyloinformatics Hackathon: A Field Report
Lapp, Hilmar; Bala, Sendu; Balhoff, James P.; Bouck, Amy; Goto, Naohisa; Holder, Mark; Holland, Richard; Holloway, Alisha; Katayama, Toshiaki; Lewis, Paul O.; Mackey, Aaron J.; Osborne, Brian I.; Piel, William H.; Kosakovsky Pond, Sergei L.; Poon, Art F.Y.; Qiu, Wei-Gang; Stajich, Jason E.; Stoltzfus, Arlin; Thierer, Tobias; Vilella, Albert J.; Vos, Rutger A.; Zmasek, Christian M.; Zwickl, Derrick J.; Vision, Todd J.
2007-01-01
In December, 2006, a group of 26 software developers from some of the most widely used life science programming toolkits and phylogenetic software projects converged on Durham, North Carolina, for a Phyloinformatics Hackathon, an intense five-day collaborative software coding event sponsored by the National Evolutionary Synthesis Center (NESCent). The goal was to help researchers to integrate multiple phylogenetic software tools into automated workflows. Participants addressed deficiencies in interoperability between programs by implementing “glue code” and improving support for phylogenetic data exchange standards (particularly NEXUS) across the toolkits. The work was guided by use-cases compiled in advance by both developers and users, and the code was documented as it was developed. The resulting software is freely available for both users and developers through incorporation into the distributions of several widely-used open-source toolkits. We explain the motivation for the hackathon, how it was organized, and discuss some of the outcomes and lessons learned. We conclude that hackathons are an effective mode of solving problems in software interoperability and usability, and are underutilized in scientific software development.
Collaborative Data Publication Utilizing the Open Data Repository's (ODR) Data Publisher
NASA Technical Reports Server (NTRS)
Stone, N.; Lafuente, B.; Bristow, T.; Keller, R. M.; Downs, R. T.; Blake, D.; Fonda, M.; Dateo, C.; Pires, A.
2017-01-01
Introduction: For small communities in diverse fields such as astrobiology, publishing and sharing data can be a difficult challenge. While large, homogenous fields often have repositories and existing data standards, small groups of independent researchers have few options for publishing standards and data that can be utilized within their community. In conjunction with teams at NASA Ames and the University of Arizona, the Open Data Repository's (ODR) Data Publisher has been conducting ongoing pilots to assess the needs of diverse research groups and to develop software to allow them to publish and share their data collaboratively. Objectives: The ODR's Data Publisher aims to provide an easy-to-use and implement software tool that will allow researchers to create and publish database templates and related data. The end product will facilitate both human-readable interfaces (web-based with embedded images, files, and charts) and machine-readable interfaces utilizing semantic standards. Characteristics: The Data Publisher software runs on the standard LAMP (Linux, Apache, MySQL, PHP) stack to provide the widest server base available. The software is based on Symfony (www.symfony.com) which provides a robust framework for creating extensible, object-oriented software in PHP. The software interface consists of a template designer where individual or master database templates can be created. A master database template can be shared by many researchers to provide a common metadata standard that will set a compatibility standard for all derivative databases. Individual researchers can then extend their instance of the template with custom fields, file storage, or visualizations that may be unique to their studies. This allows groups to create compatible databases for data discovery and sharing purposes while still providing the flexibility needed to meet the needs of scientists in rapidly evolving areas of research. Research: As part of this effort, a number of ongoing pilot and test projects are currently in progress. The Astrobiology Habitable Environments Database Working Group is developing a shared database standard using the ODR's Data Publisher and has a number of example databases where astrobiology data are shared. Soon these databases will be integrated via the template-based standard. Work with this group helps determine what data researchers in these diverse fields need to share and archive. Additionally, this pilot helps determine what standards are viable for sharing these types of data from internally developed standards to existing open standards such as the Dublin Core (http://dublincore.org) and Darwin Core (http://rs.twdg.org) metadata standards. Further studies are ongoing with the University of Arizona Department of Geosciences where a number of mineralogy databases are being constructed within the ODR Data Publisher system. Conclusions: Through the ongoing pilots and discussions with individual researchers and small research teams, a definition of the tools desired by these groups is coming into focus. As the software development moves forward, the goal is to meet the publication and collaboration needs of these scientists in an unobtrusive and functional way.
Quantifying parametric uncertainty in the Rothermel model
S. Goodrick
2008-01-01
The purpose of the present work is to quantify parametric uncertainty in the Rothermel wildland fire spreadmodel (implemented in software such as fire spread models in the United States. This model consists of a non-linear system of equations that relates environmentalvariables (input parameter groups...
1993-11-01
Eliezer N. Solomon Steve Sedrel Westinghouse Electronic Systems Group P.O. Box 746, MS 432, Baltimore, Maryland 21203-0746, USA SUMMARY The United States...subset of the Joint Intergrated Avionics NewAgentCollection which has four Working Group (JIAWG), Performance parameters: Acceptor, of type Task._D...Published Noember 1993 Distribution and Availability on Back Cover SAGARD-CP54 ADVISORY GROUP FOR AERSACE RESEARCH & DEVELOPMENT 7 RUE ANCELLE 92200
Using synchronous software in Web-based nursing courses.
Little, Barbara Battin; Passmore, Denise; Schullo, Shauna
2006-01-01
To promote learning and enhance immediacy and satisfaction, a college of nursing at a large research I southern university undertook a pilot project to incorporate synchronous classroom software into an ongoing online program. Two synchronous class sessions using voice over Internet protocol were offered in the Community/Public Health Nursing course through Elluminate Live! Upon conclusion of the lecture, students were divided into breakout groups to work on group projects. Surveys were administered to the students and faculty before and after the class sessions. Evaluation of the pedagogical strategies used in the synchronous sessions was conducted by instructional technology faculty. Students in the pilot group reported higher levels of satisfaction with the Web-based course with synchronous sessions. In addition, students reported that group time at the end of the session was helpful for completing group projects. A majority responded that synchronous session activities and assignments facilitated their understanding of course content. This article presents a description of the synchronous classroom pilot project along with recommendations for implementation and pedagogical approaches.
HEP Community White Paper on Software Trigger and Event Reconstruction: Executive Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albrecht, Johannes; et al.
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less
HEP Community White Paper on Software Trigger and Event Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albrecht, Johannes; et al.
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less
The use of hypermedia to increase the productivity of software development teams
NASA Technical Reports Server (NTRS)
Coles, L. Stephen
1991-01-01
Rapid progress in low-cost commercial PC-class multimedia workstation technology will potentially have a dramatic impact on the productivity of distributed work groups of 50-100 software developers. Hypermedia/multimedia involves the seamless integration in a graphical user interface (GUI) of a wide variety of data structures, including high-resolution graphics, maps, images, voice, and full-motion video. Hypermedia will normally require the manipulation of large dynamic files for which relational data base technology and SQL servers are essential. Basic machine architecture, special-purpose video boards, video equipment, optical memory, software needed for animation, network technology, and the anticipated increase in productivity that will result for the introduction of hypermedia technology are covered. It is suggested that the cost of the hardware and software to support an individual multimedia workstation will be on the order of $10,000.
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
Virtualization of System of Systems Test and Evaluation
2012-06-04
computers and is the primary enabler for virtualization. 2. Virtualization System Elements Parmalee, Peterson , Tillman, & Hatfield (1972) outlined the...The work of Abu-Taieh and El Sheikh, based on the work of Balci (1994, 1995), and Balci et al. ( 1996 ), seeks to organize types of tests and to...and testing. In A. Dasso & A. Funes (Eds.), Verification, validation, and testing in software engineering (pp. 155–184). Hershey , PA: Idea Group
Working Group 1: Software System Design and Implementation for Environmental Modeling (presentation)
Background: Nine Federal agencies have been cooperating under a Memorandum of Understanding (MOU) on the research and development of multimedia environmental models. The MOU, which was revised in 2012, continues an effort that began in 2001. It establishes a framework for facilit...
Selvester scoring in patients with strict LBBB using the QUARESS software.
Xia, Xiaojuan; Chaudhry, Uzma; Wieslander, Björn; Borgquist, Rasmus; Wagner, Galen S; Strauss, David G; Platonov, Pyotr; Ugander, Martin; Couderc, Jean-Philippe
2015-01-01
Estimation of the infarct size from body-surface ECGs in post-myocardial infarction patients has become possible using the Selvester scoring method. Automation of this scoring has been proposed in order to speed-up the measurement of the score and improving the inter-observer variability in computing a score that requires strong expertise in electrocardiography. In this work, we evaluated the quality of the QuAReSS software for delivering correct Selvester scoring in a set of standard 12-lead ECGs. Standard 12-lead ECGs were recorded in 105 post-MI patients prescribed implantation of an implantable cardiodefibrillator (ICD). Amongst the 105 patients with standard clinical left bundle branch block (LBBB) patterns, 67 had a LBBB pattern meeting the strict criteria. The QuAReSS software was applied to these 67 tracings by two independent groups of cardiologists (from a clinical group and an ECG core laboratory) to measure the Selvester score semi-automatically. Using various level of agreement metrics, we compared the scores between groups and when automatically measured by the software. The average of the absolute difference in Selvester scores measured by the two independent groups was 1.4±1.5 score points, whereas the difference between automatic method and the two manual adjudications were 1.2±1.2 and 1.3±1.2 points. Eighty-two percent score agreement was observed between the two independent measurements when the difference of score was within two point ranges, while 90% and 84% score agreements were reached using the automatic method compared to the two manual adjudications. The study confirms that the QuAReSS software provides valid measurements of the Selvester score in patients with strict LBBB with minimal correction from cardiologists. Copyright © 2015 Elsevier Inc. All rights reserved.
Ondersma, Steven J; Martin, Joanne; Fortson, Beverly; Whitaker, Daniel J; Self-Brown, Shannon; Beatty, Jessica; Loree, Amy; Bard, David; Chaffin, Mark
2017-11-01
Early home visitation (EHV) for child maltreatment prevention is widely adopted but has received inconsistent empirical support. Supplementation with interactive software may facilitate attention to major risk factors and use of evidence-based approaches. We developed eight 20-min computer-delivered modules for use by mothers during the course of EHV. These modules were tested in a randomized trial in which 413 mothers were assigned to software-supplemented e-Parenting Program ( ePP), services as usual (SAU), or community referral conditions, with evaluation at 6 and 12 months. Outcomes included satisfaction, working alliance, EHV retention, child maltreatment, and child maltreatment risk factors. The software was well-received overall. At the 6-month follow-up, working alliance ratings were higher in the ePP condition relative to the SAU condition (Cohen's d = .36, p < .01), with no differences at 12 months. There were no between-group differences in maltreatment or major risk factors at either time point. Despite good acceptability and feasibility, these findings provide limited support for use of this software within EHV. These findings contribute to the mixed results seen across different models of EHV for child maltreatment prevention.
ERIC Educational Resources Information Center
Li, Xiaoming; Atkins, Melissa S.; Stanton, Bonita
2006-01-01
Data from 122 Head Start children were analyzed to examine the impact of computer use on school readiness and psychomotor skills. Children in the experimental group were given the opportunity to work on a computer for 15-20 minutes per day with their choice of developmentally appropriate educational software, while the control group received a…
My World Is Your World: Web Portal Design For Environmental Data
NASA Astrophysics Data System (ADS)
Laney, C.; Cody, R. P.; Gaylord, A. G.; Kassin, A.; Manley, W. F.; Score, R.; Tweedie, C. E.
2013-12-01
In the environmental sciences, researchers are increasingly relying on automated sensors as necessary components of their work. There are many software packages available that will help users download data from internet-connected data loggers; process, store, document, and analyze the data; or provide web-based geoportals for visualization and sharing of both spatial and time-series data. However, few (if any) software packages provide a complete, end-to-end system that will meet all of the needs of any given research group. Such systems often need to be designed and built as needed. Our group specializes in creating such systems. Our portals provide rapid data discovery and contextualization, and promote collaboration. We work at multiple scales, from a small lab working at a single site in the Chihuahuan desert (SEL-Jornada), to a community portal for environmental data from Barrow, Alaska (Barrow Area Information Database Information Management System [BAID-IMS]), to a project-tracking system for US Arctic research efforts (Arctic Research Mapping Application/Arctic Observing Viewer [ARMAP/AON]). Here, we share our experiences of creating scalable systems and improving practices that address both user community and research needs.
Determining the explosion risk level and the explosion hazard area for a group of natural gas wells
NASA Astrophysics Data System (ADS)
Gligor, A.; Petrescu, V.; Deac, C.; Bibu, M.
2016-11-01
Starting from the fact that the natural gas engineering profession is generally associated with a high occupational risk, the current paper aims to help increase the safety of natural gas wells and reduce the risk of work-related accidents, as well as the occurrence of professional illnesses, by applying an assessment method that has proven its efficiency in other industrial areas in combination with a computer-aided design software. More specifically, the paper focuses on two main research directions: assessing the explosion risk for employees working at natural gas wells and indicating areas with a higher explosion hazard by using a modern software that allows their presentation in 3D. The appropriate zoning of industrial areas allows to group the various functional areas function of the probability of the occurrence of a dangerous element, such as an explosive atmosphere and subsequently it allows also to correctly select the electrical and mechanical equipment that will be used in that area, since electrical apparatuses that are otherwise found in normal work environments cannot generally be used in areas with explosion hazard, because of the risk that an electric spark, an electrostatic discharge etc. ignites the explosive atmosphere.
Singularity: Scientific containers for mobility of compute.
Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W
2017-01-01
Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.
Singularity: Scientific containers for mobility of compute
Kurtzer, Gregory M.; Bauer, Michael W.
2017-01-01
Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014
[Development of integrated support software for clinical nutrition].
Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús
2015-09-01
to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
NASA Astrophysics Data System (ADS)
Waller, Lewis G.; Shortridge, Keith; Farrell, Tony J.; Vuong, Minh; Muller, Rolf; Sheinis, Andrew I.
2014-07-01
The new HERMES spectrograph represents the first foray by AAO into the use of commercial off-the-shelf industrial field bus technology for instrument control, and we regard the final system, with its relatively simple wiring requirements, as a great success. However, both software and hardware teams had to work together to solve a number of problems integrating the chosen CANopen/CAN bus system into our normal observing systems. A Linux system running in an industrial PC chassis ran the HERMES control software, using a PCI CAN bus interface connected to a number of distributed CANopen/CAN bus I/O devices and servo amplifiers. In the main, the servo amplifiers performed impressively, although some experimentation with homing algorithms was required, and we hit a significant hurdle when we discovered that we needed to disable some of the encoders used during observations; we learned a lot about how servo amplifiers respond when their encoders are turned off, and about how encoders react to losing power. The software was based around a commercial CANopen library from Copley Controls. Early worries about how this heavily multithreaded library would work with our standard data acquisition system led to the development of a very low-level CANopen software simulator to verify the design. This also enabled the software group to develop and test almost all the control software well in advance of the construction of the hardware. In the end, the instrument went from initial installation at the telescope to successful commissioning remarkably smoothly.
SRA Real Math Building Blocks PreK. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2007
2007-01-01
"SRA Real Math Building Blocks PreK" (also referred to as "Building Blocks for Math") is a supplemental mathematics curriculum designed to develop preschool children's early mathematical knowledge through various individual and small- and large-group activities. It uses "Building Blocks for Math PreK" software,…
SOFTWARE SYSTEM DESIGN AND IMPLEMENTATION FOR ENVIRONMENTAL MODELING: A MOU WORKING GROUP
A workgroup was formed in conjunction with a formal Memorandum of Understanding (MOU) among six Federal Agencies to pursue collaborative research in technical areas related to environmental modeling. Among the primary objectives of the MOU are to 1) provide a mechanism for the c...
Supporting Effective Collaboration: Using a Rearview Mirror to Look Forward
ERIC Educational Resources Information Center
McManus, Margaret M.; Aiken, Robert M.
2016-01-01
Our original research, to design and develop an Intelligent Collaborative Learning System (ICLS), yielded the creation of a Group Leader Tutor software system which utilizes a Collaborative Skills Network to monitor students working collaboratively in a networked environment. The Collaborative Skills Network was a conceptualization of…
48 CFR 227.7205 - Contracts for special works.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Computer Software and Computer Software Documentation 227.7205 Contracts for special works. (a) Use the... a specific need to control the distribution of computer software or computer software documentation..., modification, reproduction, release, performance, display, or disclosure of such software or documentation. Use...
48 CFR 227.7205 - Contracts for special works.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Computer Software and Computer Software Documentation 227.7205 Contracts for special works. (a) Use the... a specific need to control the distribution of computer software or computer software documentation..., modification, reproduction, release, performance, display, or disclosure of such software or documentation. Use...
Systems biology driven software design for the research enterprise
Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya
2008-01-01
Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887
The ALMA Common Software as a Basis for a Distributed Software Development
NASA Astrophysics Data System (ADS)
Raffi, Gianni; Chiozzi, Gianluca; Glendenning, Brian
The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe, North America and Japan. ALMA will consist of 64 12-m antennas operating in the millimetre and sub-millimetre wavelength range, with baselines of more than 10 km. It will be located at an altitude above 5000 m in the Chilean Atacama desert. The ALMA Computing group is a joint group with staff scattered on 3 continents and is responsible for all the control and data flow software related to ALMA, including tools ranging from support of proposal preparation to archive access of automatically created images. Early in the project it was decided that an ALMA Common Software (ACS) would be developed as a way to provide to all partners involved in the development a common software platform. The original assumption was that some key middleware like communication via CORBA and the use of XML and Java would be part of the project. It was intended from the beginning to develop this software in an incremental way based on releases, so that it would then evolve into an essential embedded part of all ALMA software applications. In this way we would build a basic unity and coherence into a system that will have been developed in a distributed fashion. This paper evaluates our progress after 1.5 year of work, following a few tests and preliminary releases. It analyzes the advantages and difficulties of such an ambitious approach, which creates an interface across all the various control and data flow applications.
Dynamic Modelling with "MLE-Energy Dynamic" for Primary School
NASA Astrophysics Data System (ADS)
Giliberti, Enrico; Corni, Federico
During the recent years simulation and modelling are growing instances in science education. In primary school, however, the main use of software is the simulation, due to the lack of modelling software tools specially designed to fit/accomplish the needs of primary education. In particular primary school teachers need to use simulation in a framework that is both consistent and simple enough to be understandable by children [
NASA Astrophysics Data System (ADS)
Downs, R. R.; Lenhardt, W. C.; Robinson, E.
2014-12-01
Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.
Report of AAPM Task Group 162: Software for planar image quality metrology.
Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J
2018-02-01
The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.
Introducing the AAS Working Group on Astroinformatics and Astrostatistics
NASA Astrophysics Data System (ADS)
Ivezic, Zeljko
2014-01-01
In response to two White Papers submitted to the Astro2010 Decadal Survey (1,2), a new AAS Working Group on Astroinformatics and Astrostatistics (WGAA) has been approved by the AAS Council at the 220th Meeting, June 2012, in Anchorage. The motivation for this WG is the growing importance of the interface between astronomy and various branches of applied mathematics, computer science and the emerging field of data science. With the new data-intensive projects envisioned for the coming decade, the need for advice derived from the focused attention of a group of AAS members who work in these areas is bound to increase. The Working Group is charged with spreading awareness of rapidly advancing computational techniques, sophsticated statistical methods, and highly capble software to further the goals of astronomical and astrophysical research. The three main strategic goals adopted by the WGAA Steering Committee for the next few years are to: (i) develop, organize and maintain methodological resources (such as software tools, papers, books, and lectures); (ii) enhance human resources (such as foster the creation of career paths, establish a Speakers' Bureau, establish and maintain an archived discussion forum, enable periodic news distribution); and (iii) organize topical meetings. The WGAA Steering Committee at this time includes twelve members: Kirk Borne, George Djorgovski, Eric Feigelson, Eric Ford, Alyssa Goodman, Joe Hilbe, Zeljko Ivezic (chair), Ashish Mahabal, Aneta Siemiginowska, Alex Szalay, Rick White, and Padma Yanamandra-Fisher. I will summarize our accomplishments since July 2012. (1) Astroinformatics: A 21st Century Approach to Astronomy (Borne & 90 coauthors), (2) The Astronomical Information Sciences: A Keystone for 21st-Century Astronomy (Loredo & 72 coauthors)
CCSDS SOIS Subnetwork Services: A First Reference Implementation
NASA Astrophysics Data System (ADS)
Gunes-Lasnet, S.; Notebaert, O.; Farges, P.-Y.; Fowell, S.
2008-08-01
The CCSDS SOIS working groups are developing a range of standards for spacecraft onboard interfaces with the intention of promoting reuse of hardware and software designs across a range of missions while enabling interoperability of onboard systems from diverse sources. The CCSDS SOIS working groups released in June 2007 their red books for both Subnetwork and application support layers. In order to allow the verification of these recommended standards and to pave the way for future implementation onboard spacecrafts, it is essential for these standards to be prototyped on a representative spacecraft platform, to provide valuable feed back to the SOIS working group. A first reference implementation of both Subnetwork and Application Support SOIS services over SpaceWire and Mil-Std-1553 bus is thus being realised by SciSys Ltd and Astrium under an ESA contract.
Sánchez Cuervo, Marina; Muñoz García, María; Gómez de Salazar López de Silanes, María Esther; Bermejo Vicedo, Teresa
2015-03-01
to describe the features of a computer program for management of drugs in special situations (off-label and compassionate use) in a Department of Hospital Pharmacy (PD). To describe the methodology followed for its implementation in the Medical Services. To evaluate their use after 2 years of practice. the design was carried out by pharmacists of the PD. The stages of the process were: selection of a software development company, establishment of a working group, selection of a development platform, design of an interactive Viewer, definition of functionality and data processing, creation of databases, connection, installation and configuration, application testing and improvements development. A directed sequential strategy was used for implementation in the Medical Services. The program's utility and experience of use were evaluated after 2 years. a multidisciplinary working group was formed and developed Pk_Usos®. The program works in web environment with a common viewer for all users enabling real time checking of the request files' status and that adapts to the management of medications in special situations procedure. Pk_Usos® was introduced first in the Oncology Department, with 15 oncologists as users of the program. 343 patients had 384 treatment requests managed, of which 363 are authorized throughout two years. PK_Usos® is the first software designed for the management of drugs in special situations in the PD. It is a dynamic and efficient tool for all professionals involved in the process by optimization of times. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
User-centered design of multi-gene sequencing panel reports for clinicians.
Cutting, Elizabeth; Banchero, Meghan; Beitelshees, Amber L; Cimino, James J; Fiol, Guilherme Del; Gurses, Ayse P; Hoffman, Mark A; Jeng, Linda Jo Bone; Kawamoto, Kensaku; Kelemen, Mark; Pincus, Harold Alan; Shuldiner, Alan R; Williams, Marc S; Pollin, Toni I; Overby, Casey Lynnette
2016-10-01
The objective of this study was to develop a high-fidelity prototype for delivering multi-gene sequencing panel (GS) reports to clinicians that simulates the user experience of a final application. The delivery and use of GS reports can occur within complex and high-paced healthcare environments. We employ a user-centered software design approach in a focus group setting in order to facilitate gathering rich contextual information from a diverse group of stakeholders potentially impacted by the delivery of GS reports relevant to two precision medicine programs at the University of Maryland Medical Center. Responses from focus group sessions were transcribed, coded and analyzed by two team members. Notification mechanisms and information resources preferred by participants from our first phase of focus groups were incorporated into scenarios and the design of a software prototype for delivering GS reports. The goal of our second phase of focus group, to gain input on the prototype software design, was accomplished through conducting task walkthroughs with GS reporting scenarios. Preferences for notification, content and consultation from genetics specialists appeared to depend upon familiarity with scenarios for ordering and delivering GS reports. Despite familiarity with some aspects of the scenarios we proposed, many of our participants agreed that they would likely seek consultation from a genetics specialist after viewing the test reports. In addition, participants offered design and content recommendations. Findings illustrated a need to support customized notification approaches, user-specific information, and access to genetics specialists with GS reports. These design principles can be incorporated into software applications that deliver GS reports. Our user-centered approach to conduct this assessment and the specific input we received from clinicians may also be relevant to others working on similar projects. Copyright © 2016 Elsevier Inc. All rights reserved.
Developing Engineering and Science Process Skills Using Design Software in an Elementary Education
NASA Astrophysics Data System (ADS)
Fusco, Christopher
This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.
[Job stressors in software developers--a comparison with other occupations].
Kadokura, M
1997-09-01
The aim of this study is to investigate the difference in job stressors among software developers, the sales staff and the clerical staff (n = 2,079) in two companies (A Co. and B Co.) using a self-administered questionnaire that included a job stressor scale and the 30-item General Health Questionnaire (GHQ). We developed the job stressor scale based on the interviews with out-patients who engaged in software development and previous studies about job stressors. Factor analysis with a seven-factor solution showed that seven subscales were abstracted from the job stressor scale, namely, quantitative load of work, dissatisfaction with work, demanding work, uneasiness about work, human relations, ambiguity of work and shortage of private time. Each subscale was significantly (r = .313-.442, p < 0.0001) correlated with the GHQ score and proved to be a reliable instrument, as indicated by a Cronbach's alpha of greater than 0.73. Stepwise multiple regression analysis revealed that quantitative load of work and shortage of private time subscale scores were significantly high in software developers in A Co. Software developers in A Co. tended to score higher (P < .10) than the others in demanding work and ambiguity of work subscale. All subscale scores were significantly low in the clerical staff in B Co. There was no significant difference between the sales staff and software developers in B Co. Results of the interviews with out-patients showed that demanding work, hard deadline, ambiguity of work and precarious work would cause trouble in software developers. The implications of these findings with respect to occupational issues related to software developers are discussed.
NASA Technical Reports Server (NTRS)
Benowitz, E.; Niessner, A.
2003-01-01
This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... Machines (IBM), Software Group Business Unit, Quality Assurance Group, San Jose, California; Notice of... workers of International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA... February 2, 2011 (76 FR 5832). The subject worker group supplies acceptance testing services, design...
KETCindy--Collaboration of Cinderella and KETpic Reports on CADGME 2014 Conference Working Group
ERIC Educational Resources Information Center
Kaneko, Masataka; Yamashita, Satoshi; Kitahara, Kiyoshi; Maeda, Yoshifumi; Nakamura, Yasuyuki; Kortenkamp, Ulrich; Takato, Setsuo
2015-01-01
Dynamic Geometry Software (DGS) is a powerful tool which enables students to move geometric objects interactively. Through experimental simulations with DGS, mathematical facts and background mechanisms are accessible to students. However, especially when those facts and mechanisms are complicated, it is not so easy for some students to record and…
NASA Technical Reports Server (NTRS)
Pourmal, Elena
2016-01-01
The HDF Group maintains and evolves HDF software used by NASA ESDIS program to manage remote sense data. In this talk we will discuss new features of HDF (Virtual Datasets, Single writerMultiple reader access, Community supported HDF5 compression filters) that address storage and IO performance requirements of the applications that work with the ESDIS data products.
Participation Through Gaze Controlled Computer for Children with Severe Multiple Disabilities.
Holmqvist, Eva; Derbring, Sandra; Wallin, Sofia
2017-01-01
This paper presents work on developing methodology material for use of gaze controlled computers. The target group is families and professionals around children with severe multiple disabilities. The material includes software grids for children at various levels, aimed for communication, leisure and learning and will be available for download.
2005 5th Annual CMMI Technology Conference and User Group. Volume 1: Monday
2005-11-17
OF TOTAL EFFORT WORK 59% REWORK 41% By the numbers: the impact of requirements Dion, DIO1 McConnell, MCC1 Davis, DAV1, Novorita, NOV1 - 66% to 55% 55...1996, Rational Software Corporation DAV2 http://mozart.uccs.edu/adavis/reqbib.html Requirements management bibliography DIO1 http
ERIC Educational Resources Information Center
Rostad, John
1997-01-01
Describes the production of news broadcasts on video by a high school class in Le Center, Minnesota. Topics include software for Apple computers, equipment used, student responsibilities, class curriculum, group work, communication among the production crew, administrative and staff support, and future improvements. (LRW)
The Quality of Talk in Children's Joint Activity at the Computer.
ERIC Educational Resources Information Center
Mercer, Neil
1994-01-01
Describes findings of the Spoken Language and New Technology (SLANT) research project which studied the talk of primary school children in the United Kingdom who were working in small groups at computers with various kinds of software. Improvements in the quality of talk and collaboration during computer-based activities are suggested. (Contains…
Agile Manifesto for Teaching and Learning
ERIC Educational Resources Information Center
Krehbiel, Timothy C.; Salzarulo, Peter A.; Cosmah, Michelle L.; Forren, John; Gannod, Gerald; Havelka, Douglas; Hulshult, Andrea R.; Merhout, Jeffrey
2017-01-01
A group of faculty members representing six colleges at a public university formed a learning community to study the Agile Way of Working--a method of workplace collaboration widely used in software development--and to determine whether the concepts, practices, and benefits of Agile are applicable to higher education settings. After more than two…
NASA Technical Reports Server (NTRS)
1997-01-01
Cogent Software, Inc. was formed in January 1995 by David Atkinson and Irene Woerner, both former employees of the Jet Propulsion Laboratory (JPL). Several other Cogent employees also worked at JPL. Atkinson headed JPL's Information Systems Technology section and Woerner lead the Advanced User Interfaces Group. Cogent's mission is to help companies organize and manage their online content by developing advanced software for the next generation of online directories and information catalogs. The company offers a complete range of Internet solutions, including Internet access, Web site design, local and wide-area networks, and custom software for online commerce applications. Cogent also offers DesignSphere Online, an electronic community for the communications arts industry. Customers range from small offices to manufacturers with thousands of employees, including Chemi-Con, one of the largest manufacturers of capacitors in the world.
ConcreteWorks v3 training/user manual (P1) : ConcreteWorks software (P2).
DOT National Transportation Integrated Search
2017-04-01
ConcreteWorks is designed to be a user-friendly software package that can help concrete : professionals optimize concrete mixture proportioning, perform a concrete thermal analysis, and : increase the chloride diffusion service life. The software pac...
NASA Astrophysics Data System (ADS)
Kwon, So Young
Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However, the students who collaboratively generated concept maps created significantly higher quality concept maps than those who individually generated concept maps. The researcher concluded that the concept mapping software, Inspiration(TM), fostered construction of students' concept maps individually or collaboratively for science learning and helped students capture their evolving creative ideas and organize them for meaningful learning. Students in both the individual and the collaborative concept mapping groups had positive attitudes toward concept mapping using Inspiration(TM) software.
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2017-12-01
Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.
NASA Astrophysics Data System (ADS)
Herbuś, K.; Ociepka, P.
2017-08-01
In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.
Verification and Validation of Neural Networks for Aerospace Systems
NASA Technical Reports Server (NTRS)
Mackall, Dale; Nelson, Stacy; Schumman, Johann; Clancy, Daniel (Technical Monitor)
2002-01-01
The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: 1) Overview of Adaptive Systems; and 2) V&V Processes/Methods.
Verification and Validation of Neural Networks for Aerospace Systems
NASA Technical Reports Server (NTRS)
Mackall, Dale; Nelson, Stacy; Schumann, Johann
2002-01-01
The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: Overview of Adaptive Systems and V&V Processes/Methods.
Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community
NASA Astrophysics Data System (ADS)
Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.
2016-12-01
The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.
Scheduling System Assessment, and Development and Enhancement of Re-engineered Version of GPSS
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah; Thomas, Bushrod; Passonno, Nicole
1996-01-01
The objective of this project is two-fold. First to provide an evaluation of a commercially developed version of the ground processing scheduling system (GPSS) for its applicability to the Kennedy Space Center (KSC) ground processing problem. Second, to work with the KSC GPSS development team and provide enhancement to the existing software. Systems reengineering is required to provide a sustainable system for the users and the software maintenance group. Using the LISP profile prototype code developed by the GPSS reverse reengineering groups as a building block, we have implemented the resource deconfliction portion of GPSS in common LISP using its object oriented features. The prototype corrects and extends some of the deficiencies of the current production version, plus it uses and builds on the classes from the development team's profile prototype.
Benchmarking the ATLAS software through the Kit Validation engine
NASA Astrophysics Data System (ADS)
De Salvo, Alessandro; Brasolin, Franco
2010-04-01
The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.
Selection criteria and facilitation training for the study of groupware
NASA Technical Reports Server (NTRS)
Robichaux, Barry P.
1993-01-01
Computer support for planning and decision making groups is a growing trend in the 90s. Groupware is a name often applied to group software and has been defined as 'computer-based systems that support groups engaged in a common task (or goal) and that provide an interface to a shared environment'. Unlike most single-user software, groupware assists user groups in their collaboration, coordination, and communication efforts. This paper focuses on groupware to support the meeting process. These systems are often called group decision support systems (GDSS), electronic meeting systems (EMS), or group support systems (GSS). The term 'meeting support groupware' is used here to include any computer-based system to support meetings. In order to understand this technology, one must first understand groups, what they do and the problems they face, and groupware, a wide range of technology to support group work. Guidelines for selecting groups for study as part of an overall research plan are provided in this document. These were taken from the literature and from persons for whom the information in this paper was targeted. Also, guidelines for facilitation training are discussed. Familiarity with known and accepted techniques are the principle duties of the facilitator and any form of training must include practice in using these techniques.
Medical device software: defining key terms.
Pashkov, Vitalii; Gutorova, Nataliya; Harkusha, Andrii
one of the areas of significant growth in medical devices has been the role of software - as an integral component of a medical device, as a standalone device and more recently as applications on mobile devices. The risk related to a malfunction of the standalone software used within healthcare is in itself not a criterion for its qualification or not as a medical device. It is therefore, necessary to clarify some criteria for the qualification of stand-alone software as medical devices Materials and methods: Ukrainian, European Union, United States of America legislation, Guidelines developed by European Commission and Food and Drug Administration's, recommendations represented by international voluntary group and scientific works. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. the legal regulation of software which is used for medical purpose in Ukraine limited to one definition. In European Union and United States of America were developed and applying special guidelines that help developers, manufactures and end users to difference software on types standing on medical purpose criteria. Software becomes more and more incorporated into medical devices. Developers and manufacturers may not have initially appreciated potential risks to patients and users such situation could have dangerous results for patients or users. It is necessary to develop and adopt the legislation that will intend to define the criteria for the qualification of medical device software and the application of the classification criteria to such software, provide some illustrative examples and step by step recommendations to qualify software as medical device.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridge, Pete, E-mail: pete.bridge@qut.edu.au; Gunn, Therese; Kastanis, Lazaros
A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice.more » Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment.« less
Cölfen, Helmut; Laue, Thomas M; Wohlleben, Wendel; Schilling, Kristian; Karabudak, Engin; Langhorst, Bradley W; Brookes, Emre; Dubbs, Bruce; Zollars, Dan; Rocco, Mattia; Demeler, Borries
2010-02-01
Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software.
Specialty Engineering Supplement to IEEE-15288.1
2015-05-15
receiver required to work in a dense EMI environment. (15) Any RF receiver with a burnout level of less than 30 dBm (1 mW). b. A summary of all...Context 2.1 ISO-IEC-IEEE-15288: 2015, Systems and Software Engineering — System life cycle processes ISO-IEC-IEEE 15288 is the DOD-adopted standard for...to ISO-15288 for application of systems engineering on defense programs that was developed by a joint services working group under the auspices of the
1983-11-01
INSTRUMENTATION ;(U) FORKLIFT VEHICLES ;(U) EXPERIMENTAL DATA IDENTIFIERS: OBJECTIVE: (U) SUPPORT INHOUSE RESEARCH FOR- ACQUISITION AND ANALYSIS OF...ROBOTIC RECONNAISSANCE VEHICLE DEMONSTRATOR WITH TERRAIN ANALYSIS . THIS WORK WILL SPECIFY THE BASE LINE HARDWARE, SOFTWARE, DATA BASE, AND SYSTEM...THE DATA ANALYSIS . THIS IS ALSO TRUE OF INFLIGMT DATA THAT THE PILOT IS REQUIRED TO ANALYZE. THIS RESEARCH IS CONCERNED WITH THE REPORT NO. CX7419
Spaceport Command and Control System Automation Testing
NASA Technical Reports Server (NTRS)
Plano, Tom
2017-01-01
The goal of automated testing is to create and maintain a cohesive infrastructure of robust tests that could be run independently on a software package in its entirety. To that end, the Spaceport Command and Control System (SCCS) project at the National Aeronautics and Space Administration's (NASA) Kennedy Space Center (KSC) has brought in a large group of interns to work side-by-side with full time employees to do just this work. Thus, our job is to implement the tests that will put SCCS through its paces.
1997-06-01
situations. The USAMH population includes 32,793 family members, many who are young spouses with small children . A recent study of USAMH emergency room...software is based on the work of Shiela Q. Wheeler, considered a pioneer in triage nursing and author of Telephone Triage: Theory . Practice and Protocol...personnel at USAMH established personnel levels at the Table of Distribution and Allowances ( TDA ) level with no overhires authorized. The working group
A Cochlear Implant Signal Processing Lab: Exploration of a Problem-Based Learning Exercise
ERIC Educational Resources Information Center
Bhatti, P. T.; McClellan, J. H.
2011-01-01
This paper presents an introductory signal processing laboratory and examines this laboratory exercise in the context of problem-based learning (PBL). Centered in a real-world application, a cochlear implant, the exercise challenged students to demonstrate a working software-based signal processor. Partnering in groups of two or three, second-year…
RICIS Symposium 1992: Mission and Safety Critical Systems Research and Applications
NASA Technical Reports Server (NTRS)
1992-01-01
This conference deals with computer systems which control systems whose failure to operate correctly could produce the loss of life and or property, mission and safety critical systems. Topics covered are: the work of standards groups, computer systems design and architecture, software reliability, process control systems, knowledge based expert systems, and computer and telecommunication protocols.
The Use of Technology in a Model of Formative Assessment
ERIC Educational Resources Information Center
García López, Alfonsa; García Mazarío, Francisco
2016-01-01
This work describes a formative assessment model for a Mathematical Analysis course taken by engineering students. It includes online quizzes with feedback, a portfolio with weekly assignments, exams involving the use of mathematical software and a project to be completed in small groups of two or three students. The model has been perfected since…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-06-01
This progress report summarizes our research activities under our consensus grant. In year five, we devoted much of our activities to completing fundamental research projects delayed because of the considerably stepped-up effort in consensus processes efforts during development of DOE`s Five Year Waste Plan (FYWP). Following our work on various procedures for bringing together groups such as the State and Tribal Government Working Group and the Stakeholders` Forum (both of which provide input to the Five Year Waste Plan), we compiled a literature overview of small-group consensus gaining and a handbook for consensus decision making. We also tested the effectivenessmore » Of group decision support software, and designed a structured observation process and its related hard- and software. We completed studies on experts and the role of personality characteristics in consensus group influence. Results of these studies are included in this final report. In consensus processes research, we were unable to continue studying consensus groups in action. However, we did study ways to improve ways to improve DOE`s technological information exchange effectiveness. We also studied how a new administration identifies what its strategic mission is and how it gets support from existing EM managers. We identified selection criteria for locating the EM exhibit, and tested our audience selection model. We also further calibrated our consensus measure. Additional conference papers and papers for journal submission were completed during year five.« less
Habibi, Ehsanollah; Soury, Shiva
2015-01-01
Background: Prevalence of work-related musculoskeletal disorders (WMSDs) is high among computer users. The study investigates the effect of three ergonomic interventions on the incidence of musculoskeletal disorders among the staff of Isfahan Province Gas Company, including training, sport, and installation of software. Materials and Methods: The study was performed in the summer of 2013 on 75 (52 men, 23 women) Isfahan Province Gas Company employees in three phases (phase 1: Evaluation of present situation, phase 2: Performing interventions, and phase 3: Re-evaluation). Participants were divided into three groups (training, exercise, and software). The Nordic Musculoskeletal Questionnaire (NMQ) and rapid upper limb assessment (RULA) were used. Data collected were analyzed using SPSS software and McNemar test, t-test, and Chi-square test. Results: Based on the evaluations, there was a decrease in musculoskeletal symptoms among the trained group participants after they received the training. McNemar test showed that the lower rate of pain in low back, neck, knee, and wrist was significant (P < 0.05). The results obtained from the RULA method for evaluation of posture showed an average 25 points decrease in the right side of the body and 20 points decrease in the left side of the body in the group subjected to training. Based on t-test, the decrease was significant. Conclusion: The study demonstrated that majority of the participants accepted interventions, which indicates that most of the people were unsatisfied with the work settings and seeking improvement at the workplace. Overall, the findings show that training, chair adjustment, and arrangement in workplace could decrease musculoskeletal disorders. PMID:26430692
Detailed Design Documentation, without the Pain
NASA Astrophysics Data System (ADS)
Ramsay, C. D.; Parkes, S.
2004-06-01
Producing detailed forms of design documentation, such as pseudocode and structured flowcharts, to describe the procedures of a software system:(1) allows software developers to model and discuss their understanding of a problem and the design of a solution free from the syntax of a programming language,(2) facilitates deeper involvement of non-technical stakeholders, such as the customer or project managers, whose influence ensures the quality, correctness and timeliness of the resulting system,(3) forms comprehensive documentation of the system for its future maintenance, reuse and/or redeployment.However, such forms of documentation require effort to create and maintain.This paper describes a software tool which is currently being developed within the Space Systems Research Group at the University of Dundee which aims to improve the utility of, and the incentive for, creating detailed design documentation for the procedures of a software system. The rationale for creating such a tool is briefly discussed, followed by a description of the tool itself, a summary of its perceived benefits, and plans for future work.
Ye, Xin
2018-01-01
The awareness of others’ activities has been widely recognized as essential in facilitating coordination in a team among Computer-Supported Cooperative Work communities. Several field studies of software developers in large software companies such as Microsoft have shown that coworker and artifact awareness are the most common information needs for software developers; however, they are also two of the seven most frequently unsatisfied information needs. To address this problem, we built a workspace awareness tool named TeamWATCH to visualize developer activities using a 3-D city metaphor. In this paper, we discuss the importance of awareness in software development, review existing workspace awareness tools, present the design and implementation of TeamWATCH, and evaluate how it could help detect and resolve conflicts earlier and better maintain group awareness via a controlled experiment. The experimental results showed that the subjects using TeamWATCH performed significantly better with respect to early conflict detection and resolution. PMID:29558519
Implementation of Task-Tracking Software for Clinical IT Management.
Purohit, Anne-Maria; Brutscheck, Clemens; Prokosch, Hans-Ulrich; Ganslandt, Thomas; Schneider, Martin
2017-01-01
Often in clinical IT departments, many different methods and IT systems are used for task-tracking and project organization. Based on managers' personal preferences and knowledge about project management methods, tools differ from team to team and even from employee to employee. This causes communication problems, especially when tasks need to be done in cooperation with different teams. Monitoring tasks and resources becomes impossible: there are no defined deliverables, which prevents reliable deadlines. Because of these problems, we implemented task-tracking software which is now in use across all seven teams at the University Hospital Erlangen. Over a period of seven months, a working group defined types of tasks (project, routine task, etc.), workflows, and views to monitor the tasks of the 7 divisions, 20 teams and 340 different IT services. The software has been in use since December 2016.
Ribu, Kirsten; Patel, Tulpesh
2016-01-01
People with development disorders, for instance autism, need structured plans to help create predictability in their daily lives. Digital plans can facilitate enhanced independency, learning, and quality of life, but existing apps are largely general purpose and lack the flexibility required by this specific but heterogeneous user group. Universal design is both a goal and a process and should be based on a holistic approach and user-centered design, interacting with the users in all stages of the development process. At Oslo and Akershus University College (HiOA) we conducted a research-based teaching project in co-operation with the Department of Neuro-habilitation at Oslo University Hospital (OUS) with two employees acting as project managers and students as developers. Three groups of Computer Science bachelor students developed digital prototypes for a planning tool for young adults with pervasive development disorders, who live either with their families or in supervised residences, and do not receive extensive public services. The students conducted the initial planning phase of the software development process, focusing on prototyping the system requirements, whilst a professional software company programmed the end solution. The goal of the project was to develop flexible and adaptive user-oriented and user-specific app solutions for tablets that can aid this diverse user group in structuring daily life, whereby, for example, photos of objects and places known to the individual user replace general pictures or drawings, and checklists can be elaborate or sparse as necessary. The three student groups worked independently of each other and created interactive working prototypes based on tests, observations and short interviews with end users (both administrators and residents) and regular user feedback from the project managers. Three very different solutions were developed that were of high enough quality that an external software company were able to continue the work and create a beta version of the app. The first phase in software development process is always challenging and time consuming. Using a research-based teaching approach allowed us to not only save time and expense in the development phase, but, importantly, allowed us to thoroughly investigate a variety of aspects of the problem to create an accessible solution, whilst leveraging our students' knowledge, competencies and creativity. The next stage will be to evaluate the beta version of the app and study its impact on the user's quality of life. Although the end solution is designed for a specific user group, the built-in flexibility of its structure and function means there is the inherent potential to open it up to all users. The universal benefit lies in the flexibility of the solution.
Ada Software Design Methods Formulation.
1982-10-01
cycle organization is also appropriate for another reason. The source material for the case studies is the work of the two contractors who participated in... working version of the system exist. The integration phase takes the pieces developed and combines them into a single working system. Interfaces...hardware, developed separately from the software, is united with the software, and further testing is performed until the system is a working whole
Code of Federal Regulations, 2012 CFR
2012-01-01
... technology and software to destinations in Country Group D:1. 770.3 Section 770.3 Commerce and Foreign Trade... technology and software to destinations in Country Group D:1. (a) Introduction. This section is intended to provide you additional guidance on how to determine whether your technology or software would be eligible...
Code of Federal Regulations, 2011 CFR
2011-01-01
... technology and software to destinations in Country Group D:1. 770.3 Section 770.3 Commerce and Foreign Trade... technology and software to destinations in Country Group D:1. (a) Introduction. This section is intended to provide you additional guidance on how to determine whether your technology or software would be eligible...
Code of Federal Regulations, 2010 CFR
2010-01-01
... technology and software to destinations in Country Group D:1. 770.3 Section 770.3 Commerce and Foreign Trade... technology and software to destinations in Country Group D:1. (a) Introduction. This section is intended to provide you additional guidance on how to determine whether your technology or software would be eligible...
Code of Federal Regulations, 2014 CFR
2014-01-01
... technology and software to destinations in Country Group D:1. 770.3 Section 770.3 Commerce and Foreign Trade... technology and software to destinations in Country Group D:1. (a) Introduction. This section is intended to provide you additional guidance on how to determine whether your technology or software would be eligible...
Code of Federal Regulations, 2013 CFR
2013-01-01
... technology and software to destinations in Country Group D:1. 770.3 Section 770.3 Commerce and Foreign Trade... technology and software to destinations in Country Group D:1. (a) Introduction. This section is intended to provide you additional guidance on how to determine whether your technology or software would be eligible...
Behind Linus's Law: Investigating Peer Review Processes in Open Source
ERIC Educational Resources Information Center
Wang, Jing
2013-01-01
Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…
Free Software and Free Textbooks
ERIC Educational Resources Information Center
Takhteyev, Yuri
2012-01-01
Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…
Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M
2012-03-09
A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.
2012-01-01
Background A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. Methods In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). Results The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. Conclusions When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies. PMID:22405400
The development of a program analysis environment for Ada
NASA Technical Reports Server (NTRS)
Brown, David B.; Carlisle, Homer W.; Chang, Kai-Hsiung; Cross, James H.; Deason, William H.; Haga, Kevin D.; Huggins, John R.; Keleher, William R. A.; Starke, Benjamin B.; Weyrich, Orville R.
1989-01-01
A unit level, Ada software module testing system, called Query Utility Environment for Software Testing of Ada (QUEST/Ada), is described. The project calls for the design and development of a prototype system. QUEST/Ada design began with a definition of the overall system structure and a description of component dependencies. The project team was divided into three groups to resolve the preliminary designs of the parser/scanner: the test data generator, and the test coverage analyzer. The Phase 1 report is a working document from which the system documentation will evolve. It provides history, a guide to report sections, a literature review, the definition of the system structure and high level interfaces, descriptions of the prototype scope, the three major components, and the plan for the remainder of the project. The appendices include specifications, statistics, two papers derived from the current research, a preliminary users' manual, and the proposal and work plan for Phase 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevins, N; Vanderhoek, M; Lang, S
2014-06-15
Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary andmore » secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.« less
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
Bent, F; Ahlbrandt, J; Wagner, A; Weigand, M A; Hofer, S; Lichtenstern, C
2016-05-01
In the hospital, human resource planning has to consider the needs and preferences of personnel and planners as well as the financial interest of the hospital. Additionally, staff planning has become more complex due to a growing number of part-time doctors as well as a variety of working shifts. The aim of the study was to describe existing human resource planning in German anesthesiology departments. Furthermore, we evaluated existing software solutions supporting human resource planning. Anesthesiology departments of German university hospitals were enrolled in the study. The aspects covered were tools and time needed for planning, amount of conflicts while planning, components of the software solutions and the efficiency and satisfaction according to the users. This was evaluated for short-, intermediate- and long-term planning. Two groups were compared: departments with and without software exchanging information among the three planning periods. Out of 35 university anesthesiology departments, 23 took part in the survey. On average they employed 105.8 ± 27.8 doctors who had to cover 13.5 ± 6.3 different shifts during a weekday. Personnel planning is mostly done by senior physicians. In some departments, other staff, such as residents and junior doctors, were involved as well. Software that exchanged information between short-, intermediate- and long-term planning was used in 53 % of the departments (12 out of 23). Five departments used commercially available planning software: Polypoint Deutschland (PolypointDeutschland), Atoss (Atoss AG) and SP Expert (Interflex Datensysteme). The time needed for short-term planning was slightly reduced in the exchanging software group. No difference was shown for the intermediate planning period. The use of this software led to a slight reduction in planning conflicts and increased the self-estimated efficiency of the users (p = 0.02). Throughout all groups, the major complaint was missing interfaces, for example between the software and human resources department. The ideal planning software should reduce time needed for planning and prevent planning conflicts according to the interviewed physicians. Furthermore it should be flexible and transparent for all involved staff. This study analyzed structures established in human resource planning in the anesthesiology departments for the first time. Time for planning varies significantly in comparable departments indicating suboptimal processes. Throughout Germany, the requirements for human resources planning are similar; for example, the software should integrate all aspects of HR planning. Different approaches are under evaluation but so far no software solution has prevailed. The used solutions vary substantially and therefore a comparison is difficult. There is no software solution with wide adoption.
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
Assessment Environment for Complex Systems Software Guide
NASA Technical Reports Server (NTRS)
2013-01-01
This Software Guide (SG) describes the software developed to test the Assessment Environment for Complex Systems (AECS) by the West Virginia High Technology Consortium (WVHTC) Foundation's Mission Systems Group (MSG) for the National Aeronautics and Space Administration (NASA) Aeronautics Research Mission Directorate (ARMD). This software is referred to as the AECS Test Project throughout the remainder of this document. AECS provides a framework for developing, simulating, testing, and analyzing modern avionics systems within an Integrated Modular Avionics (IMA) architecture. The purpose of the AECS Test Project is twofold. First, it provides a means to test the AECS hardware and system developed by MSG. Second, it provides an example project upon which future AECS research may be based. This Software Guide fully describes building, installing, and executing the AECS Test Project as well as its architecture and design. The design of the AECS hardware is described in the AECS Hardware Guide. Instructions on how to configure, build and use the AECS are described in the User's Guide. Sample AECS software, developed by the WVHTC Foundation, is presented in the AECS Software Guide. The AECS Hardware Guide, AECS User's Guide, and AECS Software Guide are authored by MSG. The requirements set forth for AECS are presented in the Statement of Work for the Assessment Environment for Complex Systems authored by NASA Dryden Flight Research Center (DFRC). The intended audience for this document includes software engineers, hardware engineers, project managers, and quality assurance personnel from WVHTC Foundation (the suppliers of the software), NASA (the customer), and future researchers (users of the software). Readers are assumed to have general knowledge in the field of real-time, embedded computer software development.
Voge, Catherine; Hirvela, Kari; Jarzemsky, Paula
2012-01-01
To create an opportunity for students to connect with the Quality and Safety Education for Nurses competencies and demonstrate learning via knowledge transference, the authors piloted a digital media assignment. Students worked in small groups to create an unfolding patient care scenario with embedded decision points, using presentation software. The authors discuss the assignment and its outcomes.
Assurance Policy Evaluation - Spacecraft and Strategic Systems
2014-09-17
electromechanical (EEE) parts, software, design and workmanship, work instructions, manufacturing and tooling, cleanrooms, electrostatic discharge ...T9001B. An external group, called the Evaluation and Assessment Team, made up of product assurance subject matter experts from NSWC Corona performs...NSWC, Corona and SSP Technical Branch(es). The FTPE, performed every 3 years, is an objective evaluation of facility performance to assure proper
Complexity and Chaos - State-of-the-Art; Formulations and Measures of Complexity
2007-09-01
Systems (SoS) Section. His research interests are oriented toward the study, design and engineering of military complex systems through the lens of the...Approved for release by This work is part of project 15bp01 – Defensive Software Design . © Her Majesty the Queen in Right of Canada...16 2.35 Minimum Number of Sub Groups
2015-04-16
Specific to Work and Organization .................................................................... 32 Summary of Questions Specific to Work and...Limitations include assumptions that the work identified in the software center’s 6 mission and functions manual (10-1; CECOM, 2011) as well as in public...that produced RDECOM. The focus was on the movement of positions based on the position job series, not on the work that was actually being performed
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... Packard Company Business Critical Systems, Mission Critical Business Software Division, OpenVMS Operating... Software Division, OpenVMS Operating System Development Group, Including an Employee Operating Out of the..., Mission Critical Business Software Division, OpenVMS Operating System Development Group, including...
New CFD tools to evaluate nasal airflow.
Burgos, M A; Sanmiguel-Rojas, E; Del Pino, C; Sevilla-García, M A; Esteban-Ortega, F
2017-08-01
Computational fluid dynamics (CFD) is a mathematical tool to analyse airflow. As currently CFD is not a usual tool for rhinologists, a group of engineers in collaboration with experts in Rhinology have developed a very intuitive CFD software. The program MECOMLAND ® only required snapshots from the patient's cross-sectional (tomographic) images, being the output those results originated by CFD, such as airflow distributions, velocity profiles, pressure, temperature, or wall shear stress. This is useful complementary information to cover diagnosis, prognosis, or follow-up of nasal pathologies based on quantitative magnitudes linked to airflow. In addition, the user-friendly environment NOSELAND ® helps the medical assessment significantly in the post-processing phase with dynamic reports using a 3D endoscopic view. Specialists in Rhinology have been asked for a more intuitive, simple, powerful CFD software to offer more quality and precision in their work to evaluate the nasal airflow. We present MECOMLAND ® and NOSELAND ® which have all the expected characteristics to fulfil this demand and offer a proper assessment with the maximum of quality plus safety for the patient. These programs represent a non-invasive, low-cost (as the CT scan is already performed in every patient) alternative for the functional study of the difficult rhinologic case. To validate the software, we studied two groups of patients from the Ear Nose Throat clinic, a first group with normal noses and a second group presenting septal deviations. Wall shear stresses are lower in the cases of normal noses in comparison with those for septal deviation. Besides, velocity field distributions, pressure drop between nasopharynx and the ambient, and flow rates in each nostril were different among the nasal cavities in the two groups. These software modules open up a promising future to simulate the nasal airflow behaviour in virtual surgery intervention scenarios under different pressure or temperature conditions to understand the effects on nasal airflow.
Towards a Better Understanding of CMMI and Agile Integration - Multiple Case Study of Four Companies
NASA Astrophysics Data System (ADS)
Pikkarainen, Minna
The amount of software is increasing in the different domains in Europe. This provides the industries in smaller countries good opportunities to work in the international markets. Success in the global markets however demands the rapid production of high quality, error free software. Both CMMI and agile methods seem to provide a ready solution for quality and lead time improvements. There is not, however, much empirical evidence available either about 1) how the integration of these two aspects can be done in practice or 2) what it actually demands from assessors and software process improvement groups. The goal of this paper is to increase the understanding of CMMI and agile integration, in particular, focusing on the research question: how to use ‘lightweight’ style of CMMI assessments in agile contexts. This is done via four case studies in which assessments were conducted using the goals of CMMI integrated project management and collaboration and coordination with relevant stakeholder process areas and practices from XP and Scrum. The study shows that the use of agile practices may support the fulfilment of the goals of CMMI process areas but there are still many challenges for the agile teams to be solved within the continuous improvement programs. It also identifies practical advices to the assessors and improvement groups to take into consideration when conducting assessment in the context of agile software development.
The software development process at the Chandra X-ray Center
NASA Astrophysics Data System (ADS)
Evans, Janet D.; Evans, Ian N.; Fabbiano, Giuseppina
2008-08-01
Software development for the Chandra X-ray Center Data System began in the mid 1990's, and the waterfall model of development was mandated by our documents. Although we initially tried this approach, we found that a process with elements of the spiral model worked better in our science-based environment. High-level science requirements are usually established by scientists, and provided to the software development group. We follow with review and refinement of those requirements prior to the design phase. Design reviews are conducted for substantial projects within the development team, and include scientists whenever appropriate. Development follows agreed upon schedules that include several internal releases of the task before completion. Feedback from science testing early in the process helps to identify and resolve misunderstandings present in the detailed requirements, and allows review of intangible requirements. The development process includes specific testing of requirements, developer and user documentation, and support after deployment to operations or to users. We discuss the process we follow at the Chandra X-ray Center (CXC) to develop software and support operations. We review the role of the science and development staff from conception to release of software, and some lessons learned from managing CXC software development for over a decade.
NASA Astrophysics Data System (ADS)
Belloni, V.; Ravanelli, R.; Nascetti, A.; Di Rita, M.; Mattei, D.; Crespi, M.
2018-05-01
In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC) has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome "La Sapienza"; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome "La Sapienza" and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.
The development and evaluation of a medical imaging training immersive environment
Bridge, Pete; Gunn, Therese; Kastanis, Lazaros; Pack, Darren; Rowntree, Pamela; Starkey, Debbie; Mahoney, Gaynor; Berry, Clare; Braithwaite, Vicki; Wilson-Stewart, Kelly
2014-01-01
Introduction A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. Methods A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Results Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Conclusions Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment. PMID:26229652
McGreevy, P D; Della Torre, P K; Evans, D L
2003-01-01
Interactive software has been developed on CD-ROM to facilitate learning of problem formulation, diagnostic methodology, and therapeutic options in dog and cat behavior problems. Students working in small groups are presented with a signalment, a case history, and brief description of the problem behavior as perceived by the client. Students then navigate through the case history by asking the client questions from an icon-driven question pad. Animated video responses to the questions are provided. Students are then required to rate the significance of the questions and answers with respect to the development of the unwelcome behavior. Links to online self-assessments and to resource materials about causation and treatment options are provided to assist students in their decision-making process. The activity concludes with a software-generated e-mail submission that includes the recorded history, diagnosis, and recommended treatment for assessment purposes.
Coarse-Grained Structural Modeling of Molecular Motors Using Multibody Dynamics
Parker, David; Bryant, Zev; Delp, Scott L.
2010-01-01
Experimental and computational approaches are needed to uncover the mechanisms by which molecular motors convert chemical energy into mechanical work. In this article, we describe methods and software to generate structurally realistic models of molecular motor conformations compatible with experimental data from different sources. Coarse-grained models of molecular structures are constructed by combining groups of atoms into a system of rigid bodies connected by joints. Contacts between rigid bodies enforce excluded volume constraints, and spring potentials model system elasticity. This simplified representation allows the conformations of complex molecular motors to be simulated interactively, providing a tool for hypothesis building and quantitative comparisons between models and experiments. In an example calculation, we have used the software to construct atomically detailed models of the myosin V molecular motor bound to its actin track. The software is available at www.simtk.org. PMID:20428469
ERIC Educational Resources Information Center
Muller, Eugene W.
1985-01-01
Develops generalizations for empirical evaluation of software based upon suitability of several research designs--pretest posttest control group, single-group pretest posttest, nonequivalent control group, time series, and regression discontinuity--to type of software being evaluated, and on circumstances under which evaluation is conducted. (MBR)
Iwata, Kazuhiko; Matsuda, Yasuhiro; Sato, Sayaka; Furukawa, Shunichi; Watanabe, Yukako; Hatsuse, Norifumi; Ikebuchi, Emi
2017-03-01
Cognitive impairment is common in schizophrenia, and is associated with poor psychosocial functioning. Previous studies had inconsistently shown improvement in cognitive functions with cognitive remediation therapy. This study examined whether cognitive remediation is effective in improving both cognitive and social functions in schizophrenia in outpatient settings that provide learning-based psychiatric rehabilitation. This study is the first randomized controlled trial of cognitive remediation in Japan. Study participants were individuals with schizophrenia from 6 outpatient psychiatric medical facilities who were randomly assigned either a cognitive remediation program or treatment as usual. The cognitive remediation intervention includes Cognitive training using computer software (CogPack; Japanese version) administered twice a week and a weekly group over 12 weeks and was based on the Thinking Skills for Work program. Most study participants were attending day treatment services where social skills training, psychoeducation for knowledge about schizophrenia, group activities such as recreation and sport, and other psychosocial treatment were offered. Cognitive and social functioning were assessed using the Brief Assessment of Cognition in Schizophrenia (BACS) and Life Assessment Scale for Mentally Ill (LASMI) at pre- and postintervention. Of the 60 people with schizophrenia enrolled, 29 were allocated to the cognitive remediation group and 31 were allocated to the treatment as usual group. Processing speed, executive function, and the composite score of the BACS showed significantly greater improvement for the cognitive remediation group than the treatment as usual group. In addition, there was significant improvement in interpersonal relationships and work skills on the LASMI for the cognitive remediation group compared with the treatment as usual group. Changes from pretreatment to posttreatment in verbal fluency and interpersonal relationships were significantly correlated, as well as changes in attention and work skills. The present findings showed that providing cognitive remediation on addition to psychiatric rehabilitation contributed to greater improvement in both cognitive and social functioning than psychiatric rehabilitation alone. Cognitive remediation may enhance the efficacy of psychiatric rehabilitation improving social functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
From Data-Sharing to Model-Sharing: SCEC and the Development of Earthquake System Science (Invited)
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2009-12-01
Earthquake system science seeks to construct system-level models of earthquake phenomena and use them to predict emergent seismic behavior—an ambitious enterprise that requires high degree of interdisciplinary, multi-institutional collaboration. This presentation will explore model-sharing structures that have been successful in promoting earthquake system science within the Southern California Earthquake Center (SCEC). These include disciplinary working groups to aggregate data into community models; numerical-simulation working groups to investigate system-specific phenomena (process modeling) and further improve the data models (inverse modeling); and interdisciplinary working groups to synthesize predictive system-level models. SCEC has developed a cyberinfrastructure, called the Community Modeling Environment, that can distribute the community models; manage large suites of numerical simulations; vertically integrate the hardware, software, and wetware needed for system-level modeling; and promote the interactions among working groups needed for model validation and refinement. Various socio-scientific structures contribute to successful model-sharing. Two of the most important are “communities of trust” and collaborations between government and academic scientists on mission-oriented objectives. The latter include improvements of earthquake forecasts and seismic hazard models and the use of earthquake scenarios in promoting public awareness and disaster management.
Programming Makes Software; Support Makes Users
NASA Astrophysics Data System (ADS)
Batcheller, A. L.
2010-12-01
Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.
Software for enhanced video capsule endoscopy: challenges for essential progress.
Iakovidis, Dimitris K; Koulaouzidis, Anastasios
2015-03-01
Video capsule endoscopy (VCE) has revolutionized the diagnostic work-up in the field of small bowel diseases. Furthermore, VCE has the potential to become the leading screening technique for the entire gastrointestinal tract. Computational methods that can be implemented in software can enhance the diagnostic yield of VCE both in terms of efficiency and diagnostic accuracy. Since the appearance of the first capsule endoscope in clinical practice in 2001, information technology (IT) research groups have proposed a variety of such methods, including algorithms for detecting haemorrhage and lesions, reducing the reviewing time, localizing the capsule or lesion, assessing intestinal motility, enhancing the video quality and managing the data. Even though research is prolific (as measured by publication activity), the progress made during the past 5 years can only be considered as marginal with respect to clinically significant outcomes. One thing is clear-parallel pathways of medical and IT scientists exist, each publishing in their own area, but where do these research pathways meet? Could the proposed IT plans have any clinical effect and do clinicians really understand the limitations of VCE software? In this Review, we present an in-depth critical analysis that aims to inspire and align the agendas of the two scientific groups.
An Update on the VAMOS Extremes Working Group Activities
NASA Technical Reports Server (NTRS)
Schubert, Siegfried; Cavalcanti, Iracema
2011-01-01
We review here the progress of the Variability of the American MOnsoon Systems (VAMOS) extremes working group since it was formed in February of 2010. The goals of the working group are to 1) develop an atlas of warm-season extremes over the Americas, 2) evaluate existing and planned simulations, and 3) suggest new model runs to address mechanisms and predictability of extremes. Substantial progress has been made in the development of an extremes atlas based on gridded observations and several reanalysis products including Modern Era Retrospective-Analysis for Research and Applications (MERRA) and Climate Forecast System Reanalysis (CFSR). The status of the atlas, remaining issues and plans for its expansion to include model data will be discussed. This includes the possibility of adding a companion atlas based on station observations based on the software developed under the World Climate Research Programme (WCRP) Expert Team on Climate Change. Detection and Indices (ETCCDI) activity. We will also review progress on relevant research and plans for the use and validation of the atlas results.
Abstract of talk for Silicon Valley Linux Users Group
NASA Technical Reports Server (NTRS)
Clanton, Sam
2003-01-01
The use of Linux for research at NASA Ames is discussed.Topics include:work with the Atmospheric Physics branch on software for a spectrometer to be used in the CRYSTAL-FACE mission this summer; work on in the Neuroengineering Lab with code IC including an introduction to the extension of the human senses project,advantages with using linux for real-time biological data processing,algorithms utilized on a linux system, goals of the project,slides of people with Neuroscan caps on, and progress that has been made and how linux has helped.
Remote sensing information sciences research group: Browse in the EOS era
NASA Technical Reports Server (NTRS)
Estes, John E.; Star, Jeffrey L.
1989-01-01
The problem of science data browse was examined. Given the tremendous data volumes that are planned for future space missions, particularly the Earth Observing System in the late 1990's, the need for access to large spatial databases must be understood. Work was continued to refine the concept of data browse. Further, software was developed to provide a testbed of the concepts, both to locate possibly interesting data, as well as view a small portion of the data. Build II was placed on a minicomputer and a PC in the laboratory, and provided accounts for use in the testbed. Consideration of the testbed software as an element of in-house data management plans was begun.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Evan Weaver Photo of Evan Weaver Evan Weaver Researcher III-Software Engineering Evan.Weaver , he works as a software engineer developing whole-building energy modeling tools. Prior to joining NREL, he worked in the biomedical industry as a software engineer, specializing in graphical user
The Future of Software Engineering for High Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G
DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts ofmore » the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.« less
An ontology based trust verification of software license agreement
NASA Astrophysics Data System (ADS)
Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo
2017-08-01
When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
Griffiths, Silja Torvik; Gundersen, Hilde; Neto, Emanuel; Elgen, Irene; Markestad, Trond; Aukland, Stein M; Hugdahl, Kenneth
2013-08-01
Extremely preterm (EPT)/extremely low-birth-weight (ELBW) children attaining school age and adolescence often have problems with executive functions such as working memory and selective attention. Our aim was to investigate a hypothesized difference in blood oxygen level-dependent (BOLD) activation during a selective attention-working memory task in EPT/ELBW children as compared with term-born controls. A regional cohort of 28 EPT/ELBW children and 28 term-born controls underwent functional magnetic resonance imaging (fMRI) scanning at 11 y of age while performing a combined Stroop n-back task. Group differences in BOLD activation were analyzed with Statistical Parametric Mapping 8 analysis software package, and reaction times (RTs) and response accuracy (RA) were compared in a multifactorial ANOVA test. The BOLD activation pattern in the preterm group involved the same areas (cingulate, prefrontal, and parietal cortexes), but all areas displayed significantly less activation than those in the control group, particularly when the cognitive load was increased. The RA results corresponded with the activation data in that the preterm group had significantly fewer correct responses. No group difference was found regarding RTs. Children born EPT/ELBW displayed reduced working memory and selective attention capacity as compared with term-born controls. These impairments had neuronal correlates with reduced BOLD activation in areas responsible for online stimulus monitoring, working memory, and cognitive control.
NASA Astrophysics Data System (ADS)
Eslinger, Eric Martin
Metacognitive skills are a crucial component of a successful learning career. We define metacognition as the ability to plan, monitor progress toward a goal, reflect on the quality of work and process, and revise the work or plan accordingly. By explicitly addressing certain metacognitive practices in classrooms, researchers have observed improved learning outcomes in both science and mathematical problem solving. Although these efforts were successful, they were also limited in the range of skills that could be addressed at one time and the methods used to address them due to the static nature inherent in traditional pencil-and-paper format. We wished to address these skills in a more dynamic, continuous representation such as that afforded by a computerized learning environment. This paper outlines such an environment and describes pedagogical activities afforded by the system. The ThinkerTools group developed and tested a software scaffold for inquiry projects in a middle-school classroom. By analyzing student use of the software tool, three forms of self-assessment activity were noted: integrated, task and project self-assessment. Each assessment form was related to the degree of interleaving between assessment and work the students engaged in as they developed their inquiry products. I argue that the integrated forms of assessment are more beneficial to student learning, and show that there is a significant relationship between active self-assessment forms and measures of student achievement and product quality. Through the use of case studies including video analysis, I address specific student self-assessment activity that utilized the software as well as self-assessment that took place outside of the software. A model of student self-assessment activity was created, highlighting aspects of activity that afford more productive self-assessment episodes.
NASA Astrophysics Data System (ADS)
Neighbour, Gordon
2013-04-01
In 2012 Computing and Information Technology was disapplied from the English National Curriculum and therefore no longer has a compulsory programme of study. Data logging and data modelling are still essential components of the curriculum in the Computing and Information Technology classroom. Once the students have mastered the basics of both spreadsheet and information handling software they need to be further challenged. All too often the data used with relation to data-logging and data-handling is not realistic enough to really challenge very able students. However, using data from seismology allows students to manipulate "real" data and enhances their experience of geo-science, developing their skills and then allowing them to build on this work in both the science and geography classroom. This new scheme of work "Seismology at School" has allowed the students to work and develop skills beyond those normally expected for their age group and has allowed them to better appreciate their learning experience of "Natural Hazards" in the science and geography classroom in later years. The students undertake research to help them develop their understanding of earthquakes. This includes using materials from other nations within the European Economic Area, to also develop and challenge their use of Modern Foreign Languages. They are then challenged to create their own seismometers using simple kits and 'free' software - this "problem-solving" approach to their work is designed to enhance team-work and to extend the challenge they experience in the classroom. The students are then are asked to manipulate a "real" set of data using international earthquake data from the most recent whole year. This allows the students to make use of many of the analytical and statistical functions of both spreadsheet software and information handling software in a meaningful way. The students will need to have developed a hypothesis which their work should have provided either validation for or against. They are required to document their progress throughout the project and submit their work as an electronic portfolio for marking and this thus challenges their organisational abilities. Finally through the project it is hoped to develop and extend partnerships with other schools in the European Economic Area so that the students are able to work with students from these areas to further appreciate the teaching of "Natural Hazards" in other cultures within the EEA.
A Brief Survey of the Team Software ProcessSM (TSPSM)
2011-10-24
spent more than 20 years in industry as a software engineer, system designer, project leader, and development manager working on control systems...InnerWorkings, Inc. Instituto Tecnologico y de Estudios Superiores de Monterrey Siemens AG SILAC Ingenieria de Software S.A. de C.V
NASA Astrophysics Data System (ADS)
Babik, M.; Chudoba, J.; Dewhurst, A.; Finnern, T.; Froy, T.; Grigoras, C.; Hafeez, K.; Hoeft, B.; Idiculla, T.; Kelsey, D. P.; López Muñoz, F.; Martelli, E.; Nandakumar, R.; Ohrenberg, K.; Prelz, F.; Rand, D.; Sciabà, A.; Tigerstedt, U.; Traynor, D.; Wartel, R.
2017-10-01
IPv4 network addresses are running out and the deployment of IPv6 networking in many places is now well underway. Following the work of the HEPiX IPv6 Working Group, a growing number of sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) are deploying dual-stack IPv6/IPv4 services. The aim of this is to support the use of IPv6-only clients, i.e. worker nodes, virtual machines or containers. The IPv6 networking protocols while they do contain features aimed at improving security also bring new challenges for operational IT security. The lack of maturity of IPv6 implementations together with the increased complexity of some of the protocol standards raise many new issues for operational security teams. The HEPiX IPv6 Working Group is producing guidance on best practices in this area. This paper considers some of the security concerns for WLCG in an IPv6 world and presents the HEPiX IPv6 working group guidance for the system administrators who manage IT services on the WLCG distributed infrastructure, for their related site security and networking teams and for developers and software engineers working on WLCG applications.
Visiting Vehicle Ground Trajectory Tool
NASA Technical Reports Server (NTRS)
Hamm, Dustin
2013-01-01
The International Space Station (ISS) Visiting Vehicle Group needed a targeting tool for vehicles that rendezvous with the ISS. The Visiting Vehicle Ground Trajectory targeting tool provides the ability to perform both realtime and planning operations for the Visiting Vehicle Group. This tool provides a highly reconfigurable base, which allows the Visiting Vehicle Group to perform their work. The application is composed of a telemetry processing function, a relative motion function, a targeting function, a vector view, and 2D/3D world map type graphics. The software tool provides the ability to plan a rendezvous trajectory for vehicles that visit the ISS. It models these relative trajectories using planned and realtime data from the vehicle. The tool monitors ongoing rendezvous trajectory relative motion, and ensures visiting vehicles stay within agreed corridors. The software provides the ability to update or re-plan a rendezvous to support contingency operations. Adding new parameters and incorporating them into the system was previously not available on-the-fly. If an unanticipated capability wasn't discovered until the vehicle was flying, there was no way to update things.
Using Group Explorer in teaching abstract algebra
NASA Astrophysics Data System (ADS)
Schubert, Claus; Gfeller, Mary; Donohue, Christopher
2013-04-01
This study explores the use of Group Explorer in an undergraduate mathematics course in abstract algebra. The visual nature of Group Explorer in representing concepts in group theory is an attractive incentive to use this software in the classroom. However, little is known about students' perceptions on this technology in learning concepts in abstract algebra. A total of 26 participants in an undergraduate course studying group theory were surveyed regarding their experiences using Group Explorer. Findings indicate that all participants believed that the software was beneficial to their learning and described their attitudes regarding the software in terms of using the technology and its helpfulness in learning concepts. A multiple regression analysis reveals that representational fluency of concepts with the software correlated significantly with participants' understanding of group concepts yet, participants' attitudes about Group Explorer and technology in general were not significant factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childers, L.; Liming, L.; Foster, I.
2008-10-15
This report summarizes the methodology and results of a user perspectives study conducted by the Community Driven Improvement of Globus Software (CDIGS) project. The purpose of the study was to document the work-related goals and challenges facing today's scientific technology users, to record their perspectives on Globus software and the distributed-computing ecosystem, and to provide recommendations to the Globus community based on the observations. Globus is a set of open source software components intended to provide a framework for collaborative computational science activities. Rather than attempting to characterize all users or potential users of Globus software, our strategy has beenmore » to speak in detail with a small group of individuals in the scientific community whose work appears to be the kind that could benefit from Globus software, learn as much as possible about their work goals and the challenges they face, and describe what we found. The result is a set of statements about specific individuals experiences. We do not claim that these are representative of a potential user community, but we do claim to have found commonalities and differences among the interviewees that may be reflected in the user community as a whole. We present these as a series of hypotheses that can be tested by subsequent studies, and we offer recommendations to Globus developers based on the assumption that these hypotheses are representative. Specifically, we conducted interviews with thirty technology users in the scientific community. We included both people who have used Globus software and those who have not. We made a point of including individuals who represent a variety of roles in scientific projects, for example, scientists, software developers, engineers, and infrastructure providers. The following material is included in this report: (1) A summary of the reported work-related goals, significant issues, and points of satisfaction with the use of Globus software; (2) A method for characterizing users according to their technology interactions, and identification of four user types among the interviewees using the method; (3) Four profiles that highlight points of commonality and diversity in each user type; (4) Recommendations for technology developers and future studies; (5) A description of the interview protocol and overall study methodology; (6) An anonymized list of the interviewees; and (7) Interview writeups and summary data. The interview summaries in Section 3 and transcripts in Appendix D illustrate the value of distributed computing software--and Globus in particular--to scientific enterprises. They also document opportunities to make these tools still more useful both to current users and to new communities. We aim our recommendations at developers who intend their software to be used and reused in many applications. (This kind of software is often referred to as 'middleware.') Our two core recommendations are as follows. First, it is essential for middleware developers to understand and explicitly manage the multiple user products in which their software components are used. We must avoid making assumptions about the commonality of these products and, instead, study and account for their diversity. Second, middleware developers should engage in different ways with different kinds of users. Having identified four general user types in Section 4, we provide specific ideas for how to engage them in Section 5.« less
Imai, Shungo; Yamada, Takehiro; Ishiguro, Nobuhisa; Miyamoto, Takenori; Kagami, Keisuke; Tomiyama, Naoki; Niinuma, Yusuke; Nagasaki, Daisuke; Suzuki, Koji; Yamagami, Akira; Kasashi, Kumiko; Kobayashi, Masaki; Iseki, Ken
2017-01-01
Based on the predictive performance in our previous study, we switched the therapeutic drug monitoring (TDM) analysis software for dose setting of vancomycin (VCM) from "Vancomycin MEEK TDM analysis software Ver2.0" (MEEK) to "SHIONOGI-VCM-TDM ver.2009" (VCM-TDM) in January 2015. In the present study, our aim was to validate the effectiveness of the changing VCM TDM analysis software in initial dose setting of VCM. The enrolled patients were divided into two groups, each having 162 patients in total, who received VCM with the initial dose set using MEEK (MEEK group) or VCM-TDM (VCM-TDM group). We compared the rates of attaining the therapeutic range (trough value; 10-20 μg/mL) of serum VCM concentration between the groups. Multivariate logistic regression analysis was performed to confirm that changing the VCM TDM analysis software was an independent factor related to attaining the therapeutic range. Switching the VCM TDM analysis software from MEEK to VCM-TDM improved the rate of attaining the therapeutic range by 21.6% (MEEK group: 42.6% vs. VCM-TDM group: 64.2%, p<0.01). Patient age ≥65 years, concomitant medication (furosemide) and the TDM analysis software used VCM-TDM were considered to be independent factors for attaining the therapeutic range. These results demonstrated the effectiveness of switching the VCM TDM analysis software from MEEK to VCM-TDM for initial dose setting of VCM.
NASA Technical Reports Server (NTRS)
2000-01-01
Oak Grove Reactor, developed by Oak Grove Systems, is a new software program that allows users to integrate workflow processes. It can be used with portable communication devices. The software can join e-mail, calendar/scheduling and legacy applications into one interactive system via the web. Priority tasks and due dates are organized and highlighted to keep the user up to date with developments. Reactor works with existing software and few new skills are needed to use it. Using a web browser, a user can can work on something while other users can work on the same procedure or view its status while it is being worked on at another site. The software was developed by the Jet Propulsion Lab and originally put to use at Johnson Space Center.
Fernandez, Elizabeth; Bergado Rosado, Jorge A.; Rodriguez Perez, Daymi; Salazar Santana, Sonia; Torres Aguilar, Maydane; Bringas, Maria Luisa
2017-01-01
Many training programs have been designed using modern software to restore the impaired cognitive functions in patients with acquired brain damage (ABD). The objective of this study was to evaluate the effectiveness of a computer-based training program of attention and memory in patients with ABD, using a two-armed parallel group design, where the experimental group (n = 50) received cognitive stimulation using RehaCom software, and the control group (n = 30) received the standard cognitive stimulation (non-computerized) for eight weeks. In order to assess the possible cognitive changes after the treatment, a post-pre experimental design was employed using the following neuropsychological tests: Wechsler Memory Scale (WMS) and Trail Making test A and B. The effectiveness of the training procedure was statistically significant (p < 0.05) when it established the comparison between the performance in these scales, before and after the training period, in each patient and between the two groups. The training group had statistically significant (p < 0.001) changes in focused attention (Trail A), two subtests (digit span and logical memory), and the overall score of WMS. Finally, we discuss the advantages of computerized training rehabilitation and further directions of this line of work. PMID:29301194
Design of polarized infrared athermal telephoto objective for penetrating the fog
NASA Astrophysics Data System (ADS)
Gao, Duorui; Fu, Qiang; Zhao, Zhao; Zhao, Bin; Zhong, Lijun; Zhan, Juntong
2014-11-01
Polarized infrared imaging technology is a new detection technique which own the ability of spying through the fog, highlighting the target and recognizing the forgeries, these characters make it a good advantage of increasing the work distance in the fog. Compared to the traditional infrared imaging method, polarized infrared imaging can identify the background and target easily, that is the most distinguishing feature of polarized infrared imaging technology. Owning to the large refractive index of the infrared material, temperature change will bring defocus seriously, athermal infrared objective is necessarily. On the other hand, athermal objective has large total length, and hard to be integrated for their huge volume. However telephoto objective has the character of small volume and short total length. The paper introduce a method of polarized and athermal infrared telephoto objective which can spy the fog. First assign the optical power of the fore group and the rear group on the basis of the principle of telephoto objective, the power of the fore group is positive and the rear group is negative; then distribute the optical power within each group to realize the ability of athermalization, finally computer-aided software is used to correct aberration. In order to prove the feasibility of the scheme, an athermal optical system was designed by virtue of ZEMAX software which works at 8~12 µm, the focal length of 150mm, F number is 2, and total length of the telephoto objective is 120mm. The environment temperature analysis shows that the optical system have stable imaging quality, MTF is close to diffraction limit. This telephoto objective is available for infrared polarized imaging.
Pioche, Mathieu; Rivory, Jérôme; Nishizawa, Toshihiro; Uraoka, Toshio; Touzet, Sandrine; O'Brien, Marc; Saurin, Jean-Christophe; Ponchon, Thierry; Denis, Angélique; Yahagi, Naohisa
2016-12-01
Background and study aim: Endoscopic submucosal dissection (ESD) is currently the reference method to achieve an en bloc resection for large lesions; however, the technique is difficult and risky, with a long learning curve. In order to reduce the morbidity, training courses that use animal models are recommended. Recently, self-learning software has been developed to assist students in their training. The aim of this study was to evaluate the impact of this tool on the ESD learning curve. Methods: A prospective, randomized, comparative study enrolled 39 students who were experienced in interventional endoscopy. Each student was randomized to one of two groups and performed 30 ESDs of 30 mm standardized lesions in a bovine colon model. The software group used the self-learning software whereas the control group only observed an ESD procedure video. The primary outcome was the rate of successful ESD procedures, defined as complete en bloc resection without any perforation and performed in less than 75 minutes. Results: A total of 39 students performed 1170 ESDs. Success was achieved in 404 (70.9 %) in the software group and 367 (61.2 %) in the control group ( P = 0.03). Among the successful procedures, there were no significant differences between the software and control groups in terms of perforation rate (22 [4.0 %] vs. 29 [5.1 %], respectively; P = 0.27) and mean (SD) procedure duration (34.1 [13.4] vs. 32.3 [14.0] minutes, respectively; P = 0.52). For the 30th procedure, the rate of complete resection was superior in the software group (84.2 %) compared with the control group (50.0 %; P = 0.01). Conclusion: ESD self-learning software was effective in improving the quality of resection compared with a standard teaching method using procedure videos. This result suggests the benefit of incorporating such software into teaching programs. © Georg Thieme Verlag KG Stuttgart · New York.
Computer work duration and its dependence on the used pause definition.
Richter, Janneke M; Slijper, Harm P; Over, Eelco A B; Frens, Maarten A
2008-11-01
Several ergonomic studies have estimated computer work duration using registration software. In these studies, an arbitrary pause definition (Pd; the minimal time between two computer events to constitute a pause) is chosen and the resulting duration of computer work is estimated. In order to uncover the relationship between the used pause definition and the computer work duration (PWT), we used registration software to record usage patterns of 571 computer users across almost 60,000 working days. For a large range of Pds (1-120 s), we found a shallow, log-linear relationship between PWT and Pds. For keyboard and mouse use, a second-order function fitted the data best. We found that these relationships were dependent on the amount of computer work and subject characteristics. Comparison of exposure duration from studies using different pause definitions should take this into account, since it could lead to misclassification. Software manufacturers and ergonomists assessing computer work duration could use the found relationships for software design and study comparison.
Development of Software to Model AXAF-I Image Quality
NASA Technical Reports Server (NTRS)
Ahmad, Anees; Hawkins, Lamar
1996-01-01
This draft final report describes the work performed under the delivery order number 145 from May 1995 through August 1996. The scope of work included a number of software development tasks for the performance modeling of AXAF-I. A number of new capabilities and functions have been added to the GT software, which is the command mode version of the GRAZTRACE software, originally developed by MSFC. A structural data interface has been developed for the EAL (old SPAR) finite element analysis FEA program, which is being used by MSFC Structural Analysis group for the analysis of AXAF-I. This interface utility can read the structural deformation file from the EAL and other finite element analysis programs such as NASTRAN and COSMOS/M, and convert the data to a suitable format that can be used for the deformation ray-tracing to predict the image quality for a distorted mirror. There is a provision in this utility to expand the data from finite element models assuming 180 degrees symmetry. This utility has been used to predict image characteristics for the AXAF-I HRMA, when subjected to gravity effects in the horizontal x-ray ground test configuration. The development of the metrology data processing interface software has also been completed. It can read the HDOS FITS format surface map files, manipulate and filter the metrology data, and produce a deformation file, which can be used by GT for ray tracing for the mirror surface figure errors. This utility has been used to determine the optimum alignment (axial spacing and clocking) for the four pairs of AXAF-I mirrors. Based on this optimized alignment, the geometric images and effective focal lengths for the as built mirrors were predicted to cross check the results obtained by Kodak.
First steps of processing VLBI data of space probes with VieVS
NASA Astrophysics Data System (ADS)
Plank, L.; Böhm, J.; Schuh, H.
2011-07-01
Since 2008 the VLBI group at the Institute of Geodesy and Geophysics (IGG) of the Vienna University of Technology has developed the Vienna VLBI Software VieVS which is capable to process geodetic VLBI data in NGS format. Constantly we are working on upgrading the new software, e.g. by developing a scheduling tool or extending the software from single session solution to a so-called global solution, allowing the joint analysis of many sessions covering several years. In this presentation we report on first steps to enable the processing of space VLBI data with the software. Driven by the recently increasing number of space VLBI applications, our goal is the geodetic usage of such data, primarily concerning frame ties between various reference frames, e. g. by connecting the dynamic reference frame of a space probe with the kinematically defined International Celestial Reference Frame (ICRF). Main parts of the software extension w.r.t. the existing VieVS are the treatment of fast moving targets, the implementation of a delay model for radio emitters at finite distances, and the adequate mathematical model and adjustment of the particular unknowns. Actual work has been done for two mission scenarios so far: On the one hand differential VLBI (D-VLBI) data from the two sub-satellites of the Japanese lunar mission Selene were processed, on the other hand VLBI observations of GNSS satellites were modelled in VieVS. Besides some general aspects, we give details on the calculation of the theoretical delay (delay model for moving sources at finite distances) and its realization in VieVS. First results with real data and comparisons with best fit mission orbit data are also presented.'
A Novel Approach for Collaborative Pair Programming
ERIC Educational Resources Information Center
Goel, Sanjay; Kathuria, Vanshi
2010-01-01
The majority of an engineer's time in the software industry is spent working with other programmers. Agile methods of software development like eXtreme Programming strongly rely upon practices like daily meetings and pair programming. Hence, the need to learn the skill of working collaboratively is of primary importance for software developers.…
Leader Delegation and Trust in Global Software Teams
ERIC Educational Resources Information Center
Zhang, Suling
2008-01-01
Virtual teams are an important work structure in global software development. The distributed team structure enables access to a diverse set of expertise which is often not available in one location, to a cheaper labor force, and to a potentially accelerated development process that uses a twenty-four hour work structure. Many software teams…
ERIC Educational Resources Information Center
Prvinchandar, Sunita; Ayub, Ahmad Fauzi Mohd
2014-01-01
This study compared the effectiveness of two types of computer software for improving the English writing skills of pupils in a Malaysian primary school. Sixty students who participated in the seven-week training course were divided into two groups, with the experimental group using the StyleWriter software and the control group using the…
Cooperative Work and Sustainable Scientific Software Practices in R
NASA Astrophysics Data System (ADS)
Weber, N.
2013-12-01
Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.
NASA Astrophysics Data System (ADS)
Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.
2013-05-01
Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.
Data Management Working Group report
NASA Technical Reports Server (NTRS)
Filardo, Edward J.; Smith, David B.
1986-01-01
The current flight qualification program lags technology insertion by 6 to 10 years. The objective is to develop an integrated software engineering and development environment assisted by an expert system technology. An operating system needs to be developed which is portable to the on-board computers of the year 2000. The use of ADA verses a High-Order Language; fault tolerance; fiber optics networks; communication protocols; and security are also examined and outlined.
A strategy for electronic dissemination of NASA Langley technical publications
NASA Technical Reports Server (NTRS)
Roper, Donna G.; Mccaskill, Mary K.; Holland, Scott D.; Walsh, Joanne L.; Nelson, Michael L.; Adkins, Susan L.; Ambur, Manjula Y.; Campbell, Bryan A.
1994-01-01
To demonstrate NASA Langley Research Center's relevance and to transfer technology to external customers in a timely and efficient manner, Langley has formed a working group to study and recommend a course of action for the electronic dissemination of technical reports (EDTR). The working group identified electronic report requirements (e.g., accessibility, file format, search requirements) of customers in U.S. industry through numerous site visits and personal contacts. Internal surveys were also used to determine commonalities in document preparation methods. From these surveys, a set of requirements for an electronic dissemination system was developed. Two candidate systems were identified and evaluated against the set of requirements: the Full-Text Electronic Documents System (FEDS), which is a full-text retrieval system based on the commercial document management package Interleaf, and the Langley Technical Report Server (LTRS), which is a Langley-developed system based on the publicly available World Wide Web (WWW) software system. Factors that led to the selection of LTRS as the vehicle for electronic dissemination included searching and viewing capability, current system operability, and client software availability for multiple platforms at no cost to industry. This report includes the survey results, evaluations, a description of the LTRS architecture, recommended policy statement, and suggestions for future implementations.
CONTACT: An Air Force technical report on military satellite control technology
NASA Astrophysics Data System (ADS)
Weakley, Christopher K.
1993-07-01
This technical report focuses on Military Satellite Control Technologies and their application to the Air Force Satellite Control Network (AFSCN). This report is a compilation of articles that provide an overview of the AFSCN and the Advanced Technology Program, and discusses relevant technical issues and developments applicable to the AFSCN. Among the topics covered are articles on Future Technology Projections; Future AFSCN Topologies; Modeling of the AFSCN; Wide Area Communications Technology Evolution; Automating AFSCN Resource Scheduling; Health & Status Monitoring at Remote Tracking Stations; Software Metrics and Tools for Measuring AFSCN Software Performance; Human-Computer Interface Working Group; Trusted Systems Workshop; and the University Technical Interaction Program. In addition, Key Technology Area points of contact are listed in the report.
NASA's MERBoard: An Interactive Collaborative Workspace Platform. Chapter 4
NASA Technical Reports Server (NTRS)
Trimble, Jay; Wales, Roxana; Gossweiler, Rich
2003-01-01
This chapter describes the ongoing process by which a multidisciplinary group at NASA's Ames Research Center is designing and implementing a large interactive work surface called the MERBoard Collaborative Workspace. A MERBoard system involves several distributed, large, touch-enabled, plasma display systems with custom MERBoard software. A centralized server and database back the system. We are continually tuning MERBoard to support over two hundred scientists and engineers during the surface operations of the Mars Exploration Rover Missions. These scientists and engineers come from various disciplines and are working both in small and large groups over a span of space and time. We describe the multidisciplinary, human-centered process by which this h4ERBoard system is being designed, the usage patterns and social interactions that we have observed, and issues we are currently facing.
Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando
2016-04-01
The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.
A Mechanized Decision Support System for Academic Scheduling.
1986-03-01
an operational system called software. The first step in the development phase is Design . Designers destribute software control by factoring the Data...SUBJECT TERMS (Continue on reverse if necessary and identify by block number) ELD GROUP SUB-GROUP Scheduling, Decision Support System , Software Design ...scheduling system . It will also examine software - design techniques to identify the most appropriate method- ology for this problem. " - Chapter 3 will
Simulink/PARS Integration Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vacaliuc, B.; Nakhaee, N.
2013-12-18
The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less
Control software and electronics architecture design in the framework of the E-ELT instrumentation
NASA Astrophysics Data System (ADS)
Di Marcantonio, P.; Coretti, I.; Cirami, R.; Comari, M.; Santin, P.; Pucillo, M.
2010-07-01
During the last years the European Southern Observatory (ESO), in collaboration with other European astronomical institutes, has started several feasibility studies for the E-ELT (European-Extremely Large Telescope) instrumentation and post-focal adaptive optics. The goal is to create a flexible suite of instruments to deal with the wide variety of scientific questions astronomers would like to see solved in the coming decades. In this framework INAF-Astronomical Observatory of Trieste (INAF-AOTs) is currently responsible of carrying out the analysis and the preliminary study of the architecture of the electronics and control software of three instruments: CODEX (control software and electronics) and OPTIMOS-EVE/OPTIMOS-DIORAMAS (control software). To cope with the increased complexity and new requirements for stability, precision, real-time latency and communications among sub-systems imposed by these instruments, new solutions have been investigated by our group. In this paper we present the proposed software and electronics architecture based on a distributed common framework centered on the Component/Container model that uses OPC Unified Architecture as a standard layer to communicate with COTS components of three different vendors. We describe three working prototypes that have been set-up in our laboratory and discuss their performances, integration complexity and ease of deployment.
Caesy: A software tool for computer-aided engineering
NASA Technical Reports Server (NTRS)
Wette, Matt
1993-01-01
A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.
The Transition to a Many-core World
NASA Astrophysics Data System (ADS)
Mattson, T. G.
2012-12-01
The need to increase performance within a fixed energy budget has pushed the computer industry to many core processors. This is grounded in the physics of computing and is not a trend that will just go away. It is hard to overestimate the profound impact of many-core processors on software developers. Virtually every facet of the software development process will need to change to adapt to these new processors. In this talk, we will look at many-core hardware and consider its evolution from a perspective grounded in the CPU. We will show that the number of cores will inevitably increase, but in addition, a quest to maximize performance per watt will push these cores to be heterogeneous. We will show that the inevitable result of these changes is a computing landscape where the distinction between the CPU and the GPU is blurred. We will then consider the much more pressing problem of software in a many core world. Writing software for heterogeneous many core processors is well beyond the ability of current programmers. One solution is to support a software development process where programmer teams are split into two distinct groups: a large group of domain-expert productivity programmers and much smaller team of computer-scientist efficiency programmers. The productivity programmers work in terms of high level frameworks to express the concurrency in their problems while avoiding any details for how that concurrency is exploited. The second group, the efficiency programmers, map applications expressed in terms of these frameworks onto the target many-core system. In other words, we can solve the many-core software problem by creating a software infrastructure that only requires a small subset of programmers to become master parallel programmers. This is different from the discredited dream of automatic parallelism. Note that productivity programmers still need to define the architecture of their software in a way that exposes the concurrency inherent in their problem. We submit that domain-expert programmers understand "what is concurrent". The parallel programming problem emerges from the complexity of "how that concurrency is utilized" on real hardware. The research described in this talk was carried out in collaboration with the ParLab at UC Berkeley. We use a design pattern language to define the high level frameworks exposed to domain-expert, productivity programmers. We then use tools from the SEJITS project (Selective embedded Just In time Specializers) to build the software transformation tool chains thst turn these framework-oriented designs into highly efficient code. The final ingredient is a software platform to serve as a target for these tools. One such platform is the OpenCL industry standard for programming heterogeneous systems. We will briefly describe OpenCL and show how it provides a vendor-neutral software target for current and future many core systems; both CPU-based, GPU-based, and heterogeneous combinations of the two.
Johnson, Kevin B; Ravich, William J; Cowan, John A
2004-09-01
Computer-based software to record histories, physical exams, and progress or procedure notes, known as computer-based documentation (CBD) software, has been touted as an important addition to the electronic health record. The functionality of CBD systems has remained static over the past 30 years, which may have contributed to the limited adoption of these tools. Early users of this technology, who have tried multiple products, may have insight into important features to be considered in next-generation CBD systems. We conducted a cross-sectional, observational study of the clinical working group membership of the American Medical Informatics Association (AMIA) to generate a set of features that might improve adoption of next-generation systems. The study was conducted online over a 4-month period; 57% of the working group members completed the survey. As anticipated, CBD tool use was higher (53%) in this population than in the US physician offices. The most common methods of data entry employed keyboard and mouse, with agreement that these modalities worked well. Many respondents had experience with pre-printed data collection forms before interacting with a CBD system. Respondents noted that CBD improved their ability to document large amounts of information, allowed timely sharing of information, enhanced patient care, and enhanced medical information with other clinicians (all P < 0.001). Respondents also noted some important but absent features in CBD, including the ability to add images, get help, and generate billing information. The latest generation of CBD systems is being used successfully by early adopters, who find that these tools confer many advantages over the approaches to documentation that they replaced. These users provide insights that may improve successive generations of CBD tools. Additional surveys of CBD non-users and failed adopters will be necessary to provide other useful insights that can address barriers to the adoption of CBD by less computer literate physicians.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G M
2005-05-03
For a number of years I had the pleasure of teaching Testing Seminars all over the world and meeting and learning from others in our field. Over a twelve year period, I always asked the following questions to Software Developers, Test Engineers, and Managers who took my two or three day seminar on Software Testing: 'When was the first time you heard the word test'? 'Where were you when you first heard the word test'? 'Who said the word test'? 'How did the word test make you feel'? Most of the thousands of responses were similar to 'It was mymore » third grade teacher at school, and I felt nervous and afraid'. Now there were a few exceptions like 'It was my third grade teacher, and I was happy and excited to show how smart I was'. But by and large, my informal survey found that 'testing' is a word to which most people attach negative meanings, based on its historical context. So why is this important to those of us in the software development business? Because I have found that a preponderance of software developers do not get real excited about hearing that the software they just wrote is going to be 'tested' by the Test Group. Typical reactions I have heard over the years run from: 'I'm sure there is nothing wrong with the software, so go ahead and test it, better you find defects than our customers'. to these extremes: 'There is no need to test my software because there is nothing wrong with it'. 'You are not qualified to test my software because you don't know as much as I do about it'. 'If any Test Engineers come into our office again to test our software we will throw them through the third floor window'. So why is there such a strong negative reaction to testing? It is primitive. It goes back to grade school for many of us. It is a negative word that congers up negative emotions. In other words, 'test' is a four letter word. How many of us associate 'Joy' with 'Test'? Not many. It is hard for most of us to reprogram associations learned at an early age. So what can we do about it (short of hypnotic therapy for software developers)? Well one concept I have used (and still use) is to not call testing 'testing'. Call it something else. Ever wonder why most of the Independent Software Testing groups are called Software Quality Assurance groups? Now you know. Software Quality Assurance is not such a negatively charged phrase, even though Software Quality Assurance is much more than simply testing. It was a real blessing when the concept of Validation and Verification came about for software. Now I define Validation to mean assuring that the product produced does the right thing (usually what the customer wants it to do), and verification means that the product was built the right way (in accordance with some good design principles and practices). So I have deliberately called the System Test Group the Verification and Validation Group, or V&V Group, as a way of avoiding the negative image problem. I remember once having a conversation with a developer colleague who said, in the heat of battle, that it was fine to V&V his code, just don't test it! Once again V&V includes many things besides testing, but it just doesn't sound like an onerous thing to do to software. In my current job, working at a highly regarded national laboratory with world renowned physicists, I have again encountered the negativity about testing software. Except here they don't take kindly to Software Quality Assurance or Software Verification and Validation either. After all, software is just a trivial tool to automate algorithms that implement physics models. Testing, SQA, and V&V take time and get in the way of completing ground breaking science experiments. So I have again had to change the name of software testing to something less negative in the physics world. I found (the hard way) that if I requested more time to do software experimentation, the physicist's resistance melted. And so the conversation continues, 'We have time to run more software experiments. Just don't waste any time testing the software'! In case the concept of not calling testing 'testing' appeals to you, and there may be an opportunity for you to take the sting out of the name at your place of employment, I have compiled a table of things that testing could be called besides 'testing'. Of course we can embellish this by adding some good sounding prefixes and suffixes also. To come up with alternate names for testing, pick a word from columns A, B, and C in the table below. For instance Unified Acceptance Trials (A2,B7,C3) or Tailored Observational Demonstration (A6,B5,C5) or Agile Criteria Scoring (A3,B8,C8) or Rapid Requirement Proof (A1,B9,C7) or Satisfaction Assurance (B10,C1). You can probably think of some additional combinations appropriate for your industry.« less
A survey of Canadian medical physicists: software quality assurance of in-house software.
Salomons, Greg J; Kelly, Diane
2015-01-05
This paper reports on a survey of medical physicists who write and use in-house written software as part of their professional work. The goal of the survey was to assess the extent of in-house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple-choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software-related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines.
Research of real-time communication software
NASA Astrophysics Data System (ADS)
Li, Maotang; Guo, Jingbo; Liu, Yuzhong; Li, Jiahong
2003-11-01
Real-time communication has been playing an increasingly important role in our work, life and ocean monitor. With the rapid progress of computer and communication technique as well as the miniaturization of communication system, it is needed to develop the adaptable and reliable real-time communication software in the ocean monitor system. This paper involves the real-time communication software research based on the point-to-point satellite intercommunication system. The object-oriented design method is adopted, which can transmit and receive video data and audio data as well as engineering data by satellite channel. In the real-time communication software, some software modules are developed, which can realize the point-to-point satellite intercommunication in the ocean monitor system. There are three advantages for the real-time communication software. One is that the real-time communication software increases the reliability of the point-to-point satellite intercommunication system working. Second is that some optional parameters are intercalated, which greatly increases the flexibility of the system working. Third is that some hardware is substituted by the real-time communication software, which not only decrease the expense of the system and promotes the miniaturization of communication system, but also aggrandizes the agility of the system.
Web-Based Environment for Maintaining Legacy Software
NASA Technical Reports Server (NTRS)
Tigges, Michael; Thompson, Nelson; Orr, Mark; Fox, Richard
2007-01-01
Advanced Tool Integration Environment (ATIE) is the name of both a software system and a Web-based environment created by the system for maintaining an archive of legacy software and expertise involved in developing the legacy software. ATIE can also be used in modifying legacy software and developing new software. The information that can be encapsulated in ATIE includes experts documentation, input and output data of tests cases, source code, and compilation scripts. All of this information is available within a common environment and retained in a database for ease of access and recovery by use of powerful search engines. ATIE also accommodates the embedment of supporting software that users require for their work, and even enables access to supporting commercial-off-the-shelf (COTS) software within the flow of the experts work. The flow of work can be captured by saving the sequence of computer programs that the expert uses. A user gains access to ATIE via a Web browser. A modern Web-based graphical user interface promotes efficiency in the retrieval, execution, and modification of legacy code. Thus, ATIE saves time and money in the support of new and pre-existing programs.
Software quality for 1997 - what works and what doesn`t?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, C.
1997-11-01
This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jun
Our group has been working with ANL collaborators on the topic bridging the gap between parallel file system and local file system during the course of this project period. We visited Argonne National Lab -- Dr. Robert Ross's group for one week in the past summer 2007. We looked over our current project progress and planned the activities for the incoming years 2008-09. The PI met Dr. Robert Ross several times such as HEC FSIO workshop 08, SC08 and SC10. We explored the opportunities to develop a production system by leveraging our current prototype to (SOGP+PVFS) a new PVFS version.more » We delivered SOGP+PVFS codes to ANL PVFS2 group in 2008.We also talked about exploring a potential project on developing new parallel programming models and runtime systems for data-intensive scalable computing (DISC). The methodology is to evolve MPI towards DISC by incorporating some functions of Google MapReduce parallel programming model. More recently, we are together exploring how to leverage existing works to perform (1) coordination/aggregation of local I/O operations prior to movement over the WAN, (2) efficient bulk data movement over the WAN, (3) latency hiding techniques for latency-intensive operations. Since 2009, we start applying Hadoop/MapReduce to some HEC applications with LANL scientists John Bent and Salman Habib. Another on-going work is to improve checkpoint performance at I/O forwarding Layer for the Road Runner super computer with James Nuetz and Gary Gridder at LANL. Two senior undergraduates from our research group did summer internships about high-performance file and storage system projects in LANL since 2008 for consecutive three years. Both of them are now pursuing Ph.D. degree in our group and will be 4th year in the PhD program in Fall 2011 and go to LANL to advance two above-mentioned works during this winter break. Since 2009, we have been collaborating with several computer scientists (Gary Grider, John bent, Parks Fields, James Nunez, Hsing-Bung Chen, etc) from HPC5 and James Ahrens from Advanced Computing Laboratory in Los Alamos National Laboratory. We hold a weekly conference and/or video meeting on advancing works at two fronts: the hardware/software infrastructure of building large-scale data intensive cluster and research publications. Our group members assist in constructing several onsite LANL data intensive clusters. Two parties have been developing software codes and research papers together using both sides resources.« less
Evaluating Sustainability Models for Interoperability through Brokering Software
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew
2016-04-01
Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.
Requirements Engineering in Building Climate Science Software
ERIC Educational Resources Information Center
Batcheller, Archer L.
2011-01-01
Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…
Electronic imaging of the human body
NASA Astrophysics Data System (ADS)
Vannier, Michael W.; Yates, Randall E.; Whitestone, Jennifer J.
1992-09-01
The Human Engineering Division of the Armstrong Laboratory (USAF); the Mallinckrodt Institute of Radiology; the Washington University School of Medicine; and the Lister-Hill National Center for Biomedical Communication, National Library of Medicine are sponsoring a working group on electronic imaging of the human body. Electronic imaging of the surface of the human body has been pursued and developed by a number of disciplines including radiology, forensics, surgery, engineering, medical education, and anthropometry. The applications range from reconstructive surgery to computer-aided design (CAD) of protective equipment. Although these areas appear unrelated, they have a great deal of commonality. All the organizations working in this area are faced with the challenges of collecting, reducing, and formatting the data in an efficient and standard manner; storing this data in a computerized database to make it readily accessible; and developing software applications that can visualize, manipulate, and analyze the data. This working group is being established to encourage effective use of the resources of all the various groups and disciplines involved in electronic imaging of the human body surface by providing a forum for discussing progress and challenges with these types of data.
Student perceptions of drill-and-practice mathematics software in primary education
NASA Astrophysics Data System (ADS)
Kuiper, Els; de Pater-Sneep, Martie
2014-06-01
Drill-and-practice mathematics software offers teachers a relatively simple way to use technology in the classroom. One of the reasons to use the software may be that it motivates children, working on the computer being more "fun" than doing regular school work. However, students' own perceptions of such software are seldom studied. This article reports on a study on the opinions of Grade 5 and 6 students regarding two mathematics drill-and-practice software packages. In total, 329 students from ten Dutch primary schools took part in the study. The results show that a majority of the students preferred to work in their exercise book, for various reasons. Especially the rigid structure of the software is mentioned as a negative aspect by students. The elaborate arguments students used illustrate the importance of taking their opinions into account already at the primary level. Students' perceptions also show that the idea of ICT as naturally motivating for students may need modification.
NASA Technical Reports Server (NTRS)
Wallace, Robert
1986-01-01
A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.
ERIC Educational Resources Information Center
Sawtelle, Sara
2008-01-01
Proving that technology works is not as simple as proving that a new vendor for art supplies is more cost effective. Technology effectiveness requires both the right software and the right implementation. Just having the software is not enough. Proper planning, training, leadership, support, pedagogy, and software use--along with many other…
Giancarlo, R; Scaturro, D; Utro, F
2015-02-01
The prediction of the number of clusters in a dataset, in particular microarrays, is a fundamental task in biological data analysis, usually performed via validation measures. Unfortunately, it has received very little attention and in fact there is a growing need for software tools/libraries dedicated to it. Here we present ValWorkBench, a software library consisting of eleven well known validation measures, together with novel heuristic approximations for some of them. The main objective of this paper is to provide the interested researcher with the full software documentation of an open source cluster validation platform having the main features of being easily extendible in a homogeneous way and of offering software components that can be readily re-used. Consequently, the focus of the presentation is on the architecture of the library, since it provides an essential map that can be used to access the full software documentation, which is available at the supplementary material website [1]. The mentioned main features of ValWorkBench are also discussed and exemplified, with emphasis on software abstraction design and re-usability. A comparison with existing cluster validation software libraries, mainly in terms of the mentioned features, is also offered. It suggests that ValWorkBench is a much needed contribution to the microarray software development/algorithm engineering community. For completeness, it is important to mention that previous accurate algorithmic experimental analysis of the relative merits of each of the implemented measures [19,23,25], carried out specifically on microarray data, gives useful insights on the effectiveness of ValWorkBench for cluster validation to researchers in the microarray community interested in its use for the mentioned task. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Practical Software Measurement: Measuring for Process Management and Improvement,
1997-04-01
Ishikawa , Kaoru . Guide to Quality Control, Second Revised Edition. White Plains, N.Y.: UNIPUB-Kraus International Publications, 1986. CMU/SEI-97...begin, you may want to assemble a group of people who work within the process to brainstorm possible reasons for the unusual behavior. Ishikawa charts...control limits and center line. • Cause-and-effect diagrams (also know as Ishikawa charts) allow you to probe for, map, and prioritize a set of factors
Virtualization Technology for System of Systems Test and Evaluation
2012-06-01
Peterson , Tillman, & Hatfield (1972) outlined the capabilities of virtualization in the early days of VM with some guiding principles. The following...Sheikh, based on the work of Balci (1994, 1995), and Balci et al. ( 1996 ), seeks to organize types of tests and to align requirements to the appropriate...Verification, validation, and testing in software engineering (pp. 155–184). Hershey , PA: Idea Group. Adair, R. J., Bayles, R. U., Comeau, L. W
Association between the Type of Workplace and Lung Function in Copper Miners
Gruszczyński, Leszek; Wojakowska, Anna; Ścieszka, Marek; Turczyn, Barbara; Schmidt, Edward
2016-01-01
The aim of the analysis was to retrospectively assess changes in lung function in copper miners depending on the type of workplace. In the groups of 225 operators, 188 welders, and 475 representatives of other jobs, spirometry was performed at the start of employment and subsequently after 10, 20, and 25 years of work. Spirometry Longitudinal Data Analysis software was used to estimate changes in group means for FEV1 and FVC. Multiple linear regression analysis was used to assess an association between workplace and lung function. Lung function assessed on the basis of calculation of longitudinal FEV1 (FVC) decline was similar in all studied groups. However, multiple linear regression model used in cross-sectional analysis revealed an association between workplace and lung function. In the group of welders, FEF75 was lower in comparison to operators and other miners as early as after 10 years of work. Simultaneously, in smoking welders, the FEV1/FVC ratio was lower than in nonsmokers (p < 0,05). The interactions between type of workplace and smoking (p < 0,05) in their effect on FVC, FEV1, PEF, and FEF50 were shown. Among underground working copper miners, the group of smoking welders is especially threatened by impairment of lung ventilatory function. PMID:27274987
Internet teleconferencing as a clinical tool for anesthesiologists.
Ruskin, K J; Palmer, T E; Hagenouw, R R; Lack, A; Dunnill, R
1998-04-01
Internet teleconferencing software can be used to hold "virtual" meetings, during which participants around the world can share ideas. A core group of anesthetic medical practitioners, largely consisting of the Society for Advanced Telecommunications in Anesthesia (SATA), has begun to hold regularly scheduled "virtual grand rounds." This paper examines currently available software and offers impressions of our own early experiences with this technology. Two teleconferencing systems have been used: White Pine Software CU-SeeMe and Microsoft NetMeeting. While both provided acceptable results, each had specific advantages and disadvantages. CU-SeeMe is easier to use when conferences include more than two participants. NetMeeting provides higher quality audio and video signals under crowded network conditions, and is better for conferences with only two participants. While some effort is necessary to get these teleconferencing systems to work well, we have been using desktop conferencing for six months to hold virtual Internet meetings. The sound and video images produced by Internet teleconferencing software are inferior to dedicated point-to-point teleconferencing systems. However, low cost, wide availability, and ease of use make this technology a potentially valuable tool for clinicians and researchers.
The ALMA software architecture
NASA Astrophysics Data System (ADS)
Schwarz, Joseph; Farris, Allen; Sommer, Heiko
2004-09-01
The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.
A Method for Assessing the Accuracy of a Photogrammetry System for Precision Deployable Structures
NASA Technical Reports Server (NTRS)
Moore, Ashley
2005-01-01
The measurement techniques used to validate analytical models of large deployable structures are an integral Part of the technology development process and must be precise and accurate. Photogrammetry and videogrammetry are viable, accurate, and unobtrusive methods for measuring such large Structures. Photogrammetry uses Software to determine the three-dimensional position of a target using camera images. Videogrammetry is based on the same principle, except a series of timed images are analyzed. This work addresses the accuracy of a digital photogrammetry system used for measurement of large, deployable space structures at JPL. First, photogrammetry tests are performed on a precision space truss test article, and the images are processed using Photomodeler software. The accuracy of the Photomodeler results is determined through, comparison with measurements of the test article taken by an external testing group using the VSTARS photogrammetry system. These two measurements are then compared with Australis photogrammetry software that simulates a measurement test to predict its accuracy. The software is then used to study how particular factors, such as camera resolution and placement, affect the system accuracy to help design the setup for the videogrammetry system that will offer the highest level of accuracy for measurement of deploying structures.
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
NASA Astrophysics Data System (ADS)
Orngreen, Rikke; Clemmensen, Torkil; Pejtersen, Annelise Mark
The boundaries and work processes for how virtual teams interact are undergoing changes, from a tool and stand-alone application orientation, to the use of multiple generic platforms chosen and redesigned to the specific context. These are often at the same time designed both by professional software developers and the individual members of the virtual teams, rather than determined on a single organizational level. There may be no impact of the technology per se on individuals, groups or organizations, as the technology for virtual teams rather enhance situation ambiguity and disrupt existing task-artifact cycles. This ambiguous situation calls for new methods for empirical work analysis and interaction design that can help us understand how organizations, teams and individuals learn to organize, design and work in virtual teams in various networked contexts.
Development of a phantom to test fully automated breast density software - A work in progress.
Waade, G G; Hofvind, S; Thompson, J D; Highnam, R; Hogg, P
2017-02-01
Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. Copyright © 2016 The College of Radiographers. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-02
... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of... determination of the TAA petition filed on behalf of workers at International Business Machines (IBM), Software...
The effect of advertising in clinical software on general practitioners' prescribing behaviour.
Henderson, Joan; Miller, Graeme; Pan, Ying; Britt, Helena
2008-01-07
To assess the effect of pharmaceutical advertising embedded in clinical software on the prescribing behaviour of general practitioners. Secondary analysis of data from a random sample of 1336 Australian GPs who participated in Bettering the Evaluation and Care of Health, a national continuous cross-sectional survey of general practice activity, between November 2003 and March 2005. The prescribing behaviour of participants who used the advertising software was compared with that of participants who did not, for seven pharmaceutical products advertised continually throughout the study period. Prescription for advertised product as a proportion (%) of prescriptions for all pharmaceutical products in the same generic class or group. GP age, practice location, accreditation status, patient bulk-billing status and hours worked were significantly associated (P < 0.05) with use of advertising software. We found no significant differences, either before or after adjustment for these confounders, in the prescribing rate of Lipitor (adjusted odds ratio [AOR], 0.90; P = 0.26); Micardis (AOR, 0.98; P = 0.91); Mobic (AOR, 1.02; P = 0.89); Norvasc (AOR, 1.02; P = 0.91); Natrilix (AOR, 0.80; P = 0.32); or Zanidip (AOR, 0.88; P = 0.47). GPs using advertising software prescribed Nexium significantly less often than those not using advertising software (AOR, 0.78; P = 0.02). When all advertised products were combined and compared with products that were not advertised, no difference in the overall prescribing behaviour was demonstrated (AOR, 0.96; P = 0.42). Exposure to advertisements in clinical software has little influence on the prescribing behaviour of GPs.
Digital Geological Mapping for Earth Science Students
NASA Astrophysics Data System (ADS)
England, Richard; Smith, Sally; Tate, Nick; Jordan, Colm
2010-05-01
This SPLINT (SPatial Literacy IN Teaching) supported project is developing pedagogies for the introduction of teaching of digital geological mapping to Earth Science students. Traditionally students are taught to make geological maps on a paper basemap with a notebook to record their observations. Learning to use a tablet pc with GIS based software for mapping and data recording requires emphasis on training staff and students in specific GIS and IT skills and beneficial adjustments to the way in which geological data is recorded in the field. A set of learning and teaching materials are under development to support this learning process. Following the release of the British Geological Survey's Sigma software we have been developing generic methodologies for the introduction of digital geological mapping to students that already have experience of mapping by traditional means. The teaching materials introduce the software to the students through a series of structured exercises. The students learn the operation of the software in the laboratory by entering existing observations, preferably data that they have collected. Through this the students benefit from being able to reflect on their previous work, consider how it might be improved and plan new work. Following this they begin fieldwork in small groups using both methods simultaneously. They are able to practise what they have learnt in the classroom and review the differences, advantages and disadvantages of the two methods, while adding to the work that has already been completed. Once the field exercises are completed students use the data that they have collected in the production of high quality map products and are introduced to the use of integrated digital databases which they learn to search and extract information from. The relatively recent development of the technologies which underpin digital mapping also means that many academic staff also require training before they are able to deliver the course materials. Consequently, a set of staff training materials are being developed in parallel to those for the students. These focus on the operation of the software and an introduction to the structure of the exercises. The presentation will review the teaching exercises and student and staff responses to their introduction.
Portability scenarios for intelligent robotic control agent software
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2014-06-01
Portability scenarios are critical in ensuring that a piece of AI control software will run effectively across the collection of craft that it is required to control. This paper presents scenarios for control software that is designed to control multiple craft with heterogeneous movement and functional characteristics. For each prospective target-craft type, its capabilities, mission function, location, communications capabilities and power profile are presented and performance characteristics are reviewed. This work will inform future work related to decision making related to software capabilities, hardware control capabilities and processing requirements.
Tisdall, M Dylan; Reuter, Martin; Qureshi, Abid; Buckner, Randy L; Fischl, Bruce; van der Kouwe, André J W
2016-02-15
Recent work has demonstrated that subject motion produces systematic biases in the metrics computed by widely used morphometry software packages, even when the motion is too small to produce noticeable image artifacts. In the common situation where the control population exhibits different behaviors in the scanner when compared to the experimental population, these systematic measurement biases may produce significant confounds for between-group analyses, leading to erroneous conclusions about group differences. While previous work has shown that prospective motion correction can improve perceived image quality, here we demonstrate that, in healthy subjects performing a variety of directed motions, the use of the volumetric navigator (vNav) prospective motion correction system significantly reduces the motion-induced bias and variance in morphometry. Copyright © 2015 Elsevier Inc. All rights reserved.
Toward Stronger Ties: The AAS Working Group on Professional-Amateur Collaboration
NASA Astrophysics Data System (ADS)
Beatty, J. K.; White, J. C.
2004-05-01
Experienced amateur astronomers represent a unique resource for their professional counterparts. Many knowledgeable amateurs now have telescopes in the 0.2- to 0.5-m class equipped with high-grade CCDs and software. To foster stronger ties between these observers and astronomical researchers, the AAS Council established a Working Group for Professional-Amateur Collaboration (WGPAC) during the Society's 193rd meeting in January 1999. Initially given a five-year charter, the WGPAC was made permanent at the 202nd Council meeting in May 2003. Since its creation the WGPAC has coordinated its activities with major amateur-astronomy organizations, sponsored a tutorial workshop at the annual meeting of the Astronomical League, laid the groundwork for a national registry of highly qualified amateur observers, and promoted pro-am collaborations through articles in the AAS Newsletter and leading amateur-astronomy publications.
Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025
NASA Astrophysics Data System (ADS)
Banegas, J. M.; Orué, M. W.
2016-07-01
Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.
[Computer-assisted phacoemulsification for hard cataracts].
Zemba, M; Papadatu, Adriana-Camelia; Sîrbu, Laura-Nicoleta; Avram, Corina
2012-01-01
to evaluate the efficiency of new torsional phacoemulsification software (Ozil IP system) in hard nucleus cataract extraction. 45 eyes with hard senile cataract (degree III and IV) underwent phacoemulsification performed by the same surgeon, using the same technique (stop and chop). Infiniti (Alcon) platform was used, with Ozil IP software and Kelman phaco tip miniflared, 45 degrees. The nucleus was split into two and after that the first half was phacoemulsificated with IP-on (group 1) and the second half with IP-off (group 2). For every group we measured: cumulative dissipated energy (CDE), numbers of tip closure that needed manual desobstruction the amount of BSS used. The mean CDE was the same in group 1 and in group 2 (between 6.2 and 14.9). The incidence of occlusion that needed manual desobstruction was lower in group 1 (5 times) than in group 2 (13 times). Group 2 used more BSS compared to group 1. The new torsional software (IP system) significantly decreased occlusion time and balanced salt solution use over standard torsional software, particularly with denser cataracts.
The Future of Earthquake Relocation Tools
NASA Astrophysics Data System (ADS)
Lecocq, T.; Caudron, C.
2010-12-01
Many scientists around the world use earthquake relocation software for their research. Some use "known" software like HYPODD or COMPLOC, while others use their own algorithms and codes. Often, beginners struggle to get one tool running or to properly configure input parameters. This Poster will be witness of debates that will take place during the Meeting, for example adressing questions like "Which program for which application?" ; "Standardized In/Outs?" , "Tectonic / Volcanic / Other ?" ; "All programs inside one single Super-Package?" ; "Common/Base Bibliography for the Relocation-Beginner?" ; "Continuous or Layered Velocity Model?" etc... We will also present the scheme of a Super-Package we are working on, grouping HYPODD [Waldhauser 2001], COMPLOC [Lin&Shearer 2006], LOTOS [Koulakov 2009] ; allowing standard in/outs for the 3 programs, and thus, the comparison of their outputs.
NASA Technical Reports Server (NTRS)
Briones, Janette C.; Handler, Louis M.; Hall, Steve C.; Reinhart, Richard C.; Kacpura, Thomas J.
2009-01-01
The Space Telecommunication Radio System (STRS) standard is a Software Defined Radio (SDR) architecture standard developed by NASA. The goal of STRS is to reduce NASA s dependence on custom, proprietary architectures with unique and varying interfaces and hardware and support reuse of waveforms across platforms. The STRS project worked with members of the Object Management Group (OMG), Software Defined Radio Forum, and industry partners to leverage existing standards and knowledge. This collaboration included investigating the use of the OMG s Platform-Independent Model (PIM) SWRadio as the basis for an STRS PIM. This paper details the influence of the OMG technologies on the STRS update effort, findings in the STRS/SWRadio mapping, and provides a summary of the SDR Forum recommendations.
Software Past, Present, and Future: Views from Government, Industry and Academia
NASA Technical Reports Server (NTRS)
Holcomb, Lee; Page, Jerry; Evangelist, Michael
2000-01-01
Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.
Better software, better research: the challenge of preserving your research and your reputation
NASA Astrophysics Data System (ADS)
Chue Hong, N.
2017-12-01
Software is fundamental to research. From short, thrown-together temporary scripts, through an abundance of complex spreadsheets analysing collected data, to the hundreds of software engineers and millions of lines of code behind international efforts such as the Large Hadron Collider and the Square Kilometre Array, software has made an invaluable contribution to advancing our research knowledge. Within the earth and space sciences, data is being generated, collected, processed and analysed in ever greater amounts and detail. However the pace of this improvement leads to challenges around the persistence of research outputs and artefacts. A specific challenge in this field is that often experiments and measurements cannot be repeated, yet the infrastructure used to manage, store and process this data must be continually updated and developed: constant change just to stay still. The UK-based Software Sustainability Institute (SSI) aims to improve research software sustainability, working with researchers, funders, research software engineers, managers, and other stakeholders across the research spectrum. In this talk, I will present lessons learned and good practice based on the work of the Institute and its collaborators. I will summarise some of the work that is being done to improve the integration of infrastructure for managing research outputs, including around software citation and reward, extending data management plans, and improving researcher skills: "better software, better research". Ultimately, being a modern researcher in the geosciences requires you to efficiently balance the pursuit of new knowledge with making your work reusable and reproducible. And as scientists are placed under greater scrutiny about whether others can trust their results, the preservation of your artefacts has a key role in the preservation of your reputation.
Software Engineering Guidebook
NASA Technical Reports Server (NTRS)
Connell, John; Wenneson, Greg
1993-01-01
The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.
Corridor-based forecasts of work-zone impacts for freeways.
DOT National Transportation Integrated Search
2011-08-09
This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...
Federal Highway Administration (FHWA) work zone driver model software
DOT National Transportation Integrated Search
2016-11-01
FHWA and the U.S. Department of Transportation (USDOT) Volpe Center are developing a work zone car-following model and simulation software that interfaces with existing microsimulation tools, enabling more accurate simulation of car-following through...
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
How Knowledge Organisations Work: The Case of Software Firms
ERIC Educational Resources Information Center
Gottschalk, Petter
2007-01-01
Knowledge workers in software firms solve client problems in sequential and cyclical work processes. Sequential and cyclical work takes place in the value configuration of a value shop. While typical examples of value chains are manufacturing industries such as paper and car production, typical examples of value shops are law firms and medical…
Scientific Data Analysis and Software Support: Geodynamics
NASA Technical Reports Server (NTRS)
Klosko, Steven; Sanchez, B. (Technical Monitor)
2000-01-01
The support on this contract centers on development of data analysis strategies, geodynamic models, and software codes to study four-dimensional geodynamic and oceanographic processes, as well as studies and mission support for near-Earth and interplanetary satellite missions. SRE had a subcontract to maintain the optical laboratory for the LTP, where instruments such as MOLA and GLAS are developed. NVI performed work on a Raytheon laser altimetry task through a subcontract, providing data analysis and final data production for distribution to users. HBG had a subcontract for specialized digital topography analysis and map generation. Over the course of this contract, Raytheon ITSS staff have supported over 60 individual tasks. Some tasks have remained in place during this entire interval whereas others have been completed and were of shorter duration. Over the course of events, task numbers were changed to reflect changes in the character of the work or new funding sources. The description presented below will detail the technical accomplishments that have been achieved according to their science and technology areas. What will be shown is a brief overview of the progress that has been made in each of these investigative and software development areas. Raytheon ITSS staff members have received many awards for their work on this contract, including GSFC Group Achievement Awards for TOPEX Precision Orbit Determination and the Joint Gravity Model One Team. NASA JPL gave the TOPEX/POSEIDON team a medal commemorating the completion of the primary mission and a Certificate of Appreciation. Raytheon ITSS has also received a Certificate of Appreciation from GSFC for its extensive support of the Shuttle Laser Altimeter Experiment.
NASA Astrophysics Data System (ADS)
1992-06-01
The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.
The Many Faces of a Software Engineer in a Research Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinovici, Maria C.; Kirkham, Harold
2013-10-14
The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks atmore » the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.« less
Charter for Systems Engineer Working Group
NASA Technical Reports Server (NTRS)
Suffredini, Michael T.; Grissom, Larry
2015-01-01
This charter establishes the International Space Station Program (ISSP) Mobile Servicing System (MSS) Systems Engineering Working Group (SEWG). The MSS SEWG is established to provide a mechanism for Systems Engineering for the end-to-end MSS function. The MSS end-to-end function includes the Space Station Remote Manipulator System (SSRMS), the Mobile Remote Servicer (MRS) Base System (MBS), Robotic Work Station (RWS), Special Purpose Dexterous Manipulator (SPDM), Video Signal Converters (VSC), and Operations Control Software (OCS), the Mobile Transporter (MT), and by interfaces between and among these elements, and United States On-Orbit Segment (USOS) distributed systems, and other International Space Station Elements and Payloads, (including the Power Data Grapple Fixtures (PDGFs), MSS Capture Attach System (MCAS) and the Mobile Transporter Capture Latch (MTCL)). This end-to-end function will be supported by the ISS and MSS ground segment facilities. This charter defines the scope and limits of the program authority and document control that is delegated to the SEWG and it also identifies the panel core membership and specific operating policies.
Developing a Software for Fuzzy Group Decision Support System: A Case Study
ERIC Educational Resources Information Center
Baba, A. Fevzi; Kuscu, Dincer; Han, Kerem
2009-01-01
The complex nature and uncertain information in social problems required the emergence of fuzzy decision support systems in social areas. In this paper, we developed user-friendly Fuzzy Group Decision Support Systems (FGDSS) software. The software can be used for multi-purpose decision making processes. It helps the users determine the main and…
Use of a Wiki-Based Software to Manage Research Group Activities
ERIC Educational Resources Information Center
Wang, Ting; Vezenov, Dmitri V.; Simboli, Brian
2014-01-01
This paper discusses use of the wiki software Confluence to organize research group activities and lab resources. Confluence can serve as an electronic lab notebook (ELN), as well as an information management and collaboration tool. The article provides a case study in how researchers can use wiki software in "home-grown" fashion to…
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
Extending software repository hosting to code review and testing
NASA Astrophysics Data System (ADS)
Gonzalez Alvarez, A.; Aparicio Cotarelo, B.; Lossent, A.; Andersen, T.; Trzcinska, A.; Asbury, D.; Hłimyr, N.; Meinhard, H.
2015-12-01
We will describe how CERN's services around Issue Tracking and Version Control have evolved, and what the plans for the future are. We will describe the services main design, integration and structure, giving special attention to the new requirements from the community of users in terms of collaboration and integration tools and how we address this challenge when defining new services based on GitLab for collaboration to replace our current Gitolite service and Code Review and Jenkins for Continuous Integration. These new services complement the existing ones to create a new global "development tool stack" where each working group can place its particular development work-flow.
ERIC Educational Resources Information Center
Laina, Vasiliki; Monaghan, John
2014-01-01
This paper reports on two students' work on geometry tasks in a dynamic geometry system. It augments prior work on students' instrumental geneses via a consideration of emergent goals that arise in students' work. It offers a way to interpret students' (working with new software) awareness of what software can and cannot do and students'…
Teaching Introductory Mineralogy With the GeoWall
NASA Astrophysics Data System (ADS)
Anderson, C. D.; Haymon, R. M.
2003-12-01
Mineralogy, like many topics in Earth Sciences, contains inherently three-dimensional topics that are difficult to teach. Concepts such as crystal symmetry and forms, Miller indices, the polymerization of silica tetrahedra and resulting structures of silicate mineral groups, and the interaction of light and minerals are particularly difficult. Two-dimensional diagrams are limited in their effectiveness, and physical models, while effective, are expensive and do not work as well in large class settings. The GeoWall system brings the effectiveness of physical models to the large classroom. In Fall 2003, we will integrate the GeoWall into our introductory mineralogy classes at UCSB using a combination of commercial software, atomic structure models available on the web, and custom content created in-house. The commercial software SHAPE (www.shapesoftware.com) allows users to build and display crystal shapes and their symmetry. Though not designed for the GeoWall, the software's stereopair display mode works perfectly on the system. Using the Chime web browser plug-in (www.mdl.com), computer models of silicate minerals available from the Virtual Museum of Minerals and Molecules (www.soils.umn.eduvirtual_museum) provide an interactive display of silicate mineral structure that illustrates the tetrahedral framework. Again, while not developed for the GeoWall, the Chime plug-in works seamlessly with the GeoWall hardware. 3-D GeoWall images that display light paths through minerals, and reveal relationships between crystal symmetry and optical indicatrix properties, have been developed in-house using a combination of SHAPE and 3D modeling software. The 3-D GeoWall images should convey in an instant these difficult concepts that students historically have struggled to visualize. Initial assessment of the GeoWall's effectiveness as a mineralogy teaching aid at UCSB in Fall 2003 will come from the instructor's impressions and by comparing test scores with classes from previous years.
Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software
NASA Technical Reports Server (NTRS)
Graydon, Patrick J.; Holloway, C. Michael
2015-01-01
We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.
Seven Processes that Enable NASA Software Engineering Technologies
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brower, Richard C.
This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less
A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code
ERIC Educational Resources Information Center
Fischer, Michael
2011-01-01
The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…
Learning Science in the 21st century - a shared experience between schools
NASA Astrophysics Data System (ADS)
Pinto, Tânia; Soares, Rosa; Ruas, Fátima
2015-04-01
Problem Based Learning is considered an innovative teaching and learning inquiry methodology that is student centered, focused in the resolution of an authentic problem and in which the teacher acts like a facilitator of the work in small groups. In this process, it is expected that students develop attitudinal, procedural and communication skills, in addition to the cognitive typically valued. PBL implementation also allows the use of multiple educational strategies, like laboratorial experiments, analogue modeling or ICT (video animations, electronic presentations or software simulations, for instance), which can potentiate a more interactive environment in the classroom. In this study, taken in three schools in the north of Portugal, which resulted from the cooperation between three science teachers, with a 75 individuals sample, were examined students' opinions about the main difficulties and strengths concerning the PBL methodology, having as a common denominator the use of a laboratorial experiment followed by an adequate digital software as educational resource to interpret the obtained results and to make predictions (e.g. EarthQuake, Virtual Quake, Stellarium). The data collection methods were based on direct observation and questionnaires. The results globally show that this educational approach motivates students' towards science, helping them to solve problems from daily life and that the use of software was relevant, as well as the collaborative working. The cognitive strand continues to be the most valued by pupils.
The CoreWall Project: An Update for 2007
NASA Astrophysics Data System (ADS)
Yu-Chung Chen, J.; Higgins, S.; Hur, H.; Ito, E.; Jenkins, C. J.; Johnson, A.; Leigh, J.; Morin, P.; Lee, J.
2007-12-01
The CoreWall Suite is a NSF-supported collaborative development for a real-time core description (Corelyzer), stratigraphic correlation (Correlater), and data visualization (CoreNavigator) software to be used by the marine, terrestrial and Antarctic science communities. The overall goal of the Corewall software development is to bring portable cross-platform tools to the broader drilling and coring communities to expand and enhance data visualization and enhance collaborative integration of multiple datasets. The CoreWall Project is now in its second year and significant progress has been made on all 3 software components. Corelyzer has undergone 2 field deployments and testing by ANDRILL program in 2006 (and again in Fall 2007) and by ICDP's SAFOD project (summer 2007). In addition, Corewall group and ICDP are working together so that the core description (DIS) system can expose DIS core data directly into Corelyzer seamlessly and be available to future ICDP and IODP-Mission Specific Platform expeditions. Educators have also taken note of the software's ease of use and strong visualization capabilities to begin exploring curriculum projects with Corelyzer software. To ensure that the software development is integrated with other community IT activities the development of the U.S. IODP-Phase 2 Scientific Ocean Drilling Vessel (SODV), a Steering Committee was constituted. It is composed of key U.S. IODP and related database (e.g., CHRONOS, SedDB) developers and users as well as representatives of other core-based enterprises (e.g., ANDRILL, ICDP, LacCore). Corelyzer (CoreWall's main visual core description tool) software displays digital core images from one or more cores along with discrete data streams (eg. physical properties, downhole logs) and nested images (eg. thin sections, fossils) to provide a robust approach to the description of sediment cores. Corelyzer's digital image handling allows the cores to be viewed from micron to km scale determined by the image resolution along a sliding plane, effectively making it a "digital microscope". Detailed features such as lithologic variation, macroscopic grain size variation, bioturbation intensity, chemical composition and micropaleontology are easier to interpret and annotate. Significant new capabilities have been added to allow for importing multiple images and data types, sharing/exporting Corelyzer "work sessions" for multiple users, enhanced annotations, as well as support for other activities like examining clasts, and sample requests. The new Correlator software, the updated version of Splicer/Sagan software used by ODP for over 10 years, has been ported into a single new analysis tool that will work across multiple platforms and interact seamlessly with both JANUS (ODP's relational database), CHRONOS, PetDB, SedDB, dbSEABED and other databases. This functionality will result in a CoreWall Suite module that can be used and distributed anywhere for stratigraphic and age correlation tasks. CoreNavigator, a spatial data discovery tool, has taken on a virtual Globe interface that allows users to enter Corelyzer from a geographic-visual standpoint.
Are Earth System model software engineering practices fit for purpose? A case study.
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Johns, T. C.
2009-04-01
We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.
NASA Technical Reports Server (NTRS)
Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.
NASA Technical Reports Server (NTRS)
Basili, V. R.
1981-01-01
Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.
NASA Technical Reports Server (NTRS)
Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.
1992-01-01
The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.
Ontological Modeling for Integrated Spacecraft Analysis
NASA Technical Reports Server (NTRS)
Wicks, Erica
2011-01-01
Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.
NASA Astrophysics Data System (ADS)
Hucka, M.
2015-09-01
In common with many fields, including astronomy, a vast number of software tools for computational modeling and simulation are available today in systems biology. This wealth of resources is a boon to researchers, but it also presents interoperability problems. Despite working with different software tools, researchers want to disseminate their work widely as well as reuse and extend the models of other researchers. This situation led in the year 2000 to an effort to create a tool-independent, machine-readable file format for representing models: SBML, the Systems Biology Markup Language. SBML has since become the de facto standard for its purpose. Its success and general approach has inspired and influenced other community-oriented standardization efforts in systems biology. Open standards are essential for the progress of science in all fields, but it is often difficult for academic researchers to organize successful community-based standards. I draw on personal experiences from the development of SBML and summarize some of the lessons learned, in the hope that this may be useful to other groups seeking to develop open standards in a community-oriented fashion.
[Stressor and stress reduction strategies for computer software engineers].
Asakura, Takashi
2002-07-01
First, in this article we discuss 10 significant occupational stressors for computer software engineers, based on the review of the scientific literature on their stress and mental health. The stressors include 1) quantitative work overload, 2) time pressure, 3) qualitative work load, 4) speed and diffusion of technological innovation, and technological divergence, 5) low discretional power, 6) underdeveloped career pattern, 7) low earnings/reward from jobs, 8) difficulties in managing a project team for software development and establishing support system, 9) difficulties in customer relations, and 10) personality characteristics. In addition, we delineate their working and organizational conditions that cause such occupational stressors in order to find strategies to reduce those stressors in their workplaces. Finally, we suggest three stressor and stress reduction strategies for software engineers.
Developing Avionics Hardware and Software for Rocket Engine Testing
NASA Technical Reports Server (NTRS)
Aberg, Bryce Robert
2014-01-01
My summer was spent working as an intern at Kennedy Space Center in the Propulsion Avionics Branch of the NASA Engineering Directorate Avionics Division. The work that I was involved with was part of Rocket University's Project Neo, a small scale liquid rocket engine test bed. I began by learning about the layout of Neo in order to more fully understand what was required of me. I then developed software in LabView to gather and scale data from two flowmeters and integrated that code into the main control software. Next, I developed more LabView code to control an igniter circuit and integrated that into the main software, as well. Throughout the internship, I performed work that mechanics and technicians would do in order to maintain and assemble the engine.
Reinforcement and Drill by Microcomputer.
ERIC Educational Resources Information Center
Balajthy, Ernest
1984-01-01
Points out why drill work has a role in the language arts classroom, explores the possibilities of using a microcomputer to give children drill work, and discusses the characteristics of a good software program, along with faults found in many software programs. (FL)
Calibration of work zone impact analysis software for Missouri.
DOT National Transportation Integrated Search
2013-12-01
This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...
Evaluation of work zone enhancement software programs.
DOT National Transportation Integrated Search
2009-09-01
The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...
A survey of Canadian medical physicists: software quality assurance of in‐house software
Kelly, Diane
2015-01-01
This paper reports on a survey of medical physicists who write and use in‐house written software as part of their professional work. The goal of the survey was to assess the extent of in‐house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple‐choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software‐related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines. PACS number: 87.55.Qr PMID:25679168
Operationalizing Cyberspace for Today’s Combat Air Force
2010-04-01
rootkit techniques to run inside common Windows services (sometimes bundled with fake antivirus software ) or in Windows safe mode, and it can hide...has shifted to downloading other malware, with its main focus on fake alerts and rogue antivirus software . 5. TR/Dldr.Agent.JKH - Compromised U.S...patch, software update, or security breech away from failure. In short, what works AU/ACSC/SIMMONS/AY10 5 today, may not work tomorrow; this fact
Group Projects and the Computer Science Curriculum
ERIC Educational Resources Information Center
Joy, Mike
2005-01-01
Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…
The Effects of Multiple Linked Representations on Students' Learning of Linear Relationships
ERIC Educational Resources Information Center
Ozgun-Koca, S. Asli
2004-01-01
The focus of this study was on comparing three groups of Algebra I 9th-year students: one group using linked representation software, the second group using similar software but with semi-linked representations, and the control group in order to examine the effects on students' understanding of linear relationships. Data collection methods…
Facilitating the analysis of the multifocal electroretinogram using the free software environment R.
Bergholz, Richard; Rossel, Mirjam; Dutescu, Ralf M; Vöge, Klaas P; Salchow, Daniel J
2018-01-01
The large amount of data rendered by the multifocal electroretinogram (mfERG) can be analyzed and visualized in various ways. The evaluation and comparison of more than one examination is time-consuming and prone to create errors. Using the free software environment R we developed a solution to average the data of multiple examinations and to allow a comparison of different patient groups. Data of single mfERG recordings as exported in .csv format from a RETIport 21 system (version 7/03, Roland Consult) or manually compiled .csv files are the basis for the calculations. The R software extracts response densities and implicit times of N1 and P1 for the sum response, each ring eccentricity, and each single hexagon. Averages can be calculated for as many subjects as needed. The mentioned parameters can then be compared to another group of patients or healthy subjects. Application of the software is illustrated by comparing 11 patients with chloroquine maculopathy to a control group of 7 healthy subjects. The software scripts display response density and implicit time 3D plots of each examination as well as of the group averages. Differences of the group averages are presented as 3D and grayscale 2D plots. Both groups are compared using the t-test with Bonferroni correction. The group comparison is furthermore illustrated by the average waveforms and by boxplots of each eccentricity. This software solution on the basis of the programming language R facilitates the clinical and scientific use of the mfERG and aids in interpretation and analysis.
ERIC Educational Resources Information Center
What Works Clearinghouse, 2010
2010-01-01
The combination of "Carnegie Learning Curricula and Cognitive Tutor[R] Software" merges algebra textbooks with interactive software developed around an artificial intelligence model that identifies strengths and weaknesses in an individual student's mastery of mathematical concepts. The software customizes prompts to focus on areas in…
ERIC Educational Resources Information Center
Marson, Guilherme A.; Torres, Bayardo B.
2011-01-01
This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…
Formal assessment instrument for ensuring the security of NASA's networks, systems and software
NASA Technical Reports Server (NTRS)
Gilliam, D. P.; Powell, J. D.; Sherif, J.
2002-01-01
To address the problem of security for NASA's networks, systems and software, NASA has funded the Jet Propulsion Lab in conjunction with UC Davis to begin work on developing a software security assessment instrument for use in the software development and maintenance life cycle.
Converging Work-Talk Patterns in Online Task-Oriented Communities.
Xuan, Qi; Devanbu, Premkumar; Filkov, Vladimir
2016-01-01
Much of what we do is accomplished by working collaboratively with others, and a large portion of our lives are spent working and talking; the patterns embodied in the alternation of working and talking can provide much useful insight into task-oriented social behaviors. The available electronic traces of the different kinds of human activities in online communities are an empirical goldmine that can enable the holistic study and understanding of these social systems. Open Source Software (OSS) projects are prototypical examples of collaborative, task-oriented communities, depending on volunteers for high-quality work. Here, we use sequence analysis methods to identify the work-talk patterns of software developers in online communities of Open Source Software projects. We find that software developers prefer to persist in same kinds of activities, i.e., a string of work activities followed by a string of talk activities and so forth, rather than switch them frequently; this tendency strengthens with time, suggesting that developers become more efficient, and can work longer with fewer interruptions. This process is accompanied by the formation of community culture: developers' patterns in the same communities get closer with time while different communities get relatively more different. The emergence of community culture is apparently driven by both "talk" and "work". Finally, we also find that workers with good balance between "work" and "talk" tend to produce just as much work as those that focus strongly on "work"; however, the former appear to be more likely to continue to be active contributors in the communities.
WLCG and IPv6 - The HEPiX IPv6 working group
Campana, S.; K. Chadwick; Chen, G.; ...
2014-06-11
The HEPiX (http://www.hepix.org) IPv6 Working Group has been investigating the many issues which feed into the decision on the timetable for the use of IPv6 (http://www.ietf.org/rfc/rfc2460.txt) networking protocols in High Energy Physics (HEP) Computing, in particular in the Worldwide Large Hadron Collider (LHC) Computing Grid (WLCG). RIPE NCC, the European Regional Internet Registry (RIR), ran out ofIPv4 addresses in September 2012. The North and South America RIRs are expected to run out soon. In recent months it has become more clear that some WLCG sites, including CERN, are running short of IPv4 address space, now without the possibility of applyingmore » for more. This has increased the urgency for the switch-on of dual-stack IPv4/IPv6 on all outward facing WLCG services to allow for the eventual support of IPv6-only clients. The activities of the group include the analysis and testing of the readiness for IPv6 and the performance of many required components, including the applications, middleware, management and monitoring tools essential for HEP computing. Many WLCG Tier 1/2 sites are participants in the group's distributed IPv6 testbed and the major LHC experiment collaborations are engaged in the testing. We are constructing a group web/wiki which will contain useful information on the IPv6 readiness of the various software components and a knowledge base (http://hepix-ipv6.web.cern.ch/knowledge-base). Furthermore, this paper describes the work done by the working group and its future plans.« less
WLCG and IPv6 - the HEPiX IPv6 working group
NASA Astrophysics Data System (ADS)
Campana, S.; Chadwick, K.; Chen, G.; Chudoba, J.; Clarke, P.; Eliáš, M.; Elwell, A.; Fayer, S.; Finnern, T.; Goossens, L.; Grigoras, C.; Hoeft, B.; Kelsey, D. P.; Kouba, T.; López Muñoz, F.; Martelli, E.; Mitchell, M.; Nairz, A.; Ohrenberg, K.; Pfeiffer, A.; Prelz, F.; Qi, F.; Rand, D.; Reale, M.; Rozsa, S.; Sciaba, A.; Voicu, R.; Walker, C. J.; Wildish, T.
2014-06-01
The HEPiX (http://www.hepix.org) IPv6 Working Group has been investigating the many issues which feed into the decision on the timetable for the use of IPv6 (http://www.ietf.org/rfc/rfc2460.txt) networking protocols in High Energy Physics (HEP) Computing, in particular in the Worldwide Large Hadron Collider (LHC) Computing Grid (WLCG). RIPE NCC, the European Regional Internet Registry (RIR), ran out ofIPv4 addresses in September 2012. The North and South America RIRs are expected to run out soon. In recent months it has become more clear that some WLCG sites, including CERN, are running short of IPv4 address space, now without the possibility of applying for more. This has increased the urgency for the switch-on of dual-stack IPv4/IPv6 on all outward facing WLCG services to allow for the eventual support of IPv6-only clients. The activities of the group include the analysis and testing of the readiness for IPv6 and the performance of many required components, including the applications, middleware, management and monitoring tools essential for HEP computing. Many WLCG Tier 1/2 sites are participants in the group's distributed IPv6 testbed and the major LHC experiment collaborations are engaged in the testing. We are constructing a group web/wiki which will contain useful information on the IPv6 readiness of the various software components and a knowledge base (http://hepix-ipv6.web.cern.ch/knowledge-base). This paper describes the work done by the working group and its future plans.
Architectural Implementation of NASA Space Telecommunications Radio System Specification
NASA Technical Reports Server (NTRS)
Peters, Kenneth J.; Lux, James P.; Lang, Minh; Duncan, Courtney B.
2012-01-01
This software demonstrates a working implementation of the NASA STRS (Space Telecommunications Radio System) architecture specification. This is a developing specification of software architecture and required interfaces to provide commonality among future NASA and commercial software-defined radios for space, and allow for easier mixing of software and hardware from different vendors. It provides required functions, and supports interaction with STRS-compliant simple test plug-ins ("waveforms"). All of it is programmed in "plain C," except where necessary to interact with C++ plug-ins. It offers a small footprint, suitable for use in JPL radio hardware. Future NASA work is expected to develop into fully capable software-defined radios for use on the space station, other space vehicles, and interplanetary probes.
Annotated bibliography of Software Engineering Laboratory literature
NASA Technical Reports Server (NTRS)
Morusiewicz, Linda; Valett, Jon D.
1991-01-01
An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.
Software Reviews: Programs Worth a Second Look.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1989
1989-01-01
Reviews three software programs: (1) "Microsoft Works 2.0": word processing, data processing, and telecommunications, grades 7 and up; (2) "AppleWorks GS": word processor, database, spreadsheet, graphics, and telecommunications, grades 3-12, Apple IIGS; (3) "Choices, Choices: On the Playground, Taking Responsibility":…
Zielińska-Bliźniewska, Hanna; Sułkowski, Wiesław J; Pietkiewicz, Piotr; Miłoński, Jarosław; Mazurek, Agnieszka; Olszewski, Jurek
2012-06-01
The aim of this study was to compare the parameters of vocal acoustic and vocal efficiency analyses in medical students and academic teachers with use of the IRIS and DiagnoScope Specialist software and to evaluate their usefulness in prevention and certification of occupational disease. The study group comprised 40 women, including students and employees of the Military Medical Faculty, Medical University of Łodź. After informed consent had been obtained from the participant women, the primary medical history was taken, videolaryngoscopic and stroboscopic examinations were performed and diagnostic vocal acoustic analysis was carried out with the use of the IRIS and Diagno-Scope Specialist software. Based on the results of the performed measurements, the statistical analysis evidenced the compatibility between two software programs, IRIS and DiagnoScope Specialist, with the only exception of the F4 formant. The mean values of vocal acoustic parameters in medical students and academic teachers, obtained by means of the IRIS software, can be used as standards for the female population not yet developed by the producer. When using the DiagnoScope Specialist software, some mean values were higher and some lower than the standards specified by the producer. The study evidenced the compatibility between two measurement software programs, IRIS and DiagnoScope Specialist, except for the F4 formant. It should be noted that the later has advantage over the former since the standard values of vocal acoustic parameters have been worked out by the producer. Moreover, they only slightly departed from the values obtained in our study and may be useful in diagnostics of occupational voice disorders.
NASA Technical Reports Server (NTRS)
Graydon, Patrick J.; Holloway, C. M.
2015-01-01
Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.
Architectures and Evaluation for Adjustable Control Autonomy for Space-Based Life Support Systems
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra K.
2001-01-01
In the past five years, a number of automation applications for control of crew life support systems have been developed and evaluated in the Adjustable Autonomy Testbed at NASA's Johnson Space Center. This paper surveys progress on an adjustable autonomous control architecture for situations where software and human operators work together to manage anomalies and other system problems. When problems occur, the level of control autonomy can be adjusted, so that operators and software agents can work together on diagnosis and recovery. In 1997 adjustable autonomy software was developed to manage gas transfer and storage in a closed life support test. Four crewmembers lived and worked in a chamber for 91 days, with both air and water recycling. CO2 was converted to O2 by gas processing systems and wheat crops. With the automation software, significantly fewer hours were spent monitoring operations. System-level validation testing of the software by interactive hybrid simulation revealed problems both in software requirements and implementation. Since that time, we have been developing multi-agent approaches for automation software and human operators, to cooperatively control systems and manage problems. Each new capability has been tested and demonstrated in realistic dynamic anomaly scenarios, using the hybrid simulation tool.
NASA Technical Reports Server (NTRS)
Wichmann, Benjamin C.
2013-01-01
I work directly with the System Monitoring and Control (SMC) software engineers who develop, test and release custom and commercial software in support of the Kennedy Space Center Spaceport Command and Control System. (SCCS). SMC uses Commercial Off-The-Shelf (COTS) Enterprise Management Systems (EMS) software which provides a centralized subsystem for configuring, monitoring, and controlling SCCS hardware and software used in the Control Rooms. There are multiple projects being worked on using the COTS EMS software. I am currently working with the HP Operations Manager for UNIX (OMU) software which allows Master Console Operators (MCO) to access, view and interpret messages regarding the status of the SCCS hardware and software. The OMU message browser gets cluttered with messages which can make it difficult for the MCO to manage. My main project involves determining ways to reduce the number of messages being displayed in the OMU message browser. I plan to accomplish this task in two different ways: (1) by correlating multiple messages into one single message being displayed and (2) to create policies that will determine the significance of each message and whether or not it needs to be displayed to the MCO. The core idea is to lessen the number of messages being sent to the OMU message browser so the MCO can more effectively use it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Puckett, Elbridge Gerry; Miller, Gregory Hale
Much of the work conducted under the auspices of DE-FG02-03ER25579 was characterized by an exceptionally close collaboration with researchers at the Lawrence Berkeley National Laboratory (LBNL). For example, Andy Nonaka, one of Professor Miller's graduate students in the Department of Applied Science at U. C. Davis (UCD) wrote his PhD thesis in an area of interest to researchers in the Applied Numerical Algorithms Group (ANAG), which is a part of the National Energy Research Supercomputer Center (NERSC) at LBNL. Dr. Nonaka collaborated closely with these researchers and subsequently published the results of this collaboration jointly with them, one article inmore » a peer reviewed journal article and one paper in the proceedings of a conference. Dr. Nonaka is now a research scientist in the Center for Computational Sciences and Engineering (CCSE), which is also part of the National Energy Research Supercomputer Center (NERSC) at LBNL. This collaboration with researchers at LBNL also included having one of Professor Puckett's graduate students in the Graduate Group in Applied Mathematics (GGAM) at UCD, Sarah Williams, spend the summer working with Dr. Ann Almgren, who is a staff scientist in CCSE. As a result of this visit Sarah decided work on a problem suggested by the head of CCSE, Dr. John Bell, for her PhD thesis. Having finished all of the coursework and examinations required for a PhD, Sarah stayed at LBNL to work on her thesis under the guidance of Dr. Bell. Sarah finished her PhD thesis in June of 2007. Writing a PhD thesis while working at one of the University of California (UC) managed DOE laboratories is long established tradition at UC and Professor Puckett has always encouraged his students to consider doing this. Another one of Professor Puckett's graduate students in the GGAM at UCD, Christopher Algieri, was partially supported with funds from DE-FG02-03ER25579 while he wrote his MS thesis in which he analyzed and extended work originally published by Dr. Phillip Colella, the head of ANAG, and some of his colleagues. Chris Algieri is now employed as a staff member in Dr. Bill Collins' Climate Science Department in the Earth Sciences Division at LBNL working with computational models of climate change. Finally, it should be noted that the work conducted by Professor Puckett and his students Sarah Williams and Chris Algieri and described in this final report for DOE grant # DE-FC02-03ER25579 is closely related to work performed by Professor Puckett and his students under the auspices of Professor Puckett's DOE SciDAC grant DE-FC02-01ER25473 An Algorithmic and Software Framework for Applied Partial Differential Equations: A DOE SciDAC Integrated Software Infrastructure Center (ISIC). Dr. Colella was the lead PI for this SciDAC grant, which was comprised of several research groups from DOE national laboratories and five university PI's from five different universities. In theory Professor Puckett tried to use funds from the SciDAC grant to support work directly involved in implementing algorithms developed by members of his research group at UCD as software that might be of use to Puckett's SciDAC CoPIs. (For example, see the work reported in Section 2.2.2 of this final report.) However, since there is considerable lead time spent developing such algorithms before they are ready to become `software' and research plans and goals change as the research progresses, Professor Puckett supported each member of his research group partially with funds from the SciDAC APDEC ISIC DE-FC02-01ER25473 and partially with funds from this DOE MICS grant DE-FC02-03ER25579. This has necessarily resulted in a significant overlap of project areas that were funded by both grants. In particular, both Sarah Williams and Chris Algieri were supported partially with funds from grant # DE-FG02-03ER25579, for which this is the final report, and in part with funds from Professor Puckett's DOE SciDAC grant # DE-FC02-01ER25473. For example, Sarah Williams received support from DE-FC02- 01ER25473 and DE-FC02-03ER25579, both while at UCD taking classes and writing her MS thesis and during the first year she was living in Berkeley and working at LBNL on her PhD thesis. In Chris Algieri's case he was at UCD during the entire time he received support from both grants. More specific details of their work are included in the report.« less
NASA-LaRc Flight-Critical Digital Systems Technology Workshop
NASA Technical Reports Server (NTRS)
Meissner, C. W., Jr. (Editor); Dunham, J. R. (Editor); Crim, G. (Editor)
1989-01-01
The outcome is documented of a Flight-Critical Digital Systems Technology Workshop held at NASA-Langley December 13 to 15 1988. The purpose of the workshop was to elicit the aerospace industry's view of the issues which must be addressed for the practical realization of flight-critical digital systems. The workshop was divided into three parts: an overview session; three half-day meetings of seven working groups addressing aeronautical and space requirements, system design for validation, failure modes, system modeling, reliable software, and flight test; and a half-day summary of the research issues presented by the working group chairmen. Issues that generated the most consensus across the workshop were: (1) the lack of effective design and validation methods with support tools to enable engineering of highly-integrated, flight-critical digital systems, and (2) the lack of high quality laboratory and field data on system failures especially due to electromagnetic environment (EME).
Markkanen, Pia; Quinn, Margaret; Galligan, Catherine; Sama, Susan; Brouillette, Natalie; Okyere, Daniel
2014-04-01
Home care (HC) aide is the fastest growing occupation, yet job hazards are under-studied. This study documents the context of HC aide work, characterizes occupational safety and health (OSH) hazards, and identifies preventive interventions using qualitative methods. We conducted 12 focus groups among aides and 26 in-depth interviews comprising 15 HC agency, union, and insurance company representatives as well as 11 HC recipients in Massachusetts. All focus groups and interviews were audio-recorded, transcribed, and coded with NVIVO software. Major OSH concerns were musculoskeletal disorders from client care tasks and verbal abuse. Performing tasks beyond specified job duties may be an OSH risk factor. HC aides' safety and clients' safety are closely linked. Client handling devices, client evaluation, care plan development, and training are key interventions for both aides' and clients' safety. Promoting OSH in HC is essential for maintaining a viable workforce. © 2013 Wiley Periodicals, Inc.
Vrijsen, Bart; Chatwin, Michelle; Contal, Oliver; Derom, Eric; Janssens, Jean-Paul; Kampelmacher, Mike J; Muir, Jean-Francois; Pinto, Susana; Rabec, Claudio; Ramsay, Michelle; Randerath, Winfried J; Storre, Jan H; Wijkstra, Peter J; Windisch, Wolfram; Testelmans, Dries
2015-09-01
During the last few decades, attention has increasingly focused on noninvasive ventilation (NIV) in the treatment of chronic respiratory failure. The University of Leuven and the University Hospitals Leuven therefore chose this topic for a 2-day working group session during their International Symposium on Sleep-Disordered Breathing. Numerous European experts took part in this session and discussed (1) NIV in amyotrophic lateral sclerosis (when to start NIV, NIV and sleep, secretion management, and what to do when NIV fails), (2) recent insights in NIV and COPD (high-intensity NIV, NIV in addition to exercise training, and NIV during exercise training), (3) monitoring of NIV (monitoring devices, built-in ventilator software, leaks, and asynchronies) and identifying events during NIV; and (4) recent and future developments in NIV (target-volume NIV, electromyography-triggered NIV, and autoregulating algorithms). Copyright © 2015 by Daedalus Enterprises.
NASA Astrophysics Data System (ADS)
Kristianti, Y.; Prabawanto, S.; Suhendra, S.
2017-09-01
This study aims to examine the ability of critical thinking and students who attain learning mathematics with learning model ASSURE assisted Autograph software. The design of this study was experimental group with pre-test and post-test control group. The experimental group obtained a mathematics learning with ASSURE-assisted model Autograph software and the control group acquired the mathematics learning with the conventional model. The data are obtained from the research results through critical thinking skills tests. This research was conducted at junior high school level with research population in one of junior high school student in Subang Regency of Lesson Year 2016/2017 and research sample of class VIII student in one of junior high school in Subang Regency for 2 classes. Analysis of research data is administered quantitatively. Quantitative data analysis was performed on the normalized gain level between the two sample groups using a one-way anova test. The results show that mathematics learning with ASSURE assisted model Autograph software can improve the critical thinking ability of junior high school students. Mathematical learning using ASSURE-assisted model Autograph software is significantly better in improving the critical thinking skills of junior high school students compared with conventional models.
ERIC Educational Resources Information Center
Kendall, Leslie R.
2013-01-01
Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…
AAS Publishing News: Astronomical Software Citation Workshop
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2015-07-01
Do you write code for your research? Use astronomical software? Do you wish there were a better way of citing, sharing, archiving, or discovering software for astronomy research? You're not alone! In April 2015, AAS's publishing team joined other leaders in the astronomical software community in a meeting funded by the Sloan Foundation, with the purpose of discussing these issues and potential solutions. In attendance were representatives from academic astronomy, publishing, libraries, for-profit software sharing platforms, telescope facilities, and grantmaking institutions. The goal of the group was to establish “protocols, policies, and platforms for astronomical software citation, sharing, and archiving,” in the hopes of encouraging a set of normalized standards across the field. The AAS is now collaborating with leaders at GitHub to write grant proposals for a project to develop strategies for software discoverability and citation, in astronomy and beyond. If this topic interests you, you can find more details in this document released by the group after the meeting: http://astronomy-software-index.github.io/2015-workshop/ The group hopes to move this project forward with input and support from the broader community. Please share the above document, discuss it on social media using the hashtag #astroware (so that your conversations can be found!), or send private comments to julie.steffen@aas.org.
NASA Astrophysics Data System (ADS)
Ruby, Michael
In the last decades scanning probe microscopy and spectroscopy have become well-established tools in nanotechnology and surface science. This opened the market for many commercial manufacturers, each with different hardware and software standards. Besides the advantage of a wide variety of available hardware, the diversity may software-wise complicate the data exchange between scientists, and the data analysis for groups working with hardware developed by different manufacturers. Not only the file format differs between manufacturers, but also the data often requires further numerical treatment before publication. SpectraFox is an open-source and independent tool which manages, processes, and evaluates scanning probe spectroscopy and microscopy data. It aims at simplifying the documentation in parallel to measurement, and it provides solid evaluation tools for a large number of data.
Space life support engineering program
NASA Technical Reports Server (NTRS)
Seagrave, Richard C.
1992-01-01
A comprehensive study to develop software to simulate the dynamic operation of water reclamation systems in long-term closed-loop life support systems is being carried out as part of an overall program for the design of systems for a moon station or a Mars voyage. This project is being done in parallel with a similar effort in the Department of Chemistry to develop durable accurate low-cost sensors for monitoring of trace chemical and biological species in recycled water supplies. Aspen-Plus software is being used on a group of high-performance work stations to develop the steady state descriptions for a number of existing technologies. Following completion, a dynamic simulation package will be developed for determining the response of such systems to changes in the metabolic needs of the crew and to upsets in system hardware performance.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
Specification for Visual Requirements of Work-Centered Software Systems
2006-10-01
no person shall be subject to any penity for fallng to comply wih a collection of N ilo ration it does not display a currenty valid OMB control...work- aiding systems. Based on the design concept for a work- centered support system (WCSS), these software systems support user tasks and goals...through both direct and indirect aiding methods within the interface client. In order to ensure the coherent development and delivery of work- centered
SU-F-P-04: Implementation of Dose Monitoring Software: Successes and Pitfalls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Och, J
2016-06-15
Purpose: to successfully install a dose monitoring software (DMS) application to assist in CT protocol and dose management. Methods: Upon selecting the DMS, we began our implementation of the application. A working group composed of Medical Physics, Radiology Administration, Information Technology, and CT technologists was formed. On-site training in the application was supplied by the vendor. The decision was made to apply the process for all the CT protocols on all platforms at all facilities. Protocols were painstakingly mapped to the correct masters, and the system went ‘live’. Results: We are routinely using DMS as a tool in our Clinicalmore » Performance CT QA program. It is useful in determining the effectiveness of revisions to existing protocols, and establishing performance baselines for new units. However, the implementation was not without difficulty. We identified several pitfalls and obstacles which frustrated progress. Including: Training deficiencies, Nomenclature problems, Communication, DICOM variability. Conclusion: Dose monitoring software can be a potent tool for QA. However, implementation of the program can be problematic and requires planning, organization and commitment.« less
Orbiter Flying Qualities (OFQ) Workstation user's guide
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.
1988-01-01
This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.
Baghban, Iran; Malekiha, Marziyeh; Fatehizadeh, Maryam
2010-01-01
Work-family conflict has many negative outcomes for organization and career and family life of each person. The aim of present study was to determine the relationship between work-family conflict and the level of self-efficacy in female nurses. In this cross-sectional descriptive research, the relationship between work-family conflict and the level of self-efficacy in female nurses of Alzahra Hospital was assessed. Questionnaire, demographic data form, work-family conflict scale and self-efficacy scale were the data collection tools. Content analysis and Cronbach's alpha were used for evaluating the validity and reliability of questionnaire. The study sample included 160 nurses (80 permanent nurses and 80 contract-based nurses) selected through simple random sampling from nurses working in different wards of Alzahra Hospital. Data analysis was done using SPSS software. There was significant difference in work-family conflict between the two groups of permanent and contract-based nurses (p = 0.02). Also, a significant difference in the level of self-efficacy was observed between the two groups of nurses (p = 0.03). The level of self-efficacy and work-family conflict in contract-based nurses was not acceptable. Therefore, it is suggested to arrange courses to train effective skills in the field of management of work-family conflicts in order to increase the level of self-efficacy for contract-based nurses.
Integrated IMA (Information Mission Areas) IC (Information Center) Guide
1989-06-01
COMPUTER AIDED DESIGN / COMPUTER AIDED MANUFACTURE 8-8 8.3.7 LIQUID CRYSTAL DISPLAY PANELS 8-8 8.3.8 ARTIFICIAL INTELLIGENCE APPLIED TO VI 8-9 8.4...2 10.3.1 DESKTOP PUBLISHING 10-3 10.3.2 INTELLIGENT COPIERS 10-5 10.3.3 ELECTRONIC ALTERNATIVES TO PRINTED DOCUMENTS 10-5 10.3.4 ELECTRONIC FORMS...Optical Disk LCD Units Storage Image Scanners Graphics Forms Output Generation Copiers Devices Software Optical Disk Intelligent Storage Copiers Work Group
Sample EP Flow Analysis of Severely Damaged Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werley, Kenneth Alan; McCown, Andrew William
These are slides for a presentation at the working group meeting of the WESC SREMP Software Product Integration Team on sample EP flow analysis of severely damaged networks. The following topics are covered: ERCOT EP Transmission Model; Zoomed in to Houston and Overlaying StreetAtlas; EMPACT Solve/Dispatch/Shedding Options; QACS BaseCase Power Flow Solution; 3 Substation Contingency; Gen. & Load/100 Optimal Dispatch; Dispatch Results; Shed Load for Low V; Network Damage Summary; Estimated Service Areas (Potential); Estimated Outage Areas (potential).
What Librarians Still Don't Know about Free Software
ERIC Educational Resources Information Center
Chudnov, Daniel
2009-01-01
Free software isn't about cost, and it isn't about hype, it isn't about taking business away from vendors. It's about four kinds of freedom--the freedom to use the software for any purpose, the freedom to study how the software works, the freedom to modify the software to adapt it to one's needs, and the freedom to copy and share copies of the…
2012-01-01
Background Long terminal repeat (LTR) retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-)families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets), making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. Results We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. Conclusions LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining the output of software for predicting LTR retrotransposons up to the stage of preparing full-length reference sequence libraries. The LTRsift software is freely available at http://www.zbh.uni-hamburg.de/LTRsift under an open-source license. PMID:23131050
Steinbiss, Sascha; Kastens, Sascha; Kurtz, Stefan
2012-11-07
Long terminal repeat (LTR) retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-)families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets), making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining the output of software for predicting LTR retrotransposons up to the stage of preparing full-length reference sequence libraries. The LTRsift software is freely available at http://www.zbh.uni-hamburg.de/LTRsift under an open-source license.
SIGKit: a New Data-based Software for Learning Introductory Geophysics
NASA Astrophysics Data System (ADS)
Zhang, Y.; Kruse, S.; George, O.; Esmaeili, S.; Papadimitrios, K. S.; Bank, C. G.; Cadmus, A.; Kenneally, N.; Patton, K.; Brusher, J.
2016-12-01
Students of diverse academic backgrounds take introductory geophysics courses to learn the theory of a variety of measurement and analysis methods with the expectation to be able to apply their basic knowledge to real data. Ideally, such data is collected in field courses and also used in lecture-based courses because they provide a critical context for better learning and understanding of geophysical methods. Each method requires a separate software package for the data processing steps, and the complexity and variety of professional software makes the path through data processing to data interpretation a strenuous learning process for students and a challenging teaching task for instructors. SIGKit (Student Investigation of Geophysics Toolkit) being developed as a collaboration between the University of South Florida, the University of Toronto, and MathWorks intends to address these shortcomings by showing the most essential processing steps and allowing students to visualize the underlying physics of the various methods. It is based on MATLAB software and offered as an easy-to-use graphical user interface and packaged so it can run as an executable in the classroom and the field even on computers without MATLAB licenses. An evaluation of the software based on student feedback from focus-group interviews and think-aloud observations helps drive its development and refinement. The toolkit provides a logical gateway into the more sophisticated and costly software students will encounter later in their training and careers by combining essential visualization, modeling, processing, and analysis steps for seismic, GPR, magnetics, gravity, resistivity, and electromagnetic data.
Birgiolas, Justas; Jernigan, Christopher M.; Gerkin, Richard C.; Smith, Brian H.; Crook, Sharon M.
2017-01-01
Many scientifically and agriculturally important insects use antennae to detect the presence of volatile chemical compounds and extend their proboscis during feeding. The ability to rapidly obtain high-resolution measurements of natural antenna and proboscis movements and assess how they change in response to chemical, developmental, and genetic manipulations can aid the understanding of insect behavior. By extending our previous work on assessing aggregate insect swarm or animal group movements from natural and laboratory videos using the video analysis software SwarmSight, we developed a novel, free, and open-source software module, SwarmSight Appendage Tracking (SwarmSight.org) for frame-by-frame tracking of insect antenna and proboscis positions from conventional web camera videos using conventional computers. The software processes frames about 120 times faster than humans, performs at better than human accuracy, and, using 30 frames per second (fps) videos, can capture antennal dynamics up to 15 Hz. The software was used to track the antennal response of honey bees to two odors and found significant mean antennal retractions away from the odor source about 1 s after odor presentation. We observed antenna position density heat map cluster formation and cluster and mean angle dependence on odor concentration. PMID:29364251
UROKIN: A Software to Enhance Our Understanding of Urogenital Motion.
Czyrnyj, Catriona S; Labrosse, Michel R; Graham, Ryan B; McLean, Linda
2018-05-01
Transperineal ultrasound (TPUS) allows for objective quantification of mid-sagittal urogenital mechanics, yet current practice omits dynamic motion information in favor of analyzing only a rest and a peak motion frame. This work details the development of UROKIN, a semi-automated software which calculates kinematic curves of urogenital landmark motion. A proof of concept analysis, performed using UROKIN on TPUS video recorded from 20 women with and 10 women without stress urinary incontinence (SUI) performing maximum voluntary contraction of the pelvic floor muscles. The anorectal angle and bladder neck were tracked while the motion of the pubic symphysis was used to compensate for the error incurred by TPUS probe motion during imaging. Kinematic curves of landmark motion were generated for each video and curves were smoothed, time normalized, and averaged within groups. Kinematic data yielded by the UROKIN software showed statistically significant differences between women with and without SUI in terms of magnitude and timing characteristics of the kinematic curves depicting landmark motion. Results provide insight into the ways in which UROKIN may be useful to study differences in pelvic floor muscle contraction mechanics between women with and without SUI and other pelvic floor disorders. The UROKIN software improves on methods described in the literature and provides unique capacity to further our understanding of urogenital biomechanics.
Upper Secondary and Vocational Level Teachers at Social Software
ERIC Educational Resources Information Center
Valtonen, Teemu; Kontkanen, Sini; Dillon, Patrick; Kukkonen, Jari; Väisänen, Pertti
2014-01-01
This study focuses on upper secondary and vocational level teachers as users of social software i.e. what software they use during their leisure and work and for what purposes they use software in teaching. The study is theorised within a technological pedagogical content knowledge framework, the emphasis is especially on technological knowledge…
NASA Technical Reports Server (NTRS)
Shell, Elaine M.; Lue, Yvonne; Chu, Martha I.
1999-01-01
Flight software is a mission critical element of spacecraft functionality and performance. When ground operations personnel interface to a spacecraft, they are typically dealing almost entirely with the capabilities of onboard software. This software, even more than critical ground/flight communications systems, is expected to perform perfectly during all phases of spacecraft life. Due to the fact that it can be reprogrammed on-orbit to accommodate degradations or failures in flight hardware, new insights into spacecraft characteristics, new control options which permit enhanced science options, etc., the on- orbit flight software maintenance team is usually significantly responsible for the long term success of a science mission. Failure of flight software to perform as needed can result in very expensive operations work-around costs and lost science opportunities. There are three basic approaches to maintaining spacecraft software--namely using the original developers, using the mission operations personnel, or assembling a center of excellence for multi-spacecraft software maintenance. Not planning properly for flight software maintenance can lead to unnecessarily high on-orbit costs and/or unacceptably long delays, or errors, in patch installations. A common approach for flight software maintenance is to access the original development staff. The argument for utilizing the development staff is that the people who developed the software will be the best people to modify the software on-orbit. However, it can quickly becomes a challenge to obtain the services of these key people. They may no longer be available to the organization. They may have a more urgent job to perform, quite likely on another project under different project management. If they havn't worked on the software for a long time, they may need precious time for refamiliarization to the software, testbeds and tools. Further, a lack of insight into issues related to flight software in its on-orbit environment, may find the developer unprepared for the challenges. The second approach is to train a member of the flight operations team to maintain the spacecraft software. This can prove to be a costly and inflexible solution. The person assigned to this duty may not have enough work to do during a problem free period and may have too much to do when a problem arises. If the person is a talented software engineer, he/she may not enjoy the limited software opportunities available in this position; and may eventually leave for newer technology computer science opportunities. Training replacement flight software personnel can be a difficult and lengthy process. The third approach is to assemble a center of excellence for on-orbit spacecraft software maintenance. Personnel in this specialty center can be managed to support flight software of multiple missions at once. The variety of challenges among a set of on-orbit missions, can result in a dedicated, talented staff which is fully trained and available to support each mission's needs. Such staff are not software developers but are rather spacecraft software systems engineers. The cost to any one mission is extremely low because the software staff works and charges, minimally on missions with no current operations issues; and their professional insight into on-orbit software troubleshooting and maintenance methods ensures low risk, effective and minimal-cost solutions to on-orbit issues.
Open-source meteor detection software for low-cost single-board computers
NASA Astrophysics Data System (ADS)
Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.
2016-01-01
This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.
The Diffuse Soft X-ray Background: Trials and Tribulations
NASA Astrophysics Data System (ADS)
Ulmer, Melville P.
2013-01-01
I joined the University of Wisconsin-Madison sounding rocket group at its inception. It was an exciting time, as nobody knew what the X-ray sky looked like. Our group focused on the soft X-ray background, and built proportional counters with super thin (2 micron thick) windows. As the inter gas pressure of the counters was about 1 atmosphere, it was no mean feat to get payload to launch without the window bursting. On top of that we built all our own software from space solutions to unfolding the spectral data. For we did it then as now: Our computer code modeled the detector response and then folded various spectral shapes through the response and compared the results with the raw data. As far as interpretation goes, here are examples of how one can get things wrong: The Berkeley group published a paper of the soft X-ray background that disagreed with ours. Why? It turned out they had **assumed** the galactic plane was completely opaque to soft X-ray and hence corrected for detector background that way. It turns out that the ISM emits in soft X-rays! Another example was the faux pas of the Calgary group. They didn’t properly shield their detector from the sounding rocket telemetry. Thus they got an enormous signal, which to our amusement some (ambulance chaser) theoreticians tried to explain! So back then as now, mistakes were made, but at least we all knew how our X-ray systems worked from soup (the detectors) to nuts (the data analysis code) where as toady “anybody” with a good idea but only a vague inkling of how detectors, mirrors and software work, can be an X-ray astronomer. On the one hand, this has made the field accessible to all, and on the other, errors in interpretation can be made as the X-ray telescope user can fall prey to running black box software. Furthermore with so much funding going into supporting observers, there is little left to make the necessary technology advances or keep the core expertise in place to even to stay even with today’s observatories. We will need a newly launched facility (or two) or the field will eventually die.
Applying Evolutionary Prototyping In Developing LMIS: A Spatial Web-Based System For Land Management
NASA Astrophysics Data System (ADS)
Agustiono, W.
2018-01-01
Software development project is a difficult task. Especially for software designed to comply with regulations that are constantly being introduced or changed, it is almost impossible to make just one change during the development process. Even if it is possible, nonetheless, the developers may take bulk of works to fix the design to meet specified needs. This iterative work also means that it takes additional time and potentially leads to failing to meet the original schedule and budget. In such inevitable changes, it is essential for developers to carefully consider and use an appropriate method which will help them carry out software project development. This research aims to examine the implementation of a software development method called evolutionary prototyping for developing software for complying regulation. It investigates the development of Land Management Information System (pseudonym), initiated by the Australian government, for use by farmers to meet regulatory demand requested by Soil and Land Conservation Act. By doing so, it sought to provide understanding the efficacy of evolutionary prototyping in helping developers address frequent changing requirements and iterative works but still within schedule. The findings also offer useful practical insights for other developers who seek to build similar regulatory compliance software.
The Effect of Software Features on Software Adoption and Training in the Audit Profession
ERIC Educational Resources Information Center
Kim, Hyo-Jeong
2012-01-01
Although software has been studied with technology adoption and training research, the study of specific software features for professional groups has been limited. To address this gap, I researched the impact of software features of varying complexity on internal audit (IA) professionals. Two studies along with the development of training…
A Novel Coupling Pattern in Computational Science and Engineering Software
Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization, existing CSE software may need to integrate other CSE software systems developed by different groups of experts. The coupling problem is one of the challenges...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... Pachard Company, Business Critical Systems, Mission Critical Business Software Division, Openvms Operating... Business Software Division, Openvms Operating System Development Group, Including an Employee Operating Out... Company, Business Critical Systems, Mission Critical Business Software Division, OpenVMS Operating System...
A Novel Coupling Pattern in Computational Science and Engineering Software
Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization,existing CSE software may need to integrate other CSE software systems developed by different groups of experts. Thecoupling problem is one of the challenges f...
Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo
2016-01-01
The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.
The Future of Digital Working: Knowledge Migration and Learning
ERIC Educational Resources Information Center
Malcolm, Irene
2014-01-01
Against the backdrop of intensified migration linked to globalisation, this article considers the implications of knowledge migration for future digital workers. It draws empirically on a socio-material analysis of the international software localisation industry. Localisers' work requires linguistic, cultural and software engineering skills to…
Accelerated Math[TM]. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2011
2011-01-01
"Accelerated Math"[TM], published by Renaissance Learning, is a software tool used to customize assignments and monitor progress in math for students in grades 1-12. The "Accelerated Math"[TM] software creates individualized assignments aligned with state standards and national guidelines, scores student work, and generates…
Educational Software Employing Group Competition Using an Interactive Electronic Whiteboard
ERIC Educational Resources Information Center
Otsuki, Yoko; Bandoh, Hirokazu; Kato, Naoki; Indurkhya, Bipin; Nakagawa, Masaki
2004-01-01
This article presents a design of educational software employing group competition using a large interactive electronic whiteboard, and a report on its experimental use. Group competition and collaboration are useful methods to cultivate originality and communication skills. To share the same space, the same large screen, and face-to-face…
Development of Cross-Platform Software for Well Logging Data Visualization
NASA Astrophysics Data System (ADS)
Akhmadulin, R. K.; Miraev, A. I.
2017-07-01
Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.
Swarming Robot Design, Construction and Software Implementation
NASA Technical Reports Server (NTRS)
Stolleis, Karl A.
2014-01-01
In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.
Introduction to Stand-up Meetings in Agile Methods
NASA Astrophysics Data System (ADS)
Hasnain, Eisha; Hall, Tracy
2009-05-01
In recent years, agile methods have become more popular in the software industry. Agile methods are a new approach compared to plan-driven approaches. One of the most important shifts in adopting an agile approach is the central focus given to people in the process. This is exemplified by the independence afforded to developers in the development work they do. This work investigates the opinions of practitioners about daily stand-up meetings in the agile methods and the role of developer in that. For our investigation we joined a yahoo group called "Extreme Programming". Our investigation suggests that although trust is an important factor in agile methods. But stand-ups are not the place to build trust.
Handling knowledge via Concept Maps: a space weather use case
NASA Astrophysics Data System (ADS)
Messerotti, Mauro; Fox, Peter
Concept Maps (Cmaps) are powerful means for knowledge coding in graphical form. As flexible software tools exist to manipulate the knowledge embedded in Cmaps in machine-readable form, such complex entities are suitable candidates not only for the representation of ontologies and semantics in Virtual Observatory (VO) architectures, but also for knowledge handling and knowledge discovery. In this work, we present a use case relevant to space weather applications and we elaborate on its possible implementation and adavanced use in Semantic Virtual Observatories dedicated to Sun-Earth Connections. This analysis was carried out in the framework of the Electronic Geophysical Year (eGY) and represents an achievement synergized by the eGY Virtual Observatories Working Group.
Return to work in multi-ethnic breast cancer survivors--a qualitative inquiry.
Tan, Foo Lan; Loh, Siew Yim; Su, Tin Tin; Veloo, V W; Ng, Lee Luan
2012-01-01
Return-to-work (RTW) can be a problematic occupational issue with detrimental impact on the quality of life of previously-employed breast cancer survivors. This study explored barriers and facilitators encountered during the RTW process in the area of cancer survivorship. Six focus groups were conducted using a semi-structured interview guide on 40 informants (employed multiethnic survivors). Survivors were stratified into three groups for successfully RTW, and another three groups of survivors who were unable to return to work. Each of the three groups was ethnically homogeneous. Thematic analysis using a constant comparative approach was aided by in vivo software. Participants shared numerous barriers and facilitators which directly or interactively affect RTW. Key barriers were physical-psychological after-effects of treatment, fear of potential environment hazards, high physical job demand, intrusive negative thoughts and overprotective family. Key facilitators were social support, employer support, and regard for financial independence. Across ethnic groups, the main facilitators were financial-independence (for Chinese), and socialisation opportunity (for Malay). A key barrier was after-effects of treatment, expressed across all ethnic groups. Numerous barriers were identified in the non-RTW survivors. Health professionals and especially occupational therapists should be consulted to assist the increasing survivors by providing occupational rehabilitation to enhance RTW amongst employed survivors. Future research to identify prognostic factors can guide clinical efforts to restore cancer survivors to their desired level/type of occupational functioning for productivity and wellbeing.
Instructional Software and Attention Disorders: A Tool for Teachers.
ERIC Educational Resources Information Center
Bice, Joe E.; And Others
This handbook provides information on 31 software programs designed to instruct students with attention disorders in individual and group settings. The most successful applications of instructional software are identified, and six broad categories of instructional software are discussed. Twenty-one strategies for teaching students with attention…
Experimentation in software engineering
NASA Technical Reports Server (NTRS)
Basili, V. R.; Selby, R. W.; Hutchens, D. H.
1986-01-01
Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.
An Analysis of Mission Critical Computer Software in Naval Aviation
1991-03-01
No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a
Requirements Engineering in Building Climate Science Software
NASA Astrophysics Data System (ADS)
Batcheller, Archer L.
Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.
The Production Data Approach for Full Lifecycle Management
NASA Astrophysics Data System (ADS)
Schopf, J.
2012-04-01
The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.
NASA Astrophysics Data System (ADS)
Hampton, S. E.
2015-12-01
The science necessary to unravel complex environmental problems confronts severe computational challenges - coping with huge volumes of heterogeneous data, spanning vast spatial scales at high resolution, and requiring integration of disparate measurements from multiple disciplines. But as cyberinfrastructure advances to support such work, scientists in many fields lack sufficient computational skills to participate in interdisciplinary, data-intensive research. In response, we developed innovative training workshops for early-career scientists, in order to explore both the needs and solutions for training next-generation scientists in skills for data-intensive environmental research. In 2013 and 2014 we ran intensive 3-week training workshops for early-career researchers. One of the workshops was run concurrently in California and North Carolina, connected by virtual technologies and coordinated schedules. We attracted applicants to the workshop with the opportunity to pursue data-intensive small-group research projects that they proposed. This approach presented a realistic possibility that publishable products could result from 3 weeks of focused hands-on classroom instruction combined with self-directed group research in which instructors were present to assist trainees. Instruction addressed 1) collaboration modes and technologies, 2) data management, preservation, and sharing, 3) preparing data for analysis using scripting, 4) reproducible research, 5) sustainable software practices, 6) data analysis and modeling, and 7) communicating results to broad communities. The most dramatic improvements in technical skills were in data management, version control, and working with spatial data outside of proprietary software. In addition, participants built strong networks and collaborative skills that later resulted in a successful student-led grant proposal, published manuscripts, and participants reported that the training was a highly influential experience.
Thompson, Charee M; Crook, Brittani; Love, Brad; Macpherson, Catherine Fiona; Johnson, Rebecca
2015-04-27
We compared adolescent and young adult cancer patient and survivor language between mediated and face-to-face support communities in order to understand how the use of certain words frame conversations about family, friends, health, work, achievement, and leisure. We analyzed transcripts from an online discussion board (N = 360) and face-to-face support group (N = 569) for adolescent and young adults using Linguistic Inquiry and Word Count, a word-based computerized text analysis software that counts the frequency of words and word stems. There were significant differences between the online and face-to-face support groups in terms of content (e.g. friends, health) and style words (e.g. verb tense, negative emotion, and cognitive process). © The Author(s) 2015.
Recording Computer-Based Demonstrations and Board Work
ERIC Educational Resources Information Center
Spencer, Neil H.
2010-01-01
This article describes how a demonstration of statistical (or other) software can be recorded without expensive video equipment and saved as a presentation to be displayed with software such as Microsoft PowerPoint. Work carried out on a tablet PC, for example, can also be recorded in this fashion.
AEDT Software Requirements Documents - Draft
DOT National Transportation Integrated Search
2007-01-25
This software requirements document serves as the basis for designing and testing the Aviation Environmental Design Tool (AEDT) software. The intended audience for this document consists of the following groups: the AEDT designers, developers, and te...
Occupy Hard Drives: Making your work more valuable by giving it away
NASA Astrophysics Data System (ADS)
Weiner, Benjamin J.
2014-01-01
Astronomy is more than ever reliant on scientist-built software, but our systems of supporting research and giving credit for research work have failed to evolve with this reality. Both the perception of short term advantage, and an artificial distinction between "tools" and "science," lead to software and data remaining proprietary or unpublished. The lack of incentives to build and maintain software leads to both a decay of the software infrastructure, and a potential for growing class inequality, a pundit-technician divide. Top-down efforts to direct the field such as the recent US decadal survey have not adequately addressed this future. I argue that writing, freely releasing, and publishing your software is currently not adequately funded, rewarded, or credited, and that you should do it anyway. Writing your software as if you plan to release it is better for you and for the code. Releasing software can get credit from the rest of the community beyond your circle of collaborators or letter-writers, and it can benefit you and everyone else by making astronomy a better place to work. Building a culture of cooperation will be a more effective approach to reforming the system of credit than waiting for leadership from above or outside, but requires that each of us consciously encourage process, values, and behavior that support such a change.
Annotated bibliography of software engineering laboratory literature
NASA Technical Reports Server (NTRS)
Buhler, Melanie; Valett, Jon
1989-01-01
An annotated bibliography is presented of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. The bibliography was updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials were grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
Teaching Fraunhofer diffraction via experimental and simulated images in the laboratory
NASA Astrophysics Data System (ADS)
Peinado, Alba; Vidal, Josep; Escalera, Juan Carlos; Lizana, Angel; Campos, Juan; Yzuel, Maria
2012-10-01
Diffraction is an important phenomenon introduced to Physics university students in a subject of Fundamentals of Optics. In addition, in the Physics Degree syllabus of the Universitat Autònoma de Barcelona, there is an elective subject in Applied Optics. In this subject, diverse diffraction concepts are discussed in-depth from different points of view: theory, experiments in the laboratory and computing exercises. In this work, we have focused on the process of teaching Fraunhofer diffraction through laboratory training. Our approach involves students working in small groups. They visualize and acquire some important diffraction patterns with a CCD camera, such as those produced by a slit, a circular aperture or a grating. First, each group calibrates the CCD camera, that is to say, they obtain the relation between the distances in the diffraction plane in millimeters and in the computer screen in pixels. Afterwards, they measure the significant distances in the diffraction patterns and using the appropriate diffraction formalism, they calculate the size of the analyzed apertures. Concomitantly, students grasp the convolution theorem in the Fourier domain by analyzing the diffraction of 2-D gratings of elemental apertures. Finally, the learners use a specific software to simulate diffraction patterns of different apertures. They can control several parameters: shape, size and number of apertures, 1-D or 2-D gratings, wavelength, focal lens or pixel size.Therefore, the program allows them to reproduce the images obtained experimentally, and generate others by changingcertain parameters. This software has been created in our research group, and it is freely distributed to the students in order to help their learning of diffraction. We have observed that these hands on experiments help students to consolidate their theoretical knowledge of diffraction in a pedagogical and stimulating learning process.
Kim, Dong-Yeon; Kim, Eo-Bin; Kim, Hae-Young; Kim, Ji-Hwan; Kim, Woong-Chul
2017-12-01
To evaluate the fit of a three-unit metal framework of fixed dental prostheses made by subtractive and additive manufacturing. One master model of metal was fabricated. Twenty silicone impressions were made on the master die, working die of 10 poured with Type 4 stone, and working die of 10 made of scannable stone. Ten three-unit wax frameworks were fabricated by wax-up from Type IV working die. Stereolithography files of 10 three-unit frameworks were obtained using a model scanner and three-dimensional design software on a scannable working die. The three-unit wax framework was fabricated using subtractive manufacturing (SM) by applying the prepared stereolithography file, and the resin framework was fabricated by additive manufacturing (AM); both used metal alloy castings for metal frameworks. Marginal and internal gap were measured using silicone replica technique and digital microscope. Measurement data were analyzed by Kruskal-Wallis H test and Mann-Whitney U-test (α=.05). The lowest and highest gaps between premolar and molar margins were in the SM group and the AM group, respectively. There was a statistically significant difference in the marginal gap among the 3 groups ( P <.001). In the marginal area where pontic was present, the largest gap was 149.39 ± 42.30 µm in the AM group, and the lowest gap was 24.40 ± 11.92 µm in the SM group. Three-unit metal frameworks made by subtractive manufacturing are clinically applicable. However, additive manufacturing requires more research to be applied clinically.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynch, Kevin; Popp, James
This DOE grant award for was for the period June 1, 2013 to March 31, 2016. Popp was awarded an internship in the Visiting Faculty Program at FNAL in summer of 2015; consequently the unused portion of summer salary funds allowed us to apply for a no-cost extension with our remaining funds until March 31, 2017. That support furnished us with the means to carry out numerous successful projects for Mu2e for nearly four years. Up to now, the driving force to our work has been dictated primarily by the Mu2e Project cost and schedule needs. Our work has beenmore » under the purview of three of the Working Groups to which we belong: Target Station, Electron Tracker, and Stopping Target Monitor. We have carried out a mix of bench-top testing tasks locally, more elaborate work at Fermilab every summer, and extensive software development and simulation studies.« less
Using social marketing to understand the family dinner with working mothers.
Martinasek, Mary P; DeBate, Rita D; Walvoord, Ashley G; Melton, Stephanie T; Himmelgreen, David; Allen, Tammy D; McDermott, Robert J
2010-01-01
The family dinner is a valued tradition that affords opportunities for social interaction and attachment, as well as sharing events of the day, role modeling, connectedness, and problem solving. Guided by the social-marketing framework, this study explored factors associated with the frequency of the family dinner among working mothers with children ages 8-11 years. A qualitative design was used, employing focus groups and Atlas-ti software for thematic analysis. Lack of time, cost, and exhaustion/lack of energy emerged as barriers. Working mothers indicated that a youth-based organization operating as a community partner could increase the frequency of the family dinner by helping with homework completion during after-school care, thereby providing mothers with the time necessary to prepare dinner. This research identified both community partners and working mothers as valued resources for prevention strategies. Interventions developed to increase family dinner frequency should emphasize the perceived value while decreasing the costs/barriers.
Digital diagnosis of medical images
NASA Astrophysics Data System (ADS)
Heinonen, Tomi; Kuismin, Raimo; Jormalainen, Raimo; Dastidar, Prasun; Frey, Harry; Eskola, Hannu
2001-08-01
The popularity of digital imaging devices and PACS installations has increased during the last years. Still, images are analyzed and diagnosed using conventional techniques. Our research group begun to study the requirements for digital image diagnostic methods to be applied together with PACS systems. The research was focused on various image analysis procedures (e.g., segmentation, volumetry, 3D visualization, image fusion, anatomic atlas, etc.) that could be useful in medical diagnosis. We have developed Image Analysis software (www.medimag.net) to enable several image-processing applications in medical diagnosis, such as volumetry, multimodal visualization, and 3D visualizations. We have also developed a commercial scalable image archive system (ActaServer, supports DICOM) based on component technology (www.acta.fi), and several telemedicine applications. All the software and systems operate in NT environment and are in clinical use in several hospitals. The analysis software have been applied in clinical work and utilized in numerous patient cases (500 patients). This method has been used in the diagnosis, therapy and follow-up in various diseases of the central nervous system (CNS), respiratory system (RS) and human reproductive system (HRS). In many of these diseases e.g. Systemic Lupus Erythematosus (CNS), nasal airways diseases (RS) and ovarian tumors (HRS), these methods have been used for the first time in clinical work. According to our results, digital diagnosis improves diagnostic capabilities, and together with PACS installations it will become standard tool during the next decade by enabling more accurate diagnosis and patient follow-up.
NASA Astrophysics Data System (ADS)
Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.
2015-12-01
Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines
ERIC Educational Resources Information Center
Guven, Bulent
2012-01-01
This study examines the effect of dynamic geometry software (DGS) on students' learning of transformation geometry. A pre- and post-test quasi-experimental design was used. Participants in the study were 68 eighth grade students (36 in the experimental group and 32 in the control group). While the experimental group students were studying the…
MOPEX: a software package for astronomical image processing and visualization
NASA Astrophysics Data System (ADS)
Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley
2006-06-01
We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.
Todhunter, Fern
2015-06-01
Observations obtained through concurrent think-aloud and protocol analysis offer new understanding about the influence of social learning on student nurses' acquisition of Information and Communication Technology (ICT) knowledge and skills. The software used provides a permanent record of the underpinning study method, events and analyses. The emerging themes reflect the dimensions of social engagement, and the characteristics of positive and negative reactions to ICT. The evidence shows that given the right conditions, stronger learners will support and guide their peers. To explore the use of concurrent think-aloud and protocol analysis as a method to examine how student nurses approach ICT. To identify the benefits and challenges of using observational technology to capture learning behaviours. To show the influence of small group arrangement and student interactions on their ICT knowledge and skills development. Previous studies examining social interaction between students show how they work together and respond to interactive problem solving. Social interaction has been shown to enhance skills in both ICT and collaborative decision making. Structured observational analysis using concurrent think-aloud and protocol analysis. Students displayed varying degrees of pastoral support and emotional need, leadership, reflection, suggestion and experimentation skills. Encouraging student nurses to work in small mixed ability groups can be conducive for social and ICT skill and knowledge development. Observational software gives a permanent record of the proceedings. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hut, R. W.; van de Giesen, N. C.; Drost, N.
2017-05-01
The suggestions by Hutton et al. might not be enough to guarantee reproducible computational hydrology. Archiving software code and research data alone will not be enough. We add to the suggestion of Hutton et al. that hydrologists not only document their (computer) work, but that hydrologists use the latest best practices in designing research software, most notably the use of containers and open interfaces. To make sure hydrologists know of these best practices, we urge close collaboration with Research Software Engineers (RSEs).
Software for keratometry measurements using portable devices
NASA Astrophysics Data System (ADS)
Iyomasa, C. M.; Ventura, L.; De Groote, J. J.
2010-02-01
In this work we present an image processing software for automatic astigmatism measurements developed for a hand held keratometer. The system projects 36 light spots, from LEDs, displayed in a precise circle at the lachrymal film of the examined cornea. The displacement, the size and deformation of the reflected image of these light spots are analyzed providing the keratometry. The purpose of this research is to develop a software that performs fast and precise calculations in mainstream mobile devices. In another words, a software that can be implemented in portable computer systems, which could be of low cost and easy to handle. This project allows portability for keratometers and is a previous work for a portable corneal topographer.
Annotated bibliography of software engineering laboratory literature
NASA Technical Reports Server (NTRS)
Groves, Paula; Valett, Jon
1990-01-01
An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory-software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. Subject and author indexes further classify these documents by specific topic and individual author.
Annotated bibliography of Software Engineering Laboratory literature
NASA Technical Reports Server (NTRS)
Morusiewicz, Linda; Valett, Jon
1993-01-01
This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory: software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. This document contains an index of these publications classified by individual author.
Special Report: Part One. New Tools for Professionals.
ERIC Educational Resources Information Center
Liskin, Miriam; And Others
1984-01-01
This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)
Science 101: How Does Speech-Recognition Software Work?
ERIC Educational Resources Information Center
Robertson, Bill
2016-01-01
This column provides background science information for elementary teachers. Many innovations with computer software begin with analysis of how humans do a task. This article takes a look at how humans recognize spoken words and explains the origins of speech-recognition software.
Living Design Memory: Framework, Implementation, Lessons Learned.
ERIC Educational Resources Information Center
Terveen, Loren G.; And Others
1995-01-01
Discusses large-scale software development and describes the development of the Designer Assistant to improve software development effectiveness. Highlights include the knowledge management problem; related work, including artificial intelligence and expert systems, software process modeling research, and other approaches to organizational memory;…
Zachary D. Barker: Final DHS HS-STEM Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barker, Z D
Working at Lawrence Livermore National Laboratory (LLNL) this summer has provided a very unique and special experience for me. I feel that the research opportunities given to me have allowed me to significantly benefit my research group, the laboratory, the Department of Homeland Security, and the Department of Energy. The researchers in the Single Particle Aerosol Mass Spectrometry (SPAMS) group were very welcoming and clearly wanted me to get the most out of my time in Livermore. I feel that my research partner, Veena Venkatachalam of MIT, and I have been extremely productive in meeting our research goals throughout thismore » summer, and have learned much about working in research at a national laboratory such as Lawrence Livermore. I have learned much about the technical aspects of research while working at LLNL, however I have also gained important experience and insight into how research groups at national laboratories function. I believe that this internship has given me valuable knowledge and experience which will certainly help my transition to graduate study and a career in engineering. My work with Veena Venkatachalam in the SPAMS group this summer has focused on two major projects. Initially, we were tasked with an analysis of data collected by the group this past spring in a large public environment. The SPAMS instrument was deployed for over two months, collecting information on many of the ambient air particles circulating through the area. Our analysis of the particle data collected during this deployment concerned several aspects, including finding groups, or clusters, of particles that seemed to appear more during certain times of day, analyzing the mass spectral data of clusters and comparing them with mass spectral data of known substances, and comparing the real-time detection capability of the SPAMS instrument with that of a commercially available biological detection instrument. This analysis was performed in support of a group report to the Department of Homeland Security on the results of the deployment. The analysis of the deployment data revealed some interesting applications of the SPAMS instrument to homeland security situations. Using software developed in-house by SPAMS group member Dr. Paul Steele, Veena and I were able to cluster a subset of data over a certain timeframe (ranging from a single hour to an entire week). The software used makes clusters based on the mass spectral characteristics of the each particle in the data set, as well as other parameters. By looking more closely at the characteristics of individual clusters, including the mass spectra, conclusions could be made about what these particles are. This was achieved partially through examination and discussion of the mass spectral data with the members of the SPAMS group, as well as through comparison with known mass spectra collected from substances tested in the laboratory. In many cases, broad conclusions could be drawn about the identity of a cluster of particles.« less
Quantum Mechanics/Molecular Mechanics Modeling of Enzymatic Processes: Caveats and Breakthroughs.
Quesne, Matthew G; Borowski, Tomasz; de Visser, Sam P
2016-02-18
Nature has developed large groups of enzymatic catalysts with the aim to transfer substrates into useful products, which enables biosystems to perform all their natural functions. As such, all biochemical processes in our body (we drink, we eat, we breath, we sleep, etc.) are governed by enzymes. One of the problems associated with research on biocatalysts is that they react so fast that details of their reaction mechanisms cannot be obtained with experimental work. In recent years, major advances in computational hardware and software have been made and now large (bio)chemical systems can be studied using accurate computational techniques. One such technique is the quantum mechanics/molecular mechanics (QM/MM) technique, which has gained major momentum in recent years. Unfortunately, it is not a black-box method that is easily applied, but requires careful set-up procedures. In this work we give an overview on the technical difficulties and caveats of QM/MM and discuss work-protocols developed in our groups for running successful QM/MM calculations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)
2001-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten
2002-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Fostering successful scientific software communities
NASA Astrophysics Data System (ADS)
Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.
2016-12-01
Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.
In the right order of brush strokes: a sketch of a software philosophy retrospective.
Pyshkin, Evgeny
2014-01-01
This paper follows a discourse on software recognized as a product of art and human creativity progressing probably for as long as software exists. A retrospective view on computer science and software philosophy development is introduced. In so doing we discover parallels between software and various branches of human creative manifestations. Aesthetic properties and mutual dependency of the form and matter of art works are examined in their application to software programs. While exploring some philosophical and even artistic reflection on software we consider extended comprehension of technical sciences of programming and software engineering within the realm of liberal arts.
ERIC Educational Resources Information Center
Podany, Zita
This guide lists 19 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 17 computer coordinators from educational software preview centers and evaluation agencies. The following software is listed: (1) ASK-IT, an authoring tool; (2) Balance of the Planet, an environmental…
ERIC Educational Resources Information Center
Podany, Zita
This guide lists 21 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 12 computer coordinators from educational software preview centers and evaluation agencies. These software products have been selected as not being likely to appear in the reviews produced by major software…
NASA Technical Reports Server (NTRS)
Muniz, R.; Hochstadt, J.; Boelke J.; Dalton, A.
2011-01-01
The Content Documents are created and managed under the System Software group with. Launch Control System (LCS) project. The System Software product group is lead by NASA Engineering Control and Data Systems branch (NEC3) at Kennedy Space Center. The team is working on creating Operating System Images (OSI) for different platforms (i.e. AIX, Linux, Solaris and Windows). Before the OSI can be created, the team must create a Content Document which provides the information of a workstation or server, with the list of all the software that is to be installed on it and also the set where the hardware belongs. This can be for example in the LDS, the ADS or the FR-l. The objective of this project is to create a User Interface Web application that can manage the information of the Content Documents, with all the correct validations and filters for administrator purposes. For this project we used one of the most excellent tools in agile development applications called Ruby on Rails. This tool helps pragmatic programmers develop Web applications with Rails framework and Ruby programming language. It is very amazing to see how a student can learn about OOP features with the Ruby language, manage the user interface with HTML and CSS, create associations and queries with gems, manage databases and run a server with MYSQL, run shell commands with command prompt and create Web frameworks with Rails. All of this in a real world project and in just fifteen weeks!
Technical Concept Document. Central Archive for Reusable Defense Software (CARDS)
1994-02-28
FeNbry 1994 INFORMAL TECHNICAL REPORT For The SOFTWARE TECHNOLOGY FOR ADAPTABLE, RELIABLE SYSTEMS (STARS) Technical Concept Document Central Archive for...February 1994 INFORMAL TECHNICAL REPORT For The SOFTWARE TECHNOLOGY FOR ADAPTABLE, RELIABLE SYSTEMS (STARS) Technical Concept Document Central Archive...accordance with the DFARS Special Works Clause Developed by: This document, developed under the Software Technology for Adaptable, Reliable Systems
A Secure Architecture to Provide a Medical Emergency Dataset for Patients in Germany and Abroad.
Storck, Michael; Wohlmann, Jan; Krudwig, Sarah; Vogel, Alexander; Born, Judith; Weber, Thomas; Dugas, Martin; Juhra, Christian
2017-01-01
The ongoing fragmentation of medical care and mobility of patients severely restrains exchange of lifesaving information about patient's medical history in case of emergencies. Therefore, the objective of this work is to offer a secure technical solution to supply medical professionals with emergency-relevant information concerning the current patient via mobile accessibility. To achieve this goal, the official national emergency data set was extended by additional features to form a patient summary for emergencies, a software architecture was developed and data security and data protection issues were taken into account. The patient has sovereignty over his/her data and can therefore decide who has access to or can change his/her stored data, but the treating physician composes the validated dataset. Building upon the introduced concept, future activities are the development of user-interfaces for the software components of the different user groups as well as functioning prototypes for upcoming field tests.
ICT-Supported Education; Learning Styles for Individual Knowledge Building
NASA Astrophysics Data System (ADS)
Haugen, Harald; Ask, Bodil; Bjørke, Sven Åke
School surveys and reports on integration of ICT in teaching and learning indicate that the technology is mainly used in traditional learning environments. Furthermore, the most frequently used software in the classrooms are general tools like word processors, presentation tools and Internet browsers. Recent attention among youngsters on social software / web 2.0, contemporary pedagogical approaches like social constructivism and long time experiences with system dynamics and simulations, seem to have a hard time being accepted by teachers and curriculum designers. How can teachers be trained to understand and apply these possibilities optimally that are now available in the classroom and online, on broadband connections and with high capacity computers? Some views on practices with the above-mentioned alternative approaches to learning are presented in this paper, focusing particularly on the options for online work and learning programmes. Here we have first hand experience with adult and mature academics, but also some background with other target groups.
Artificial Intelligence Software Engineering (AISE) model
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
AGScan: a pluggable microarray image quantification software based on the ImageJ library.
Cathelin, R; Lopez, F; Klopp, Ch
2007-01-15
Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.
Remote software upload techniques in future vehicles and their performance analysis
NASA Astrophysics Data System (ADS)
Hossain, Irina
Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.
Quadratic Blind Linear Unmixing: A Graphical User Interface for Tissue Characterization
Gutierrez-Navarro, O.; Campos-Delgado, D.U.; Arce-Santana, E. R.; Jo, Javier A.
2016-01-01
Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. PMID:26589467
Quadratic blind linear unmixing: A graphical user interface for tissue characterization.
Gutierrez-Navarro, O; Campos-Delgado, D U; Arce-Santana, E R; Jo, Javier A
2016-02-01
Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
Digital Tools: Enhancing Painting Skills among Malaysian Secondary School Students
ERIC Educational Resources Information Center
Samah, Azimah A.; Putih, Abu Talib; Hussin, Zaharah
2016-01-01
Digital tools refer to software applications in the production of artworks particularly in painting. Digital art work is materialized by using computers, software and a combination of computer peripherals such as tablet support. With the aid of electronic equipment, digital artists manipulate pixels or coloring with light to compose the work and…
An investigation of the effects of interventions on problem-solving strategies and abilities
NASA Astrophysics Data System (ADS)
Cox, Charles Terrence, Jr.
Problem-solving has been described as being the "heart" of the chemistry classroom, and students' development of problem-solving skills is essential for their success in chemistry. Despite the importance of problem-solving, there has been little research within the chemistry domain, largely because of the lack of tools to collect data for large populations. Problem-solving was assessed using a software package known as IMMEX (for Interactive Multimedia Exercises) which has an HTML tracking feature that allows for collection of problem-solving data in the background as students work the problems. The primary goal of this research was to develop methods (known as interventions) that could promote improvements in students' problem-solving and most notably aid in their transition from the novice to competent level. Three intervention techniques that were incorporated within the chemistry curricula: collaborative grouping (face-to-face and distance), concept mapping, and peer-led team learning. The face-to-face collaborative grouping intervention was designed to probe the factors affecting the quality of the group interaction. Students' logical reasoning abilities were measured using the Group Assessment of Logical Thinking (GALT) test which classifies students as formal, transitional, or concrete. These classifications essentially provide a basis for identifying scientific aptitude. These designations were used as the basis for forming collaborative groups of two students. The six possibilities (formal-formal, formal-transitional, etc.) were formed to determine how the group composition influences the gains in student abilities observed from collaborative grouping interventions. Students were given three assignments (an individual pre-collaborative, an individual post collaborative, and a collaborative assignment) each requiring them to work an IMMEX problem set. Similar gains in performance of 10% gains were observed for each group with two exceptions. The transitional students who were paired with concrete students had a 15% gain, and the concrete students paired with other concrete students had only a marginal gain. In fact, there was no statistical difference in the pre-collaborative and post-collaborative student abilities for concrete-concrete groups. The distance collaborative intervention was completed using a new interface for the IMMEX software designed to mimic face-to-face collaboration. A stereochemistry problem set which had a solved rate of 28% prior to collaboration was chosen for incorporation into this distance collaboration study. (Abstract shortened by UMI.)
Implementing Educational Software and Evaluating Its Academic Effectiveness: Part I.
ERIC Educational Resources Information Center
Jolicoeur, Karen; Berger, Dale E.
1988-01-01
This basic plan for implementing educational software in the classroom incorporates a research design for evaluating its effectiveness. A study of fifth grade classrooms using game and tutorial software for spelling and fractions is used as an example. Topics discussed include software selection, selecting groups of comparable ability, and use of…
Software Piracy in Research: A Moral Analysis.
Santillanes, Gary; Felder, Ryan Marshall
2015-08-01
Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.
Use of electronic information systems in nursing management.
Lammintakanen, Johanna; Saranto, Kaija; Kivinen, Tuula
2010-05-01
The purpose of this study is to describe nurse managers' perceptions of the use of electronic information systems in their daily work. Several kinds of software are used for administrative and information management purposes in health care organizations, but the issue has been studied less from nurse managers' perspective. The material for this qualitative study was acquired according to the principles of focus group interview. Altogether eight focus groups were held with 48 nurse managers from both primary and specialized health care organizations. The nurse managers were asked in focus groups to describe the use of information systems in their daily work in addition to some other themes. The material was analyzed by inductive content analysis using ATLAS.ti computer program. The main category "pros and cons of using information systems in nursing management" summarized the nurse managers' perceptions of using electronic information systems. The main category consisted of three sub-categories: (1) nurse managers' perceptions of the use of information technology; (2) usability of management information systems; (3) development of personnel competencies and work processes. The nurse managers made several comments on the implementation of immature electronic information systems which caused inefficiencies in working processes. However, they considered electronic information systems to be essential elements of their daily work. Furthermore, the nurse managers' descriptions of the pros and cons of using information systems reflected partly the shortcomings of strategic management and lack of coordination in health care organizations. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Hu, Zhouyang; Li, Xinhua; Cui, Jian; He, Xiaobo; Li, Cong; Han, Yingchao; Pan, Jie; Yang, Mingjie; Tan, Jun; Li, Lijun
2017-05-01
Preoperative planning software has been widely used in many other minimally invasive surgeries, but there is a lack of information describing the clinical benefits of existing software applied in percutaneous endoscopic lumbar discectomy (PELD). This study aimed to compare the clinical efficacy of preoperative planning software in puncture and channel establishment of PELD with routine methods in treating lumbar disc herniation (LDH). From June 2016 to October 2016, 40 patients who had single L4/5 or L5/S1 disc herniation were divided into two groups. Group A adopted planning software for preoperative puncture simulation while Group B took routine cases discussion for making puncture plans. The channel establishment time, operative time, fluoroscopic times and complications were compared between the two groups. The surgical efficacy was evaluated according to the Visual Analogue Scale (VAS), Oswestry Disability Index (ODI) and modified Macnab's criteria. The mean channel establishment time was 25.1 ± 4.2 min and 34.6 ± 5.4 min in Group A and B, respectively (P < 0.05). The mean operative time was 80.8 ± 8.4 min and 92.1 ± 7.3 min in Group A and B, respectively (P < 0.05). The fluoroscopic times were 21.5 ± 5.2 in Group A and 29.3 ± 5.5 in Group B (P < 0.05). There were no significant differences in VAS and ODI scorings between the two groups either preoperatively or postoperatively (P > 0.05). The findings of modified Macnab's criteria at each follow-up also showed no significant differences (P > 0.05). The application of preoperative planning software in puncture and cannula insertion planning in PELD was easy and reliable, and could reduce the channel establishment time, operative time and fluoroscopic times of PELD significantly. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
Math and science technology access and use in South Dakota public schools grades three through five
NASA Astrophysics Data System (ADS)
Schwietert, Debra L.
The development of K-12 technology standards, soon to be added to state testing of technology proficiency, and the increasing presence of computers in homes and classrooms reflects the growing importance of technology in current society. This study examined math and science teachers' responses on a survey of technology use in grades three through five in South Dakota. A researcher-developed survey instrument was used to collect data from a random sample of 100 public schools throughout the South Dakota. Forced choice and open-ended responses were recorded. Most teachers have access to computers, but they lack resources to purchase software for their content areas, especially in science areas. Three-fourths of teachers in this study reported multiple computers in their classrooms and 67% reported access to labs in other areas of the school building. These numbers are lower than the national average of 84% of teachers with computers in their classrooms and 95% with access to computers elsewhere in the building (USDOE, 2000). Almost eight out of 10 teachers noted time as a barrier to learning more about educational software. Additional barriers included lack of school funds (38%), access to relevant training (32%), personal funds (30%), and poor quality of training (7%). Teachers most often use math and science software as supplemental, with practice tutorials cited as another common use. The most common interest for software was math for both boys and girls. The second most common choice for boys was science and for girls, language arts. Teachers reported that there was no preference for either individual or group work on computers for girls or boys. Most teachers do not systematically evaluate software for gender preferences, but review software over subjectively.
Proteomics Standards Initiative: Fifteen Years of Progress and Future Work
2017-01-01
The Proteomics Standards Initiative (PSI) of the Human Proteome Organization (HUPO) has now been developing and promoting open community standards and software tools in the field of proteomics for 15 years. Under the guidance of the chair, cochairs, and other leadership positions, the PSI working groups are tasked with the development and maintenance of community standards via special workshops and ongoing work. Among the existing ratified standards, the PSI working groups continue to update PSI-MI XML, MITAB, mzML, mzIdentML, mzQuantML, mzTab, and the MIAPE (Minimum Information About a Proteomics Experiment) guidelines with the advance of new technologies and techniques. Furthermore, new standards are currently either in the final stages of completion (proBed and proBAM for proteogenomics results as well as PEFF) or in early stages of design (a spectral library standard format, a universal spectrum identifier, the qcML quality control format, and the Protein Expression Interface (PROXI) web services Application Programming Interface). In this work we review the current status of all of these aspects of the PSI, describe synergies with other efforts such as the ProteomeXchange Consortium, the Human Proteome Project, and the metabolomics community, and provide a look at future directions of the PSI. PMID:28849660
Proteomics Standards Initiative: Fifteen Years of Progress and Future Work.
Deutsch, Eric W; Orchard, Sandra; Binz, Pierre-Alain; Bittremieux, Wout; Eisenacher, Martin; Hermjakob, Henning; Kawano, Shin; Lam, Henry; Mayer, Gerhard; Menschaert, Gerben; Perez-Riverol, Yasset; Salek, Reza M; Tabb, David L; Tenzer, Stefan; Vizcaíno, Juan Antonio; Walzer, Mathias; Jones, Andrew R
2017-12-01
The Proteomics Standards Initiative (PSI) of the Human Proteome Organization (HUPO) has now been developing and promoting open community standards and software tools in the field of proteomics for 15 years. Under the guidance of the chair, cochairs, and other leadership positions, the PSI working groups are tasked with the development and maintenance of community standards via special workshops and ongoing work. Among the existing ratified standards, the PSI working groups continue to update PSI-MI XML, MITAB, mzML, mzIdentML, mzQuantML, mzTab, and the MIAPE (Minimum Information About a Proteomics Experiment) guidelines with the advance of new technologies and techniques. Furthermore, new standards are currently either in the final stages of completion (proBed and proBAM for proteogenomics results as well as PEFF) or in early stages of design (a spectral library standard format, a universal spectrum identifier, the qcML quality control format, and the Protein Expression Interface (PROXI) web services Application Programming Interface). In this work we review the current status of all of these aspects of the PSI, describe synergies with other efforts such as the ProteomeXchange Consortium, the Human Proteome Project, and the metabolomics community, and provide a look at future directions of the PSI.
Optics simulations: a Python workshop
NASA Astrophysics Data System (ADS)
Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.
2017-08-01
Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.
Applications of Modeling and Simulation for Flight Hardware Processing at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Marshall, Jennifer L.
2010-01-01
The Boeing Design Visualization Group (DVG) is responsible for the creation of highly-detailed representations of both on-site facilities and flight hardware using computer-aided design (CAD) software, with a focus on the ground support equipment (GSE) used to process and prepare the hardware for space. Throughout my ten weeks at this center, I have had the opportunity to work on several projects: the modification of the Multi-Payload Processing Facility (MPPF) High Bay, weekly mapping of the Space Station Processing Facility (SSPF) floor layout, kinematics applications for the Orion Command Module (CM) hatches, and the design modification of the Ares I Upper Stage hatch for maintenance purposes. The main goal of each of these projects was to generate an authentic simulation or representation using DELMIA V5 software. This allowed for evaluation of facility layouts, support equipment placement, and greater process understanding once it was used to demonstrate future processes to customers and other partners. As such, I have had the opportunity to contribute to a skilled team working on diverse projects with a central goal of providing essential planning resources for future center operations.
Proposal for a CLIPS software library
NASA Technical Reports Server (NTRS)
Porter, Ken
1991-01-01
This paper is a proposal to create a software library for the C Language Integrated Production System (CLIPS) expert system shell developed by NASA. Many innovative ideas for extending CLIPS were presented at the First CLIPS Users Conference, including useful user and database interfaces. CLIPS developers would benefit from a software library of reusable code. The CLIPS Users Group should establish a software library-- a course of action to make that happen is proposed. Open discussion to revise this library concept is essential, since only a group effort is likely to succeed. A response form intended to solicit opinions and support from the CLIPS community is included.
Design of the software development and verification system (SWDVS) for shuttle NASA study task 35
NASA Technical Reports Server (NTRS)
Drane, L. W.; Mccoy, B. J.; Silver, L. W.
1973-01-01
An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.
A comparison of traditional textbook and interactive computer learning of neuromuscular block.
Ohrn, M A; van Oostrom, J H; van Meurs, W L
1997-03-01
We designed an educational software package, RELAX, for teaching first-year anesthesiology residents about the pharmacology and clinical management of neuromuscular blockade. The software uses an interactive, problem-based approach and moves the user through cases in an operating room environment. It can be run on personal computers with Microsoft Windows (Microsoft Corp., Redmond, WA) and combines video, graphics, and text with mouse-driven user input. We utilized test scores 1) to determine whether our software was beneficial to be the educational progress of anesthesiology residents and 2) to compare computer-based learning with textbook learning. Twenty-three residents were divided into two groups matched for age and sex, and a pretest was administered to all 23 residents. There was no significant difference (P > 0.05) in the pretest scores of the two groups. Three weeks later, both groups were subjected to an educational intervention; one with our computer software and the other with selected textbooks. Both groups took a posttest immediately after the intervention. The test scores of the computer group improved significantly more (P < 0.05) than those of the textbook group. Although prior to the study the two groups showed no statistical difference in their familiarity with computers, the computer group reported much higher satisfaction with their learning experience than did the textbook group (P < 0.0001).
Albar, Juan Pablo; Binz, Pierre-Alain; Eisenacher, Martin; Jones, Andrew R; Mayer, Gerhard; Omenn, Gilbert S; Orchard, Sandra; Vizcaíno, Juan Antonio; Hermjakob, Henning
2015-01-01
Objective To describe the goals of the Proteomics Standards Initiative (PSI) of the Human Proteome Organization, the methods that the PSI has employed to create data standards, the resulting output of the PSI, lessons learned from the PSI’s evolution, and future directions and synergies for the group. Materials and Methods The PSI has 5 categories of deliverables that have guided the group. These are minimum information guidelines, data formats, controlled vocabularies, resources and software tools, and dissemination activities. These deliverables are produced via the leadership and working group organization of the initiative, driven by frequent workshops and ongoing communication within the working groups. Official standards are subjected to a rigorous document process that includes several levels of peer review prior to release. Results We have produced and published minimum information guidelines describing what information should be provided when making data public, either via public repositories or other means. The PSI has produced a series of standard formats covering mass spectrometer input, mass spectrometer output, results of informatics analysis (both qualitative and quantitative analyses), reports of molecular interaction data, and gel electrophoresis analyses. We have produced controlled vocabularies that ensure that concepts are uniformly annotated in the formats and engaged in extensive software development and dissemination efforts so that the standards can efficiently be used by the community. Conclusion In its first dozen years of operation, the PSI has produced many standards that have accelerated the field of proteomics by facilitating data exchange and deposition to data repositories. We look to the future to continue developing standards for new proteomics technologies and workflows and mechanisms for integration with other omics data types. Our products facilitate the translation of genomics and proteomics findings to clinical and biological phenotypes. The PSI website can be accessed at http://www.psidev.info. PMID:25726569
Rule groupings: A software engineering approach towards verification of expert systems
NASA Technical Reports Server (NTRS)
Mehrotra, Mala
1991-01-01
Currently, most expert system shells do not address software engineering issues for developing or maintaining expert systems. As a result, large expert systems tend to be incomprehensible, difficult to debug or modify and almost impossible to verify or validate. Partitioning rule based systems into rule groups which reflect the underlying subdomains of the problem should enhance the comprehensibility, maintainability, and reliability of expert system software. Attempts were made to semiautomatically structure a CLIPS rule base into groups of related rules that carry the same type of information. Different distance metrics that capture relevant information from the rules for grouping are discussed. Two clustering algorithms that partition the rule base into groups of related rules are given. Two independent evaluation criteria are developed to measure the effectiveness of the grouping strategies. Results of the experiment with three sample rule bases are presented.
Ya'acob, Noor Afifah; Abidin, Emilia Zainal; Rasdi, Irniza; Rahman, Anita Abd; Ismail, Suriani
2018-05-01
Work tasks in pineapple plantations in Malaysia are characterised by non-ergonomic work postures, repetitive tasks, awkward posture and manual handling of work tools that contribute to the reporting of musculoskeletal symptoms (MSS). There have been very limited studies performed among pineapple plantation workers focusing on ergonomic intervention programs to specifically reduce MSS. The aim of this study was to assess the effects of work improvement module using a Kiken Yochi participatory approach intervention in reducing MSS among male migrant pineapple farm plantation workers in Pontian, Johor. In this interventional study, a total of 68 male migrant workers from two plantation farms were invited to become a participant in this study. In total, 45 participants that consisted of 27 workers for the intervention group and 18 workers for the control group were recruited. The background of workers and MSS were assessed using questionnaires. Ergonomic and postural risks were evaluated and the work tasks with the highest risk were used as a basis for the development of the Kiken Yochi training module. MSS education and training intervention that provided information on proper lifting techniques and education on body mechanics and ergonomics to reduce MSS were implemented to both groups of workers. Kiken Yochi Training was given to the intervention group only. MSS were reassessed after 2 months of the follow-up period. Data was entered into statistical software and were analysed according to objectives. In terms of the postural risk assessment, almost two-third of the participants (68.5%) had working postures categorized as high risk for MSS. Ergonomic risk assessment identified cultivation, manual weeding and harvesting of pineapples as the work tasks contributing the highest health risks to workers. The most commonly reported MSS between both groups of workers were at the knees, lower back and shoulder area. Upon completion of the delivery of intervention module to both groups of workers, the MSS prevalence reported (after 2 months) were significantly lower for the ankles and feet area within the intervention group. This study suggested that development and implementation of programs using effective participatory approach training methods are able to prevent selected musculoskeletal problems for this occupation. To enhance the effects of such trainings, modifications of work tools in this occupation are desirable.
Baghban, Iran; Malekiha, Marziyeh; Fatehizadeh, Maryam
2010-01-01
BACKGROUND: Work-family conflict has many negative outcomes for organization and career and family life of each person. The aim of present study was to determine the relationship between work-family conflict and the level of self-efficacy in female nurses. METHODS: In this cross-sectional descriptive research, the relationship between work-family conflict and the level of self-efficacy in female nurses of Alzahra Hospital was assessed. Questionnaire, demographic data form, work-family conflict scale and self-efficacy scale were the data collection tools. Content analysis and Cronbach’s alpha were used for evaluating the validity and reliability of questionnaire. The study sample included 160 nurses (80 permanent nurses and 80 contract-based nurses) selected through simple random sampling from nurses working in different wards of Alzahra Hospital. Data analysis was done using SPSS software. RESULTS: There was significant difference in work-family conflict between the two groups of permanent and contract-based nurses (p = 0.02). Also, a significant difference in the level of self-efficacy was observed between the two groups of nurses (p = 0.03). CONCLUSIONS: The level of self-efficacy and work-family conflict in contract-based nurses was not acceptable. Therefore, it is suggested to arrange courses to train effective skills in the field of management of work-family conflicts in order to increase the level of self-efficacy for contract-based nurses. PMID:21589794
Advanced End-to-end Simulation for On-board Processing (AESOP)
NASA Technical Reports Server (NTRS)
Mazer, Alan S.
1994-01-01
Developers of data compression algorithms typically use their own software together with commercial packages to implement, evaluate and demonstrate their work. While convenient for an individual developer, this approach makes it difficult to build on or use another's work without intimate knowledge of each component. When several people or groups work on different parts of the same problem, the larger view can be lost. What's needed is a simple piece of software to stand in the gap and link together the efforts of different people, enabling them to build on each other's work, and providing a base for engineers and scientists to evaluate the parts as a cohesive whole and make design decisions. AESOP (Advanced End-to-end Simulation for On-board Processing) attempts to meet this need by providing a graphical interface to a developer-selected set of algorithms, interfacing with compiled code and standalone programs, as well as procedures written in the IDL and PV-Wave command languages. As a proof of concept, AESOP is outfitted with several data compression algorithms integrating previous work on different processors (AT&T DSP32C, TI TMS320C30, SPARC). The user can specify at run-time the processor on which individual parts of the compression should run. Compressed data is then fed through simulated transmission and uncompression to evaluate the effects of compression parameters, noise and error correction algorithms. The following sections describe AESOP in detail. Section 2 describes fundamental goals for usability. Section 3 describes the implementation. Sections 4 through 5 describe how to add new functionality to the system and present the existing data compression algorithms. Sections 6 and 7 discuss portability and future work.
Annotated bibliography of software engineering laboratory literature
NASA Technical Reports Server (NTRS)
Kistler, David; Bristow, John; Smith, Don
1994-01-01
This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.
Academic Web Authoring Mulitmedia Development and Course Management Tools
ERIC Educational Resources Information Center
Halloran, Margaret E.
2005-01-01
Course management software enables faculty members to learn one software package for web-based curriculum, assessment, synchronous and asynchronous discussions, collaborative work, multimedia and interactive resource development. There are as many as 109 different course management software packages on the market and several studies have evaluated…
Novel Television-Based Cognitive Training Improves Working Memory and Executive Function
Shatil, Evelyn; Mikulecká, Jaroslava; Bellotti, Francesco; Bureš, Vladimír
2014-01-01
The main study objective was to investigate the effect of interactive television-based cognitive training on cognitive performance of 119 healthy older adults, aged 60–87 years. Participants were randomly allocated to a cognitive training group or to an active control group in a single-blind controlled two-group design. Before and after training interactive television cognitive performance was assessed on well validated tests of fluid, higher-order ability, and system usability was evaluated. The participants in the cognitive training group completed a television-based cognitive training programme, while the participants in the active control group completed a TV-based programme of personally benefiting activities. Significant improvements were observed in well validated working memory and executive function tasks in the cognitive training but not in the control group. None of the groups showed statistically significant improvement in life satisfaction score. Participants' reports of “adequate” to “high” system usability testify to the successful development and implementation of the interactive television-based system and compliant cognitive training contents. The study demonstrates that cognitive training delivered by means of an interactive television system can generate genuine cognitive benefits in users and these are measurable using well-validated cognitive tests. Thus, older adults who cannot use or afford a computer can easily use digital interactive television to benefit from advanced software applications designed to train cognition. PMID:24992187
Adaptive Integration of Nonsmooth Dynamical Systems
2017-10-11
controlled time stepping method to interactively design running robots. [1] John Shepherd, Samuel Zapolsky, and Evan M. Drumwright, “Fast multi-body...software like this to test software running on my robots. Started working in simulation after attempting to use software like this to test software... running on my robots. The libraries that produce these beautiful results have failed at simulating robotic manipulation. Postulate: It is easier to
ERIC Educational Resources Information Center
Eisen, Daniel
2013-01-01
This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…
Performance testing of 3D point cloud software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-10-01
LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.
Descriptions of Free and Freeware Software in the Mathematics Teaching
NASA Astrophysics Data System (ADS)
Antunes de Macedo, Josue; Neves de Almeida, Samara; Voelzke, Marcos Rincon
2016-05-01
This paper presents the analysis and the cataloging of free and freeware mathematical software available on the internet, a brief explanation of them, and types of licenses for use in teaching and learning. The methodology is based on the qualitative research. Among the different types of software found, it stands out in algebra, the Winmat, that works with linear algebra, matrices and linear systems. In geometry, the GeoGebra, which can be used in the study of functions, plan and spatial geometry, algebra and calculus. For graphing, can quote the Graph and Graphequation. With Graphmatica software, it is possible to build various graphs of mathematical equations on the same screen, representing cartesian equations, inequalities, parametric among other functions. The Winplot allows the user to build graphics in two and three dimensions functions and mathematical equations. Thus, this work aims to present the teachers some free math software able to be used in the classroom.
The roles of the AAS Journals' Data Editors
NASA Astrophysics Data System (ADS)
Muench, August; NASA/SAO ADS, CERN/Zenodo.org, Harvard/CfA Wolbach Library
2018-01-01
I will summarize the community services provided by the AAS Journals' Data Editors to support authors’ when citing and preserving the software and data used in the published literature. In addition I will describe the life of a piece of code as it passes through the current workflows for software citation in astronomy. Using this “lifecycle” I will detail the ongoing work funded by a grant from the Alfred P. Sloan Foundation to the American Astronomical Society to improve the citation of software in the literature. The funded development team and advisory boards, made up of non-profit publishers, literature indexers, and preservation archives, is implementing the Force11 Software citation principles for astronomy Journals. The outcome of this work will be new workflows for authors and developers that fit in their current practices while enabling versioned citation of software and granular credit for its creators.
NASA Technical Reports Server (NTRS)
Morusiewicz, Linda; Valett, Jon
1992-01-01
This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.
An application of machine learning to the organization of institutional software repositories
NASA Technical Reports Server (NTRS)
Bailin, Sidney; Henderson, Scott; Truszkowski, Walt
1993-01-01
Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.
Astronomers as Software Developers
NASA Astrophysics Data System (ADS)
Pildis, Rachel A.
2016-01-01
Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.
Tucker, P; Gaertner, J; Mason, C
2001-12-01
As with many forms of flexible working, Annualized Hours (AH) systems offer potential benefits to both the employer and the employee. However, the flexibility requirements of employers and employees often conflict. Therefore, when a large food manufacturing organization decided to redesign its AH system, it employed an independent consultancy to act as neutral third party. The consultancy provided technical expertise and assistance in developing an AH system that optimised productivity and was acceptable to the workforce. Data are presented, obtained from focus groups conducted throughout the organization, describing some of the potential difficulties of implementing an AH system. Drawing upon these data, a number of new AH systems were proposed and modelled using specialist software tools. The design process is described, together with the advantages and difficulties associated with use of the software tools. It is concluded that the key elements in the process of designing AH systems are centred around issues of trust and communication; the involvement of a broad range of interested parties, through a process of carefully managed group facilitation; and the need for adequate technical support in the development and evaluation of AH systems.
Mantini, Dante; Petrucci, Francesca; Pieragostino, Damiana; Del Boccio, Piero; Sacchetta, Paolo; Candiano, Giovanni; Ghiggeri, Gian Marco; Lugaresi, Alessandra; Federici, Giorgio; Di Ilio, Carmine; Urbani, Andrea
2010-01-03
Mass spectrometry (MS) is becoming the gold standard for biomarker discovery. Several MS-based bioinformatics methods have been proposed for this application, but the divergence of the findings by different research groups on the same MS data suggests that the definition of a reliable method has not been achieved yet. In this work, we propose an integrated software platform, MASCAP, intended for comparative biomarker detection from MALDI-TOF MS data. MASCAP integrates denoising and feature extraction algorithms, which have already shown to provide consistent peaks across mass spectra; furthermore, it relies on statistical analysis and graphical tools to compare the results between groups. The effectiveness in mass spectrum processing is demonstrated using MALDI-TOF data, as well as SELDI-TOF data. The usefulness in detecting potential protein biomarkers is shown comparing MALDI-TOF mass spectra collected from serum and plasma samples belonging to the same clinical population. The analysis approach implemented in MASCAP may simplify biomarker detection, by assisting the recognition of proteomic expression signatures of the disease. A MATLAB implementation of the software and the data used for its validation are available at http://www.unich.it/proteomica/bioinf. (c) 2009 Elsevier B.V. All rights reserved.
Application of Advanced Decision-Analytic Technology to Rapid Deployment Joint Task Force Problems
1981-06-01
CHANGE 9 DIEGO GARCIA CHANGE 5: MOMPANA/K FROM I. SO FROM 2: AIRFIELD IMIS TO 2 AIRFIELD IMPS+DRI/II TO 6: COMM/NAV AIDS bENEF IT COET BENEFIT COFF 410...meetings: (1) To organize , display, and update the working group’s judgements about the relative costs and benefits of each level of each variable in...benefit to the organization . (3) Assess costs - In the DESIGN software, there is one type of limited resource to be allocated to the variables. This
Development of the electronic health records for nursing education (EHRNE) software program.
Kowitlawakul, Yanika; Wang, Ling; Chan, Sally Wai-Chi
2013-12-01
This paper outlines preliminary research of an innovative software program that enables the use of an electronic health record in a nursing education curriculum. The software application program is called EHRNE, which stands for Electronic Heath Record for Nursing Education. The aim of EHRNE is to enhance student's learning of health informatics when they are working in the simulation laboratory. Integrating EHRNE into the nursing curriculum exposes students to electronic health records before they go into the workplace. A qualitative study was conducted using focus group interviews of nine nursing students. Nursing students' perceptions of using the EHRNE application were explored. The interviews were audio-taped and transcribed verbatim. The data was analyzed following the Colaizzi (1978) guideline. Four main categories that related to the EHRNE application were identified from the interviews: functionality, data management, timing and complexity, and accessibility. The analysis of the data revealed advantages and limitations of using EHRNE in the classroom setting. Integrating the EHRNE program into the curriculum will promote students' awareness of electronic documentation and enhance students' learning in the simulation laboratory. Preliminary findings suggested that before integrating the EHRNE program into the nursing curriculum, educational sessions for both students and faculty outlining the software's purpose, advantages, and limitations were needed. Following the educational sessions, further investigation of students' perceptions and learning using the EHRNE program is recommended. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Dimech, C.
2013-12-01
In this contribution, I present a critical evaluation of my experience as a research student conducting an interdisciplinary project that bridges the world of geoscience with that of astronomy. The major challenge consists in studying and modifying existing geophysical software to work with synthetic solar data not obtained by direct measurement but useful for testing and evaluation, and data released from the satellite HINODE and the Solar Dynamics Observatory. I have been fortunate to collaborate closely with multiple geoscientists keen to share their software codes and help me understand their implementations so I can extend the methodology to solve problems in solar physics. Moreover, two additional experiences have helped me develop my research and collaborative skills. First was an opportunity to involve an undergraduate student, and secondly, my participation at the GNU Hackers Meeting in Paris. Three aspects that need particular attention to enhance the collective productivity of any group of individuals keen to extend existing codes to achieve further interdisciplinary goals have been identified. (1) The production of easily reusable code that users can study and modify even when large sets of computations are involved. (2) The transformation of solutions into tools that are 100% free software. (3) The harmonisation of collaborative interactions that effectively tackle the two aforementioned tasks. Each one will be discussed in detail during this session based on my experience as a research student.
Socio-Cultural Challenges in Global Software Engineering Education
ERIC Educational Resources Information Center
Hoda, Rashina; Babar, Muhammad Ali; Shastri, Yogeshwar; Yaqoob, Humaa
2017-01-01
Global software engineering education (GSEE) is aimed at providing software engineering (SE) students with knowledge, skills, and understanding of working in globally distributed arrangements so they can be prepared for the global SE (GSE) paradigm. It is important to understand the challenges involved in GSEE for improving the quality and…
Use of Software Tools in Teaching Relational Database Design.
ERIC Educational Resources Information Center
McIntyre, D. R.; And Others
1995-01-01
Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
Evaluating Games-Based Learning
ERIC Educational Resources Information Center
Hainey, Thomas; Connolly, Thomas
2010-01-01
A highly important part of software engineering education is requirements collection and analysis, one of the initial stages of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall system if performed incorrectly. As software engineering is a field with a reputation…
Building Software Development Capacity to Advance the State of Educational Technology
ERIC Educational Resources Information Center
Luterbach, Kenneth J.
2013-01-01
Educational technologists may advance the state of the field by increasing capacity to develop software tools and instructional applications. Presently, few academic programs in educational technology require even a single computer programming course. Further, the educational technologists who develop software generally work independently or in…
Knowledge Sharing through Pair Programming in Learning Environments: An Empirical Study
ERIC Educational Resources Information Center
Kavitha, R. K.; Ahmed, M. S.
2015-01-01
Agile software development is an iterative and incremental methodology, where solutions evolve from self-organizing, cross-functional teams. Pair programming is a type of agile software development technique where two programmers work together with one computer for developing software. This paper reports the results of the pair programming…
E-learning environment as intelligent tutoring system
NASA Astrophysics Data System (ADS)
Nagyová, Ingrid
2017-07-01
The development of computers and artificial intelligence theory allow their application in the field of education. Intelligent tutoring systems reflect student learning styles and adapt the curriculum according to their individual needs. The building of intelligent tutoring systems requires not only the creation of suitable software, but especially the search and application of the rules enabling ICT to individually adapt the curriculum. The main idea of this paper is to attempt to specify the rules for dividing the students to systematically working students and more practically or pragmatically inclined students. The paper shows that monitoring the work of students in e-learning environment, analysis of various approaches to educational materials and correspondence assignments show different results for the defined groups of students.
The VTIE telescope resource management system
NASA Astrophysics Data System (ADS)
Busschots, B.; Keating, J. G.
2005-06-01
The VTIE Telescope Resource Management System (TRMS) provides a frame work for managing a distributed group of internet telescopes as a single "Virtual Observatory". The TRMS provides hooks which allow for it to be connected to any Java Based web portal and for a Java based scheduler to be added to it. The TRMS represents each telescope and observatory in the system with a software agent and then allows the scheduler and web portal to communicate with these distributed resources in a simple transparent way, hence allowing the scheduler and portal designers to concentrate only on what they wish to do with these resources rather than how to communicate with them. This paper outlines the structure and implementation of this frame work.
ERIC Educational Resources Information Center
Angeli, Charoula; Valanides, Nicos; Polemitou, Eirini; Fraggoulidou, Elena
2014-01-01
The study examined the interaction between field dependence-independence (FD/I) and learning with modeling software and simulations, and their effect on children's performance. Participants were randomly assigned into two groups. Group A first learned with a modeling tool and then with simulations. Group B learned first with simulations and then…
Evaluation of Morphological Plasticity in the Cerebella of Basketball Players with MRI
Park, In Sung; Han, Jong Woo; Lee, Kea Joo; Lee, Nam Joon; Lee, Won Teak; Park, Kyung Ah
2006-01-01
Cerebellum is a key structure involved in motor learning and coordination. In animal models, motor skill learning increased the volume of molecular layer and the number of synapses on Purkinje cells in the cerebellar cortex. The aim of this study is to investigate whether the analogous change of cerebellar volume occurs in human population who learn specialized motor skills and practice them intensively for a long time. Magnetic resonance image (MRI)-based cerebellar volumetry was performed in basketball players and matched controls with V-works image software. Total brain volume, absolute and relative cerebellar volumes were compared between two groups. There was no significant group difference in the total brain volume, the absolute and the relative cerebellar volume. Thus we could not detect structural change in the cerebellum of this athlete group in the macroscopic level. PMID:16614526
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soubies, B.; Henry, J.Y.; Le Meur, M.
1300 MWe pressurised water reactors (PWRs), like the 1400 MWe reactors, operate with microprocessor-based safety systems. This is particularly the case for the Digital Integrated Protection System (SPIN), which trips the reactor in an emergency and sets in action the safeguard functions. The softwares used in these systems must therefore be highly dependable in the execution of their functions. In the case of SPIN, three players are working at different levels to achieve this goal: the protection system manufacturer, Merlin Gerin; the designer of the nuclear steam supply system, Framatome; the operator of the nuclear power plants, Electricite de Francemore » (EDF), which is also responsible for the safety of its installations. Regulatory licenses are issued by the French safety authority, the Nuclear Installations Safety Directorate (French abbreviation DSIN), subsequent to a successful examination of the technical provisions adopted by the operator. This examination is carried out by the IPSN and the standing group on nuclear reactors. This communication sets out: the methods used by the manufacturer to develop SPIN software for the 1400 MWe PWRs (N4 series); the approach adopted by the IPSN to evaluate the safety software of the protection system for the N4 series of reactors.« less
Ayling, Pete; Hill, Robert; Jassam, Nuthar; Kallner, Anders; Khatami, Zahra
2017-11-01
Background A logical consequence of the introduction of robotics and high-capacity analysers has seen a consolidation to larger units. This requires new structures and quality systems to ensure that laboratories deliver consistent and comparable results. Methods A spreadsheet program was designed to accommodate results from up to 12 different instruments/laboratories and present IQC data, i.e. Levey-Jennings and Youden plots and comprehensive numerical tables of the performance of each item. Input of data was made possible by a 'data loader' by which IQC data from the individual instruments could be transferred to the spreadsheet program on line. Results A set of real data from laboratories is used to populate the data loader and the networking software program. Examples are present from the analysis of variance components, the Levey-Jennings and Youden plots. Conclusions This report presents a software package that allows the simultaneous management and detailed monitoring of the performance of up to 12 different instruments/laboratories in a fully interactive mode. The system allows a quality manager of networked laboratories to have a continuous updated overview of the performance. This software package has been made available at the ACB website.
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
NASA Astrophysics Data System (ADS)
Faden, J.; Vandegriff, J. D.; Weigel, R. S.
2016-12-01
Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Automatic thermographic image defect detection of composites
NASA Astrophysics Data System (ADS)
Luo, Bin; Liebenberg, Bjorn; Raymont, Jeff; Santospirito, SP
2011-05-01
Detecting defects, and especially reliably measuring defect sizes, are critical objectives in automatic NDT defect detection applications. In this work, the Sentence software is proposed for the analysis of pulsed thermography and near IR images of composite materials. Furthermore, the Sentence software delivers an end-to-end, user friendly platform for engineers to perform complete manual inspections, as well as tools that allow senior engineers to develop inspection templates and profiles, reducing the requisite thermographic skill level of the operating engineer. Finally, the Sentence software can also offer complete independence of operator decisions by the fully automated "Beep on Defect" detection functionality. The end-to-end automatic inspection system includes sub-systems for defining a panel profile, generating an inspection plan, controlling a robot-arm and capturing thermographic images to detect defects. A statistical model has been built to analyze the entire image, evaluate grey-scale ranges, import sentencing criteria and automatically detect impact damage defects. A full width half maximum algorithm has been used to quantify the flaw sizes. The identified defects are imported into the sentencing engine which then sentences (automatically compares analysis results against acceptance criteria) the inspection by comparing the most significant defect or group of defects against the inspection standards.
Management Guidelines for Database Developers' Teams in Software Development Projects
NASA Astrophysics Data System (ADS)
Rusu, Lazar; Lin, Yifeng; Hodosi, Georg
Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.
Hiscox, Lucy; Leonavičiūtė, Erika; Humby, Trevor
2014-08-01
Dyslexia is associated with difficulties in language-specific skills such as spelling, writing and reading; the difficulty in acquiring literacy skills is not a result of low intelligence or the absence of learning opportunity, but these issues will persist throughout life and could affect long-term education. Writing is a complex process involving many different functions, integrated by the working memory system; people with dyslexia have a working memory deficit, which means that concentration on writing quality may be detrimental to understanding. We confirm impaired working memory in a sample of university students with (compensated) dyslexia, and using a within-subject design with three test conditions, we show that these participants demonstrated better understanding of a piece of text if they had used automatic spelling correction software during a dictation/transcription task. We hypothesize that the use of the autocorrecting software reduced demand on working memory, by allowing word writing to be more automatic, thus enabling better processing and understanding of the content of the transcriptions and improved recall. Long-term and regular use of autocorrecting assistive software should be beneficial for people with and without dyslexia and may improve confidence, written work, academic achievement and self-esteem, which are all affected in dyslexia. Copyright © 2014 John Wiley & Sons, Ltd.
Working conditions, socioeconomic factors and low birth weight: path analysis.
Mahmoodi, Zohreh; Karimlou, Masoud; Sajjadi, Homeira; Dejman, Masoumeh; Vameghi, Meroe; Dolatian, Mahrokh
2013-09-01
In recent years, with socioeconomic changes in the society, the presence of women in the workplace is inevitable. The differences in working condition, especially for pregnant women, has adverse consequences like low birth weight. This study was conducted with the aim to model the relationship between working conditions, socioeconomic factors, and birth weight. This study was conducted in case-control design. The control group consisted of 500 women with normal weight babies, and the case group, 250 women with low weight babies from selected hospitals in Tehran. Data were collected using a researcher-made questionnaire to determine mothers' lifestyle during pregnancy with low birth weight with health-affecting social determinants approach. This questionnaire investigated women's occupational lifestyle in terms of working conditions, activities, and job satisfaction. Data were analyzed with SPSS-16 and Lisrel-8.8 software using statistical path analysis. The final path model fitted well (CFI =1, RMSEA=0.00) and showed that among direct paths, working condition (β=-0.032), among indirect paths, household income (β=-0.42), and in the overall effect, unemployed spouse (β=-0.1828) had the most effects on the low birth weight. Negative coefficients indicate decreasing effect on birth weight. Based on the path analysis model, working condition and socioeconomic status directly and indirectly influence birth weight. Thus, as well as attention to treatment and health care (biological aspect), special attention must also be paid to mothers' socioeconomic factors.
Using computer software to improve group decision-making.
Mockler, R J; Dologite, D G
1991-08-01
This article provides a review of some of the work done in the area of knowledge-based systems for strategic planning. Since 1985, with the founding of the Center for Knowledge-based Systems for Business Management, the project has focused on developing knowledge-based systems (KBS) based on these models. In addition, the project also involves developing a variety of computer and non-computer methods and techniques for assisting both technical and non-technical managers and individuals to do decision modelling and KBS development. This paper presents a summary of one segment of the project: a description of integrative groupware useful in strategic planning. The work described here is part of an ongoing research project. As part of this project, for example, over 200 non-technical and technical business managers, most of them working full-time during the project, developed over 160 KBS prototype systems in conjunction with MBA course in strategic planning and management decision making. Based on replies to a survey of this test group, 28 per cent of the survey respondents reported their KBS were used at work, 21 per cent reportedly received promotions, pay rises or new jobs based on their KBS development work, and 12 per cent reported their work led to participation in other KBS development projects at work. All but two of the survey respondents reported that their work on the KBS development project led to a substantial increase in their job knowledge or performance.
NASA Astrophysics Data System (ADS)
Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.
2015-12-01
Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.
A Bibliography of Externally Published Works by the SEI Engineering Techniques Program
1992-08-01
media, and virtual reality * model- based engineering * programming languages * reuse * software architectures * software engineering as a discipline...Knowledge- Based Engineering Environments." IEEE Expert 3, 2 (May 1988): 18-23, 26-32. Audience: Practitioner [Klein89b] Klein, D.V. "Comparison of...Terms with Software Reuse Terminology: A Model- Based Approach." ACM SIGSOFT Software Engineering Notes 16, 2 (April 1991): 45-51. Audience: Practitioner
System integration test plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
This document presents the system integration test plan for the Commercial-Off-The-Shelf, PassPort and PeopleSoft software, and custom software created to work with the COTS products. The PP software is an integrated application for AP, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheet. The PS software is an integrated application for Project Costing, General Ledger, Human Resources/Training, Payroll, and Base Benefits.
NASA Technical Reports Server (NTRS)
Kincaid, D. R.; Young, D. M.
1984-01-01
Adapting and designing mathematical software to achieve optimum performance on the CYBER 205 is discussed. Comments and observations are made in light of recent work done on modifying the ITPACK software package and on writing new software for vector supercomputers. The goal was to develop very efficient vector algorithms and software for solving large sparse linear systems using iterative methods.
Lopez, Ramón Guisado; Polo, Isabel Ramirez; Berral, Jose Eduardo Arjona; Fernandez, Julia Guisado; Castelo-Branco, Camil
2015-04-01
To design software to assist health care providers with contraceptive counselling. The Model-View-Controller software architecture pattern was used. Decision logic was incorporated to automatically compute the safety category of each contraceptive option. Decisions are made according to the specific characteristics or known medical conditions of each potential contraception user. The software is an app designed for the iOS and Android platforms and is available in four languages. iContraception(®) facilitates presentation of visual data on medical eligibility criteria for contraceptive treatments. The use of this software was evaluated by a sample of 54 health care providers. The general satisfaction with the use of the app was over 8 on a 0-10 visual analogue scale in 96.3% of cases. iContraception provides easy access to medical eligibility criteria of contraceptive options and may help with contraceptive counselling. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
ESO Demonstration Project with the NRAO 12-m Antenna
NASA Astrophysics Data System (ADS)
Heald, R.; Karban, R.
2000-03-01
During the months of September through November 1999, an ALMA joint demonstration project between the European Southern Observatory (ESO) and the National Radio Astronomy Observatory (NRAO) was carried out in Socorro/New Mexico. During this period, Robert Karban (ESO) and Ron Heald (NRAO) worked together on the ESO Demonstration Project. The project integrated ESO software and existing NRAO software (a prototype for the future ALMA control software) to control the motion of the Kitt Peak 12-m antenna. ESO software from the VLT provided the operator interface and coordinate transformation software, while Pat Wallace's TPOINT provided the pointing- model software.
Tucker, D M; Wenckus, C S; Bentkover, S K
1997-03-01
Twenty-two mesial roots of extracted human mandibular molars were divided into two groups based on root curvature and length. The mesiolingual canals were instrumented using either Flexofiles in a step-back anticurvature filing method, or they were instrumented with engine-driven 0.02 taper nickel-titanium files. Ground sections were prepared at 1-, 2.5-, and 5-mm levels from the working length. The mesiobuccal canal was used as an uninstrumented control for predentin character. Digitizing software was used to calculate the instrumented portion as a percentage of the total canal perimeter. The results indicated no significant difference in overall canal wall planning between the two groups and no significant difference at each of the three levels.
Extreme Energy Particle Astrophysics with ANITA-V
NASA Astrophysics Data System (ADS)
Wissel, Stephanie
This proposal is in collaboration with Peter Gorham at the University of Hawaii, who is the PI of the lead proposal. Co-I Wissel and her group at California Polytechnic State University (Cal Poly) will be responsible for calibration equipment upgrades, calibration equipment, and deployment of the calibration system. The Cal Poly group is planning to provide calibration hardware and software products in support of the analysis of ANITAV data in search of ultra high-energy (UHE) neutrinos and cosmic rays. Wissel (now at Cal Poly, a new collaborating institution for ANITA-5) brings significant experience in the detection of high-energy and ultra-high energy particles to the collaboration, leveraging her thirteen years of experience in particle astrophysics and previous work on ANITA-III and ANITA-IV.
Kim, Dong-Yeon; Kim, Eo-Bin; Kim, Hae-Young; Kim, Ji-Hwan
2017-01-01
PURPOSE To evaluate the fit of a three-unit metal framework of fixed dental prostheses made by subtractive and additive manufacturing. MATERIALS AND METHODS One master model of metal was fabricated. Twenty silicone impressions were made on the master die, working die of 10 poured with Type 4 stone, and working die of 10 made of scannable stone. Ten three-unit wax frameworks were fabricated by wax-up from Type IV working die. Stereolithography files of 10 three-unit frameworks were obtained using a model scanner and three-dimensional design software on a scannable working die. The three-unit wax framework was fabricated using subtractive manufacturing (SM) by applying the prepared stereolithography file, and the resin framework was fabricated by additive manufacturing (AM); both used metal alloy castings for metal frameworks. Marginal and internal gap were measured using silicone replica technique and digital microscope. Measurement data were analyzed by Kruskal-Wallis H test and Mann-Whitney U-test (α=.05). RESULTS The lowest and highest gaps between premolar and molar margins were in the SM group and the AM group, respectively. There was a statistically significant difference in the marginal gap among the 3 groups (P<.001). In the marginal area where pontic was present, the largest gap was 149.39 ± 42.30 µm in the AM group, and the lowest gap was 24.40 ± 11.92 µm in the SM group. CONCLUSION Three-unit metal frameworks made by subtractive manufacturing are clinically applicable. However, additive manufacturing requires more research to be applied clinically. PMID:29279766
Commercial Aircraft Integrated Vehicle Health Management Study
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Briggs, Jeffrey L.; Evans, Joni K.; Jones, Sharon Monica; Kurtoglu, Tolga; Leone, Karen M.; Sandifer, Carl E.; Thomas, Megan A.
2010-01-01
Statistical data and literature from academia, industry, and other government agencies were reviewed and analyzed to establish requirements for fixture work in detection, diagnosis, prognosis, and mitigation for IVHM related hardware and software. Around 15 to 20 percent of commercial aircraft accidents between 1988 and 2003 involved inalftfnctions or failures of some aircraft system or component. Engine and landing gear failures/malfunctions dominate both accidents and incidents. The IVI vl Project research technologies were found to map to the Joint Planning and Development Office's National Research and Development Plan (RDP) as well as the Safety Working Group's National Aviation Safety Strategic. Plan (NASSP). Future directions in Aviation Technology as related to IVHlvl were identified by reviewing papers from three conferences across a five year time span. A total of twenty-one trend groups in propulsion, aeronautics and aircraft categories were compiled. Current and ftiture directions of IVHM related technologies were gathered and classified according to eight categories: measurement and inspection, sensors, sensor management, detection, component and subsystem monitoring, diagnosis, prognosis, and mitigation.
Non-Grey Radiation Modeling using Thermal Desktop/Sindaworks TFAWS06-1009
NASA Technical Reports Server (NTRS)
Anderson, Kevin R.; Paine, Chris
2006-01-01
This paper provides an overview of the non-grey radiation modeling capabilities of Cullimore and Ring's Thermal Desktop(Registered TradeMark) Version 4.8 SindaWorks software. The non-grey radiation analysis theory implemented by Sindaworks and the methodology used by the software are outlined. Representative results from a parametric trade study of a radiation shield comprised of a series of v-grooved shaped deployable panels is used to illustrate the capabilities of the SindaWorks non-grey radiation thermal analysis software using emissivities with temperature and wavelength dependency modeled via a Hagen-Rubens relationship.
ERIC Educational Resources Information Center
Wulfson, Stephen, Ed.
1990-01-01
Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)
Evaluation of ergonomic factors and postures that cause muscle pains in dentistry students' bodies.
Shirzaei, Masoumeh; Mirzaei, Ramazan; Khaje-Alizade, Ali; Mohammadi, Mahdi
2015-07-01
Work-related musculoskeletal disorders commonly experienced by dental professionals are one of the main occupational health problem affecting their health and well-being.This study was conducted to evaluate ergonomic factors and profession-related postures and also investigate relationship between demographic factors and work condition with pain in dental students. 60 freshman and sophomore dentistry students were randomly chosen as the subjects of control group, and 60 of 5th and 6th-year students were selected as the members of exposure group. Data related to the subjects such as sex, doing exercise, severity of musculoskeletal pain were obtained through questionnaire. Students' postures were directly observed while treating patients and they were scored by REBA method. Data were analyzed by SPSS software using Man-Whitney, Kruskal-Wallis, Spearman and Kendall correlation tests. 80.8% of the subjects were not aware of the correct ergonomic postures for dental procedures. Severity of musculoskeletal pain in the exposure group (15.9± 4.2) was significantly higher than the control group (10.5 ±3.2), (p <0.001). Risk of the most subjects (84%) was at the medium level. Students who were more involved in clinical activities experienced more muscular pains. The musculoskeletal disorders are probable prolonged in working hours in static positions, incorrect work postures, implying more force and even tools and instruments. Therefore, students who are aware of ergonomic principals of their own profession would be able to maintain their health through activities and lifelong. Key words:Posture, dentistry, students, musculoskeletal pain.
Evaluation of ergonomic factors and postures that cause muscle pains in dentistry students’ bodies
Shirzaei, Masoumeh; Khaje-Alizade, Ali; Mohammadi, Mahdi
2015-01-01
Background Work-related musculoskeletal disorders commonly experienced by dental professionals are one of the main occupational health problem affecting their health and well-being.This study was conducted to evaluate ergonomic factors and profession-related postures and also investigate relationship between demographic factors and work condition with pain in dental students. Material and Methods 60 freshman and sophomore dentistry students were randomly chosen as the subjects of control group, and 60 of 5th and 6th-year students were selected as the members of exposure group. Data related to the subjects such as sex, doing exercise, severity of musculoskeletal pain were obtained through questionnaire. Students’ postures were directly observed while treating patients and they were scored by REBA method. Data were analyzed by SPSS software using Man-Whitney, Kruskal-Wallis, Spearman and Kendall correlation tests. Results 80.8% of the subjects were not aware of the correct ergonomic postures for dental procedures. Severity of musculoskeletal pain in the exposure group (15.9± 4.2) was significantly higher than the control group (10.5 ±3.2), (p <0.001). Risk of the most subjects (84%) was at the medium level. Students who were more involved in clinical activities experienced more muscular pains. Conclusions The musculoskeletal disorders are probable prolonged in working hours in static positions, incorrect work postures, implying more force and even tools and instruments. Therefore, students who are aware of ergonomic principals of their own profession would be able to maintain their health through activities and lifelong. Key words:Posture, dentistry, students, musculoskeletal pain. PMID:26330941
Design of a Software Configuration for Real-Time Multimedia Group Communication; HNUMTP
NASA Astrophysics Data System (ADS)
Park, Gil-Cheol
This paper designs transport protocol of multi-session/channel method for real time multimedia group telecommunication and realizes it. The special features of the designed and realized protocol are first, that it solved the sync problem which is the specific character of multimedia telecommunication by using multi-channel method protocol. Usual multimedia telecommunication is assigned one channel by each media data. This paper shortened the phenomenon that waits data for sync of receiving part by assigning more than one channel for the channel that has a lot of data per hour as video data. The problem of intermedia synchronization that happens then could be solved by sending temporal/spacial related data among data assigning extra control channel. Second, that it does integrated management for sessions. Each session is one group telecommunication unit which supports mutual working environment that is independent. Each session communicates the participants in the group independently, the session manager manages all the communication among groups and lets media sources connected with all network be operated efficiently.
NASA Astrophysics Data System (ADS)
Koparan, Timur
2016-02-01
In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.
Development of electronic software for the management of trauma patients on the orthopaedic unit.
Patel, Vishal P; Raptis, Demitri; Christofi, T; Mathew, Rajeev; Horwitz, M D; Eleftheriou, K; McGovern, Paul D; Youngman, J; Patel, J V; Haddad, F S
2009-04-01
Continuity of patient care is an essential prerequisite for the successful running of a trauma surgery service. This is becoming increasingly difficult because of the new working arrangements of junior doctors. Handover is now central to ensure continuity of care following shift change over. The purpose of this study was to compare the quality of information handed over using the traditional ad hoc method of a handover sheet versus a web-based electronic software programme. It was hoped that through improved quality of handover the new system would have a positive impact on clinical care, risk and time management. Data was prospectively collected and analyzed using the SPSS 14 statistical package. The handover data of 350 patients using a paper-based system was compared to the data of 357 cases using the web-based system. Key data included basic demographic data, responsible surgeon, location of patient, injury site including site, whether fractures were open or closed, concomitant injuries and the treatment plan. A survey was conducted amongst health care providers to assess the impact of the new software. With the introduction of the electronic handover system, patients with missing demographic data reduced from 35.1% to 0.8% (p<0.0001) and missing patient location from 18.6% to 3.6% (p<0.0001). Missing consultant information and missing diagnosis dropped from 12.9% to 2.0% (p<0.0001) and from 11.7% to 0.8% (p<0.0001), respectively. The missing information regarding side and anatomical site of the injury was reduced from 31.4% to 0.8% (p<0.0001) and from 13.7% to 1.1% (p<0.0001), respectively. In 96.6% of paper ad hoc handovers it was not stated whether the injury was 'closed' or 'open', whereas in the electronic group this information was evident in all 357 patients (p<0.0001). A treatment plan was included only in 52.3% of paper handovers compared to 94.7% (p<0.0001) of electronic handovers. A survey revealed 96% of members of the trauma team felt an improvement of handover since the introduction of the software, and 94% of members were satisfied with the software. The findings of our study show that the use of web-based electronic software is effective in facilitating and improving the quality of information passed during handover. Structured software also aids in improving work flow amongst the trauma team. We argue that an improvement in the quality of handover is an improvement in clinical practice.
Taveira-Gomes, Tiago; Ferreira, Patrícia; Taveira-Gomes, Isabel; Severo, Milton; Ferreira, Maria Amélia
2016-08-01
Computer-based learning (CBL) has been widely used in medical education, and reports regarding its usage and effectiveness have ranged broadly. Most work has been done on the effectiveness of CBL approaches versus traditional methods, and little has been done on the comparative effects of CBL versus CBL methodologies. These findings urged other authors to recommend such studies in hopes of improving knowledge about which CBL methods work best in which settings. In this systematic review, we aimed to characterize recent studies of the development of software platforms and interventions in medical education, search for common points among studies, and assess whether recommendations for CBL research are being taken into consideration. We conducted a systematic review of the literature published from 2003 through 2013. We included studies written in English, specifically in medical education, regarding either the development of instructional software or interventions using instructional software, during training or practice, that reported learner attitudes, satisfaction, knowledge, skills, or software usage. We conducted 2 latent class analyses to group articles according to platform features and intervention characteristics. In addition, we analyzed references and citations for abstracted articles. We analyzed 251 articles. The number of publications rose over time, and they encompassed most medical disciplines, learning settings, and training levels, totaling 25 different platforms specifically for medical education. We uncovered 4 latent classes for educational software, characteristically making use of multimedia (115/251, 45.8%), text (64/251, 25.5%), Web conferencing (54/251, 21.5%), and instructional design principles (18/251, 7.2%). We found 3 classes for intervention outcomes: knowledge and attitudes (175/212, 82.6%), knowledge, attitudes, and skills (11.8%), and online activity (12/212, 5.7%). About a quarter of the articles (58/227, 25.6%) did not hold references or citations in common with other articles. The number of common references and citations increased in articles reporting instructional design principles (P=.03), articles measuring online activities (P=.01), and articles citing a review by Cook and colleagues on CBL (P=.04). There was an association between number of citations and studies comparing CBL versus CBL, independent of publication date (P=.02). Studies in this field vary highly, and a high number of software systems are being developed. It seems that past recommendations regarding CBL interventions are being taken into consideration. A move into a more student-centered model, a focus on implementing reusable software platforms for specific learning contexts, and the analysis of online activity to track and predict outcomes are relevant areas for future research in this field.
Software Engineering for Human Spaceflight
NASA Technical Reports Server (NTRS)
Fredrickson, Steven E.
2014-01-01
The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.
Statistical modeling of software reliability
NASA Technical Reports Server (NTRS)
Miller, Douglas R.
1992-01-01
This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.