Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... Pachard Company, Business Critical Systems, Mission Critical Business Software Division, Openvms Operating... Business Software Division, Openvms Operating System Development Group, Including an Employee Operating Out... Company, Business Critical Systems, Mission Critical Business Software Division, OpenVMS Operating System...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... Packard Company Business Critical Systems, Mission Critical Business Software Division, OpenVMS Operating... Software Division, OpenVMS Operating System Development Group, Including an Employee Operating Out of the..., Mission Critical Business Software Division, OpenVMS Operating System Development Group, including...
IRAF and STSDAS under the new ALPHA architecture
NASA Technical Reports Server (NTRS)
Zarate, N. R.
1992-01-01
Digital's next generation RISC architecture, known as ALPHA, presents many IRAF system portability questions and challenges to both site managers and end users. DEC promises to support the ULTRIX, VMS, and OSF/1 operating systems, which should allow IRAF to be ported to the new architecture at either the program executable level (using VEST), or at the source level, where IRAF can be tuned for greater performance. These notes highlight some of the details of porting IRAF to OpenVMS on the ALPHA architecture.
Cartographic applications software
,
1992-01-01
The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.
ERIC Educational Resources Information Center
Kara, Yilmaz; Yesilyurt, Selami
2008-01-01
The purpose of this study was to investigate the effects of tutorial and edutainment design of instructional software programs related to the "cell division" topic on student achievements, misconceptions and attitudes. An experimental research design including the cell division achievement test (CAT), the cell division concept test (CCT) and…
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
MDSplus quality improvement project
Fredian, Thomas W.; Stillerman, Joshua; Manduchi, Gabriele; ...
2016-05-31
MDSplus is a data acquisition and analysis system used worldwide predominantly in the fusion research community. Development began 29 years ago on the OpenVMS operating system. Since that time there have been many new features added and the code has been ported to many different operating systems. There have been contributions to the MDSplus development from the fusion community in the way of feature suggestions, feature implementations, documentation and porting to different operating systems. The bulk of the development and support of MDSplus, however, has been provided by a relatively small core developer group of three or four members. Givenmore » the size of the development team and the large number of users much more effort was focused on providing new features for the community than on keeping the underlying code and documentation up to date with the evolving software development standards. To ensure that MDSplus will continue to provide the needs of the community in the future, the MDSplus development team along with other members of the MDSplus user community has commenced on a major quality improvement project. The planned improvements include changes to software build scripts to better use GNU Autoconf and Automake tools, refactoring many of the source code modules using new language features available in modern compilers, using GNU MinGW-w64 to create MS Windows distributions, migrating to a more modern source code management system, improvement of source documentation as well as improvements to the www.mdsplus.org web site documentation and layout, and the addition of more comprehensive test suites to apply to MDSplus code builds prior to releasing installation kits to the community. This paper should lead to a much more robust product and establish a framework to maintain stability as more enhancements and features are added. Finally, this paper will describe these efforts that are either in progress or planned for the near future.« less
NASA Astrophysics Data System (ADS)
Kara, Yılmaz; Yeşilyurt, Selami
2008-02-01
The purpose of this study was to investigate the effects of tutorial and edutainment design of instructional software programs related to the "cell division" topic on student achievements, misconceptions and attitudes. An experimental research design including the cell division achievement test (CAT), the cell division concept test (CCT) and biology attitude scale (BAS) was applied at the beginning and at the end of the research. After the treatment, general achievement in CAT increased in favor of experimental groups. Instructional software programs also had the positive effect to the awareness of students' understandings to the general functions of mitosis and meiosis. However, the current study revealed that there were still some misconceptions in the experimental groups even after the treatment. It was also noticed that only using edutainment software program significantly changed students' attitudes towards biology.
Path Searching Based Fault Automated Recovery Scheme for Distribution Grid with DG
NASA Astrophysics Data System (ADS)
Xia, Lin; Qun, Wang; Hui, Xue; Simeng, Zhu
2016-12-01
Applying the method of path searching based on distribution network topology in setting software has a good effect, and the path searching method containing DG power source is also applicable to the automatic generation and division of planned islands after the fault. This paper applies path searching algorithm in the automatic division of planned islands after faults: starting from the switch of fault isolation, ending in each power source, and according to the line load that the searching path traverses and the load integrated by important optimized searching path, forming optimized division scheme of planned islands that uses each DG as power source and is balanced to local important load. Finally, COBASE software and distribution network automation software applied are used to illustrate the effectiveness of the realization of such automatic restoration program.
Animated software training via the internet: lessons learned
NASA Technical Reports Server (NTRS)
Scott, C. J.
2000-01-01
The Mission Execution and Automation Section, Information Technologies and Software Systems Division at the Jet Propulsion Laboratory, recently delivered an animated software training module for the TMOD UPLINK Consolidation Task for operator training at the Deep Space Network.
NASA Astrophysics Data System (ADS)
Wang, Fu; Liu, Bo; Zhang, Lijia; Jin, Feifei; Zhang, Qi; Tian, Qinghua; Tian, Feng; Rao, Lan; Xin, Xiangjun
2017-03-01
The wavelength-division multiplexing passive optical network (WDM-PON) is a potential technology to carry multiple services in an optical access network. However, it has the disadvantages of high cost and an immature technique for users. A software-defined WDM/time-division multiplexing PON was proposed to meet the requirements of high bandwidth, high performance, and multiple services. A reasonable and effective uplink dynamic bandwidth allocation algorithm was proposed. A controller with dynamic wavelength and slot assignment was introduced, and a different optical dynamic bandwidth management strategy was formulated flexibly for services of different priorities according to the network loading. The simulation compares the proposed algorithm with the interleaved polling with adaptive cycle time algorithm. The algorithm shows better performance in average delay, throughput, and bandwidth utilization. The results show that the delay is reduced to 62% and the throughput is improved by 35%.
Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... DEPARTMENT OF JUSTICE Antitrust Division United States, et al. v. Election Systems & Software, Inc... proposed Final Judgment in United States, et al. v. Election Systems & Software Inc., Case No. 1:10-00380... America, et al., Plaintiffs, v. Election Systems and Software, Inc., Defendant. Case No.: 1:10-cv-00380...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-15
... DEPARTMENT OF JUSTICE Antitrust Division United States, et al. v. Election Systems and Software... Columbia in United States, et al. v. Election Systems and Software Inc., Civil Action No. 10-00380. On... Systems and Software, Inc., (``ES&S'') of Premier Election Services, Inc., and PES Holdings, Inc. violated...
Software Management Environment (SME) concepts and architecture, revision 1
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1992-01-01
This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.
Training Software Developers and Designers to Conduct Usability Evaluations
ERIC Educational Resources Information Center
Skov, Mikael Brasholt; Stage, Jan
2012-01-01
Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…
An Exploratory Study of Software Cost Estimating at the Electronic Systems Division.
1976-07-01
action’. to improve the software cost Sestimating proces., While thin research was limited to the M.nD onvironment, the same types of problema may exist...Methods in Social Science. Now York: Random House, 1969. 57. Smith, Ronald L. Structured Programming Series (Vol. XI) - Estimating Software Project
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... software and related services including quality assurance and learning products, marketing, product development, marketing and administration. The company reports that on-site leased workers from Managed..., Santa Clara, California, and the Everett, Washington locations of Agilent Technologies, EEsof Division...
NASA Technical Reports Server (NTRS)
Morgan, Timothy E.
1995-01-01
The objective of the Reusable Software System (RSS) is to provide NASA Langley Research Center and its contractor personnel with a reusable software technology through the Internet. The RSS is easily accessible, provides information that is extractable, and the capability to submit information or data for the purpose of scientific research at NASA Langley Research Center within the Atmospheric Science Division.
Software OT&E Guidelines. Volume 1. Software Test Manager’s Handbook
1981-02-01
on reverse side If neceeary and identify by block number) The Software OT&E Guidelines is a set of handbooks prepared by the Computer / Support Systems...is one of a set of handbooks prepared by the Computer /Support Systems Division of the Test and Evaluation Directorate, Air Force Test and Evaluation...15 E. Software Maintainability .. .. ........ ... 16 F. Standard Questionnaires. .. .. ....... .... 16 1. Operator- Computer Interface Evaluation
ADP Analysis project for the Human Resources Management Division
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1993-01-01
The ADP (Automated Data Processing) Analysis Project was conducted for the Human Resources Management Division (HRMD) of NASA's Langley Research Center. The three major areas of work in the project were computer support, automated inventory analysis, and an ADP study for the Division. The goal of the computer support work was to determine automation needs of Division personnel and help them solve computing problems. The goal of automated inventory analysis was to find a way to analyze installed software and usage on a Macintosh. Finally, the ADP functional systems study for the Division was designed to assess future HRMD needs concerning ADP organization and activities.
Geologic Communications | Alaska Division of Geological & Geophysical
improves a database for the Division's digital and map-based geological, geophysical, and geochemical data interfaces DGGS metadata and digital data distribution - Geospatial datasets published by DGGS are designed to be compatible with a broad variety of digital mapping software, to present DGGS's geospatial data
Software Management Environment (SME) installation guide
NASA Technical Reports Server (NTRS)
Kistler, David; Jeletic, Kellyann
1992-01-01
This document contains installation information for the Software Management Environment (SME), developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides a list of hardware and software requirements as well as detailed installation instructions and trouble-shooting information.
Technical Support | Division of Cancer Prevention
To view the live webinar, you will need to have the software, Microsoft Live Meeting, downloaded onto your computer before the event. In most cases, the software will automatically download when you open the program on your system. However, in the event that you need to download it manually, you can access the software at the link below: Download the Microsoft Office Live
ERIC Educational Resources Information Center
Niguidula, David; Blumberg, Roger B.; van Dam, Andries
1999-01-01
Describes a seminar at Brown University where undergraduate students design and develop software for K-12 schools based on proposals of teachers in and around Providence (Rhode Island). Discusses seminar goals, working with schools, division of labor between teachers and seminar students, creating the software, student benefits, and using…
Space Station communications and tracking systems modeling and RF link simulation
NASA Technical Reports Server (NTRS)
Tsang, Chit-Sang; Chie, Chak M.; Lindsey, William C.
1986-01-01
In this final report, the effort spent on Space Station Communications and Tracking System Modeling and RF Link Simulation is described in detail. The effort is mainly divided into three parts: frequency division multiple access (FDMA) system simulation modeling and software implementation; a study on design and evaluation of a functional computerized RF link simulation/analysis system for Space Station; and a study on design and evaluation of simulation system architecture. This report documents the results of these studies. In addition, a separate User's Manual on Space Communications Simulation System (SCSS) (Version 1) documents the software developed for the Space Station FDMA communications system simulation. The final report, SCSS user's manual, and the software located in the NASA JSC system analysis division's VAX 750 computer together serve as the deliverables from LinCom for this project effort.
A proven approach for more effective software development and maintenance
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Hall, Dana; Sinclair, Craig
1994-01-01
Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.
PROFILE: Airfoil Geometry Manipulation and Display. User's Guide
NASA Technical Reports Server (NTRS)
Collins, Leslie; Saunders, David
1997-01-01
This report provides user information for program PROFILE, an aerodynamics design utility for plotting, tabulating, and manipulating airfoil profiles. A dozen main functions are available. The theory and implementation details for two of the more complex options are also presented. These are the REFINE option, for smoothing curvature in selected regions while retaining or seeking some specified thickness ratio, and the OPTIMIZE option, which seeks a specified curvature distribution. Use of programs QPLOT and BPLOT is also described, since all of the plots provided by PROFILE (airfoil coordinates, curvature distributions, pressure distributions)) are achieved via the general-purpose QPLOT utility. BPLOT illustrates (again, via QPLOT) the shape functions used by two of PROFILE's options. These three utilities should be distributed as one package. They were designed and implemented for the Applied Aerodynamics Branch at NASA Ames Research Center, Moffett Field, California. They are all written in FORTRAN 77 and run on DEC and SGI systems under OpenVMS and IRIX.
Technology for Manufacturing Efficiency
NASA Technical Reports Server (NTRS)
1995-01-01
The Ground Processing Scheduling System (GPSS) was developed by Ames Research Center, Kennedy Space Center and divisions of the Lockheed Company to maintain the scheduling for preparing a Space Shuttle Orbiter for a mission. Red Pepper Software Company, now part of PeopleSoft, Inc., commercialized the software as their ResponseAgent product line. The software enables users to monitor manufacturing variables, report issues and develop solutions to existing problems.
Data and Analysis Center for Software: An IAC in Transition.
1983-06-01
reviewed and is approved for publication. * APPROVEDt Proj ect Engineer . JOHN J. MARCINIAK, Colonel, USAF Chief, Command and Control Division . FOR THE CO...SUPPLEMENTARY NOTES RADC Project Engineer : John Palaimo (COEE) It. KEY WORDS (Conilnuo n rever*e aide if necessary and identify by block numober...Software Engineering Software Technology Information Analysis Center Database Scientific and Technical Information 20. ABSTRACT (Continue on reverse side It
Data collection procedures for the Software Engineering Laboratory (SEL) database
NASA Technical Reports Server (NTRS)
Heller, Gerard; Valett, Jon; Wild, Mary
1992-01-01
This document is a guidebook to collecting software engineering data on software development and maintenance efforts, as practiced in the Software Engineering Laboratory (SEL). It supersedes the document entitled Data Collection Procedures for the Rehosted SEL Database, number SEL-87-008 in the SEL series, which was published in October 1987. It presents procedures to be followed on software development and maintenance projects in the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC) for collecting data in support of SEL software engineering research activities. These procedures include detailed instructions for the completion and submission of SEL data collection forms.
Software Management Environment (SME): Components and algorithms
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1994-01-01
This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'
Cyclic Voltammetry Simulations with DigiSim Software: An Upper-Level Undergraduate Experiment
ERIC Educational Resources Information Center
Messersmith, Stephania J.
2014-01-01
An upper-division undergraduate chemistry experiment is described which utilizes DigiSim software to simulate cyclic voltammetry (CV). Four mechanisms were studied: a reversible electron transfer with no subsequent or proceeding chemical reactions, a reversible electron transfer followed by a reversible chemical reaction, a reversible chemical…
Archiving a Software Development Project
2013-04-01
an ongoing monitoring system that identifies attempts and requests for retrieval, and ensures that the attempts and requests cannot proceed without...Intelligence Division Peter Fisher has worked as a consultant, systems analyst, software developer and project manager in Australia, Holland, the USA...4 3.1.3 DRMS – Defence Records Management System
Software Management Environment (SME) release 9.4 user reference material
NASA Technical Reports Server (NTRS)
Hendrick, R.; Kistler, D.; Manter, K.
1992-01-01
This document contains user reference material for the Software Management Environment (SME) prototype, developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides an overview of the SME, a description of all functions, and detailed instructions concerning the software's installation and use.
Sandia Compact Sensor Node (SCSN) v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
HARRINGTON, JOHN
2009-01-07
The SCSN communication protocol is implemented in software and incorporates elements of Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), and Carrier Sense Multiple Access (CSMA) to reduce radio message collisions, latency, and power consumption. Alarm messages are expeditiously routed to a central node as a 'star' network with minimum overhead. Other messages can be routed along network links between any two nodes so that peer-to-peer communication is possible. Broadcast messages can be composed that flood the entire network or just specific portions with minimal radio traffic and latency. Two-way communication with sensor nodes, which sleep most ofmore » the time to conserve battery life, can occur at seven second intervals. SCSN software also incorporates special algorithms to minimize superfluous radio traffic that can result from excessive intrusion alarm messages. A built-in seismic detector is implemented with a geophone and software that distinguishes between pedestrian and vehicular targets. Other external sensors can be attached to a SCSN using supervised interface lines that are controlled by software. All software is written in the ANSI C language for ease of development, maintenance, and portability.« less
The Package-Based Development Process in the Flight Dynamics Division
NASA Technical Reports Server (NTRS)
Parra, Amalia; Seaman, Carolyn; Basili, Victor; Kraft, Stephen; Condon, Steven; Burke, Steven; Yakimovich, Daniil
1997-01-01
The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.
Overview of the CERT Resilience Management Model (CERT-RMM)
2014-01-23
Management Model (CERT®-RMM) Jim Cebula Technical Manager - Cyber Risk Management , CERT® Division Jim Cebula is the Technical Manager of the...Cyber Risk Management team in the Cyber Security Solutions Directorate of the CERT Division at the Software Engineering Institute (SEI), a unit of...Carnegie Mellon University. Cebula’s current activities include risk management methods along with assessment and management of operational
1989-05-01
0 ELECTE VARSHA P. RAO SEP11190 AND U DONNA ROBERTS MAY 1989 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED O Research & Studies Division U.S...DAKF 15-87-0-0144 Subcontract Sub-Hi 88-12, Do No. 88-007, with the Research and Studies Division, Program Analysis and Evaluation Directorate of...Toomepuu, Chief, Research and Studies Division, for their helpful counsel. ii TABLE OF CONTENTS PAGE DISCLAIMER ii ACKNOWLEDGMENTS ii LIST OF FIGURES v
Stuart G. Baker, 2017 Introduction This software computes meta-analysis and extrapolation estimates for an instrumental variable meta-analysis of randomized trial or before-and-after studies (the latter also known as the paired availability design). The software also checks on the assumptions if sufficient data are available. |
ALSC 2011 Notable Videos, Recordings & Interactive Software
ERIC Educational Resources Information Center
School Library Journal, 2011
2011-01-01
This article presents the Notable Children's Videos, Recordings, and Interactive Software for Kids lists which are compiled annually by committees of the Association for Library Service to children (ALSC), a division of the American Library Association (ALA). These lists were released in January 2011 at the ALA Midwinter meeting in San Diego,…
An Open Avionics and Software Architecture to Support Future NASA Exploration Missions
NASA Technical Reports Server (NTRS)
Schlesinger, Adam
2017-01-01
The presentation describes an avionics and software architecture that has been developed through NASAs Advanced Exploration Systems (AES) division. The architecture is open-source, highly reliable with fault tolerance, and utilizes standard capabilities and interfaces, which are scalable and customizable to support future exploration missions. Specific focus areas of discussion will include command and data handling, software, human interfaces, communication and wireless systems, and systems engineering and integration.
Composite Linear Models | Division of Cancer Prevention
By Stuart G. Baker The composite linear models software is a matrix approach to compute maximum likelihood estimates and asymptotic standard errors for models for incomplete multinomial data. It implements the method described in Baker SG. Composite linear models for incomplete multinomial data. Statistics in Medicine 1994;13:609-622. The software includes a library of thirty
SEL's Software Process-Improvement Program
NASA Technical Reports Server (NTRS)
Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose
1995-01-01
The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.
Statistical modeling of software reliability
NASA Technical Reports Server (NTRS)
Miller, Douglas R.
1992-01-01
This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.
Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.
1996-01-01
The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,
NASA Technical Reports Server (NTRS)
Waligora, Sharon; Bailey, John; Stark, Mike
1995-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are (1) to understand the software development process in the GSFC environment; (2) to measure the effects of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.
Software process improvement in the NASA software engineering laboratory
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin
1994-01-01
The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.
Clandestine Transmissions and Operations of Embedded Software on Cellular Mobile Devices
2011-09-01
Register EMS Enhanced Message Service FDMA Frequency Division Multiple Access GMT Greenwich Mean Time GMSC Gateway Mobile Switching Center...Message Switching Center SMS-IWMSC SMS-Interworking Mobile-Service Switching Center TCH Traffic Channels TDMA Time Division Multiple Access TP...assume the user will not attempt to re-program the device. Finally, we assume that the owner and user do not have root access and cannot disable any
ERIC Educational Resources Information Center
Dance, Frank E. X.; And Others
This paper reports on the Futuristic Priorities Division members' recommendations and priorities concerning the impact of the future on communication and on the speech communication discipline. The recommendations and priorities are listed for two subgroups: The Communication Needs and Rights of Mankind; and Future Communication Technologies:…
Project management in the development of scientific software
NASA Astrophysics Data System (ADS)
Platz, Jochen
1986-08-01
This contribution is a rough outline of a comprehensive project management model for the development of software for scientific applications. The model was tested in the unique environment of the Siemens AG Corporate Research and Technology Division. Its focal points are the structuring of project content - the so-called phase organization, the project organization and the planning model used, and its particular applicability to innovative projects. The outline focuses largely on actual project management aspects rather than associated software engineering measures.
Precise Interval Timer for Software Defined Radio
NASA Technical Reports Server (NTRS)
Pozhidaev, Aleksey (Inventor)
2014-01-01
A precise digital fractional interval timer for software defined radios which vary their waveform on a packet-by-packet basis. The timer allows for variable length in the preamble of the RF packet and allows to adjust boundaries of the TDMA (Time Division Multiple Access) Slots of the receiver of an SDR based on the reception of the RF packet of interest.
A Multiple-star Combined Solution Program - Application to the Population II Binary μ Cas
NASA Astrophysics Data System (ADS)
Gudehus, D. H.
2001-05-01
A multiple-star combined-solution computer program which can simultaneously fit astrometric, speckle, and spectroscopic data, and solve for the orbital parameters, parallax, proper motion, and masses has been written and is now publicly available. Some features of the program are the ability to scale the weights at run time, hold selected parameters constant, handle up to five spectroscopic subcomponents for the primary and the secondary each, account for the light travel time across the system, account for apsidal motion, plot the results, and write the residuals in position to a standard file for further analysis. The spectroscopic subcomponent data can be represented by reflex velocities and/or by independent measurements. A companion editing program which can manage the data files is included in the package. The program has been applied to the Population II binary μ Cas to derive improved masses and an estimate of the primordial helium abundance. The source code, executables, sample data files, and documentation for OpenVMS and Unix, including Linux, are available at http://www.chara.gsu.edu/\\rlap\\ \\ gudehus/binary.html.
NASA Technical Reports Server (NTRS)
Antoine, Lisa
1992-01-01
An outline of the Project Operations Branch at Goddard Space Flight Center is presented that describes the management of the division and each subgroup's responsibility. The paper further describes the development of software tools for the Macintosh personal computer, and their impending implementation. A detailed step by step procedure is given for using these software tools.
Host computer software specifications for a zero-g payload manhandling simulator
NASA Technical Reports Server (NTRS)
Wilson, S. W.
1986-01-01
The HP PASCAL source code was developed for the Mission Planning and Analysis Division (MPAD) of NASA/JSC, and takes the place of detailed flow charts defining the host computer software specifications for MANHANDLE, a digital/graphical simulator that can be used to analyze the dynamics of onorbit (zero-g) payload manhandling operations. Input and output data for representative test cases are contained.
A Vision on the Status and Evolution of HEP Physics Software Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canal, P.; Elvira, D.; Hatcher, R.
2013-07-28
This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.
Push for Cheese: A Metaphor for Software Usability
NASA Astrophysics Data System (ADS)
Radziwill, Nicole; Shelton, Amy
2005-12-01
At the National Radio Astronomy Observatory's (NRAO) Science Center in Green Bank, W. Va., visitors curious about radio astronomy and the observatory's history and operations will discover an educational, entertaining experience. Employees also visit the science center, but their thoughts are more on afternoon snacks rather than distant galaxies. The employees of NRAO's Software Development Division in Green Bank have gained tremendous insight on the topic of software usability from many visits to the Science Center Café by pontificating upon the wisdom inherent in the design and use of the liquid cheese dispenser there.
What's Happening in the Software Engineering Laboratory?
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Green, Scott; Smith, Donald
1995-01-01
Since 1976 the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. This paper presents an overview of recent activities and studies in SEL, using as a framework the SEL's organizational goals and experience based software improvement approach. It focuses on two SEL experience areas : (1) the evolution of the measurement program and (2) an analysis of three generations of Cleanroom experiments.
NASA Technical Reports Server (NTRS)
Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren
1997-01-01
The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
Software Management for the NOνAExperiment
NASA Astrophysics Data System (ADS)
Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.
2015-12-01
The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.
Towards understanding software: 15 years in the SEL
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose
1990-01-01
For 15 years, the Software Engineering Laboratory (SEL) at GSFC has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software, and software processes within a production software environment. The SEL comprises three major organizations: (1) the GSFC Flight Dynamics Division; (2) the University of Maryland Computer Science Department; and (3) the Computer Sciences Corporation Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents: all describing some aspect of the software engineering technology that has undergone analysis in the flight dynamics environment. The studies range from small controlled experiments (such as analyzing the effectiveness of code reading versus functional testing) to large, multiple-project studies (such as assessing the impacts of Ada on a production environment). The key findings that NASA feels have laid the foundation for ongoing and future software development and research activities are summarized.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
.... Div., Back Office Customer Support, Primary Services & Inceed. 81,972 Pharmetrics, An IMS Health... Constellation Homebuilder Redmond, WA September 14, 2011. Systems, Fast Division, Constellation Software, Inc...
Conceptual design for the National Water Information System
Edwards, Melvin D.; Putnam, Arthur L.; Hutchison, Norman E.
1986-01-01
The Water Resources Division of the U.S. Geological Survey began the design and development of a National Water Information System (NWIS) in 1983. The NWIS will replace and integrate the existing data systems of the National Water Data Storage and Retrieval System, National Water Data Exchange, National Water-Use Information Program, and Water Resources Scientific Information Center. The NWIS has been designed as an interactive, distributed data system. The software system has been designed in a modular manner which integrates existing software functions and allows multiple use of software modules. The data base has been designed as a relational data model that allows integrated storage of the existing water data, water-use data, and water-data indexing information by using a common relational data base management system. The NWIS will be operated on microcomputers located in each of the Water Resources Division's District offices and many of its State, subdistrict, and field offices. The microcomputers will be linked together through a national telecommunication network maintained by the U. S. Geological Survey. The NWIS is scheduled to be placed in operation in 1990.
Commercial Mobile Alert Service (CMAS) Scenarios
2012-05-01
Commercial Mobile Alert Service (CMAS) Scenarios The WEA Project Team May 2012 SPECIAL REPORT CMU/SEI-2012-SR-020 CERT® Division, Software ...Homeland Security under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally...DISTRIBUTES IT “AS IS.” References herein to any specific commercial product, process, or service by trade name, trade mark, manufacturer, or otherwise
Intelligent and Adaptive Interface (IAI) for Cognitive Cockpit (CC)
2004-03-31
goals3 and plans and generating system plans would be incorporated as task knowledge. The Dialogue Model, which is currently undeveloped in LOCATE...pieces of software. Modularity can also serve to improve the organisational effectiveness of software, whereby a suitable division of labour among...a sophisticated tool in support of future combat aircraft acquisition. While CA can monitor similar activities in countries like the UK and USA we
The Impact of Ada and Object-Oriented Design in NASA Goddard's Flight Dynamics Division
NASA Technical Reports Server (NTRS)
Waligora, Sharon; Bailey, John; Stark, Mike
1996-01-01
This paper presents the highlights and key findings of 10 years of use and study of Ada and object-oriented design in NASA Goddard's Flight Dynamics Division (FDD). In 1985, the Software Engineering Laboratory (SEL) began investigating how the Ada language might apply to FDD software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, the FDD would fully transition its entire development organization from FORTRAN to Ada within 10 years. However, 10 years later, the FDD still produced 80 percent of its software in FORTRAN and had begun using C and C++, despite positive results on Ada projects. This paper presents the final results of a SEL study to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. The detailed results of this study were published in a formal study report in March of 1995. This paper supersedes the preliminary results of this study that were presented at the Eighteenth Annual Software Engineering Workshop in 1993.
SLS Flight Software Testing: Using a Modified Agile Software Testing Approach
NASA Technical Reports Server (NTRS)
Bolton, Albanie T.
2016-01-01
NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner. Through agile testing, there has been increased value through individuals and interactions over processes and tools, improved customer collaboration, and improved responsiveness to changes through controlled planning. The presentation will describe agile testing methodology as taken with the SLS FSW Test and Verification team at Marshall Space Flight Center.
ERIC Educational Resources Information Center
Bird, Bruce
This paper discusses the development of two World Wide Web sites at Anne Arundel Community College (Maryland). The criteria for the selection of hardware and software for Web site development that led to the decision to use Microsoft FrontPage 98 are described along with its major components and features. The discussion of the Science Division Web…
1983-06-01
LOSARDO Project Engineer APPROVED: .MARMCINIhI, Colonel. USAF Chief, Coaud and Control Division FOR THE CCOaIDKR: Acting Chief, Plea Off ice * **711...WORK UNIT NUMBERS General Dynamics Corporation 62702F Data Systems Division P 0 Box 748, Fort Worth TX 76101 55811829 I1. CONTROLLING OFFICE NAME AND...Processing System for 29 the Operation/Direction Center(s) 4-3 Distribution of Processing Control 30 for the Operation/Direction Center(s) 4-4 Generalized
1980-11-01
Systems: A Raytheon Project History", RADC-TR-77-188, Final Technical Report, June 1977. 4. IBM Federal Systems Division, "Statistical Prediction of...147, June 1979. 4. W. D. Brooks, R. W. Motley, "Analysis of Discrete Software Reliability Models", IBM Corp., RADC-TR-80-84, RADC, New York, April 1980...J. C. King of IBM (Reference 9) and Lori A. Clark (Reference 10) of the University of Massachusetts. Programs, so exercised must be augmented so they
Federal COBOL Compiler Testing Service Compiler Validation Request Information.
1977-05-09
background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the
Maximizing reuse: Applying common sense and discipline
NASA Technical Reports Server (NTRS)
Waligora, Sharon; Langston, James
1992-01-01
Computer Sciences Corporation (CSC)/System Sciences Division (SSD) has maintained a long-term relationship with NASA/Goddard, providing satellite mission ground-support software and services for 23 years. As a partner in the Software Engineering Laboratory (SEL) since 1976, CSC has worked closely with NASA/Goddard to improve the software engineering process. This paper examines the evolution of reuse programs in this uniquely stable environment and formulates certain recommendations for developing reuse programs as a business strategy and as an integral part of production. It focuses on the management strategy and philosophy that have helped make reuse successful in this environment.
Pulsed acoustic vortex sensing system volume III: PAVSS operation and software documentation
DOT National Transportation Integrated Search
1977-06-01
Avco Corporation's Systems Division designed and developed an engineered Pulsed Acoustic Vortex Sensing System (PAVSS). This system is capable of real-time detection, tracking, recording, and graphic display of aircraft trailing vortices. This volume...
Best Manufacturing Practices Survey Conducted at Litton Data Systems Division, Van Nuys, California
1988-10-01
Hardware and Software ................................ 10 DESIGN RELEASE Engineering Change Order Processing and Analysis...structured using bridges to isolate local traffic. Long term plans call for a wide-band network. ENGINEERING CHANGE ORDER PROCESSING AND ANALYSIS
By Stuart G. Baker, 2017 Introduction This software fits a zero-intercept random effects linear model to data on surrogate and true endpoints in previous trials. Requirement: Mathematica Version 11 or later. |
eCDL integration with commercial skills test information system (CSTIMS)
DOT National Transportation Integrated Search
2012-11-30
In coordination with the West Virginia Division of Motor Vehicles (WVDMV), the Rahall Transportation Institute (RTI) integrated the eCDL program with the CSTIMS, a software program owned by the American Motor Vehicles Administrators Association (AAMV...
NASA Technical Reports Server (NTRS)
1983-01-01
Drones, subscale vehicles like the Firebees, and full scale retired military aircraft are used to test air defense missile systems. The DFCS (Drone Formation Control System) computer, developed by IBM (International Business Machines) Federal Systems Division, can track ten drones at once. A program called ORACLS is used to generate software to track and control Drones. It was originally developed by Langley and supplied by COSMIC (Computer Software Management and Information Center). The program saved the company both time and money.
1988-06-30
casting. 68 Figure 1-9: Line printer representation of roll solidification. 69 Figure I1-1: Test casting model. 76 Figure 11-2: Division of test casting...writing new casting analysis and design routines. The new routines would take advantage of advanced criteria for predicting casting soundness and cast...properties and technical advances in computer hardware and software. 11 2. CONCLUSIONS UPCAST, a comprehensive software package, has been developed for
Vehicle management and mission planning systems with shuttle applications
NASA Technical Reports Server (NTRS)
1972-01-01
A preliminary definition of a concept for an automated system is presented that will support the effective management and planning of space shuttle operations. It is called the Vehicle Management and Mission Planning System (VMMPS). In addition to defining the system and its functions, some of the software requirements of the system are identified and a phased and evolutionary method is recommended for software design, development, and implementation. The concept is composed of eight software subsystems supervised by an executive system. These subsystems are mission design and analysis, flight scheduler, launch operations, vehicle operations, payload support operations, crew support, information management, and flight operations support. In addition to presenting the proposed system, a discussion of the evolutionary software development philosophy that the Mission Planning and Analysis Division (MPAD) would propose to use in developing the required supporting software is included. A preliminary software development schedule is also included.
NASA Technical Reports Server (NTRS)
Mahmot, Ron; Koslosky, John T.; Beach, Edward; Schwarz, Barbara
1994-01-01
The Mission Operations Division (MOD) at Goddard Space Flight Center builds Mission Operations Centers which are used by Flight Operations Teams to monitor and control satellites. Reducing system life cycle costs through software reuse has always been a priority of the MOD. The MOD's Transportable Payload Operations Control Center development team established an extensive library of 14 subsystems with over 100,000 delivered source instructions of reusable, generic software components. Nine TPOCC-based control centers to date support 11 satellites and achieved an average software reuse level of more than 75 percent. This paper shares experiences of how the TPOCC building blocks were developed and how building block developer's, mission development teams, and users are all part of the process.
WILDFIRE IGNITION RESISTANCE ESTIMATOR WIZARD SOFTWARE DEVELOPMENT REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, M.; Robinson, C.; Gupta, N.
2012-10-10
This report describes the development of a software tool, entitled “WildFire Ignition Resistance Estimator Wizard” (WildFIRE Wizard, Version 2.10). This software was developed within the Wildfire Ignition Resistant Home Design (WIRHD) program, sponsored by the U. S. Department of Homeland Security, Science and Technology Directorate, Infrastructure Protection & Disaster Management Division. WildFIRE Wizard is a tool that enables homeowners to take preventive actions that will reduce their home’s vulnerability to wildfire ignition sources (i.e., embers, radiant heat, and direct flame impingement) well in advance of a wildfire event. This report describes the development of the software, its operation, its technicalmore » basis and calculations, and steps taken to verify its performance.« less
An application of machine learning to the organization of institutional software repositories
NASA Technical Reports Server (NTRS)
Bailin, Sidney; Henderson, Scott; Truszkowski, Walt
1993-01-01
Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.
17 CFR 39.18 - System safeguards.
Code of Federal Regulations, 2012 CFR
2012-04-01
... physical infrastructure or personnel necessary for it to conduct activities necessary to the clearing and... transportation, telecommunications, power, water, or other critical infrastructure components in a relevant area... Division of Clearing and Risk promptly of: (1) Any hardware or software malfunction, cyber security...
17 CFR 39.18 - System safeguards.
Code of Federal Regulations, 2014 CFR
2014-04-01
... physical infrastructure or personnel necessary for it to conduct activities necessary to the clearing and... transportation, telecommunications, power, water, or other critical infrastructure components in a relevant area... Division of Clearing and Risk promptly of: (1) Any hardware or software malfunction, cyber security...
17 CFR 39.18 - System safeguards.
Code of Federal Regulations, 2013 CFR
2013-04-01
... physical infrastructure or personnel necessary for it to conduct activities necessary to the clearing and... transportation, telecommunications, power, water, or other critical infrastructure components in a relevant area... Division of Clearing and Risk promptly of: (1) Any hardware or software malfunction, cyber security...
THE U.S. ENVIRONMENTAL PROTECTION AGENCY VISUAL PLUMES MODELING SOFTWARE
The U.S. Environmental Protection Agency's Center for Exposure Assessment Modeling (CEAM) at the Ecosystems Research Division in Athens, Georgia develops environmental exposure models, including plume models, and provides technical assistance to model users. The mixing zone and f...
Lessons learned in transitioning to an open systems environment
NASA Technical Reports Server (NTRS)
Boland, Dillard E.; Green, David S.; Steger, Warren L.
1994-01-01
Software development organizations, both commercial and governmental, are undergoing rapid change spurred by developments in the computing industry. To stay competitive, these organizations must adopt new technologies, skills, and practices quickly. Yet even for an organization with a well-developed set of software engineering models and processes, transitioning to a new technology can be expensive and risky. Current industry trends are leading away from traditional mainframe environments and toward the workstation-based, open systems world. This paper presents the experiences of software engineers on three recent projects that pioneered open systems development for NASA's Flight Dynamics Division of the Goddard Space Flight Center (GSFC).
Mobile Applications and Multi-User Virtual Reality Simulations
NASA Technical Reports Server (NTRS)
Gordillo, Orlando Enrique
2016-01-01
This is my third internship with NASA and my second one at the Johnson Space Center. I work within the engineering directorate in ER7 (Software Robotics and Simulations Division) at a graphics lab called IGOAL. We are a very well-rounded lab because we have dedicated software developers and dedicated 3D artist, and when you combine the two, what you get is the ability to create many different things such as interactive simulations, 3D models, animations, and mobile applications.
McDonald, James E; Kessler, Marcus M; Hightower, Jeremy L; Henry, Susan D; Deloney, Linda A
2013-12-01
With increasing volumes of complex imaging cases and rising economic pressure on physician staffing, timely reporting will become progressively challenging. Current and planned iterations of PACS and electronic medical record systems do not offer workflow management tools to coordinate delivery of imaging interpretations with the needs of the patient and ordering physician. The adoption of a server-based enterprise collaboration software system by our Division of Nuclear Medicine has significantly improved our efficiency and quality of service.
Argonne National Laboratory High Energy Physics Division Windows Desktops Problem Report Service Request Password Help New Users Back to HEP Computing Email on ANL Exchange: See Windows Clients section (Outlook or Thunderbird recommended) Web Browsers: Web Browsers for Windows Desktops Software: Available
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
NASA Technical Reports Server (NTRS)
Clark, David A.
1998-01-01
In light of the escalation of terrorism, the Department of Defense spearheaded the development of new antiterrorist software for all Government agencies by issuing a Broad Agency Announcement to solicit proposals. This Government-wide competition resulted in a team that includes NASA Lewis Research Center's Computer Services Division, who will develop the graphical user interface (GUI) and test it in their usability lab. The team launched a program entitled Joint Sphere of Security (JSOS), crafted a design architecture (see the following figure), and is testing the interface. This software system has a state-ofthe- art, object-oriented architecture, with a main kernel composed of the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS will be used as the software "breadboard" for assembling the components of explosions, such as blast and collapse simulations.
Impact of Requirements Quality on Project Success or Failure
NASA Astrophysics Data System (ADS)
Tamai, Tetsuo; Kamata, Mayumi Itakura
We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.
Developing Avionics Hardware and Software for Rocket Engine Testing
NASA Technical Reports Server (NTRS)
Aberg, Bryce Robert
2014-01-01
My summer was spent working as an intern at Kennedy Space Center in the Propulsion Avionics Branch of the NASA Engineering Directorate Avionics Division. The work that I was involved with was part of Rocket University's Project Neo, a small scale liquid rocket engine test bed. I began by learning about the layout of Neo in order to more fully understand what was required of me. I then developed software in LabView to gather and scale data from two flowmeters and integrated that code into the main control software. Next, I developed more LabView code to control an igniter circuit and integrated that into the main software, as well. Throughout the internship, I performed work that mechanics and technicians would do in order to maintain and assemble the engine.
NASA Technical Reports Server (NTRS)
Doland, Jerry; Valett, Jon
1994-01-01
This document discusses recommended practices and style for programmers using the C language in the Flight Dynamics Division environment. Guidelines are based on generally recommended software engineering techniques, industry resources, and local convention. The Guide offers preferred solutions to common C programming issues and illustrates through examples of C Code.
Cicconet, Marcelo; Gutwein, Michelle; Gunsalus, Kristin C; Geiger, Davi
2014-08-01
In this paper we report a database and a series of techniques related to the problem of tracking cells, and detecting their divisions, in time-lapse movies of mammalian embryos. Our contributions are (1) a method for counting embryos in a well, and cropping each individual embryo across frames, to create individual movies for cell tracking; (2) a semi-automated method for cell tracking that works up to the 8-cell stage, along with a software implementation available to the public (this software was used to build the reported database); (3) an algorithm for automatic tracking up to the 4-cell stage, based on histograms of mirror symmetry coefficients captured using wavelets; (4) a cell-tracking database containing 100 annotated examples of mammalian embryos up to the 8-cell stage; and (5) statistical analysis of various timing distributions obtained from those examples. Copyright © 2014 Elsevier Ltd. All rights reserved.
Impact of Ada in the Flight Dynamics Division: Excitement and frustration
NASA Technical Reports Server (NTRS)
Bailey, John; Waligora, Sharon; Stark, Mike
1993-01-01
In 1985, NASA Goddard's Flight Dynamics Division (FDD) began investigating how the Ada language might apply to their software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, they would fully transition their entire development organization from FORTRAN to Ada within 10 years. However, nearly 9 years later, the FDD still produces 80 percent of its software in FORTRAN, despite positive results on Ada projects. This paper reports preliminary results of an ongoing study, commissioned by the FDD, to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. This paper is the first public report on the Ada assessment study, which will conclude with a comprehensive final report in mid 1994.
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Enhancement of computer system for applications software branch
NASA Technical Reports Server (NTRS)
Bykat, Alex
1987-01-01
Presented is a compilation of the history of a two-month project concerned with a survey, evaluation, and specification of a new computer system for the Applications Software Branch of the Software and Data Management Division of Information and Electronic Systems Laboratory of Marshall Space Flight Center, NASA. Information gathering consisted of discussions and surveys of branch activities, evaluation of computer manufacturer literature, and presentations by vendors. Information gathering was followed by evaluation of their systems. The criteria of the latter were: the (tentative) architecture selected for the new system, type of network architecture supported, software tools, and to some extent the price. The information received from the vendors, as well as additional research, lead to detailed design of a suitable system. This design included considerations of hardware and software environments as well as personnel issues such as training. Design of the system culminated in a recommendation for a new computing system for the Branch.
The optimal community detection of software based on complex networks
NASA Astrophysics Data System (ADS)
Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong
2016-02-01
The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.
The Software Engineering Laboratory: An operational software experience factory
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon
1992-01-01
For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.
Reuse at the Software Productivity Consortium
NASA Technical Reports Server (NTRS)
Weiss, David M.
1989-01-01
The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.
A software framework for developing measurement applications under variable requirements.
Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano
2012-11-01
A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.
NASA Technical Reports Server (NTRS)
Stensrud, Kjell C.; Hamm, Dustin
2007-01-01
NASA's Johnson Space Center (JSC) / Flight Design and Dynamics Division (DM) has prototyped the use of Open Source middleware technology for building its next generation spacecraft mission support system. This is part of a larger initiative to use open standards and open source software as building blocks for future mission and safety critical systems. JSC is hoping to leverage standardized enterprise architectures, such as Java EE, so that its internal software development efforts can be focused on the core aspects of their problem domain. This presentation will outline the design and implementation of the Trajectory system and the lessons learned during the exercise.
Estimating the Overdiagnosis Fraction in Cancer Screening | Division of Cancer Prevention
By Stuart G. Baker, 2017 Introduction This software supports the mathematical investigation into estimating the fraction of cancers detected on screening that are overdiagnosed. References Baker SG and Prorok PC. Estimating the overdiagnosis fraction in cancer screening. Requirement Mathematica Version 11 or later. |
NASA Technical Reports Server (NTRS)
Stephens, J. Briscoe; Grider, Gary W.
1992-01-01
These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.
[Stress management in large-scale establishments].
Fukasawa, Kenji
2002-07-01
Due to a recent dramatic change in industrial structures in Japan, the role of large-scale enterprises is changing. Mass production used to be the major income sources of companies, but nowadays it has changed to high value-added products, including, software development. As a consequence of highly competitive inter-corporate development, there are various sources of job stress which induce health problems in employees, especially those concerned with development or management. To simply to obey the law or offer medical care are not enough to achieve management of these problems. Occupational health staff need to act according to the disease type and provide care with support from the Supervisor and Personnel Division. And for the training, development and consultation system, occupational health staff must work with the Personnel Division and Safety Division, and be approved by management supervisors.
Shared-resource computing for small research labs.
Ackerman, M J
1982-04-01
A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.
Numerical simulation study on the distribution law of smoke flow velocity in horizontal tunnel fire
NASA Astrophysics Data System (ADS)
Liu, Yejiao; Tian, Zhichao; Xue, Junhua; Wang, Wencai
2018-02-01
According to the fluid similarity theory, the simulation experiment system of mining tunnel fire is established. The grid division of experimental model roadway is carried on by GAMBIT software. By setting the boundary and initial conditions of smoke flow during fire period in FLUENT software, using RNG k-Ɛ two-equation turbulence model, energy equation and SIMPLE algorithm, the steady state numerical simulation of smoke flow velocity in mining tunnel is done to obtain the distribution law of smoke flow velocity in tunnel during fire period.
Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... Morrisville, NC........ July 12, 2009. Headquarters. 74,407 Progress Software Austin, TX July 12, 2009.... Subject firm Location Impact date 74,153 Freescale Semiconductor, Austin, TX Inc., Quality Division..., Inc., Red Bank, NJ March 16, 2009. Research & Development, and General & Administrative, ESW, etc. 73...
Distributed Collaborative Homework Activities in a Problem-Based Usability Engineering Course
ERIC Educational Resources Information Center
Carroll, John M.; Jiang, Hao; Borge, Marcela
2015-01-01
Teams of students in an upper-division undergraduate Usability Engineering course used a collaborative environment to carry out a series of three distributed collaborative homework assignments. Assignments were case-based analyses structured using a jigsaw design; students were provided a collaborative software environment and introduced to a…
ERIC Educational Resources Information Center
Oskooie, Kamran Rezai
2012-01-01
This exploratory mixed methods study quantified and explored leadership interest in legacy-data conversion and information processing. Questionnaires were administered electronically to 92 individuals in design, manufacturing, and other professions from the manufacturing, processing, Internet, computing, software and technology divisions. Research…
Changes and challenges in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Pajerski, Rose
1994-01-01
Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD), develops, maintains, and manages complex flight dynamics systems. The SEL is composed of three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation. During the past 18 years, the SEL's overall goal has remained the same: to improve the FDD's software products and processes in a measured manner. This requires that each development and maintenance effort be viewed, in part, as a SEL experiment which examines a specific technology or builds a model of interest for use on subsequent efforts. The SEL has undertaken many technology studies while developing operational support systems for numerous NASA spacecraft missions.
Pilot/Vehicle display development from simulation to flight
NASA Technical Reports Server (NTRS)
Dare, Alan R.; Burley, James R., II
1992-01-01
The Pilot Vehicle Interface Group, Cockpit Technology Branch, Flight Management Division, at the NASA Langley Research Center is developing display concepts for air combat in the next generation of highly maneuverable aircraft. The High-Alpha Technology Program, under which the research is being done, is involved in flight tests of many new control and display concepts on the High-Alpha Research Vehicle, a highly modified F-18 aircraft. In order to support display concept development through flight testing, a software/hardware system is being developed which will support each phase of the project with little or no software modifications, thus saving thousands of manhours in software development time. Simulation experiments are in progress now and flight tests are slated to begin in FY1994.
Sato, Kuniya; Ooba, Masahiro; Takagi, Tomohiko; Furukawa, Zengo; Komiya, Seiichi; Yaegashi, Rihito
2013-12-01
Agile software development gains requirements from the direct discussion with customers and the development staff each time, and the customers evaluate the appropriateness of the requirement. If the customers divide the complicated requirement into individual requirements, the engineer who is in charge of software development can understand it easily. This is called division of requirement. However, the customers do not understand how much and how to divide the requirements. This paper proposes the method to divide a complicated requirement into individual requirements. Also, it shows the development of requirement specification editor which can describe individual requirements. The engineer who is in charge of software development can understand requirements easily.
Understanding and Predicting the Process of Software Maintenance Releases
NASA Technical Reports Server (NTRS)
Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.
1996-01-01
One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.
Software Engineering Laboratory Ada performance study: Results and implications
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1992-01-01
The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts.
NASA Technical Reports Server (NTRS)
Pepe, J. T.
1972-01-01
A functional design of software executive system for the space shuttle avionics computer is presented. Three primary functions of the executive are emphasized in the design: task management, I/O management, and configuration management. The executive system organization is based on the applications software and configuration requirements established during the Phase B definition of the Space Shuttle program. Although the primary features of the executive system architecture were derived from Phase B requirements, it was specified for implementation with the IBM 4 Pi EP aerospace computer and is expected to be incorporated into a breadboard data management computer system at NASA Manned Spacecraft Center's Information system division. The executive system was structured for internal operation on the IBM 4 Pi EP system with its external configuration and applications software assumed to the characteristic of the centralized quad-redundant avionics systems defined in Phase B.
Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio
1997-01-01
In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.
Knowledge-based assistance in costing the space station DMS
NASA Technical Reports Server (NTRS)
Henson, Troy; Rone, Kyle
1988-01-01
The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... Austin and Other Cities, TX.... May 16, 2009. Company, Finance Division, Leased Workers and Teleworkers..., Nashville Business Center. 74,113 Serena Software, Bellevue, WA April 29, 2009. Inc., Research and...., Automotive Products Group. 73,473 Westar Wichita Falls, TX. Transportation, Inc. 73,487 Sonnie's Woodbury, MN...
Using the Multi-Display Teaching System to Lower Cognitive Load
ERIC Educational Resources Information Center
Cheng, Tsung-Sheng; Lu, Yu-Chun; Yang, Chu-Sing
2015-01-01
Multimedia plays a vital role in both learning systems and the actual education process. However, currently used presentation software is often not optimized and generates a great deal of clutter on the screen. Furthermore, there is often insufficient space on a single display, leading to the division of content. These limitations generally…
Efficient Model Posing and Morphing Software
2014-04-01
disclosure of contents or reconstruction of this document. Air Force Research Laboratory 711th Human Performance Wing Human ...Command, Air Force Research Laboratory 711th Human Performance Wing, Human Effectiveness Directorate, Bioeffects Division, Radio Frequency...13. SUPPLEMENTARY NOTES 14. ABSTRACT The absorption of electromagnetic energy within human tissue depends upon anatomical posture and body
Computer Managed Instruction at Arthur Andersen & Company: A Status Report.
ERIC Educational Resources Information Center
Dennis, Verl E.; Gruner, Dennis
1992-01-01
Computer managed instruction (CMI) based on the principle of mastery learning has been cost effective for job training in the tax division of Arthur Andersen & Company. The CMI software system, which uses computerized pretests and posttests to monitor training, has been upgraded from microcomputer use to local area networks. Success factors at…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... software to Anschutz Entertainment Group, Inc., to divest Paciolan, Inc. to Comcast-Spectacor, L.P. or... Justice, Antitrust Division, Antitrust Documents Group, 450 Fifth Street, NW., Suite 1010, Washington, DC... Justice, Hoover Office Building- Second Floor, 1305 East Walnut Street, Des Moines, IA 50319; State of...
Gender DiVisions Across Technology Advertisements and the WWW: Implications for Educational Equity.
ERIC Educational Resources Information Center
Knupfer, Nancy Nelson
1998-01-01
Examines images and patterns of gender stereotypes within mediated and electronic advertisements that reach students online or when viewing computer software and educational television and questions decisions made in the construction of these images. The paper explains the importance of teachers, parents, and the community working together to…
Hutchison, N.E.; Harbaugh, A.W.; Holloway, R.A.; Merk, C.F.
1987-01-01
The Water Resources Division (WRD) of the U.S. Geological Survey is evaluating 32-bit microcomputers to determine how they can complement, and perhaps later replace, the existing network of minicomputers. The WRD is also designing a National Water Information System (NWIS) that will combine and integrate the existing National Water Data Storage and Retrieval System (WATSTORE), National Water Data Exchange (NAWDEX), and components of several other existing systems. The procedures and testing done in a market evaluation of 32-bit microcomputers are documented. The results of the testing are documented in the NWIS Project Office. The market evaluation was done to identify commercially available hardware and software that could be used for implementing early NWIS prototypes to determine the applicability of 32-bit microcomputers for data base and general computing applications. Three microcomputers will be used for these prototype studies. The results of the prototype studies will be used to compile requirements for a Request for Procurement (RFP) for hardware and software to meet the WRD 's needs in the early 1990's. The identification of qualified vendors to provide the prototype hardware and software included reviewing industry literature, and making telephone calls and personal visits to prospective vendors. Those vendors that appeared to meet general requirements were required to run benchmark tests. (Author 's abstract)
The road to business process improvement--can you get there from here?
Gilberto, P A
1995-11-01
Historically, "improvements" within the organization have been frequently attained through automation by building and installing computer systems. Material requirements planning (MRP), manufacturing resource planning II (MRP II), just-in-time (JIT), computer aided design (CAD), computer aided manufacturing (CAM), electronic data interchange (EDI), and various other TLAs (three-letter acronyms) have been used as the methods to attain business objectives. But most companies have found that installing computer software, cleaning up their data, and providing every employee with training on how to best use the systems have not resulted in the level of business improvements needed. The software systems have simply made management around the problems easier but did little to solve the basic problems. The missing element in the efforts to improve the performance of the organization has been a shift in focus from individual department improvements to cross-organizational business process improvements. This article describes how the Electric Boat Division of General Dynamics Corporation, in conjunction with the Data Systems Division, moved its focus from one of vertical organizational processes to horizontal business processes. In other words, how we got rid of the dinosaurs.
NASA Technical Reports Server (NTRS)
Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim
1993-01-01
The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.
NASA Astrophysics Data System (ADS)
Ren, Danping; Wu, Shanshan; Zhang, Lijing
2016-09-01
In view of the characteristics of the global control and flexible monitor of software-defined networks (SDN), we proposes a new optical access network architecture dedicated to Wavelength Division Multiplexing-Passive Optical Network (WDM-PON) systems based on SDN. The network coding (NC) technology is also applied into this architecture to enhance the utilization of wavelength resource and reduce the costs of light source. Simulation results show that this scheme can optimize the throughput of the WDM-PON network, greatly reduce the system time delay and energy consumption.
Software Process Assessment (SPA)
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.
1994-01-01
NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.
Calibration of a COTS Integration Cost Model Using Local Project Data
NASA Technical Reports Server (NTRS)
Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David
1997-01-01
The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.
Multi-modal virtual environment research at Armstrong Laboratory
NASA Technical Reports Server (NTRS)
Eggleston, Robert G.
1995-01-01
One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.
Spaceport Command and Control System Software Development
NASA Technical Reports Server (NTRS)
Glasser, Abraham
2017-01-01
The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This large system requires a large amount of intensive testing that will properly measure the capabilities of the system. Automating the test procedures would save the project money from human labor costs, as well as making the testing process more efficient. Therefore, the Exploration Systems Division (formerly the Electrical Engineering Division) at Kennedy Space Center (KSC) has recruited interns for the past two years to work alongside full-time engineers to develop these automated tests, as well as innovate upon the current automation process.
An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Randal Scott
CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less
ERIC Educational Resources Information Center
Bertozzi, N.; Hebert, C.; Rought, J.; Staniunas, C.
2007-01-01
Over the past decade the software products available for solid modeling, dynamic, stress, thermal, and flow analysis, and computer-aiding manufacturing (CAM) have become more powerful, affordable, and easier to use. At the same time it has become increasingly important for students to gain concurrent engineering design and systems integration…
Catalog of Wargaming and Military Simulation Models
1989-09-01
and newly developed software models. This system currently (and will in the near term) supports battle force architecture design and evaluation...aborted air refuelings, or replacement aircraft. PLANNED IMPROVEMENTS AND MODIFICATIONS: Completion of model. INPUT: Input fields are required to...vehicle mobility evaluation model). PROPONENT: Mobility Systems Division, Geotechnical Laboratory, U.S. Army Engineer Waterways Experiment Station
An Object-Oriented Software Reuse Tool
1989-04-01
Square Cambridge, MA 02139 I. CONTROLLING OFFICE NAME ANO ADDRESS 12. REPORT DATIE Advanced Research Projects Agency April 1989 1400 Wilson Blvd. IS...Office of Naval Research UNCLASSIFIED Information Systems Arlington, VA 22217 1s,. DECLASSIFICATION/DOWNGRAOINGSCHEDUL.E 6. O:STRIILJTION STATEMENT (of...DISTRIBUTION: Defense Technical Information Center Computer Sciences Division ONR, Code 1133 Navy Center for Applied Research in Artificial
Plotview Software For Retrieving Plot-Level Imagery and GIS Data Over The Web
Ken Boss
2001-01-01
The Minnesota Department of Natural Resources Division of Forestry Resource Assessment office has been cooperating with both the Forest Service's FIA and Natural Resource Conservation Services's NRI inventory programs in researching methods to more tightly integrate the two programs. One aspect of these ongoing efforts has been to develop a prototype intranet...
Building an experience factory for maintenance
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.; Briand, Lionel; Kim, Yong-Mi; Basili, Victor R.
1994-01-01
This paper reports the preliminary results of a study of the software maintenance process in the Flight Dynamics Division (FDD) of the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC). This study is being conducted by the Software Engineering Laboratory (SEL), a research organization sponsored by the Software Engineering Branch of the FDD, which investigates the effectiveness of software engineering technologies when applied to the development of applications software. This software maintenance study began in October 1993 and is being conducted using the Quality Improvement Paradigm (QIP), a process improvement strategy based on three iterative steps: understanding, assessing, and packaging. The preliminary results represent the outcome of the understanding phase, during which SEL researchers characterized the maintenance environment, product, and process. Findings indicate that a combination of quantitative and qualitative analysis is effective for studying the software maintenance process, that additional measures should be collected for maintenance (as opposed to new development), and that characteristics such as effort, error rate, and productivity are best considered on a 'release' basis rather than on a project basis. The research thus far has documented some basic differences between new development and software maintenance. It lays the foundation for further application of the QIP to investigate means of improving the maintenance process and product in the FDD.
Stiltner, G.J.
1990-01-01
In 1987, the Water Resources Division of the U.S. Geological Survey undertook three pilot projects to evaluate electronic report processing systems as a means to improve the quality and timeliness of reports pertaining to water resources investigations. The three projects selected for study included the use of the following configuration of software and hardware: Ventura Publisher software on an IBM model AT personal computer, PageMaker software on a Macintosh computer, and FrameMaker software on a Sun Microsystems workstation. The following assessment criteria were to be addressed in the pilot studies: The combined use of text, tables, and graphics; analysis of time; ease of learning; compatibility with the existing minicomputer system; and technical limitations. It was considered essential that the camera-ready copy produced be in a format suitable for publication. Visual improvement alone was not a consideration. This report consolidates and summarizes the findings of the electronic report processing pilot projects. Text and table files originating on the existing minicomputer system were successfully transformed to the electronic report processing systems in American Standard Code for Information Interchange (ASCII) format. Graphics prepared using a proprietary graphics software package were transferred to all the electronic report processing software through the use of Computer Graphic Metafiles. Graphics from other sources were entered into the systems by scanning paper images. Comparative analysis of time needed to process text and tables by the electronic report processing systems and by conventional methods indicated that, although more time is invested in creating the original page composition for an electronically processed report , substantial time is saved in producing subsequent reports because the format can be stored and re-used by electronic means as a template. Because of the more compact page layouts, costs of printing the reports were 15% to 25% less than costs of printing the reports prepared by conventional methods. Because the largest report workload in the offices conducting water resources investigations is preparation of Water-Resources Investigations Reports, Open-File Reports, and annual State Data Reports, the pilot studies only involved these projects. (USGS)
Process maturity progress at Motorola Cellular Systems Division
NASA Technical Reports Server (NTRS)
Borgstahl, Ron; Criscione, Mark; Dobson, Kim; Willey, Allan
1994-01-01
We believe that the key success elements are related to our recognition that Software Process Improvement (SPI) can and should be organized, planned, managed, and measured as if it were a project to develop a new process, analogous to a software product. We believe that our process improvements have come as the result of these key elements: use of a rigorous, detailed requirements set (Capability Maturity Model, CMM); use of a robust, yet flexible architecture (IEEE 1074); use of a SPI project, resourced and managed like other work, to produce the specifications and implement them; and development of both internal and external goals, with metrics to support them.
A code inspection process for security reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; /Fermilab
2009-05-01
In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application andmore » their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.« less
A code inspection process for security reviews
NASA Astrophysics Data System (ADS)
Garzoglio, Gabriele
2010-04-01
In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.
NASA Astrophysics Data System (ADS)
Spitznagel, J. A.; Wood, Susan
1988-08-01
The Software Engineering institute is a federally funded research and development center sponsored by the Department of Defense (DOD). It was chartered by the Undersecretary of Defense for Research and Engineering on June 15, 1984. The SEI was established and is operated by Carnegie Mellon University (CUM) under contract F19628-C-0003, which was competitively awarded on December 28, 1984, by the Air Force Electronic Systems Division. The mission of the SEI is to provide the means to bring the ablest minds and the most effective technology to bear on the rapid improvement of the quality of operational software in mission-critical computer systems; to accelerate the reduction to practice of modern software engineering techniques and methods; to promulgate the use of modern techniques and methods throughout the mission-critical systems community; and to establish standards of excellence for the practice of software engineering. This report provides a summary of the programs and projects, staff, facilities, and service accomplishments of the Software Engineering Institute during 1987.
Recovery practices in Division 1 collegiate athletes in North America.
Murray, Andrew; Fullagar, Hugh; Turner, Anthony P; Sproule, John
2018-05-08
Establish current practice and attitudes towards recovery in a group of Division-1 Collegiate athletes from North America. A 16-item questionnaire was administered via custom software in an electronic format. 152 student athletes from a Division-1 Collegiate school across 3 sports (Basketball, American Football, Soccer). The approaches and attitudes to recovery in both training and competition. Sleep, cold water immersion (CWI) and nutrition were perceived to be the most effective modalities (88, 84 and 80% of the sample believed them to have a benefit respectively). Over half the sample did not believe in using compression for recovery. With regard to actual usage, CWI was the most used recovery modality and matched by athletes believing in, and using, the approach (65%). Only 24% of student athletes believed in, and used, sleep as a recovery modality despite it being rated and perceived as the most effective. Collectively, there is a discrepancy between perception and use of recovery modalities in Collegiate athletes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fault-tolerant reactor protection system
Gaubatz, Donald C.
1997-01-01
A reactor protection system having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Each division performs independently of the others (asynchronous operation). All communications between the divisions are asynchronous. Each chassis substitutes its own spare sensor reading in the 2/3 vote if a sensor reading from one of the other chassis is faulty or missing. Therefore the presence of at least two valid sensor readings in excess of a set point is required before terminating the output to the hardware logic of a scram inhibition signal even when one of the four sensors is faulty or when one of the divisions is out of service.
Fault-tolerant reactor protection system
Gaubatz, D.C.
1997-04-15
A reactor protection system is disclosed having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Each division performs independently of the others (asynchronous operation). All communications between the divisions are asynchronous. Each chassis substitutes its own spare sensor reading in the 2/3 vote if a sensor reading from one of the other chassis is faulty or missing. Therefore the presence of at least two valid sensor readings in excess of a set point is required before terminating the output to the hardware logic of a scram inhibition signal even when one of the four sensors is faulty or when one of the divisions is out of service. 16 figs.
libvaxdata: VAX data format conversion routines
Baker, Lawrence M.
2005-01-01
libvaxdata provides a collection of routines for converting numeric data-integer and floating-point-to and from the formats used on a Digital Equipment Corporation1 (DEC) VAX 32-bit minicomputer (Brunner, 1991). Since the VAX numeric data formats are inherited from those used on a DEC PDP-11 16-bit minicomputer, these routines can be used to convert PDP-11 data as well. VAX numeric data formats are also the default data formats used on DEC Alpha 64-bit minicomputers running OpenVMS The libvaxdata routines are callable from Fortran or C. They require that the caller use two's-complement format for integer data and IEEE 754 format (ANSI/IEEE, 1985) for floating-point data. They also require that the 'natural' size of a C int type (integer) is 32 bits. That is the case for most modern 32-bit and 64-bit computer systems. Nevertheless, you may wish to consult the Fortran or C compiler documentation on your system to be sure. Some Fortran compilers support conversion of VAX numeric data on-the-fly when reading or writing unformatted files, either as a compiler option or a run-time I/O option. This feature may be easier to use than the libvaxdata routines. Consult the Fortran compiler documentation on your system to determine if this alternative is available to you. 1Later Compaq Computer Corporation, now Hewlett-Packard Company
Software Sharing Enables Smarter Content Management
NASA Technical Reports Server (NTRS)
2007-01-01
In 2004, NASA established a technology partnership with Xerox Corporation to develop high-tech knowledge management systems while providing new tools and applications that support the Vision for Space Exploration. In return, NASA provides research and development assistance to Xerox to progress its product line. The first result of the technology partnership was a new system called the NX Knowledge Network (based on Xerox DocuShare CPX). Created specifically for NASA's purposes, this system combines Netmark-practical database content management software created by the Intelligent Systems Division of NASA's Ames Research Center-with complementary software from Xerox's global research centers and DocuShare. NX Knowledge Network was tested at the NASA Astrobiology Institute, and is widely used for document management at Ames, Langley Research Center, within the Mission Operations Directorate at Johnson Space Center, and at the Jet Propulsion Laboratory, for mission-related tasks.
NASA Technical Reports Server (NTRS)
Nguyen, Tien Manh
1989-01-01
MT's algorithm was developed as an aid in the design of space telecommunications systems when utilized with simultaneous range/command/telemetry operations. This algorithm provides selection of modulation indices for: (1) suppression of undesired signals to achieve desired link performance margins and/or to allow for a specified performance degradation in the data channel (command/telemetry) due to the presence of undesired signals (interferers); and (2) optimum power division between the carrier, the range, and the data channel. A software program using this algorithm was developed for use with MathCAD software. This software program, called the MT program, provides the computation of optimum modulation indices for all possible cases that are recommended by the Consultative Committee on Space Data System (CCSDS) (with emphasis on the squarewave, NASA/JPL ranging system).
Implementation of Task-Tracking Software for Clinical IT Management.
Purohit, Anne-Maria; Brutscheck, Clemens; Prokosch, Hans-Ulrich; Ganslandt, Thomas; Schneider, Martin
2017-01-01
Often in clinical IT departments, many different methods and IT systems are used for task-tracking and project organization. Based on managers' personal preferences and knowledge about project management methods, tools differ from team to team and even from employee to employee. This causes communication problems, especially when tasks need to be done in cooperation with different teams. Monitoring tasks and resources becomes impossible: there are no defined deliverables, which prevents reliable deadlines. Because of these problems, we implemented task-tracking software which is now in use across all seven teams at the University Hospital Erlangen. Over a period of seven months, a working group defined types of tasks (project, routine task, etc.), workflows, and views to monitor the tasks of the 7 divisions, 20 teams and 340 different IT services. The software has been in use since December 2016.
Developments in Computer Aided Software Maintenance
1974-09-01
thinks in terms of one sub- division until one happens to move to another one. Another study ( Tulving & Pearlstone , 1966) showed that failure to...them directions. (Wortman & Greenberg, 1971; Tulving & Pearlstone , 1966) In either case, hierarchical category formation is an economi- cal...1969.) Tulving , E., and Pearlstone , Z. "Availability Versus Accessibility of Information in Memory for Words." Journal of Verbal Learning and
Generalized Support Software: Domain Analysis and Implementation
NASA Technical Reports Server (NTRS)
Stark, Mike; Seidewitz, Ed
1995-01-01
For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."
NASA Astrophysics Data System (ADS)
Sikder, Somali; Ghosh, Shila
2018-02-01
This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.
NASA Technical Reports Server (NTRS)
Rainbolt, Phillip
2016-01-01
For the duration of my internship here at JSC for the summer 2016 session, the main project that I worked on dealt with hybrid reality simulations of the ISS. As an ER6 intern for the spacecraft software division, the main project that I worked alongside others was with regards to the Holodeck Virtual Reality Project, specifically with the ISS experience, with the use of the HTC Vive and controllers.
Implementation and Testing of the JANUS Standard with SSC Pacific’s Software-Defined Acoustic Modem
2017-12-01
Communications Outpost (FDECO) Innovative Naval Prototype (INP) Program by the Advanced Photonic Technologies Branch (Code 55360), Space and Naval Warfare... Communications and Networks Division iii EXECUTIVE SUMMARY This report presents Space and Naval Warfare (SPAWAR) Systems Center Pacific’s (SSC... Frequency -Hopped Binary Frequency Shift Keying Office of Naval Research Innovative Naval Prototype Forward Deployed Energy and Communications Outpost
Efficient Craig Interpolation for Linear Diophantine (Dis)Equations and Linear Modular Equations
2008-02-01
Craig interpolants has enabled the development of powerful hardware and software model checking techniques. Efficient algorithms are known for computing...interpolants in rational and real linear arithmetic. We focus on subsets of integer linear arithmetic. Our main results are polynomial time algorithms ...congruences), and linear diophantine disequations. We show the utility of the proposed interpolation algorithms for discovering modular/divisibility predicates
Netbook - A Toolset in Support of a Collaborative Learning.
1997-01-31
Netbook is a software development research project being conducted for the DARPA Computer Aided Training Initiative (CEATI). As a part of the Smart...Navigators to Access and Integrated Resources (SNAIR) division of CEATI, Netbook concerns itself with the management of Internet resources. More...specifically, Netbook is a toolset that enables students, teachers, and administrators to navigate the World Wide Web, collect resources found there, index
2012-02-01
UNCLASSIFIED Fuzzing: The State of the Art Richard McNally, Ken Yiu, Duncan Grove and Damien Gerhardy Command, Control, Communications and...Intelligence Division Defence Science and Technology Organisation DSTO–TN–1043 ABSTRACT Fuzzing is an approach to software testing where the system being tested...features of fuzzers and recent advances in their development, in order to discern the current state of the art in fuzzing technologies, and to extrapolate
Customizing graphical user interface technology for spacecraft control centers
NASA Technical Reports Server (NTRS)
Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald
1993-01-01
The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.
Dam Failure Inundation Map Project
NASA Technical Reports Server (NTRS)
Johnson, Carl; Iokepa, Judy; Dahlman, Jill; Michaud, Jene; Paylor, Earnest (Technical Monitor)
2000-01-01
At the end of the first year, we remain on schedule. Property owners were identified and contacted for land access purposes. A prototype software package has been completed and was demonstrated to the Division of Land and Natural Resources (DLNR), National Weather Service (NWS) and Pacific Disaster Center (PDC). A field crew gathered data and surveyed the areas surrounding two dams in Waimea. (A field report is included in the annual report.) Data sensitivity analysis was initiated and completed. A user's manual has been completed. Beta testing of the software was initiated, but not completed. The initial TNK and property owner data collection for the additional test sites on Oahu and Kauai have been initiated.
Software Development Processes Applied to Computational Icing Simulation
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.
1999-01-01
The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.
NASA Astrophysics Data System (ADS)
Belloni, V.; Ravanelli, R.; Nascetti, A.; Di Rita, M.; Mattei, D.; Crespi, M.
2018-05-01
In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC) has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome "La Sapienza"; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome "La Sapienza" and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.
A web implementation: the good and the not-so-good.
Bergsneider, C; Piraino, D; Fuerst, M
2001-06-01
E-commerce, e-mail, e-greeting, e-this, and e-that everywhere you turn there is a new "e" word for an internet or Web application. We, at the Cleveland Clinic Foundation, have been "e-nlightened" and will discuss in this report the implementation of a web-based radiology information system (RIS) in our radiology division or "e-radiology" division. The application, IDXRad Version 10.0 from IDX Corp, Burlington, VT, is in use at the Cleveland Clinic Foundation and has both intranet (for use in Radiology) and internet (referring physician viewing) modules. We will concentrate on the features of using a web browser for the application's front-end, including easy prototyping for screen review, easier mock-ups of demonstrations by vendors and developers, and easier training as more people become web-addicted. Project communication can be facilitated with an internal project web page, and use of the web browser can accommodate quicker turnaround of software upgrades as the software code is centrally located. Compared with other technologies, including client/server, there is a smaller roll out cost when using a standard web browser. However, the new technology requires a change and changes are never implemented without challenges. A seasoned technologist using a legacy system can enter data quicker using function keys than using a graphical user interface and pointing and clicking through a series of pop-up windows. Also, effective use of a web browser depends on intuitive design for it to be easily implemented and accepted by the user. Some software packages will not work on both of the popular web browsers and then are tailored to specific release levels. As computer-based patient records become a standard, patient confidentiality must be enforced. The technical design and application security features that support the web-based software package will be discussed. Also web technologies have their own implementation issues.
Performance Evaluation of Various STL File Mesh Refining Algorithms Applied for FDM-RP Process
NASA Astrophysics Data System (ADS)
Ledalla, Siva Rama Krishna; Tirupathi, Balaji; Sriram, Venkatesh
2018-06-01
Layered manufacturing machines use the stereolithography (STL) file to build parts. When a curved surface is converted from a computer aided design (CAD) file to STL, it results in a geometrical distortion and chordal error. Parts manufactured with this file, might not satisfy geometric dimensioning and tolerance requirements due to approximated geometry. Current algorithms built in CAD packages have export options to globally reduce this distortion, which leads to an increase in the file size and pre-processing time. In this work, different mesh subdivision algorithms are applied on STL file of a complex geometric features using MeshLab software. The mesh subdivision algorithms considered in this work are modified butterfly subdivision technique, loops sub division technique and general triangular midpoint sub division technique. A comparative study is made with respect to volume and the build time using the above techniques. It is found that triangular midpoint sub division algorithm is more suitable for the geometry under consideration. Only the wheel cap part is then manufactured on Stratasys MOJO FDM machine. The surface roughness of the part is measured on Talysurf surface roughness tester.
Improving the Automated Detection and Analysis of Secure Coding Violations
2014-06-01
eliminating software vulnerabilities and other flaws. The CERT Division produces books and courses that foster a security mindset in developers, and...website also provides a virtual machine containing a complete build of the Rosecheckers project on Linux . The Rosecheckers project leverages the...Compass/ROSE6 project developed at Law- rence Livermore National Laboratory. This project provides a high-level API for accessing the abstract syntax tree
Analysis of Multilayered Printed Circuit Boards using Computed Tomography
2014-05-01
complex PCBs that present a challenge for any testing or fault analysis. Set-to- work testing and fault analysis of any electronic circuit require...Electronic Warfare and Radar Division in December 2010. He is currently in Electro- Optic Countermeasures Group. Samuel works on embedded system design...and software optimisation of complex electro-optical systems, including the set to work and characterisation of these systems. He has a Bachelor of
NASA Technical Reports Server (NTRS)
Duncan, Sharon L.
2011-01-01
Enterprise Business Information Services Division (EBIS) supports the Laboratory and its functions through the implementation and support of business information systems on behalf of its business community. EBIS Five Strategic Focus Areas: (1) Improve project estimating, planning and delivery capability (2) Improve maintainability and sustainability of EBIS Application Portfolio (3) Leap forward in IT Leadership (4) Comprehensive Talent Management (5) Continuous IT Security Program. Portfolio Management is a strategy in which software applications are managed as assets
Netbook - A Toolset in Support of a Collaborative and Cooperative Learning Environment.
1996-04-26
Netbook is a software development/research project being conducted for the DARPA computer aided training initiative (CEATI). As a part of the SNAIR...division of CEATI, Netbook concerns itself with the management of Internet resources. More specifically, Netbook is a toolset that allows students...a meaningful way. In addition Netbook provides the capacity for communication with peers and teachers, enabling students to collaborate while engaged
A Taxonomy of Operational Cyber Security Risks Version 2
2014-05-01
2014-TN-006 CERT® Division Unlimited distribution subject to the copyright. http://www.sei.cmu.edu Copyright 2014 Carnegie Mellon University...This material is based upon work funded and supported by DHS DoD under Contract No. FA8721-05- C-0003 with Carnegie Mellon University for the...Schilling Circle, Bldg 1305, 3rd floor Hanscom AFB, MA 01731-2125 NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE
Naval Air Warfare Center Aircraft Division Patent Portfolio
2012-01-01
the use of said composition to protect metal from corrosion and mildew. The composition comprises, in parts by weight, from about 20 to 60 parts of...composition (NAVGUARD™) Abstract: The invention relates to an oleaginous corrosion -inhibiting composition, and the use of said composition to protect...electric motors or actuators of the robotic device to thereby control same. In addition, a computer software program is provided for use in the gesture
RANS Simulations using OpenFOAM Software
2016-01-01
Averaged Navier- Stokes (RANS) simulations is described and illustrated by applying the simpleFoam solver to two case studies; two dimensional flow...to run in parallel over large processor arrays. The purpose of this report is to illustrate and test the use of the steady-state Reynolds Averaged ...Group in the Maritime Platforms Division he has been simulating fluid flow around ships and submarines using finite element codes, Lagrangian vortex
Some issues related to simulation of the tracking and communications computer network
NASA Technical Reports Server (NTRS)
Lacovara, Robert C.
1989-01-01
The Communications Performance and Integration branch of the Tracking and Communications Division has an ongoing involvement in the simulation of its flight hardware for Space Station Freedom. Specifically, the communication process between central processor(s) and orbital replaceable units (ORU's) is simulated with varying degrees of fidelity. The results of investigations into three aspects of this simulation effort are given. The most general area involves the use of computer assisted software engineering (CASE) tools for this particular simulation. The second area of interest is simulation methods for systems of mixed hardware and software. The final area investigated is the application of simulation methods to one of the proposed computer network protocols for space station, specifically IEEE 802.4.
Some issues related to simulation of the tracking and communications computer network
NASA Astrophysics Data System (ADS)
Lacovara, Robert C.
1989-12-01
The Communications Performance and Integration branch of the Tracking and Communications Division has an ongoing involvement in the simulation of its flight hardware for Space Station Freedom. Specifically, the communication process between central processor(s) and orbital replaceable units (ORU's) is simulated with varying degrees of fidelity. The results of investigations into three aspects of this simulation effort are given. The most general area involves the use of computer assisted software engineering (CASE) tools for this particular simulation. The second area of interest is simulation methods for systems of mixed hardware and software. The final area investigated is the application of simulation methods to one of the proposed computer network protocols for space station, specifically IEEE 802.4.
Peterson, Kylee M; Torii, Keiko U
2012-12-31
Imaging in vivo dynamics of cellular behavior throughout a developmental sequence can be a powerful technique for understanding the mechanics of tissue patterning. During animal development, key cell proliferation and patterning events occur very quickly. For instance, in Caenorhabditis elegans all cell divisions required for the larval body plan are completed within six hours after fertilization, with seven mitotic cycles(1); the sixteen or more mitoses of Drosophila embryogenesis occur in less than 24 hr(2). In contrast, cell divisions during plant development are slow, typically on the order of a day (3,4,5) . This imposes a unique challenge and a need for long-term live imaging for documenting dynamic behaviors of cell division and differentiation events during plant organogenesis. Arabidopsis epidermis is an excellent model system for investigating signaling, cell fate, and development in plants. In the cotyledon, this tissue consists of air- and water-resistant pavement cells interspersed with evenly distributed stomata, valves that open and close to control gas exchange and water loss. Proper spacing of these stomata is critical to their function, and their development follows a sequence of asymmetric division and cell differentiation steps to produce the organized epidermis (Fig. 1). This protocol allows observation of cells and proteins in the epidermis over several days of development. This time frame enables precise documentation of stem-cell divisions and differentiation of epidermal cells, including stomata and epidermal pavement cells. Fluorescent proteins can be fused to proteins of interest to assess their dynamics during cell division and differentiation processes. This technique allows us to understand the localization of a novel protein, POLAR(6), during the proliferation stage of stomatal-lineage cells in the Arabidopsis cotyledon epidermis, where it is expressed in cells preceding asymmetric division events and moves to a characteristic area of the cell cortex shortly before division occurs. Images can be registered and streamlined video easily produced using public domain software to visualize dynamic protein localization and cell types as they change over time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramos, Manuel I. Martin
1996-10-07
The goal of this work was to study the behavior of the angular distribution of the electron form the decay of the W boson in a specific rest-frame of the W, the Collins-Soper frame. This thesis consists of four major divisions, each dealing with closely related themes: (a) Physics Background, (b) Description of the Hardware and General Software Tools, (c) Description of the Analysis and Specific Tools, and (d) Results and Conclusions. Each division is comprised of one or more chapters and each chapter is divided into sections and subsections.
2011-01-01
Program Jointly Managed by the USA MRMC, NIH, NASA, and the Juvenile Diabetes Research Foundation and Combat Casualty Care Division, United States Army...were performed in the CP group (p = 0.0003), and nursing staff compliance with CP recommendations was greater (p < 0.0001). Conclusions—Glycemic...enhanced consistency in practice, providing standardization among nursing staff. Keywords Glycemic control; hypoglycemia; computer decision support
Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems
2008-09-01
divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that
Evaluation of customer satisfaction level of different projects.
Das, Nandini; Samanta, Niladri
2005-01-01
Customer satisfaction as the key element for success in business is a major concern for any industry. In this paper we propose a customer satisfaction index using principal component analysis for a software solution company. This index was used as an input to the marketing division to identify their potential customers from their past experience. Since this is a very common problem for any industry, the same approach can be used in similar situations.
2001-12-01
and Lieutenant Namik Kaplan , Turkish Navy. Maj Tiefert’s thesis, “Modeling Control Channel Dynamics of SAAM using NS Network Simulation”, helped lay...DEC99] Deconinck , Dr. ir. Geert, Fault Tolerant Systems, ESAT / Division ACCA , Katholieke Universiteit Leuven, October 1999. [FRE00] Freed...Systems”, Addison-Wesley, 1989. [KAP99] Kaplan , Namik, “Prototyping of an Active and Lightweight Router,” March 1999 [KAT99] Kati, Effraim
Reactor protection system with automatic self-testing and diagnostic
Gaubatz, Donald C.
1996-01-01
A reactor protection system having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Automatic detection and discrimination against failed sensors allows the reactor protection system to automatically enter a known state when sensor failures occur. Cross communication of sensor readings allows comparison of four theoretically "identical" values. This permits identification of sensor errors such as drift or malfunction. A diagnostic request for service is issued for errant sensor data. Automated self test and diagnostic monitoring, sensor input through output relay logic, virtually eliminate the need for manual surveillance testing. This provides an ability for each division to cross-check all divisions and to sense failures of the hardware logic.
Reactor protection system with automatic self-testing and diagnostic
Gaubatz, D.C.
1996-12-17
A reactor protection system is disclosed having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Automatic detection and discrimination against failed sensors allows the reactor protection system to automatically enter a known state when sensor failures occur. Cross communication of sensor readings allows comparison of four theoretically ``identical`` values. This permits identification of sensor errors such as drift or malfunction. A diagnostic request for service is issued for errant sensor data. Automated self test and diagnostic monitoring, sensor input through output relay logic, virtually eliminate the need for manual surveillance testing. This provides an ability for each division to cross-check all divisions and to sense failures of the hardware logic. 16 figs.
Using software to predict occupational hearing loss in the mining industry.
Azman, A S; Li, M; Thompson, J K
2016-01-01
Powerful mining systems typically generate high-level noise that can damage the hearing ability of miners. Engineering noise controls are the most desirable and effective control for overexposure to noise. However, the effects of these noise controls on the actual hearing status of workers are not easily measured. A tool that can provide guidance in assigning workers to jobs based on the noise levels to which they will be exposed is highly desirable. Therefore, the Pittsburgh Mining Research Division (PMRD) of the U.S. National Institute for Occupational Safety and Health (NIOSH) developed a tool to estimate in a systematic way the hearing loss due to occupational noise exposure and to evaluate the effectiveness of developed engineering controls. This computer program is based on the ISO 1999 standard and can be used to estimate the loss of hearing ability caused by occupational noise exposures. In this paper, the functionalities of this software are discussed and several case studies related to mining machinery are presented to demonstrate the functionalities of this software.
1995-05-01
Random House Business Division, 1986. Ishikawa , Kaoru . What is Total Quality Control?: The Japanese Way. En- glewood Cliffs, NJ: Prentice-Hall...by drawing attention to the vital few truly important problems. Cause-and-effect diagrams. Also called fishbone and Ishikawa diagrams due to their...February 1994. 5.2 The Seeds [Aguayo 91] [Crosby 79] [Crosby 92] [Deming 86] [Fellers 92] [Gitlow 87] [Gluckman 93] [Imai 86] [ Ishikawa 85] [Juran
Fungal bis-Naphthopyrones as Inhibitors of Botulinum Neurotoxin Serotype A
2012-04-02
Ashish G. Soman,§ Biren K. Joshi,§ Sara M. Hein,§ Donald T. Wicklow,∥ and Leonard A. Smith*,⊥ †Division of Integrated Toxicology , U.S. Army Medical...of chemicals for bacterial mutagenicity using electrotopological E-state indices and MDL QSAR software. Regul. Toxicol. Pharmacol. 2005, 43, 313−323...12) Feng, J.; Lurati, L.; Ouyang, H.; Robinson, T.; Wang, Y.; Yuan, S.; Young, S. S. Predictive toxicology : Benchmarking molecular descriptors and
Army Corps of Engineers, Southwestern Division, Reservoir Control Center Annual Report 1988
1989-01-01
water control data system. This system includes the equipment and software used for the acquisition, transmission and processing of real-time hydrologic... transmission . The SWD system was installed at the Federal Center in Fort Worth, Texas, in September 1983. This is a Synergetics Model 10C direct Readout Ground...reservoir projects under control of the Department of the Army in the area comprising all of Arkansas and Louisiana and portions of Missouri, Kansas
The right stuff ... meeting your customer needs.
Rubin, P; Carrington, S
1999-11-01
Meeting (and exceeding) your customers' needs is a requirement for competing in the current business world. New tools and techniques must be employed to deal with the rapidly changing global environment. This article describes the success of a global supply chain integration project for a division of a large multinational corporation. A state-of-the-art ERP software package was implemented in conjunction with major process changes to improve the organization's ability to promise and deliver product to their customers.
Airborne Navigation Remote Map Reader Evaluation.
1986-03-01
EVALUATION ( James C. Byrd Intergrated Controls/Displays Branch SAvionics Systems Division Directorate of Avionics Engineering SMarch 1986 Final Report...Resolution 15 3.2 Accuracy 15 3.3 Symbology 15 3.4 Video Standard 18 3.5 Simulator Control Box 18 3.6 Software 18 3.7 Display Performance 21 3.8 Reliability 24...can be selected depending on the detail required and will automatically be presented at his present position. .The French RMR uses a Flying Spot Scanner
NASA Technical Reports Server (NTRS)
Lunsford, Myrtis Leigh
1998-01-01
The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
NASA Technical Reports Server (NTRS)
Turner, B. J. (Principal Investigator); Baumer, G. M.; Myers, W. L.; Sykes, S. G.
1981-01-01
The Forest Pest Management Division (FPMD) of the Pennsylvania Bureau of Forestry has the responsibility for conducting annual surveys of the State's forest lands to accurately detect, map, and appraise forest insect infestations. A standardized, timely, and cost-effective method of accurately surveying forests and their condition should enhance the probability of suppressing infestations. The repetitive and synoptic coverage provided by LANDSAT (formerly ERTS) makes such satellite-derived data potentially attractive as a survey medium for monitoring forest insect damage over large areas. Forest Pest Management Division personnel have expressed keen interest in LANDSAT data and have informally cooperated with NASA/Goddard Space Flight Center (GSFC) since 1976 in the development of techniques to facilitate their use. The results of this work indicate that it may be feasible to use LANDSAT digital data to conduct annual surveys of insect defoliation of hardwood forests.
Generalized Nanosatellite Avionics Testbed Lab
NASA Technical Reports Server (NTRS)
Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt
2015-01-01
The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.
The Office of the Materials Division
NASA Technical Reports Server (NTRS)
Ramsey, amanda J.
2004-01-01
I was assigned to the Materials Division, which consists of the following branches; the Advanced Metallics Branch/5120-RMM, Ceramics Branch/5130-RMC, Polymers Branch/5150-RMP, and the Durability and Protective Coatings Branch/5160-RMD. Mrs. Pamela Spinosi is my assigned mentor. She was assisted by Ms.Raysa Rodriguez/5100-RM and Mrs.Denise Prestien/5100-RM, who are both employed by InDyne, Inc. My primary assignment this past summer was working directly with Ms. Rodriguez, assisting her with setting up the Integrated Financial Management Program (IFMP) 5130-RMC/Branch procedures and logs. These duties consisted of creating various spreadsheets for each individual branch member, which were updated daily. It was not hard to familiarize myself with these duties since this is my second summer working with Ms Rodriguez at NASA Glenn Research Center. RMC ordering laboratory, supplies and equipment for the Basic Materials Laboratory (Building 106) using the IF'MP/Purchase Card (P-card), a NASA-wide software program. I entered into the IFMP/Travel and Requisitions System, new Travel Authorizations for the 5130-RMC Civil Servant Branch Members. I also entered and completed Travel Vouchers for the 5130-RMC Ceramics Branch. I assisted in the Division Office creating new Emergency Contact list for the Materials Division. I worked with Dr. Hugh Gray, the Division Chief, and Dr. Ajay Misra, the 5130-RMC Branch Chief, on priority action items, with a close deadline, for a large NASA Proposal. Another project was working closely with Ms. Rodriguez in organizing and preparing for Dr. Ajay K. Misra's SESCDP (two year detail). This consisted of organizing files, file folders, personal information, and recording all data material onto CD's and printing all presentations for display in binders. I attended numerous Branch meetings, and observed many changes in the Branch Management organization.
A model-based approach for automated in vitro cell tracking and chemotaxis analyses.
Debeir, Olivier; Camby, Isabelle; Kiss, Robert; Van Ham, Philippe; Decaestecker, Christine
2004-07-01
Chemotaxis may be studied in two main ways: 1) counting cells passing through an insert (e.g., using Boyden chambers), and 2) directly observing cell cultures (e.g., using Dunn chambers), both in response to stationary concentration gradients. This article promotes the use of Dunn chambers and in vitro cell-tracking, achieved by video microscopy coupled with automatic image analysis software, in order to extract quantitative and qualitative measurements characterizing the response of cells to a diffusible chemical agent. Previously, we set up a videomicroscopy system coupled with image analysis software that was able to compute cell trajectories from in vitro cell cultures. In the present study, we are introducing a new software increasing the application field of this system to chemotaxis studies. This software is based on an adapted version of the active contour methodology, enabling each cell to be efficiently tracked for hours and resulting in detailed descriptions of individual cell trajectories. The major advantages of this method come from an improved robustness with respect to variability in cell morphologies between different cell lines and dynamical changes in cell shape during cell migration. Moreover, the software includes a very small number of parameters which do not require overly sensitive tuning. Finally, the running time of the software is very short, allowing improved possibilities in acquisition frequency and, consequently, improved descriptions of complex cell trajectories, i.e. trajectories including cell division and cell crossing. We validated this software on several artificial and real cell culture experiments in Dunn chambers also including comparisons with manual (human-controlled) analyses. We developed new software and data analysis tools for automated cell tracking which enable cell chemotaxis to be efficiently analyzed. Copyright 2004 Wiley-Liss, Inc.
1992-05-01
Clark, T.H. Fay, Multispectral I Bathymetry Programs: A Users Guide, NTN 95. Myrick, S., M. Lohrenz, Data Base Design Document for the Digital Map...Computer1 Software in the A-12 Digital Map Set, NTN 162. Myrick, S., M. Lohrenz, P. Wischow, M. Trenchard, S. Tyskiewicz, J. Kaufman, MDFF I HELP...Shaw, K, D. Byman, S. Carter, M. Kalcic, M. Clawson, M. Harris, A Summary of the i Collected Data from a Survey of Navy Digital MC&G Requirements
High Temperature Properties and Aging-Stress Related Changes of FeCo Materials
2006-07-01
ORGANIZATION NAME(S) AND ADDRESS( ES ) 8. PERFORMING ORGANIZATION REPORT NUMBER Power Generation Branch (AFRL/PRPG) Power Division Propulsion...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSORING/MONITORING AGENCY ACRONYM(S) AFRL-PR-WP Propulsion Directorate Air Force...fcc), α-(bcc) and α’-(CsCl) phases (produced using TAPP @TM software, ES Microware) 0 200 400 600 800 1000 0 2 4 6 8 10 103 0 20 40 60 80 100 In iti
Crosstalk: The Journal of Defense Software Engineering. Volume 22, Number 1, January 2009
2009-01-01
con - tract project. They manage the pro - ject plan and its execution and ask: How do we get all of this accom- plished? • The marketing manager...struggle with the pros and cons of technological toys at home, with the cell phone probably being the most divisive. It’s great to keep our kids in...their technology behind? These concerns present us practitioners with many challenges. We must build trust with pro - gram managers, proving over and
Weapon System Software Acquisition and Support: A Theory of System Structure and Behavior.
1982-03-01
Labs, Raytheon, FMC Inorganic Chemicals Division, Motorola Military Electroncs, Hughes Aircraft, General Dynamics/Fort Worth, Grumman and IBM, among...Organization Engineering Manpower M-16A Engineers Authorized ENGEN Expected Nunber of M-17 Engineers Engineers LEI Level of Engineers M-17A Engineers...productivity sector are listed below: ENGEX.K=ENGEX.J+DT(ENGCT.JK-REXEL.JK -RREA.JK) M-21,Level ENGEN =LEI M-17N,Initial ENGTOT=LEI M-18N,Initial ENGEX=LEI M
2011-10-31
designs with code division multiple access ( CDMA ). Analog chirp filters were used to produce an up-chirp, which is used as a radar waveform, coupled with...signals. A potential shortcoming of CDMA techniques is that the addition of two signals will result in a non-constant amplitude signal which will be...of low-frequency A/ Ds . As an example for a multiple carrier signal all the received signals from the multiple carriers are aliased onto the
CrossTalk: The Journal of Defense Software Engineering. Volume 25, Number 4, July/August 2012
2012-08-01
understand the interface between various code components. For example, consider a situation in which handwrit - ten code produced by one team generates an...conclusively say that a division by zero will not occur. The abstract interpretation concept can be generalized as a tool set that can be used to determine...word what makes a good manager, I would say decisiveness. You can use the fan- ciest computers to gather the numbers, but in the end you have to set
Pulse Code Modulation (PCM) encoder handbook for Aydin Vector MMP-900 series system
NASA Technical Reports Server (NTRS)
Raphael, David
1995-01-01
This handbook explicates the hardware and software properties of a time division multiplex system. This system is used to sample analog and digital data. The data is then merged with frame synchronization information to produce a serial pulse coded modulation (PCM) bit stream. Information in this handbook is required by users to design congruous interface and attest effective utilization of this encoder system. Aydin Vector provides all of the components for these systems to Goddard Space Flight Center/Wallops Flight Facility.
1984-02-01
and is approved for publication. APPROVED: ’"" " Project Engineer APPROVED:k 1 4 RAYMOND P. URTZ, JR. Acting Technical Director Command and Control ...Technical Director Command and Control Division FOR THE COMMANDER: JOHN A. RITZ Acting Chief, Plans Office If your address has changed or if you wish to be...179 55812203 Denver CO 80201 55812203 I. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Rome Air Development Center (COEE) February 1984 Griffiss
Development of a demand assignment/TDMA system for international business satellite communications
NASA Astrophysics Data System (ADS)
Nohara, Mitsuo; Takeuchi, Yoshio; Takahata, Fumio; Hirata, Yasuo; Yamazaki, Yoshiharu
An experimental IBS (international business satellite) communications system based on a demand assignment and TDMA (time-division multiple-access) operation has been developed. The system utilizes a limited satellite resource efficiently and provides various kinds of ISDN services totally. A discussion is presented of the IBS network configurations suitable to international communications and describes the developed communications system from the viewpoint of the hardware and software implementation. The performance in terms of the transmission quality and call processing is also demonstrated.
Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL
NASA Technical Reports Server (NTRS)
Dumas, Joseph D., II
2002-01-01
The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.
The SEL Adapts to Meet Changing Times
NASA Technical Reports Server (NTRS)
Pajerski, Rose S.; Basili, Victor R.
1997-01-01
Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. It has done this by developing and refining a continual process improvement approach that allows an organization such as the FDD to fine-tune its process for its particular domain. Experimental software engineering and measurement play a significant role in this approach. The SEL is a partnership of NASA Goddard, its major software contractor, Computer Sciences Corporation (CSC), and the University of Maryland's (LTM) Department of Computer Science. The FDD primarily builds software systems that provide ground-based flight dynamics support for scientific satellites. They fall into two sets: ground systems and simulators. Ground systems are midsize systems that average around 250 thousand source lines of code (KSLOC). Ground system development projects typically last 1 - 2 years. Recent systems have been rehosted to workstations from IBM mainframes, and also contain significant new subsystems written in C and C++. The simulators are smaller systems averaging around 60 KSLOC that provide the test data for the ground systems. Simulator development lasts up to 1 year. Most of the simulators have been built in Ada on workstations. The SEL is responsible for the management and continual improvement of the software engineering processes used on these FDD projects.
Using software to predict occupational hearing loss in the mining industry
Azman, A.S.; Li, M.; Thompson, J.K.
2017-01-01
Powerful mining systems typically generate high-level noise that can damage the hearing ability of miners. Engineering noise controls are the most desirable and effective control for overexposure to noise. However, the effects of these noise controls on the actual hearing status of workers are not easily measured. A tool that can provide guidance in assigning workers to jobs based on the noise levels to which they will be exposed is highly desirable. Therefore, the Pittsburgh Mining Research Division (PMRD) of the U.S. National Institute for Occupational Safety and Health (NIOSH) developed a tool to estimate in a systematic way the hearing loss due to occupational noise exposure and to evaluate the effectiveness of developed engineering controls. This computer program is based on the ISO 1999 standard and can be used to estimate the loss of hearing ability caused by occupational noise exposures. In this paper, the functionalities of this software are discussed and several case studies related to mining machinery are presented to demonstrate the functionalities of this software. PMID:28596700
Spaceport Command and Control System Software Development
NASA Technical Reports Server (NTRS)
Mahlin, Jonathan Nicholas
2017-01-01
There is an immense challenge in organizing personnel across a large agency such as NASA, or even over a subset of that, like a center's Engineering directorate. Workforce inefficiencies and challenges are bound to grow over time without oversight and management. It is also not always possible to hire new employees to fill workforce gaps, therefore available resources must be utilized more efficiently. The goal of this internship was to develop software that improves organizational efficiency by aiding managers, making employee information viewable and editable in an intuitive manner. This semester I created an application for managers that aids in optimizing allocation of employee resources for a single division with the possibility of scaling upwards. My duties this semester consisted of developing frontend and backend software to complete this task. The application provides user-friendly information displays and documentation of the workforce to allow NASA to track diligently track the status and skills of its workforce. This tool should be able to prove that current employees are being effectively utilized and if new hires are necessary to fulfill skill gaps.
Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data
NASA Astrophysics Data System (ADS)
Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.
2014-12-01
The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.
Liu, Lei; Peng, Wei-Ren; Casellas, Ramon; Tsuritani, Takehiro; Morita, Itsuro; Martínez, Ricardo; Muñoz, Raül; Yoo, S J B
2014-01-13
Optical Orthogonal Frequency Division Multiplexing (O-OFDM), which transmits high speed optical signals using multiple spectrally overlapped lower-speed subcarriers, is a promising candidate for supporting future elastic optical networks. In contrast to previous works which focus on Coherent Optical OFDM (CO-OFDM), in this paper, we consider the direct-detection optical OFDM (DDO-OFDM) as the transport technique, which leads to simpler hardware and software realizations, potentially offering a low-cost solution for elastic optical networks, especially in metro networks, and short or medium distance core networks. Based on this network scenario, we design and deploy a software-defined networking (SDN) control plane enabled by extending OpenFlow, detailing the network architecture, the routing and spectrum assignment algorithm, OpenFlow protocol extensions and the experimental validation. To the best of our knowledge, it is the first time that an OpenFlow-based control plane is reported and its performance is quantitatively measured in an elastic optical network with DDO-OFDM transmission.
Development and Test of Robotically Assisted Extravehicular Activity Gloves
NASA Technical Reports Server (NTRS)
Rogers, Jonathan M.; Peters, Benjamin J.; Laske, Evan A.; McBryan, Emily R.
2017-01-01
Over the past two years, the High Performance EVA Glove (HPEG) project under NASA's Space Technology Mission Directorate (STMD) funded an effort to develop an electromechanically-assisted space suit glove. The project was a collaboration between the Johnson Space Center's Software, Robotics, and Simulation Division and the Crew and Thermal Systems division. The project sought to combine finger actuator technology developed for Robonaut 2 with the softgoods from the ILC Phase VI EVA glove. The Space Suit RoboGlove (SSRG) uses a system of three linear actuators to pull synthetic tendons attached to the glove's fingers to augment flexion of the user's fingers. To detect the user's inputs, the system utilizes a combination of string potentiometers along the back of the fingers and force sensitive resistors integrated into the fingertips of the glove cover layer. This paper discusses the development process from initial concepts through two major phases of prototypes, and the results of initial human testing. Initial work on the project focused on creating a functioning proof of concept, designing the softgoods integration, and demonstrating augmented grip strength with the actuators. The second year of the project focused on upgrading the actuators, sensors, and software with the overall goal of creating a system that moves with the user's fingers in order to reduce fatigue associated with the operation of a pressurized glove system. This paper also discusses considerations for a flight system based on this prototype development and address where further work is required to mature the technology.
NASA Technical Reports Server (NTRS)
Davis, Derrick D.
2014-01-01
This 2014 summer internship assignment at John F. Kennedy Space Center (K.S.C) was conducted with the National Aeronautics and Space Administration (NASA) Engineering and Technology (NE) group in support of the Control and Data Systems Division (NE-C) within the Test, Operations & Support Software Engineering Branch (NE-C2). The primary focus of this project was to assist Branch Chief Laurie B. Griffin, to support NASA's Small Payload Launch Integrated Testing Services (SPLITS) mission, by mastering the capabilities of 3-D modeling software called SketchUp. I used SketchUp to create a virtual environment for different laboratories of the NE-00 Division. My mission was to have these models uploaded into a K.S.C Partnerships Website and be used as a visual aid to viewers who browsed the site. The leads of this project were Kay L. Craig, Business and Industry Specialist (AD-A) and Steven E. Cain, (FA-C). I teamed with fellow intern Tait Sorenson of the Flight Structures and Thermal Protection Systems Branch (NE-M5) and met with many K.S.C lab managers willing to display their lab's structure and capabilities. The information collected during these lab tours was vital to the building of the K.S.C Partnerships Website. To accomplish this goal Sorenson and I later teamed with fellow Marketing intern Marlee Pereda-Ramos, of the Spaceport Planning Office In Center Planning And Development (AD-A) Along with Ramos, Tait and I toured an array of laboratories and got first hand exposure to their functions and capabilities.
Liu, Jun; Zou, Ling; Zhao, Zhi-he; Welburn, Neala; Yang, Pu; Tang, Tian; Li, Yu
2009-01-01
Aim To determine cephalometrically the mechanism of the treatment effects of non-extraction and multiloop edgewise archwire (MEAW) technique on postpeak Class II Division 1 patients. Methodology In this retrospective study, 16 postpeak Class II Division 1 patients successfully corrected using a non-extraction and MEAW technique were cephalometrically evaluated and compared with 16 matched control subjects treated using an extraction technique. Using CorelDRAW® software, standardized digital cephalograms pre- and post-active treatments were traced and a reference grid was set up. The superimpositions were based on the cranial base, the mandibular and the maxilla regions,and skeletal and dental changes were measured. Changes following treatment were evaluated using the paired-sample t-test. Student's t-test for unpaired samples was used to assess the differences in changes between the MEAW and the extraction control groups. Results The correction of the molar relationships comprised 54% skeletal change (mainly the advancement of the mandible) and 46% dental change. Correction of the anterior teeth relationships comprised 30% skeletal change and 70% dental change. Conclusion The MEAW technique can produce the desired vertical and sagittal movement of the tooth segment and then effectively stimulate mandibular advancement by utilizing the residual growth potential of the condyle. PMID:20690424
The MiAge Calculator: a DNA methylation-based mitotic age calculator of human tissue types.
Youn, Ahrim; Wang, Shuang
2018-01-01
Cell division is important in human aging and cancer. The estimation of the number of cell divisions (mitotic age) of a given tissue type in individuals is of great interest as it allows not only the study of biological aging (using a new molecular aging target) but also the stratification of prospective cancer risk. Here, we introduce the MiAge Calculator, a mitotic age calculator based on a novel statistical framework, the MiAge model. MiAge is designed to quantitatively estimate mitotic age (total number of lifetime cell divisions) of a tissue using the stochastic replication errors accumulated in the epigenetic inheritance process during cell divisions. With the MiAge model, the MiAge Calculator was built using the training data of DNA methylation measures of 4,020 tumor and adjacent normal tissue samples from eight TCGA cancer types and was tested using the testing data of DNA methylation measures of 2,221 tumor and adjacent normal tissue samples of five other TCGA cancer types. We showed that within each of the thirteen cancer types studied, the estimated mitotic age is universally accelerated in tumor tissues compared to adjacent normal tissues. Across the thirteen cancer types, we showed that worse cancer survivals are associated with more accelerated mitotic age in tumor tissues. Importantly, we demonstrated the utility of mitotic age by showing that the integration of mitotic age and clinical information leads to improved survival prediction in six out of the thirteen cancer types studied. The MiAge Calculator is available at http://www.columbia.edu/∼sw2206/softwares.htm .
Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd
2005-01-01
Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.
Modular Mount Control System for Telescopes
NASA Astrophysics Data System (ADS)
Mooney, J.; Cleis, R.; Kyono, T.; Edwards, M.
The Space Observatory Control Kit (SpOCK) is the hardware, computers and software used to run small and large telescopes in the RDS division of the Air Force Research Laboratories (AFRL). The system is used to track earth satellites, celestial objects, terrestrial objects and aerial objects. The system will track general targets when provided with state vectors in one of five coordinate systems. Client-toserver and server-to-gimbals communication occurs via human-readable s-expressions that may be evaluated by the computer language called Racket. Software verification is achieved by scripts that exercise these expressions by sending them to the server, and receiving the expressions that the server evaluates. This paper describes the adaptation of a modular mount control system developed primarily for LEO satellite imaging on large and small portable AFRL telescopes with a goal of orbit determination and the generation of satellite metrics.
Hardware Architecture Study for NASA's Space Software Defined Radios
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Scardelletti, Maximilian C.; Mortensen, Dale J.; Kacpura, Thomas J.; Andro, Monty; Smith, Carl; Liebetreu, John
2008-01-01
This study defines a hardware architecture approach for software defined radios to enable commonality among NASA space missions. The architecture accommodates a range of reconfigurable processing technologies including general purpose processors, digital signal processors, field programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs) in addition to flexible and tunable radio frequency (RF) front-ends to satisfy varying mission requirements. The hardware architecture consists of modules, radio functions, and and interfaces. The modules are a logical division of common radio functions that comprise a typical communication radio. This paper describes the architecture details, module definitions, and the typical functions on each module as well as the module interfaces. Trade-offs between component-based, custom architecture and a functional-based, open architecture are described. The architecture does not specify the internal physical implementation within each module, nor does the architecture mandate the standards or ratings of the hardware used to construct the radios.
ATLAS Live: Collaborative Information Streams
NASA Astrophysics Data System (ADS)
Goldfarb, Steven; ATLAS Collaboration
2011-12-01
I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.
2013-01-15
S48-E-007 (12 Sept 1991) --- Astronaut James F. Buchli, mission specialist, catches snack crackers as they float in the weightless environment of the earth-orbiting Discovery. This image was transmitted by the Electronic Still Camera, Development Test Objective (DTO) 648. The ESC is making its initial appearance on a Space Shuttle flight. Electronic still photography is a new technology that enables a camera to electronically capture and digitize an image with resolution approaching film quality. The digital image is stored on removable hard disks or small optical disks, and can be converted to a format suitable for downlink transmission or enhanced using image processing software. The Electronic Still Camera (ESC) was developed by the Man- Systems Division at the Johnson Space Center and is the first model in a planned evolutionary development leading to a family of high-resolution digital imaging devices. H. Don Yeates, JSC's Man-Systems Division, is program manager for the ESC. THIS IS A SECOND GENERATION PRINT MADE FROM AN ELECTRONICALLY PRODUCED NEGATIVE
NASA Technical Reports Server (NTRS)
Rickman, D.; Butler, K. A.; Laymon, C. A.
1994-01-01
The purpose of this document is to introduce Geographical Information System (GIS) terminology and summarize interviews conducted with scientists in the Earth Science and Applications Division (ESAD). There is a growing need in ESAD for GIS technology. With many different data sources available to the scientists comes the need to be able to process and view these data in an efficient manner. Since most of these data are stored in vastly different formats, specialized software and hardware are needed. Several ESAD scientists have been using a GIS, specifically the Man-computer Interactive Data Access System (MCIDAS). MCIDAS can solve many of the research problems that arise, but there are areas of research that need more powerful tools; one such example is the multispectral image analysis which is described in this document. Given the strong need for GIS in ESAD, we recommend that a requirements analysis and implementation plan be developed using this document as a basis for further investigation.
NASA Astrophysics Data System (ADS)
Wu, Bin; Yin, Hongxi; Qin, Jie; Liu, Chang; Liu, Anliang; Shao, Qi; Xu, Xiaoguang
2016-09-01
Aiming at the increasing demand of the diversification services and flexible bandwidth allocation of the future access networks, a flexible passive optical network (PON) scheme combining time and wavelength division multiplexing (TWDM) with point-to-point wavelength division multiplexing (PtP WDM) overlay is proposed for the next-generation optical access networks in this paper. A novel software-defined optical distribution network (ODN) structure is designed based on wavelength selective switches (WSS), which can implement wavelength and bandwidth dynamical allocations and suits for the bursty traffic. The experimental results reveal that the TWDM-PON can provide 40 Gb/s downstream and 10 Gb/s upstream data transmission, while the PtP WDM-PON can support 10 GHz point-to-point dedicated bandwidth as the overlay complement system. The wavelengths of the TWDM-PON and PtP WDM-PON are allocated dynamically based on WSS, which verifies the feasibility of the proposed structure.
Mass and Reliability System (MaRS)
NASA Technical Reports Server (NTRS)
Barnes, Sarah
2016-01-01
The Safety and Mission Assurance (S&MA) Directorate is responsible for mitigating risk, providing system safety, and lowering risk for space programs from ground to space. The S&MA is divided into 4 divisions: The Space Exploration Division (NC), the International Space Station Division (NE), the Safety & Test Operations Division (NS), and the Quality and Flight Equipment Division (NT). The interns, myself and Arun Aruljothi, will be working with the Risk & Reliability Analysis Branch under the NC Division's. The mission of this division is to identify, characterize, diminish, and communicate risk by implementing an efficient and effective assurance model. The team utilizes Reliability and Maintainability (R&M) and Probabilistic Risk Assessment (PRA) to ensure decisions concerning risks are informed, vehicles are safe and reliable, and program/project requirements are realistic and realized. This project pertains to the Orion mission, so it is geared toward a long duration Human Space Flight Program(s). For space missions, payload is a critical concept; balancing what hardware can be replaced by components verse by Orbital Replacement Units (ORU) or subassemblies is key. For this effort a database was created that combines mass and reliability data, called Mass and Reliability System or MaRS. The U.S. International Space Station (ISS) components are used as reference parts in the MaRS database. Using ISS components as a platform is beneficial because of the historical context and the environment similarities to a space flight mission. MaRS uses a combination of systems: International Space Station PART for failure data, Vehicle Master Database (VMDB) for ORU & components, Maintenance & Analysis Data Set (MADS) for operation hours and other pertinent data, & Hardware History Retrieval System (HHRS) for unit weights. MaRS is populated using a Visual Basic Application. Once populated, the excel spreadsheet is comprised of information on ISS components including: operation hours, random/nonrandom failures, software/hardware failures, quantity, orbital replaceable units (ORU), date of placement, unit weight, frequency of part, etc. The motivation for creating such a database will be the development of a mass/reliability parametric model to estimate mass required for replacement parts. Once complete, engineers working on future space flight missions will have access a mean time to failures and on parts along with their mass, this will be used to make proper decisions for long duration space flight missions
Amaya, N; Yan, S; Channegowda, M; Rofoee, B R; Shu, Y; Rashidi, M; Ou, Y; Hugues-Salas, E; Zervas, G; Nejabati, R; Simeonidou, D; Puttnam, B J; Klaus, W; Sakaguchi, J; Miyazawa, T; Awaji, Y; Harai, H; Wada, N
2014-02-10
We present results from the first demonstration of a fully integrated SDN-controlled bandwidth-flexible and programmable SDM optical network utilizing sliceable self-homodyne spatial superchannels to support dynamic bandwidth and QoT provisioning, infrastructure slicing and isolation. Results show that SDN is a suitable control plane solution for the high-capacity flexible SDM network. It is able to provision end-to-end bandwidth and QoT requests according to user requirements, considering the unique characteristics of the underlying SDM infrastructure.
NASA Technical Reports Server (NTRS)
Johnson, David W.
1991-01-01
The purpose was to study how manpower and projects are planned at the Facilities Engineering Division (FENGD) within the Systems Engineering and Operations Directorate of the LaRC and to make recommendations for improving the effectiveness and productivity ot the tools that are used. The existing manpower and project planning processes (including the management plan for the FENGD, existing manpower planning reports, project reporting to LaRC and NASA Headquarters, employee time reporting, financial reporting, and coordination/tracking reports for procurement) were discussed with several people, and project planning software was evaluated.
New space sensor and mesoscale data analysis
NASA Technical Reports Server (NTRS)
Hickey, John S.
1987-01-01
The developed Earth Science and Application Division (ESAD) system/software provides the research scientist with the following capabilities: an extensive data base management capibility to convert various experiment data types into a standard format; and interactive analysis and display package (AVE80); an interactive imaging/color graphics capability utilizing the Apple III and IBM PC workstations integrated into the ESAD computer system; and local and remote smart-terminal capability which provides color video, graphics, and Laserjet output. Recommendations for updating and enhancing the performance of the ESAD computer system are listed.
International interface design for Space Station Freedom - Challenges and solutions
NASA Technical Reports Server (NTRS)
Mayo, Richard E.; Bolton, Gordon R.; Laurini, Daniele
1988-01-01
The definition of interfaces for the International Space Station is discussed, with a focus on negotiations between NASA and ESA. The program organization and division of responsibilities for the Space Station are outlined; the basic features of physical and functional interfaces are described; and particular attention is given to the interface management and documentation procedures, architectural control elements, interface implementation and verification, and examples of Columbus interface solutions (including mechanical, ECLSS, thermal-control, electrical, data-management, standardized user, and software interfaces). Diagrams, drawings, graphs, and tables listing interface types are provided.
Parallel Event Analysis Under Unix
NASA Astrophysics Data System (ADS)
Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.
The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.
National geochemical data base; PLUTO geochemical data base for the United States
Baedecker, Philip A.; Grossman, Jeffrey N.; Buttleman, Kim P.
1998-01-01
The PLUTO CD-ROM data base contains inorganic geothermal data obtained by the analytical laboratories of the Geologic Division of the U.S. Geological Survey (USGS) for the United States, including Hawaii and Alaska, in support of USGS program activities requiring chemical data. This CD-ROM was produced in accordance with the ISO 9660 standard and can be accessed by any computer system that has the appropriate software to read the ISO 9660 discs; however, the disc is intended for use in a DOS environment.
Using a Geographic Information System to Improve Childhood Lead-Screening Efforts
2013-01-01
The Idaho Division of Public Health conducted a pilot study to produce a lead-exposure–risk map to help local and state agencies better target childhood lead-screening efforts. Priority lead-screening areas, at the block group level, were created by using county tax assessor data and geographic information system software. A series of maps were produced, indicating childhood lead-screening prevalence in areas in which there was high potential for exposure to lead. These maps could enable development of more systematically targeted and cost-effective childhood lead-screening efforts. PMID:23764346
Pulse Code Modulation (PCM) encoder handbook for Aydin Vector MMP-600 series system
NASA Technical Reports Server (NTRS)
Currier, S. F.; Powell, W. R.
1986-01-01
The hardware and software characteristics of a time division multiplex system are described. The system is used to sample analog and digital data. The data is merged with synchronization information to produce a serial pulse coded modulation (PCM) bit stream. Information presented herein is required by users to design compatible interfaces and assure effective utilization of this encoder system. GSFC/Wallops Flight Facility has flown approximately 50 of these systems through 1984 on sounding rockets with no inflight failures. Aydin Vector manufactures all of the components for these systems.
Software Engineering Laboratory (SEL) Ada performance study report
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1991-01-01
The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.
Quality assurance plan for discharge measurements using broadband acoustic Doppler current profilers
Lipscomb, S.W.
1995-01-01
The recent introduction of the Acoustic Doppler Current Profiler (ADCP) as an instrument for measuring velocities and discharge in the riverine and estuarine environment promises to revolutionize the way these data are collected by the U.S. Geological Survey. The ADCP and associated software, however, compose a complex system and should be used only by qualifies personnel. Standard procedures should be rigorously followed to ensure that the quality of data collected is commensurate with the standards set by the Water Resources Division for all its varied activities in hydrologic investigations.
The Future is Hera! Analyzing Astronomical Over the Internet
NASA Technical Reports Server (NTRS)
Valencic, L. A.; Chai, P.; Pence, W.; Shafer, R.; Snowden, S.
2008-01-01
Hera is the data processing facility provided by the High Energy Astrophysics Science Archive Research Center (HEASARC) at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the pre-installed software packages, local disk space, and computing resources need to do general processing of FITS format data files residing on the users local computer, and to do research using the publicly available data from the High ENergy Astrophysics Division. Qualified students, educators and researchers may freely use the Hera services over the internet of research and educational purposes.
Mazerolle, Stephanie M; Bruening, Jennifer E; Casa, Douglas J
2008-01-01
Work-family conflict (WFC) involves discord that arises when the demands of work interfere with the demands of family or home life. Long work hours, minimal control over work schedules, and time spent away from home are antecedents to WFC. To date, few authors have examined work-family conflict within the athletic training profession. To investigate the occurrence of WFC in certified athletic trainers (ATs) and to identify roots and factors leading to quality-of-life issues for ATs working in the National Collegiate Athletic Association Division I-A setting. Survey questionnaire and follow-up, in-depth, in-person interviews. Division I-A universities sponsoring football. A total of 587 ATs (324 men, 263 women) responded to the questionnaire. Twelve ATs (6 men, 6 women) participated in the qualitative portion: 2 head ATs, 4 assistant ATs, 4 graduate assistant ATs, and 2 AT program directors. Multiple regression analysis was performed to determine whether workload and travel predicted levels of WFC. Analyses of variance were calculated to investigate differences among the factors of sex, marital status, and family status. Interviews were transcribed verbatim and then analyzed using computer software as well as member checks and peer debriefing. The triangulation of the data collection and multiple sources of qualitative analysis were utilized to limit potential researcher prejudices. Regression analyses revealed that long work hours and travel directly contributed to WFC. In addition to long hours and travel, inflexible work schedules and staffing patterns were discussed by the interview participants as antecedents to WFC. Regardless of sex (P = .142), marital status (P = .687), family status (P = .055), or age of children (P = .633), WFC affected Division I-A ATs. No matter their marital or family status, ATs employed at the Division I-A level experienced difficulties balancing their work and home lives. Sources of conflict primarily stemmed from the consuming nature of the profession, travel, inflexible work schedules, and lack of full-time staff members.
The NASA Mission Operations and Control Architecture Program
NASA Technical Reports Server (NTRS)
Ondrus, Paul J.; Carper, Richard D.; Jeffries, Alan J.
1994-01-01
The conflict between increases in space mission complexity and rapidly declining space mission budgets has created strong pressures to radically reduce the costs of designing and operating spacecraft. A key approach to achieving such reductions is through reducing the development and operations costs of the supporting mission operations systems. One of the efforts which the Communications and Data Systems Division at NASA Headquarters is using to meet this challenge is the Mission Operations Control Architecture (MOCA) project. Technical direction of this effort has been delegated to the Mission Operations Division (MOD) of the Goddard Space Flight Center (GSFC). MOCA is to develop a mission control and data acquisition architecture, and supporting standards, to guide the development of future spacecraft and mission control facilities at GSFC. The architecture will reduce the need for around-the-clock operations staffing, obtain a high level of reuse of flight and ground software elements from mission to mission, and increase overall system flexibility by enabling the migration of appropriate functions from the ground to the spacecraft. The end results are to be an established way of designing the spacecraft-ground system interface for GSFC's in-house developed spacecraft, and a specification of the end to end spacecraft control process, including data structures, interfaces, and protocols, suitable for inclusion in solicitation documents for future flight spacecraft. A flight software kernel may be developed and maintained in a condition that it can be offered as Government Furnished Equipment in solicitations. This paper describes the MOCA project, its current status, and the results to date.
Interaction design challenges and solutions for ALMA operations monitoring and control
NASA Astrophysics Data System (ADS)
Pietriga, Emmanuel; Cubaud, Pierre; Schwarz, Joseph; Primet, Romain; Schilling, Marcus; Barkats, Denis; Barrios, Emilio; Vila Vilaro, Baltasar
2012-09-01
The ALMA radio-telescope, currently under construction in northern Chile, is a very advanced instrument that presents numerous challenges. From a software perspective, one critical issue is the design of graphical user interfaces for operations monitoring and control that scale to the complexity of the system and to the massive amounts of data users are faced with. Early experience operating the telescope with only a few antennas has shown that conventional user interface technologies are not adequate in this context. They consume too much screen real-estate, require many unnecessary interactions to access relevant information, and fail to provide operators and astronomers with a clear mental map of the instrument. They increase extraneous cognitive load, impeding tasks that call for quick diagnosis and action. To address this challenge, the ALMA software division adopted a user-centered design approach. For the last two years, astronomers, operators, software engineers and human-computer interaction researchers have been involved in participatory design workshops, with the aim of designing better user interfaces based on state-of-the-art visualization techniques. This paper describes the process that led to the development of those interface components and to a proposal for the science and operations console setup: brainstorming sessions, rapid prototyping, joint implementation work involving software engineers and human-computer interaction researchers, feedback collection from a broader range of users, further iterations and testing.
NASA Astrophysics Data System (ADS)
Richter, Dale A.; Higdon, N. S.; Ponsardin, Patrick L.; Sanchez, David; Chyba, Thomas H.; Temple, Doyle A.; Gong, Wei; Battle, Russell; Edmondson, Mika; Futrell, Anne; Harper, David; Haughton, Lincoln; Johnson, Demetra; Lewis, Kyle; Payne-Baggott, Renee S.
2002-01-01
ITTs Advanced Engineering and Sciences Division and the Hampton University Center for Lidar and Atmospheric Sciences Students (CLASS) team have worked closely to design, fabricate and test an eye-safe, scanning aerosol-lidar system that can be safely deployed and used by students form a variety of disciplines. CLASS is a 5-year undergraduate- research training program funded by NASA to provide hands-on atmospheric-science and lidar-technology education. The system is based on a 1.5 micron, 125 mJ, 20 Hz eye-safe optical parametric oscillator (OPO) and will be used by the HU researchers and students to evaluate the biological impact of aerosols, clouds, and pollution a variety of systems issues. The system design tasks we addressed include the development of software to calculate eye-safety levels and to model lidar performance, implementation of eye-safety features in the lidar transmitter, optimization of the receiver using optical ray tracing software, evaluation of detectors and amplifiers in the near RI, test of OPO and receiver technology, development of hardware and software for laser and scanner control and video display of the scan region.
Wu, Xin; Liu, Guo-yuan; Jiang, Yong-lian
2015-10-01
To investigate the differences in anchorage effects between micro-implants and J hook in treating patients with Class II division 1 maxillary protrusion. Thirty-one cases of adult patients with Class II division 1 maxillary protrusion were treated. They were divided into 2 groups depending on their selection. The first group included 17 patients for micro-implant anchorage, who adopted micro-implant and sliding mechanism to close maxillary extraction space and depress the mandibular molar. The second group encompassed 14 cases for J hook, who adopted sliding mechanism, J hooks in high traction and Class II intermaxillary traction to close extraction space. X-ray lateral cephalometric radiographs were measured before and after treatment, and SPSS16.0 software package was employed to compare the differences in soft and hard tissue changes before and after treatment between 2 groups. There were statistically significant differences in SNB, ANB, MP-FH, U1-Y, U6-Y, L6-MP, NLA, and UL-Y between the 2 groups before and after treatment, while there was no significant difference in SNA, U1-SN, U1-X, and U6-X between the 2 groups. In treating patients with Class II division 1 maxillary protrusion, micro-implant has stronger anchorage effects than J hook, while at the same time depressing the mandibular molars, and making it more favorable to improve Class II faces.
NASA Technical Reports Server (NTRS)
Phillips, Veronica J.
2017-01-01
The Ames Engineering Directorate is the principal engineering organization supporting aerospace systems and spaceflight projects at NASA's Ames Research Center in California's Silicon Valley. The Directorate supports all phases of engineering and project management for flight and mission projects-from R&D to Close-out-by leveraging the capabilities of multiple divisions and facilities.The Mission Design Center (MDC) has full end-to-end mission design capability with sophisticated analysis and simulation tools in a collaborative concurrent design environment. Services include concept maturity level (CML) maturation, spacecraft design and trades, scientific instruments selection, feasibility assessments, and proposal support and partnerships. The Engineering Systems Division provides robust project management support as well as systems engineering, mechanical and electrical analysis and design, technical authority and project integration support to a variety of programs and projects across NASA centers. The Applied Manufacturing Division turns abstract ideas into tangible hardware for aeronautics, spaceflight and science applications, specializing in fabrication methods and management of complex fabrication projects. The Engineering Evaluation Lab (EEL) provides full satellite or payload environmental testing services including vibration, temperature, humidity, immersion, pressure/altitude, vacuum, high G centrifuge, shock impact testing and the Flight Processing Center (FPC), which includes cleanrooms, bonded stores and flight preparation resources. The Multi-Mission Operations Center (MMOC) is composed of the facilities, networks, IT equipment, software and support services needed by flight projects to effectively and efficiently perform all mission functions, including planning, scheduling, command, telemetry processing and science analysis.
Microcosm to Cosmos: The Growth of a Divisional Computer Network
Johannes, R.S.; Kahane, Stephen N.
1987-01-01
In 1982, we reported the deployment of a network of microcomputers in the Division of Gastroenterology[1]. This network was based upon Corvus Systems Omninet®. Corvus was one of the very first firms to offer networking products for PC's. This PC development occurred coincident with the planning phase of the Johns Hopkins Hospital's multisegment ethernet project. A rich communications infra-structure is now in place at the Johns Hopkins Medical Institutions[2,3]. Shortly after the hospital development under the direction of the Operational and Clinical Systems Division (OCS) development began, the Johns Hopkins School of Medicine began an Integrated Academic Information Management Systems (IAIMS) planning effort. We now present a model that uses aspects of all three planning efforts (PC networks, Hospital Information Systems & IAIMS) to build a divisional computing facility. This facility is viewed as a terminal leaf on then institutional network diagram. Nevertheless, it is noteworthy that this leaf, the divisional resource in the Division of Gastroenterology (GASNET), has a rich substructure and functionality of its own, perhaps revealing the recursive nature of network architecture. The current status, design and function of the GASNET computational facility is discussed. Among the major positive aspects of this design are the sharing and centralization of MS-DOS software, the high-speed DOS/Unix link that makes available most of the our institution's computing resources.
Influence of stapling the intersegmental planes on lung volume and function after segmentectomy.
Tao, Hiroyuki; Tanaka, Toshiki; Hayashi, Tatsuro; Yoshida, Kumiko; Furukawa, Masashi; Yoshiyama, Koichi; Okabe, Kazunori
2016-10-01
Dividing the intersegmental planes with a stapler during pulmonary segmentectomy leads to volume loss in the remnant segment. The aim of this study was to assess the influence of segment division methods on preserved lung volume and pulmonary function after segmentectomy. Using image analysis software on computed tomography (CT) images of 41 patients, the ratio of remnant segment and ipsilateral lung volume to their preoperative values (R-seg and R-ips) was calculated. The ratio of postoperative actual forced expiratory volume in 1 s (FEV1) and forced vital capacity (FVC) per those predicted values based on three-dimensional volumetry (R-FEV1 and R-FVC) was also calculated. Differences in actual/predicted ratios of lung volume and pulmonary function for each of the division methods were analysed. We also investigated the correlations of the actual/predicted ratio of remnant lung volume with that of postoperative pulmonary function. The intersegmental planes were divided by either electrocautery or with a stapler in 22 patients and with a stapler alone in 19 patients. Mean values of R-seg and R-ips were 82.7 (37.9-140.2) and 104.9 (77.5-129.2)%, respectively. The mean values of R-FEV1 and R-FVC were 103.9 (83.7-135.1) and 103.4 (82.2-125.1)%, respectively. There were no correlations between the actual/predicted ratio of remnant lung volume and pulmonary function based on the division method. Both R-FEV1 and R-FVC were correlated not with R-seg, but with R-ips. Stapling does not lead to less preserved volume or function than electrocautery in the division of the intersegmental planes. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
A reconfigurable multicarrier demodulator architecture
NASA Technical Reports Server (NTRS)
Kwatra, S. C.; Jamali, M. M.
1991-01-01
An architecture based on parallel and pipline design approaches has been developed for the Frequency Division Multiple Access/Time Domain Multiplexed (FDMA/TDM) conversion system. The architecture has two main modules namely the transmultiplexer and the demodulator. The transmultiplexer has two pipelined modules. These are the shared multiplexed polyphase filter and the Fast Fourier Transform (FFT). The demodulator consists of carrier, clock, and data recovery modules which are interactive. Progress on the design of the MultiCarrier Demodulator (MCD) using commercially available chips and Application Specific Integrated Circuits (ASIC) and simulation studies using Viewlogic software will be presented at the conference.
Data processing and analysis for 2D imaging GEM detector system
NASA Astrophysics Data System (ADS)
Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Linczuk, M.; Wojenski, A.; Zabolotny, W.; Zienkiewicz, P.
2014-11-01
The Triple Gas Electron Multiplier (T-GEM) is presented as soft X-ray (SXR) energy and position sensitive detector for high-resolution X-ray diagnostics of magnetic confinement fusion plasmas [1]. Multi-channel measurement system and essential data processing for X-ray energy and position recognition is consider. Several modes of data acquisition are introduced depending on processing division for hardware and software components. Typical measuring issues aredeliberated for enhancement of data quality. Fundamental output characteristics are presented for one and two dimensional detector structure. Representative results for reference X-ray source and tokamak plasma are demonstrated.
2010-08-14
Jeffrey Beyon, lower right, and Paul Joseph Petzar, right, researchers from NASA's Langley Research Center, speak with Ramesh Kakar right, of the NASA Earth Science Division as they work with DAWN Air Data Acquisition and Processing software aboard NASA's DC-8 research aircraft, Sunday, Aug. 15, 2010, in support of the GRIP experiment at Fort Lauderdale International Airport in Fort Lauderdale, Fla. The Genesis and Rapid Intensification Processes (GRIP) experiment is a NASA Earth science field experiment in 2010 that is being conducted to better understand how tropical storms form and develop into major hurricanes. Photo Credit: (NASA/Paul E. Alers)
Lineage mapper: A versatile cell and particle tracker
NASA Astrophysics Data System (ADS)
Chalfoun, Joe; Majurski, Michael; Dima, Alden; Halter, Michael; Bhadriraju, Kiran; Brady, Mary
2016-11-01
The ability to accurately track cells and particles from images is critical to many biomedical problems. To address this, we developed Lineage Mapper, an open-source tracker for time-lapse images of biological cells, colonies, and particles. Lineage Mapper tracks objects independently of the segmentation method, detects mitosis in confluence, separates cell clumps mistakenly segmented as a single cell, provides accuracy and scalability even on terabyte-sized datasets, and creates division and/or fusion lineages. Lineage Mapper has been tested and validated on multiple biological and simulated problems. The software is available in ImageJ and Matlab at isg.nist.gov.
The Ames Virtual Environment Workstation: Implementation issues and requirements
NASA Technical Reports Server (NTRS)
Fisher, Scott S.; Jacoby, R.; Bryson, S.; Stone, P.; Mcdowall, I.; Bolas, M.; Dasaro, D.; Wenzel, Elizabeth M.; Coler, C.; Kerr, D.
1991-01-01
This presentation describes recent developments in the implementation of a virtual environment workstation in the Aerospace Human Factors Research Division of NASA's Ames Research Center. Introductory discussions are presented on the primary research objectives and applications of the system and on the system's current hardware and software configuration. Principle attention is then focused on unique issues and problems encountered in the workstation's development with emphasis on its ability to meet original design specifications for computational graphics performance and for associated human factors requirements necessary to provide compelling sense of presence and efficient interaction in the virtual environment.
TSX-PLUS MULTI-TASKING UPGRADE FOR THE NICOLET L-11 POWDER DIFFRACTION SYSTEM.
Fitzpatrick, J.; Queen, David L.
1985-01-01
In August of 1982, a single-user, dual-translator, automated powder diffraction system was purchased by the Denver Research Institute for use on project work in the Chemical and Materials Sciences Division. Within a short period of time, the system had already become saturated with users. Scheduling conflicts arose. In view of these problems, an answer was sought in the form of hardware and software changes which would allow many users access to the system simultaneously. A low-cost, minimum impact solution was eventually found. The elements of the solution are reported.
Composite Design and Manufacturing Development for Human Spacecrafts
NASA Technical Reports Server (NTRS)
Litteken, Douglas; Lowry, David
2013-01-01
The Structural Engineering Division at the NASA Johnson Space Center (JSC) has begun work on lightweight, multi-functional pressurized composite structures. The first candidate vehicle for technology development is the Multi-Mission Space Exploration Vehicle (MMSEV) cabin, known as the Gen 2B cabin, which has been built at JSC by the Robotics Division. Of the habitable MMSEV vehicle prototypes designed to date, this is the first one specifically analyzed and tested to hold internal pressure and the only one made out of composite materials. This design uses a laminate base with zoned reinforcement and external stringers, intended to demonstrate certain capabilities, and to prepare for the next cabin design, which will be a composite sandwich panel construction with multi-functional capabilities. As part of this advanced development process, a number of new technologies were used to assist in the design and manufacturing process. One of the methods, new to JSC, was to build the Gen 2B cabin with Out of Autoclave technology to permit the creation of larger parts with fewer joints. An 8-ply pre-preg layup was constructed to form the cabin body. Prior to lay-up, a design optimization software called FiberSIM was used to create each ply pattern. This software is integrated with Pro/Engineer to allow for customized draping of each fabric ply over the complex tool surface. Slits and darts are made in the software model to create an optimal design that maintains proper fiber placement and orientation. The flat pattern of each ply is then exported and sent to an automated cutting table where the patterns are cut out of graphite material. Additionally, to assist in lay-up, a laser projection system (LPT) is used to project outlines of each ply directly onto the tool face for accurate fiber placement and ply build-up. Finally, as part of the OoA process, a large oven was procured to post-cure each part. After manufacturing complete, the cabin underwent modal and pressure testing (currently in progress at date of writing) and will go on to be outfitted and used for further ops usage.
Software defined multi-OLT passive optical network for flexible traffic allocation
NASA Astrophysics Data System (ADS)
Zhang, Shizong; Gu, Rentao; Ji, Yuefeng; Zhang, Jiawei; Li, Hui
2016-10-01
With the rapid growth of 4G mobile network and vehicular network services mobile terminal users have increasing demand on data sharing among different radio remote units (RRUs) and roadside units (RSUs). Meanwhile, commercial video-streaming, video/voice conference applications delivered through peer-to-peer (P2P) technology are still keep on stimulating the sharp increment of bandwidth demand in both business and residential subscribers. However, a significant issue is that, although wavelength division multiplexing (WDM) and orthogonal frequency division multiplexing (OFDM) technology have been proposed to fulfil the ever-increasing bandwidth demand in access network, the bandwidth of optical fiber is not unlimited due to the restriction of optical component properties and modulation/demodulation technology, and blindly increase the wavelength cannot meet the cost-sensitive characteristic of the access network. In this paper, we propose a software defined multi-OLT PON architecture to support efficient scheduling of access network traffic. By introducing software defined networking technology and wavelength selective switch into TWDM PON system in central office, multiple OLTs can be considered as a bandwidth resource pool and support flexible traffic allocation for optical network units (ONUs). Moreover, under the configuration of the control plane, ONUs have the capability of changing affiliation between different OLTs under different traffic situations, thus the inter-OLT traffic can be localized and the data exchange pressure of the core network can be released. Considering this architecture is designed to be maximum following the TWDM PON specification, the existing optical distribution network (ODN) investment can be saved and conventional EPON/GPON equipment can be compatible with the proposed architecture. What's more, based on this architecture, we propose a dynamic wavelength scheduling algorithm, which can be deployed as an application on control plane and achieve effective scheduling OLT wavelength resources between different OLTs based on various traffic situation. Simulation results show that, by using the scheduling algorithm, network traffic between different OLTs can be optimized effectively, and the wavelength utilization of the multi-OLT system can be improved due to the flexible wavelength scheduling.
2012-11-08
S48-E-013 (15 Sept 1991) --- The Upper Atmosphere Research Satellite (UARS) in the payload bay of the earth- orbiting Discovery. UARS is scheduled for deploy on flight day three of the STS-48 mission. Data from UARS will enable scientists to study ozone depletion in the stratosphere, or upper atmosphere. This image was transmitted by the Electronic Still Camera (ESC), Development Test Objective (DTO) 648. The ESC is making its initial appearance on a Space Shuttle flight. Electronic still photography is a new technology that enables a camera to electronically capture and digitize an image with resolution approaching film quality. The digital image is stored on removable hard disks or small optical disks, and can be converted to a format suitable for downlink transmission or enhanced using image processing software. The Electronic Still Camera (ESC) was developed by the Man- Systems Division at the Johnson Space Center and is the first model in a planned evolutionary development leading to a family of high-resolution digital imaging devices. H. Don Yeates, JSC's Man-Systems Division, is program manager for the ESC. THIS IS A SECOND GENERATION PRINT MADE FROM AN ELECTRONICALLY PRODUCED NEGATIVE.
Load management as a smart grid concept for sizing and designing of hybrid renewable energy systems
NASA Astrophysics Data System (ADS)
Eltamaly, Ali M.; Mohamed, Mohamed A.; Al-Saud, M. S.; Alolah, Abdulrahman I.
2017-10-01
Optimal sizing of hybrid renewable energy systems (HRES) to satisfy load requirements with the highest reliability and lowest cost is a crucial step in building HRESs to supply electricity to remote areas. Applying smart grid concepts such as load management can reduce the size of HRES components and reduce the cost of generated energy considerably. In this article, sizing of HRES is carried out by dividing the load into high- and low-priority parts. The proposed system is formed by a photovoltaic array, wind turbines, batteries, fuel cells and a diesel generator as a back-up energy source. A smart particle swarm optimization (PSO) algorithm using MATLAB is introduced to determine the optimal size of the HRES. The simulation was carried out with and without division of the load to compare these concepts. HOMER software was also used to simulate the proposed system without dividing the loads to verify the results obtained from the proposed PSO algorithm. The results show that the percentage of division of the load is inversely proportional to the cost of the generated energy.
W-band radio-over-fiber propagation of two optically encoded wavelength channels
NASA Astrophysics Data System (ADS)
Eghbal, Morad Khosravi; Shadaram, Mehdi
2018-01-01
We propose a W-band wavelength-division multiplexing (WDM)-over-optical code-division multiple access radio-over-fiber system. This system offers capacity expansion by increasing the working frequency to millimeter wave region and by introducing optical encoding and multiwavelength multiplexing. The system's functionality is investigated by software modeling, and the results are presented. The generated signals are data modulated at 10 Gb/s and optically encoded for two wavelength channels and transmitted with a 20-km length of fiber. The received signals are optically decoded and detected. Also, encoding has improved the bit error rate (BER) versus the received optical power margin for the WDM setting by about 4 dB. In addition, the eye-diagram shows that the difference between received optical power levels at the BER of 10-12 to 10-3 is about 1.3% between two encoded channels. This method of capacity improvement is significantly important for the next generation of mobile communication, where millimeter wave signals will be widely used to deliver data to small cells.
Research on distributed optical fiber sensing data processing method based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing
2018-01-01
The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
Activity Catalog Tool (ACT) user manual, version 2.0
NASA Technical Reports Server (NTRS)
Segal, Leon D.; Andre, Anthony D.
1994-01-01
This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.
Technology for national asset storage systems
NASA Technical Reports Server (NTRS)
Coyne, Robert A.; Hulen, Harry; Watson, Richard
1993-01-01
An industry-led collaborative project, called the National Storage Laboratory, was organized to investigate technology for storage systems that will be the future repositories for our national information assets. Industry participants are IBM Federal Systems Company, Ampex Recording Systems Corporation, General Atomics DISCOS Division, IBM ADSTAR, Maximum Strategy Corporation, Network Systems Corporation, and Zitel Corporation. Industry members of the collaborative project are funding their own participation. Lawrence Livermore National Laboratory through its National Energy Research Supercomputer Center (NERSC) will participate in the project as the operational site and the provider of applications. The expected result is an evaluation of a high performance storage architecture assembled from commercially available hardware and software, with some software enhancements to meet the project's goals. It is anticipated that the integrated testbed system will represent a significant advance in the technology for distributed storage systems capable of handling gigabyte class files at gigabit-per-second data rates. The National Storage Laboratory was officially launched on 27 May 1992.
Concurrent and Accurate Short Read Mapping on Multicore Processors.
Martínez, Héctor; Tárraga, Joaquín; Medina, Ignacio; Barrachina, Sergio; Castillo, Maribel; Dopazo, Joaquín; Quintana-Ortí, Enrique S
2015-01-01
We introduce a parallel aligner with a work-flow organization for fast and accurate mapping of RNA sequences on servers equipped with multicore processors. Our software, HPG Aligner SA (HPG Aligner SA is an open-source application. The software is available at http://www.opencb.org, exploits a suffix array to rapidly map a large fraction of the RNA fragments (reads), as well as leverages the accuracy of the Smith-Waterman algorithm to deal with conflictive reads. The aligner is enhanced with a careful strategy to detect splice junctions based on an adaptive division of RNA reads into small segments (or seeds), which are then mapped onto a number of candidate alignment locations, providing crucial information for the successful alignment of the complete reads. The experimental results on a platform with Intel multicore technology report the parallel performance of HPG Aligner SA, on RNA reads of 100-400 nucleotides, which excels in execution time/sensitivity to state-of-the-art aligners such as TopHat 2+Bowtie 2, MapSplice, and STAR.
Software-Implemented Fault Tolerance in Communications Systems
NASA Technical Reports Server (NTRS)
Gantenbein, Rex E.
1994-01-01
Software-implemented fault tolerance (SIFT) is used in many computer-based command, control, and communications (C(3)) systems to provide the nearly continuous availability that they require. In the communications subsystem of Space Station Alpha, SIFT algorithms are used to detect and recover from failures in the data and command link between the Station and its ground support. The paper presents a review of these algorithms and discusses how such techniques can be applied to similar systems found in applications such as manufacturing control, military communications, and programmable devices such as pacemakers. With support from the Tracking and Communication Division of NASA's Johnson Space Center, researchers at the University of Wyoming are developing a testbed for evaluating the effectiveness of these algorithms prior to their deployment. This testbed will be capable of simulating a variety of C(3) system failures and recording the response of the Space Station SIFT algorithms to these failures. The design of this testbed and the applicability of the approach in other environments is described.
NASA Astrophysics Data System (ADS)
Vanderka, Ales; Hajek, Lukas; Bednarek, Lukas; Latal, Jan; Vitasek, Jan; Hejduk, Stanislav; Vasinek, Vladimir
2016-09-01
In this article the author's team deals with using Wavelength Division Multiplexing (WDM) for Free Space Optical (FSO) Communications. In FSO communication occurs due to the influence of atmospheric effect (attenuation, and fluctuation of the received power signal, influence turbulence) and the WDM channel suffers from interchannel crosstalk. There is considered only the one direction. The behavior FSO link was tested for one or eight channels. Here we will be dealing with modulation schemes OOK (On-Off keying), QAM (Quadrature Amplitude Modulation) and Subcarrier Intensity Modulation (SIM) based on a BPSK (Binary Phase Shift Keying). Simulation software OptiSystem 14 was used for tasting. For simulation some parameters were set according to real FSO link such as the datarate 1.25 Gbps, link range 1.4 km. Simulated FSO link used wavelength of 1550 nm with 0.8 nm spacing. There is obtained the influence of crosstalk and modulation format for the BER, depending on the amount of turbulence in the propagation medium.
Higher-Order Mixed Finite Element Methods for Time Domain Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D; Stowell, M; Koning, J
This is the final report for LDRD 01-ERD-005. The Principal Investigator was Niel Madsen of the Defense Sciences Engineering Division (DSED). Collaborators included Daniel White, Joe Koning and Nathan Champagne of DSED, Mark Stowell of Center for Applications Development and Software Engineering (CADSE), and Ph.D. students Rob Rieben and Aaron Fisher at the UC Davis Department of Applied Science. It should be noted that the students were partially supported by the LLNL Student-Employee Graduate Research Fellow program. We begin with an Introduction which provides background and motivation for this research effort. Section II contains high-level description of our Approach, andmore » Section III summarizes our key research Accomplishments. A description of the Software deliverables is provided in Section IV, and Section V includes simulation Validation and Results. It should be noted we do not get into the mathematical details in this report, rather these can be found in our publications which are listed in Section III.« less
Guidelines for computer security in general practice.
Schattner, Peter; Pleteshner, Catherine; Bhend, Heinz; Brouns, Johan
2007-01-01
As general practice becomes increasingly computerised, data security becomes increasingly important for both patient health and the efficient operation of the practice. To develop guidelines for computer security in general practice based on a literature review, an analysis of available information on current practice and a series of key stakeholder interviews. While the guideline was produced in the context of Australian general practice, we have developed a template that is also relevant for other countries. Current data on computer security measures was sought from Australian divisions of general practice. Semi-structured interviews were conducted with general practitioners (GPs), the medical software industry, senior managers within government responsible for health IT (information technology) initiatives, technical IT experts, divisions of general practice and a member of a health information consumer group. The respondents were asked to assess both the likelihood and the consequences of potential risks in computer security being breached. The study suggested that the most important computer security issues in general practice were: the need for a nominated IT security coordinator; having written IT policies, including a practice disaster recovery plan; controlling access to different levels of electronic data; doing and testing backups; protecting against viruses and other malicious codes; installing firewalls; undertaking routine maintenance of hardware and software; and securing electronic communication, for example via encryption. This information led to the production of computer security guidelines, including a one-page summary checklist, which were subsequently distributed to all GPs in Australia. This paper maps out a process for developing computer security guidelines for general practice. The specific content will vary in different countries according to their levels of adoption of IT, and cultural, technical and other health service factors. Making these guidelines relevant to local contexts should help maximise their uptake.
Embracing Open Source for NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin
2017-01-01
The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.
Autonomous docking ground demonstration (category 3)
NASA Technical Reports Server (NTRS)
Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.
1991-01-01
The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.
NASA Astrophysics Data System (ADS)
Lombardo, Luigi; Cigala, Valeria; Rizzi, Jonathan; Craciun, Iulya; Gain, Animesh Kumar; Albano, Raffaele
2017-04-01
Alongside with other major EGU divisions, Natural Hazard has recently formed his Early Career Scientist (ECS) team, known as NhET. NhET was born in 2016 and its scope includes various activities for the EGU members, the international scientific community as well as for the general public. We are a group of six early career researchers, either PhDs or Post-Docs, from different fields of Natural Hazard, keen to promote knowledge exchanges and collaborations. This is done by organizing courses, training sessions and social activities, especially targeting ECSs, during the EGU General Assembly for this year and the next to come. Outside the timeframe of the EGU conference, we constantly promote EGU contents for our division. This is done through the division website (http://www.egu.eu/nh), a mailing list (https://groups.google.com/forum/#!forum/nhet) and social media. With respect to the latter, a new Facebook page will be launched shortly and other platforms such as Twitter will be used to reach a broader audience. These platforms will foster the transmission of Natural Hazard topics to anyone who is interested. The main content will be researchers' interviews, information about open positions, trainings, open source software, conferences together with news on hazards and their anthropic and environmental impacts. We are NhET and we invite you all to follow and collaborate with us for a more dynamic, efficient and widespread scientific communication.
Autonomous docking ground demonstration (category 3)
NASA Astrophysics Data System (ADS)
Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.
The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.
Mazerolle, Stephanie M; Bruening, Jennifer E; Casa, Douglas J
2008-01-01
Context: Work-family conflict (WFC) involves discord that arises when the demands of work interfere with the demands of family or home life. Long work hours, minimal control over work schedules, and time spent away from home are antecedents to WFC. To date, few authors have examined work-family conflict within the athletic training profession. Objective: To investigate the occurrence of WFC in certified athletic trainers (ATs) and to identify roots and factors leading to quality-of-life issues for ATs working in the National Collegiate Athletic Association Division I-A setting. Design: Survey questionnaire and follow-up, in-depth, in-person interviews. Setting: Division I-A universities sponsoring football. Patients or Other Participants: A total of 587 ATs (324 men, 263 women) responded to the questionnaire. Twelve ATs (6 men, 6 women) participated in the qualitative portion: 2 head ATs, 4 assistant ATs, 4 graduate assistant ATs, and 2 AT program directors. Data Collection and Analysis: Multiple regression analysis was performed to determine whether workload and travel predicted levels of WFC. Analyses of variance were calculated to investigate differences among the factors of sex, marital status, and family status. Interviews were transcribed verbatim and then analyzed using computer software as well as member checks and peer debriefing. The triangulation of the data collection and multiple sources of qualitative analysis were utilized to limit potential researcher prejudices. Results: Regression analyses revealed that long work hours and travel directly contributed to WFC. In addition to long hours and travel, inflexible work schedules and staffing patterns were discussed by the interview participants as antecedents to WFC. Regardless of sex (P = .142), marital status (P = .687), family status (P = .055), or age of children (P = .633), WFC affected Division I-A ATs. Conclusions: No matter their marital or family status, ATs employed at the Division I-A level experienced difficulties balancing their work and home lives. Sources of conflict primarily stemmed from the consuming nature of the profession, travel, inflexible work schedules, and lack of full-time staff members. PMID:18833313
Manyak, Kristin A.; Abdenour, Thomas E.; Rauh, Mitchell J.; Baweja, Harsimran S.
2016-01-01
Background As recently dictated by the American Medical Society, balance testing is an important component in the clinical evaluation of concussion. Despite this, previous research on the efficacy of balance testing for concussion diagnosis suggests low sensitivity (∼30%), based primarily on the popular Balance Error Scoring System (BESS). The Balance Tracking System (BTrackS, Balance Tracking Systems Inc., San Diego, CA, USA) consists of a force plate (BTrackS Balance Plate) and software (BTrackS Sport Balance) which can quickly (<2 min) perform concussion balance testing with gold standard accuracy. Purpose The present study aimed to determine the sensitivity of the BTrackS Balance Plate and Sports Balance Software for concussion diagnosis. Study Design Cross-Sectional Study Methods Preseason baseline balance testing of 519 healthy Division I college athletes playing sports with a relatively high risk for concussions was performed with the BTrackS Balance Test. Testing was administered by certified athletic training staff using the BTrackS Balance Plate and Sport Balance software. Of the baselined athletes, 25 later experienced a concussion during the ensuing sport season. Post-injury balance testing was performed on these concussed athletes within 48 of injury and the sensitivity of the BTrackS Balance Plate and Sport Balance software was estimated based on the number of athletes showing a balance decline according to the criteria specified in the Sport Balance software. This criteria is based on the minimal detectable change statistic with a 90% confidence level (i.e. 90% specificity). Results Of 25 athletes who experienced concussions, 16 had balance declines relative to baseline testing results according to the BTrackS Sport Balance software criteria. This corresponds to an estimated concussion sensitivity of 64%, which is twice as great as that reported previously for the BESS. Conclusions The BTrackS Balance Plate and Sport Balance software has the greatest concussion sensitivity of any balance testing instrument reported to date. Level of Evidence Level 2 (Individual cross sectional diagnostic study) PMID:27104048
Modernization of the NASA IRTF Telescope Control System
NASA Astrophysics Data System (ADS)
Pilger, Eric J.; Harwood, James V.; Onaka, Peter M.
1994-06-01
We describe the ongoing modernization of the NASA IR Telescope Facility Telescope Control System. A major mandate of this project is to keep the telescope available for observations throughout. Therefore, we have developed an incremental plan that will allow us to replace components of the software and hardware without shutting down the system. The current system, running under FORTH on a DEC LSI 11/23 minicomputer interfaced to a Bus and boards developed in house, will be replaced with a combination of a Sun SPARCstation running SunOS, a MicroSPARC based Single Board Computer running LynxOS, and various intelligent VME based peripheral cards. The software is based on a design philosophy originally developed by Pat Wallace for use on the Anglo Australian Telescope. This philosophy has gained wide acceptance, and is currently used in a number of observatories around the world. A key element of this philosophy is the division of the TCS into `Virtual' and `Real' parts. This will allow us to replace the higher level functions of the TCS with software running on the Sun, while still relying on the LSI 11/23 for performance of the lower level functions. Eventual transfer of lower level functions to the MicroSPARC system will then proceed incrementally through use of a Q-Bus to VME-Bus converter.
NASA Technical Reports Server (NTRS)
1979-01-01
The machinery pictured is a set of Turbodyne steam turbines which power a sugar mill at Bell Glade, Florida. A NASA-developed computer program called NASTRAN aided development of these and other turbines manufactured by Turbodyne Corporation's Steam Turbine Division, Wellsville, New York. An acronym for NASA Structural Analysis Program, NASTRAN is a predictive tool which advises development teams how a structural design will perform under service use conditions. Turbodyne uses NASTRAN to analyze the dynamic behavior of steam turbine components, achieving substantial savings in development costs. One of the most widely used spinoffs, NASTRAN is made available to private industry through NASA's Computer Software Management Information Center (COSMIC) at the University of Georgia.
Operator Station Design System - A computer aided design approach to work station layout
NASA Technical Reports Server (NTRS)
Lewis, J. L.
1979-01-01
The Operator Station Design System is resident in NASA's Johnson Space Center Spacecraft Design Division Performance Laboratory. It includes stand-alone minicomputer hardware and Panel Layout Automated Interactive Design and Crew Station Assessment of Reach software. The data base consists of the Shuttle Transportation System Orbiter Crew Compartment (in part), the Orbiter payload bay and remote manipulator (in part), and various anthropometric populations. The system is utilized to provide panel layouts, assess reach and vision, determine interference and fit problems early in the design phase, study design applications as a function of anthropometric and mission requirements, and to accomplish conceptual design to support advanced study efforts.
NASA Technical Reports Server (NTRS)
1998-01-01
Recom Technologies, Inc., was established in 1980 by Jack Lee, a former NASA contractor. After forming the new company, Recom was awarded NASA contracts, which eventually grew to 50 percent of the company's business. Two companies have spun-off from Recom, both of which have their basis in NASA technology. The first is Attention Control Systems, Inc. with utilizes intelligent planning software that Recom developed for NASA Ames Computational Sciences Division in a hand-held planner used as an aid in cognitive rehabilitation of brain injury patients. The second is MiraNet, Inc. who uses CLIPS as the foundation for WEXpert, the first rules based help system on the Web.
The PLAID graphics analysis impact on the space program
NASA Technical Reports Server (NTRS)
Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.
1994-01-01
An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pordes, R.; Anderson, J.; Berg, D.
1994-04-01
DART is the new data acquisition system designed and implemented for six Fermilab experiments by the Fermilab Computing Division and the experiments themselves. The complexity of the experiments varies greatly. Their data taking throughput and event filtering requirements range from a few (2-5) to tens (80) of CAMAC, FASTBUS and home built front end crates; from a few 100 KByte/sec to 160 MByte/sec front end data collection rates; and from 0-3000 Mips of level 3 processing. The authors report on the architecture and implementation of DART to this date, and the hardware and software components that are being developed andmore » supported.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pordes, R.; Anderson, J.; Berg, D.
1994-12-31
DART is the new data acquisition system designed and implemented for six Fermilab experiments by the Fermilab Computing Division and the experiments themselves. The complexity of the experiments varies greatly. Their data taking throughput and event filtering requirements range from a few (2-5) to tens (80) of CAMAC, FASTBUS and home built front end crates; from a few 100 KByte/sec to 160 MByte/sec front end data collection rates; and from 0-3000 Mips of level 3 processing. The authors report on the architecture and implementation of DART to this data, and the hardware and software components that are being developed andmore » supported.« less
NASA Technical Reports Server (NTRS)
1996-01-01
Released in 1995, the Trilogy cardiac pacemaker is the fourth generation of a unit developed in the 1970s by NASA, Johns Hopkins Applied Physics Laboratory and St. Jude Medical's Cardiac Rhythm Management Division (formerly known as Pacesetter Systems, Inc.). The new system incorporates the company's PDx diagnostic and programming software and a powerful microprocessor that allows more functions to be fully automatic and gives more detailed information on the patient's health and the performance of the pacing systems. The pacemaker incorporates bidirectional telemetry used for space communications for noninvasive communication with the implanted pacemaker, smaller implantable pulse generators from space microminiaturization, and longer-life batteries from technology for spacecraft electrical power systems.
1982-08-01
REPORT SHT",, -ENGLAND A W*nson Match Company Saftey and Protion Division ISSUE " 3 (contd) HEADS 1 2 3 4 5 6 7 8 ,7 0 BITS iii ii The 832 tm inspection...input data buffering and output data buffering. 2.2.2.3.1. Power up, Reset Circuit To ensure correct system operation when power is first applied the...act in conjunction with R2, R3 and two buffer sections of IC2. When power is first applied , Cl is discharged, Via the pot chain divider of R2 and R3 the
Design and Implementation of CIA, the ISOCAM Interactive Analysis System
NASA Astrophysics Data System (ADS)
Ott, S.; Abergel, A.; Altieri, B.; Augueres, J.-L.; Aussel, H.; Bernard, J.-P.; Biviano, A.; Blommaert, J.; Boulade, O.; Boulanger, F.; Cesarsky, C.; Cesarsky, D. A.; Claret, A.; Delattre, C.; Delaney, M.; Deschamps, T.; Desert, F.-X.; Didelon, P.; Elbaz, D.; Gallais, P.; Gastaud, R.; Guest, S.; Helou, G.; Kong, M.; Lacombe, F.; Li, J.; Landriu, D.; Metcalfe, L.; Okumura, K.; Perault, M.; Pollock, A. M. T.; Rouan, D.; Sam-Lone, J.; Sauvage, M.; Siebenmorgen, R.; Starck, J.-L.; Tran, D.; van Buren, D.; Vigroux, L.; Vivares, F.
This paper presents an overview of the Interactive Analysis System for ISOCAM (CIA). With this system ISOCAM data can be analysed for calibration and engineering purposes, the ISOCAM pipeline software validated and refined, and astronomical data processing can be performed. The system is mainly IDL-based but contains \\fortran, C, and C++ parts for special tasks. It represents an effort of 15 man-years and is comprised of over 1000 IDL and 200 \\fortran, C, and C++ modules. CIA is a joint development by the ESA Astrophysics Division and the ISOCAM Consortium led by the ISOCAM PI, C. Cesarsky, Direction des Sciences de la Matiere, C.E.A., France.
Proceedings of the First NASA Ada Users' Symposium
NASA Technical Reports Server (NTRS)
1988-01-01
Ada has the potential to be a part of the most significant change in software engineering technology within NASA in the last twenty years. Thus, it is particularly important that all NASA centers be aware of Ada experience and plans at other centers. Ada activity across NASA are covered, with presenters representing five of the nine major NASA centers and the Space Station Freedom Program Office. Projects discussed included - Space Station Freedom Program Office: the implications of Ada on training, reuse, management and the software support environment; Johnson Space Center (JSC): early experience with the use of Ada, software engineering and Ada training and the evaluation of Ada compilers; Marshall Space Flight Center (MSFC): university research with Ada and the application of Ada to Space Station Freedom, the Orbital Maneuvering Vehicle, the Aero-Assist Flight Experiment and the Secure Shuttle Data System; Lewis Research Center (LeRC): the evolution of Ada software to support the Space Station Power Management and Distribution System; Jet Propulsion Laboratory (JPL): the creation of a centralized Ada development laboratory and current applications of Ada including the Real-time Weather Processor for the FAA; and Goddard Space Flight Center (GSFC): experiences with Ada in the Flight Dynamics Division and the Extreme Ultraviolet Explorer (EUVE) project and the implications of GSFC experience for Ada use in NASA. Despite the diversity of the presentations, several common themes emerged from the program: Methodology - NASA experience in general indicates that the effective use of Ada requires modern software engineering methodologies; Training - It is the software engineering principles and methods that surround Ada, rather than Ada itself, which requires the major training effort; Reuse - Due to training and transition costs, the use of Ada may initially actually decrease productivity, as was clearly found at GSFC; and real-time work at LeRC, JPL and GSFC shows that it is possible to use Ada for real-time applications.
Poisson-event-based analysis of cell proliferation.
Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul
2015-05-01
A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Chen, Xi-hua; Hua, Yong-mei; Xie, Xing-qian; Yu, Xiao-jia; Wang, Jian; Liu, Li-ming
2013-06-01
To evaluate and compare the treatment efficiency of Empower interactive self-ligating brackets and traditional brackets in Class II division I extraction patients. Forty patients with Class II division I malocclusion were randomly divided into 2 groups. Twenty patients received Empower self-ligating technique (group A) and the other 20 patients received MBT technique (group B). Four first premolars were extracted and without any other anchorage devices added in both groups. The duration of treatment, the number of visits and chair-side time were recorded. Cephalometric analysis was performed before and after treatment. The data was analyzed with SPSS 13.0 software package for paired t test. Treatment time and number of visits in group A were more than in group B, but there was no significant difference between the 2 groups. Chair-side time in group A reduced 151.15s on average compared with group B. Significant changes were observed in both groups after treatment. Upper and lower anterior teeth retracted and convex profile improved.U1-SN, U1-NA, L1-MP, L1-NB, UI-PTV, LI-PTV, UL-EP, LL-EP decreased. Significant differences were found in UM-PTV between the 2 groups(P<0.05). Compared with traditional brackets, Empower self-ligating brackets can save chair-side time, control anterior teeth torque and posterior teeth anchorage effectively, but can not reduce the treatment time or number of visits. Supported by Youth Research Project of Shanghai Municipal Health Bureau(2010Y155).
Computing Across the Physics and Astrophysics Curriculum
NASA Astrophysics Data System (ADS)
DeGioia Eastwood, Kathy; James, M.; Dolle, E.
2012-01-01
Computational skills are essential in today's marketplace. Bachelors entering the STEM workforce report that their undergraduate education does not adequately prepare them to use scientific software and to write programs. Computation can also increase student learning; not only are the students actively engaged, but computational problems allow them to explore physical problems that are more realistic than the few that can be solved analytically. We have received a grant from the NSF CCLI Phase I program to integrate computing into our upper division curriculum. Our language of choice is Matlab; this language had already been chosen for our required sophomore course in Computational Physics because of its prevalence in industry. For two summers we have held faculty workshops to help our professors develop the needed expertise, and we are now in the implementation and evaluation stage. The end product will be a set of learning materials in the form of computational modules that we will make freely available. These modules will include the assignment, pedagogical goals, Matlab code, samples of student work, and instructor comments. At this meeting we present an overview of the project as well as modules written for a course in upper division stellar astrophysics. We acknowledge the support of the NSF through DUE-0837368.
Bathymetric map of the south part of Great Salt Lake, Utah, 2005
Baskin, Robert L.; Allen, David V.
2005-01-01
The U.S. Geological Survey, in cooperation with the Utah Department of Natural Resources, Division of Wildlife Resources, collected bathymetric data for the south part of Great Salt Lake during 2002–04 using a single beam, high-definition fathometer and real-time differential global positioning system. Approximately 7.6 million depth readings were collected along more than 1,050 miles of survey transects for construction of this map. Sound velocities were obtained in conjunction with the bathymetric data to provide time-of-travel corrections to the depth calculations. Data were processed with commercial hydrographic software and exported into geographic information system (GIS) software for mapping. Because of the shallow nature of the lake and the limitations of the instrumentation, contours above an altitude of 4,193 feet were digitized from existing USGS 1:24,000 source-scale digital line graph data.For additional information on methods used to derive the bathymetric contours for this map, please see Baskin, Robert L., 2005, Calculation of area and volume for the south part of Great Salt Lake, Utah, U.S. Geological Survey Open-File Report OFR–2005–1327.
Composing, Analyzing and Validating Software Models
NASA Astrophysics Data System (ADS)
Sheldon, Frederick T.
1998-10-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-07-01
The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of themore » physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.« less
Accelerator controls at CERN: Some converging trends
NASA Astrophysics Data System (ADS)
Kuiper, B.
1990-08-01
CERN's growing services to the high-energy physics community using frozen resources has led to the implementation of "Technical Boards", mandated to assist the management by making recommendations for rationalizations in various technological domains. The Board on Process Control and Electronics for Accelerators, TEBOCO, has emphasized four main lines which might yield economy in resources. First, a common architecture for accelerator controls has been agreed between the three accelerator divisions. Second, a common hardware/software kit has been defined, from which the large majority of future process interfacing may be composed. A support service for this kit is an essential part of the plan. Third, high-level protocols have been developed for standardizing access to process devices. They derive from agreed standard models of the devices and involve a standard control message. This should ease application development and mobility of equipment. Fourth, a common software engineering methodology and a commercial package of application development tools have been adopted. Some rationalization in the field of the man-machine interface and in matters of synchronization is also under way.
Composing, Analyzing and Validating Software Models
NASA Technical Reports Server (NTRS)
Sheldon, Frederick T.
1998-01-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Stamataki, Evangelia; Harich, Benjamin; Guignard, Léo; Preibisch, Stephan; Shorte, Spencer; Keller, Philipp J
2018-01-01
During development, coordinated cell behaviors orchestrate tissue and organ morphogenesis. Detailed descriptions of cell lineages and behaviors provide a powerful framework to elucidate the mechanisms of morphogenesis. To study the cellular basis of limb development, we imaged transgenic fluorescently-labeled embryos from the crustacean Parhyale hawaiensis with multi-view light-sheet microscopy at high spatiotemporal resolution over several days of embryogenesis. The cell lineage of outgrowing thoracic limbs was reconstructed at single-cell resolution with new software called Massive Multi-view Tracker (MaMuT). In silico clonal analyses suggested that the early limb primordium becomes subdivided into anterior-posterior and dorsal-ventral compartments whose boundaries intersect at the distal tip of the growing limb. Limb-bud formation is associated with spatial modulation of cell proliferation, while limb elongation is also driven by preferential orientation of cell divisions along the proximal-distal growth axis. Cellular reconstructions were predictive of the expression patterns of limb development genes including the BMP morphogen Decapentaplegic. PMID:29595475
Improved Algorithms Speed It Up for Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A
2005-09-20
Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less
Strategic directions of computing at Fermilab
NASA Astrophysics Data System (ADS)
Wolbers, Stephen
1998-05-01
Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.
Replacing the IRAF/PyRAF Code-base at STScI: The Advanced Camera for Surveys (ACS)
NASA Astrophysics Data System (ADS)
Lucas, Ray A.; Desjardins, Tyler D.; STScI ACS (Advanced Camera for Surveys) Team
2018-06-01
IRAF/PyRAF are no longer viable on the latest hardware often used by HST observers, therefore STScI no longer actively supports IRAF or PyRAF for most purposes. STScI instrument teams are in the process of converting all of our data processing and analysis code from IRAF/PyRAF to Python, including our calibration reference file pipelines and data reduction software. This is exemplified by our latest ACS Data Handbook, version 9.0, which was recently published in February 2018. Examples of IRAF and PyRAF commands have now been replaced by code blocks in Python, with references linked to documentation on how to download and install the latest Python software via Conda and AstroConda. With the temporary exception of the ACS slitless spectroscopy tool aXe, all ACS-related software is now independent of IRAF/PyRAF. A concerted effort has been made across STScI divisions to help the astronomical community transition from IRAF/PyRAF to Python, with tools such as Python Jupyter notebooks being made to give users workable examples. In addition to our code changes, the new ACS data handbook discusses the latest developments in charge transfer efficiency (CTE) correction, bias de-striping, and updates to the creation and format of calibration reference files among other topics.
The TSO Logic and G2 Software Product
NASA Technical Reports Server (NTRS)
Davis, Derrick D.
2014-01-01
This internship assignment for spring 2014 was at John F. Kennedy Space Center (KSC), in NASAs Engineering and Technology (NE) group in support of the Control and Data Systems Division (NE-C) within the Systems Hardware Engineering Branch. (NEC-4) The primary focus was in system integration and benchmarking utilizing two separate computer software products. The first half of this 2014 internship is spent in assisting NE-C4s Electronics and Embedded Systems Engineer, Kelvin Ruiz and fellow intern Scott Ditto with the evaluation of a newly piece of software, called G2. Its developed by the Gensym Corporation and introduced to the group as a tool used in monitoring launch environments. All fellow interns and employees of the G2 group have been working together in order to better understand the significance of the G2 application and how KSC can benefit from its capabilities. The second stage of this Spring project is to assist with an ongoing integration of a benchmarking tool, developed by a group of engineers from a Canadian based organization known as TSO Logic. Guided by NE-C4s Computer Engineer, Allen Villorin, NASA 2014 interns put forth great effort in helping to integrate TSOs software into the Spaceport Processing Systems Development Laboratory (SPSDL) for further testing and evaluating. The TSO Logic group claims that their software is designed for, monitoring and reducing energy consumption at in-house server farms and large data centers, allows data centers to control the power state of servers, without impacting availability or performance and without changes to infrastructure and the focus of the assignment is to test this theory. TSOs Aaron Rallo Founder and CEO, and Chris Tivel CTO, both came to KSC to assist with the installation of their software in the SPSDL laboratory. TSOs software is installed onto 24 individual workstations running three different operating systems. The workstations were divided into three groups of 8 with each group having its own operating system. The first group is comprised of Ubuntus Debian -based Linux the second group is windows 7 Professional and the third group ran Red Hat Linux. The highlight of this portion of the assignment is to compose documentation expressing the overall impression of the software and its capabilities.
Testing and Performance Analysis of the Multichannel Error Correction Code Decoder
NASA Technical Reports Server (NTRS)
Soni, Nitin J.
1996-01-01
This report provides the test results and performance analysis of the multichannel error correction code decoder (MED) system for a regenerative satellite with asynchronous, frequency-division multiple access (FDMA) uplink channels. It discusses the system performance relative to various critical parameters: the coding length, data pattern, unique word value, unique word threshold, and adjacent-channel interference. Testing was performed under laboratory conditions and used a computer control interface with specifically developed control software to vary these parameters. Needed technologies - the high-speed Bose Chaudhuri-Hocquenghem (BCH) codec from Harris Corporation and the TRW multichannel demultiplexer/demodulator (MCDD) - were fully integrated into the mesh very small aperture terminal (VSAT) onboard processing architecture and were demonstrated.
2014-04-11
CAPE CANAVERAL, Fla. -- At the Marriott Courtyard Hotel in Cocoa Beach, Fla., Greg Clements, chief of Kennedy's Control and Data Systems Division and lead for the Engineering and Technology's Small Payload Integrated Testing Services, or SPLITS, line of business, speaks to participants in the 4th International Workshop on Lunar and Planetary Compact and Cryogenic Science and Technology Applications. Scientists, engineers and entrepreneurs interested in research on the moon and other planetary surfaces, recently participated in the Workshop. Taking place April 8-11, 2014, the event was designed to foster collaborative work among those interested in solving the challenges of building hardware, software and businesses interested in going back to the moon and exploring beyond. Photo credit: NASA/Daniel Casper
2014-04-11
CAPE CANAVERAL, Fla. -- At the Marriott Courtyard Hotel in Cocoa Beach, Fla., Greg Clements, chief of Kennedy's Control and Data Systems Division and lead for the Engineering and Technology's Small Payload Integrated Testing Services, or SPLITS, line of business, speaks to participants in the 4th International Workshop on Lunar and Planetary Compact and Cryogenic Science and Technology Applications. Scientists, engineers and entrepreneurs interested in research on the moon and other planetary surfaces, recently participated in the Workshop. Taking place April 8-11, 2014, the event was designed to foster collaborative work among those interested in solving the challenges of building hardware, software and businesses interested in going back to the moon and exploring beyond. Photo credit: NASA/Daniel Casper
Analyzing huge pathology images with open source software.
Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc
2013-06-06
Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.
Analyzing huge pathology images with open source software
2013-01-01
Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479
Project LITE - Light Inquiry Through Experiments
NASA Astrophysics Data System (ADS)
Brecher, K.
2004-12-01
Hands-on, inquiry-based, constructivist activity offers students a powerful way to explore, uncover and ultimately gain a feel for the nature of science. In order to make practicable a more genuine approach to learning astronomy, we have undertaken the development of hands-on (and eyes-on) materials that can be used in introductory undergraduate astronomy courses. These materials focus on light and optics. Over the past several years as part of Project LITE (Light Inquiry Through Experiments), we have developed a kit of optical materials that is integrated with a set of Java applets. The combined kit and software allows students to do actual experiments concerning geometrical optics, fluorescence, phosphorescence, polarization and other topics by making use of the photons that are emitted by their computer screens. We have also developed a suite of over 100 Flash applets that allow students to directly explore many aspects of visual perception. A major effort of the project concerns spectroscopy, since it is arguably the most important tool used by astronomers to disentangle the nature of the universe. It is also one of the most challenging subjects to teach in undergraduate astronomy courses. The spectroscopy component of Project LITE includes take-home laboratory materials and experiments that are integrated with web-based software. We have also developed a novel quantitative handheld binocular spectrometer (patent pending). Our major spectroscopic software is called the Spectrum Explorer (SPEX). It allows students to create, manipulate and explore all types of spectra including blackbody, power law, emission and absorption. We are now extending the SPEX capabilities to help students gain easy access to the astronomical spectra included in the NVO databases. All of the Project LITE software can be found http://lite.bu.edu. Project LITE is supported by Grant #DUE-0125992 from the NSF Division of Undergraduate Education.
artdaq: DAQ software development made simple
NASA Astrophysics Data System (ADS)
Biery, Kurt; Flumerfelt, Eric; Freeman, John; Ketchum, Wesley; Lukhanin, Gennadiy; Rechenmacher, Ron
2017-10-01
For a few years now, the artdaq data acquisition software toolkit has provided numerous experiments with ready-to-use components which allow for rapid development and deployment of DAQ systems. Developed within the Fermilab Scientific Computing Division, artdaq provides data transfer, event building, run control, and event analysis functionality. This latter feature includes built-in support for the art event analysis framework, allowing experiments to run art modules for real-time filtering, compression, disk writing and online monitoring. As art, also developed at Fermilab, is also used for offline analysis, a major advantage of artdaq is that it allows developers to easily switch between developing online and offline software. artdaq continues to be improved. Support for an alternate mode of running whereby data from some subdetector components are only streamed if requested has been added; this option will reduce unnecessary DAQ throughput. Real-time reporting of DAQ metrics has been implemented, along with the flexibility to choose the format through which experiments receive the reports; these formats include the Ganglia, Graphite and syslog software packages, along with flat ASCII files. Additionally, work has been performed investigating more flexible modes of online monitoring, including the capability to run multiple online monitoring processes on different hosts, each running its own set of art modules. Finally, a web-based GUI interface through which users can configure details of their DAQ system has been implemented, increasing the ease of use of the system. Already successfully deployed on the LArlAT, DarkSide-50, DUNE 35ton and Mu2e experiments, artdaq will be employed for SBND and is a strong candidate for use on ICARUS and protoDUNE. With each experiment comes new ideas for how artdaq can be made more flexible and powerful. The above improvements will be described, along with potential ideas for the future.
Ali, Dler; Mohammed, Hnd; Koo, Seung-Hwan; Kang, Kyung-Hwa; Kim, Sang-Cheol
2016-09-01
The aim of this study was to analyze tooth movement and arch width changes in maxillary dentition following nonextraction treatment with orthodontic mini-implant (OMI) anchorage in Class II division 1 malocclusions. Seventeen adult patients diagnosed with Angle's Class II division 1 malocclusion were treated by nonextraction with OMIs as anchorage for distalization of whole maxillary dentition. Three-dimensional virtual maxillary models were superimposed with the best-fit method at the pretreatment and post-treatment stages. Linear, angular, and arch width variables were measured using Rapidform 2006 software, and analyzed by the paired t-test. All maxillary teeth showed statistically significant movement posteriorly (p < 0.05). There were no significant changes in the vertical position of the maxillary teeth, except that the second molars were extruded (0.86 mm, p < 0.01). The maxillary first and second molars were rotated distal-in (4.5°, p < 0.001; 3.0°, p < 0.05, respectively). The intersecond molar width increased slightly (0.1 mm, p > 0.05) and the intercanine, interfirst premolar, intersecond premolar, and interfirst molar widths increased significantly (2.2 mm, p < 0.01; 2.2 mm, p < 0.05; 1.9 mm, p < 0.01; 2.0 mm, p < 0.01; respectively). Nonextraction treatment with OMI anchorage for Class II division 1 malocclusions could retract the whole maxillary dentition to achieve a Class I canine and molar relationship without a change in the vertical position of the teeth; however, the second molars were significantly extruded. Simultaneously, the maxillary arch was shown to be expanded with distal-in rotation of the molars.
Ali, Dler; Mohammed, Hnd; Koo, Seung-Hwan; Kang, Kyung-Hwa
2016-01-01
Objective The aim of this study was to analyze tooth movement and arch width changes in maxillary dentition following nonextraction treatment with orthodontic mini-implant (OMI) anchorage in Class II division 1 malocclusions. Methods Seventeen adult patients diagnosed with Angle's Class II division 1 malocclusion were treated by nonextraction with OMIs as anchorage for distalization of whole maxillary dentition. Three-dimensional virtual maxillary models were superimposed with the best-fit method at the pretreatment and post-treatment stages. Linear, angular, and arch width variables were measured using Rapidform 2006 software, and analyzed by the paired t-test. Results All maxillary teeth showed statistically significant movement posteriorly (p < 0.05). There were no significant changes in the vertical position of the maxillary teeth, except that the second molars were extruded (0.86 mm, p < 0.01). The maxillary first and second molars were rotated distal-in (4.5°, p < 0.001; 3.0°, p < 0.05, respectively). The intersecond molar width increased slightly (0.1 mm, p > 0.05) and the intercanine, interfirst premolar, intersecond premolar, and interfirst molar widths increased significantly (2.2 mm, p < 0.01; 2.2 mm, p < 0.05; 1.9 mm, p < 0.01; 2.0 mm, p < 0.01; respectively). Conclusions Nonextraction treatment with OMI anchorage for Class II division 1 malocclusions could retract the whole maxillary dentition to achieve a Class I canine and molar relationship without a change in the vertical position of the teeth; however, the second molars were significantly extruded. Simultaneously, the maxillary arch was shown to be expanded with distal-in rotation of the molars. PMID:27668191
NASA Astrophysics Data System (ADS)
Martins, T. M.; Kelman, R.; Metello, M.; Ciarlini, A.; Granville, A. C.; Hespanhol, P.; Castro, T. L.; Gottin, V. M.; Pereira, M. V. F.
2015-12-01
The hydroelectric potential of a river is proportional to its head and water flows. Selecting the best development alternative for Greenfield projects watersheds is a difficult task, since it must balance demands for infrastructure, especially in the developing world where a large potential remains unexplored, with environmental conservation. Discussions usually diverge into antagonistic views, as in recent projects in the Amazon forest, for example. This motivates the construction of a computational tool that will support a more qualified debate regarding development/conservation options. HERA provides the optimal head division partition of a river considering technical, economic and environmental aspects. HERA has three main components: (i) pre-processing GIS of topographic and hydrologic data; (ii) automatic engineering and equipment design and budget estimation for candidate projects; (iii) translation of division-partition problem into a mathematical programming model. By integrating an automatic calculation with geoprocessing tools, cloud computation and optimization techniques, HERA makes it possible countless head partition division alternatives to be intrinsically compared - a great advantage with respect to traditional field surveys followed by engineering design methods. Based on optimization techniques, HERA determines which hydro plants should be built, including location, design, technical data (e.g. water head, reservoir area and volume, engineering design (dam, spillways, etc.) and costs). The results can be visualized in the HERA interface, exported to GIS software, Google Earth or CAD systems. HERA has a global scope of application since the main input data area a Digital Terrain Model and water inflows at gauging stations. The objective is to contribute to an increased rationality of decisions by presenting to the stakeholders a clear and quantitative view of the alternatives, their opportunities and threats.
Do Over or Make Do? Climate Models as a Software Development Challenge (Invited)
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.
2010-12-01
We present the results of a comparative study of the software engineering culture and practices at four different earth system modeling centers: the UK Met Office Hadley Centre, the National Center for Atmospheric Research (NCAR), The Max-Planck-Institut für Meteorologie (MPI-M), and the Institut Pierre Simon Laplace (IPSL). The study investigated the software tools and techniques used at each center to assess their effectiveness. We also investigated how differences in the organizational structures, collaborative relationships, and technical infrastructures constrain the software development and affect software quality. Specific questions for the study included 1) Verification and Validation - What techniques are used to ensure that the code matches the scientists’ understanding of what it should do? How effective are these are at eliminating errors of correctness and errors of understanding? 2) Coordination - How are the contributions from across the modeling community coordinated? For coupled models, how are the differences in the priorities of different, overlapping communities of users addressed? 3) Division of responsibility - How are the responsibilities for coding, verification, and coordination distributed between different roles (scientific, engineering, support) in the organization? 4) Planning and release processes - How do modelers decide on priorities for model development, how do they decide which changes to tackle in a particular release of the model? 5) Debugging - How do scientists debug the models, what types of bugs do they find in their code, and how they find them? The results show that each center has evolved a set of model development practices that are tailored to their needs and organizational constraints. These practices emphasize scientific validity, but tend to neglect other software qualities, and all the centers struggle frequently with software problems. The testing processes are effective at removing software errors prior to release, but the code is hard to understand and hard to change. Software errors and model configuration problems are common during model development, and appear to have a serious impact on scientific productivity. These problems have grown dramatically in recent years with the growth in size and complexity of earth system models. Much of the success in obtaining valid simulations from the models depends on the scientists developing their own code, experimenting with alternatives, running frequent full system tests, and exploring patterns in the results. Blind application of generic software engineering processes is unlikely to work well. Instead, each center needs to lean how to balance the need for better coordination through a more disciplined approach with the freedom to explore, and the value of having scientists work directly with the code. This suggests that each center can learn a lot from comparing their practices with others, but that each might need to develop a different set of best practices.
Modeling Code Is Helping Cleveland Develop New Products
NASA Technical Reports Server (NTRS)
1998-01-01
Master Builders, Inc., is a 350-person company in Cleveland, Ohio, that develops and markets specialty chemicals for the construction industry. Developing new products involves creating many potential samples and running numerous tests to characterize the samples' performance. Company engineers enlisted NASA's help to replace cumbersome physical testing with computer modeling of the samples' behavior. Since the NASA Lewis Research Center's Structures Division develops mathematical models and associated computation tools to analyze the deformation and failure of composite materials, its researchers began a two-phase effort to modify Lewis' Integrated Composite Analyzer (ICAN) software for Master Builders' use. Phase I has been completed, and Master Builders is pleased with the results. The company is now working to begin implementation of Phase II.
Planning assistance for the NASA 30/20 GHz program. Network control architecture study.
NASA Technical Reports Server (NTRS)
Inukai, T.; Bonnelycke, B.; Strickland, S.
1982-01-01
Network Control Architecture for a 30/20 GHz flight experiment system operating in the Time Division Multiple Access (TDMA) was studied. Architecture development, identification of processing functions, and performance requirements for the Master Control Station (MCS), diversity trunking stations, and Customer Premises Service (CPS) stations are covered. Preliminary hardware and software processing requirements as well as budgetary cost estimates for the network control system are given. For the trunking system control, areas covered include on board SS-TDMA switch organization, frame structure, acquisition and synchronization, channel assignment, fade detection and adaptive power control, on board oscillator control, and terrestrial network timing. For the CPS control, they include on board processing and adaptive forward error correction control.
Basic charasteristics of information system of health insurance in FB&H.
Dzubur, Amela; Besić, Asim; Omanić, Ajnija; Dzubur, Alen; Niksić, Dragana
2004-10-01
Due to the territorial and administrative division in the war period, information system of health protection after the war was divided in two systems, what matched organisation of health insurance in that period. Those information systems were incompatible, developed on different, both, hardware and software. Therefore, Ministry of Health, within the project "Basic hospital services", financed through the World Bank loan, applied new, common information system in health insurance. Goal of this paper is to present basic features of information system of health insurance in FB&H, as well as the way of its functioning in respect to other institutions included in the system, respective data bases, sites of entering and updating data, while using data available with Federal Bureau of Health Insurance.
Cost and schedule estimation study report
NASA Technical Reports Server (NTRS)
Condon, Steve; Regardie, Myrna; Stark, Mike; Waligora, Sharon
1993-01-01
This report describes the analysis performed and the findings of a study of the software development cost and schedule estimation models used by the Flight Dynamics Division (FDD), Goddard Space Flight Center. The study analyzes typical FDD projects, focusing primarily on those developed since 1982. The study reconfirms the standard SEL effort estimation model that is based on size adjusted for reuse; however, guidelines for the productivity and growth parameters in the baseline effort model have been updated. The study also produced a schedule prediction model based on empirical data that varies depending on application type. Models for the distribution of effort and schedule by life-cycle phase are also presented. Finally, this report explains how to use these models to plan SEL projects.
Command and Control Software Development
NASA Technical Reports Server (NTRS)
Wallace, Michael
2018-01-01
The future of the National Aeronautics and Space Administration (NASA) depends on its innovation and efficiency in the coming years. With ambitious goals to reach Mars and explore the vast universe, correct steps must be taken to ensure our space program reaches its destination safely. The interns in the Exploration Systems and Operations Division at the Kennedy Space Center (KSC) have been tasked with building command line tools to ease the process of managing and testing the data being produced by the ground control systems while its recording system is not in use. While working alongside full-time engineers, we were able to create multiple programs that reduce the cost and time it takes to test the subsystems that launch rockets to outer space.
The Virtual Solar Observatory and the Heliophysics Meta-Virtual Observatory
NASA Technical Reports Server (NTRS)
Gurman, Joseph B.
2007-01-01
The Virtual Solar Observatory (VSO) is now able to search for solar data ranging from the radio to gamma rays, obtained from space and groundbased observatories, from 26 sources at 12 data providers, and from 1915 to the present. The solar physics community can use a Web interface or an Application Programming Interface (API) that allows integrating VSO searches into other software, including other Web services. Over the next few years, this integration will be especially obvious as the NASA Heliophysics division sponsors the development of a heliophysics-wide virtual observatory (VO), based on existing VO's in heliospheric, magnetospheric, and ionospheric physics as well as the VSO. We examine some of the challenges and potential of such a "meta-VO."
NASA Automated Rendezvous and Capture Review. Executive summary
NASA Technical Reports Server (NTRS)
1991-01-01
In support of the Cargo Transfer Vehicle (CTV) Definition Studies in FY-92, the Advanced Program Development division of the Office of Space Flight at NASA Headquarters conducted an evaluation and review of the United States capabilities and state-of-the-art in Automated Rendezvous and Capture (AR&C). This review was held in Williamsburg, Virginia on 19-21 Nov. 1991 and included over 120 attendees from U.S. government organizations, industries, and universities. One hundred abstracts were submitted to the organizing committee for consideration. Forty-two were selected for presentation. The review was structured to include five technical sessions. Forty-two papers addressed topics in the five categories below: (1) hardware systems and components; (2) software systems; (3) integrated systems; (4) operations; and (5) supporting infrastructure.
Scalable cluster administration - Chiba City I approach and lessons learned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Navarro, J. P.; Evard, R.; Nurmi, D.
2002-07-01
Systems administrators of large clusters often need to perform the same administrative activity hundreds or thousands of times. Often such activities are time-consuming, especially the tasks of installing and maintaining software. By combining network services such as DHCP, TFTP, FTP, HTTP, and NFS with remote hardware control, cluster administrators can automate all administrative tasks. Scalable cluster administration addresses the following challenge: What systems design techniques can cluster builders use to automate cluster administration on very large clusters? We describe the approach used in the Mathematics and Computer Science Division of Argonne National Laboratory on Chiba City I, a 314-node Linuxmore » cluster; and we analyze the scalability, flexibility, and reliability benefits and limitations from that approach.« less
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schrenkenghost, Debra K.
2001-01-01
The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.
Automatic breast tissue density estimation scheme in digital mammography images
NASA Astrophysics Data System (ADS)
Menechelli, Renan C.; Pacheco, Ana Luisa V.; Schiabel, Homero
2017-03-01
Cases of breast cancer have increased substantially each year. However, radiologists are subject to subjectivity and failures of interpretation which may affect the final diagnosis in this examination. The high density features in breast tissue are important factors related to these failures. Thus, among many functions some CADx (Computer-Aided Diagnosis) schemes are classifying breasts according to the predominant density. In order to aid in such a procedure, this work attempts to describe automated software for classification and statistical information on the percentage change in breast tissue density, through analysis of sub regions (ROIs) from the whole mammography image. Once the breast is segmented, the image is divided into regions from which texture features are extracted. Then an artificial neural network MLP was used to categorize ROIs. Experienced radiologists have previously determined the ROIs density classification, which was the reference to the software evaluation. From tests results its average accuracy was 88.7% in ROIs classification, and 83.25% in the classification of the whole breast density in the 4 BI-RADS density classes - taking into account a set of 400 images. Furthermore, when considering only a simplified two classes division (high and low densities) the classifier accuracy reached 93.5%, with AUC = 0.95.
Bathymetric map of the north part of Great Salt Lake, Utah, 2006
Baskin, Robert L.; Turner, Jane
2006-01-01
The U.S. Geological Survey, in cooperation with the Utah Department of Natural Resources, Division of Forestry, Fire, and State Lands, collected bathymetric data for the north part of Great Salt Lake during the spring and early summer of 2006 using a single beam, high-definition fathometer and real-time differential global positioning system. Approximately 5.2 million depth readings were collected along more than 765 miles of survey transects for construction of this map. Sound velocities were obtained in conjunction with the bathymetric data to provide time-of-travel corrections to the depth calculations. Data were processed using commercial hydrographic software and exported into a geographic information system (GIS) software for mapping. Due to the shallow nature of the lake and the limitations of the instrumentation, contours above an altitude of 4,194 feet were digitized from existing USGS 1:24,000 source-scale digital line graph data. The Behrens Trench is approximately located.For additional information on methods used to derive the bathymetric contours for this map, please see Baskin, Robert L., 2006, Calculation of area and volume for the North Part of Great Salt Lake, Utah, U.S. Geological Survey Open-File Report OFR–2006–1359
Digital coherent receiver based transmitter penalty characterization.
Geisler, David J; Kaufmann, John E
2016-12-26
For optical communications links where receivers are signal-power-starved, such as through free-space, it is important to design transmitters and receivers that can operate as close as practically possible to theoretical limits. A total system penalty is typically assessed in terms of how far the end-to-end bit-error rate (BER) is from these limits. It is desirable, but usually difficult, to determine the division of this penalty between the transmitter and receiver. This paper describes a new rigorous and computationally based method that isolates which portion of the penalty can be assessed against the transmitter. There are two basic parts to this approach: (1) use of a coherent optical receiver to perform frequency down-conversion of a transmitter's optical signal waveform to the electrical domain, preserving both optical field amplitude and phase information, and (2): software-based analysis of the digitized electrical waveform. The result is a single numerical metric that quantifies how close a transmitter's signal waveform is to the ideal, based on its BER performance with a perfect software-defined matched-filter receiver demodulator. A detailed description of applying the proposed methodology to the waveform characterization of an optical burst-mode differential phase-shifted keying (DPSK) transmitter is experimentally demonstrated.
A method of LED free-form tilted lens rapid modeling based on scheme language
NASA Astrophysics Data System (ADS)
Dai, Yidan
2017-10-01
According to nonimaging optical principle and traditional LED free-form surface lens, a new kind of LED free-form tilted lens was designed. And a method of rapid modeling based on Scheme language was proposed. The mesh division method was applied to obtain the corresponding surface configuration according to the character of the light source and the desired energy distribution on the illumination plane. Then 3D modeling software and the Scheme language programming are used to generate lens model respectively. With the help of optical simulation software, a light source with the size of 1mm*1mm*1mm in volume is used in experiment, and the lateral migration distance of illumination area is 0.5m, in which total one million rays are computed. We could acquire the simulated results of both models. The simulated output result shows that the Scheme language can prevent the model deformation problems caused by the process of the model transfer, and the degree of illumination uniformity is reached to 82%, and the offset angle is 26°. Also, the efficiency of modeling process is greatly increased by using Scheme language.
Micro-video display with ocular tracking and interactive voice control
NASA Technical Reports Server (NTRS)
Miller, James E.
1993-01-01
In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.
CAD/CAM/AM applications in the manufacture of dental appliances.
Al Mortadi, Noor; Eggbeer, Dominic; Lewis, Jeffrey; Williams, Robert J
2012-11-01
The purposes of this study were to apply the latest developments in additive manufacturing (AM) construction and to evaluate the effectiveness of these computer-aided design and computer-aided manufacturing (CAD/CAM) techniques in the production of dental appliances. In addition, a new method of incorporating wire into a single build was developed. A scanner was used to capture 3-dimensional images of Class II Division 1 dental models that were translated onto a 2-dimensional computer screen. Andresen and sleep-apnea devices were designed in 3 dimensions by using FreeForm software (version 11; Geo Magics SensAble Group, Wilmington, Mass) and a phantom arm. The design was then exported and transferred to an AM machine for building. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Biomedical Computing Technology Information Center: introduction and report of early progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maskewitz, B.F.; Henne, R.L.; McClain, W.J.
1976-01-01
In July 1975, the Biomedical Computing Technology Information Center (BCTIC) was established by the Division of Biomedical and Environmental Research of the U. S. Energy Research and Development Administration (ERDA) at the Oak Ridge National Laboratory. BCTIC collects, organizes, evaluates, and disseminates information on computing technology pertinent to biomedicine, providing needed routes of communication between installations and serving as a clearinghouse for the exchange of biomedical computing software, data, and interface designs. This paper presents BCTIC's functions and early progress to the MUMPS Users' Group in order to stimulate further discussion and cooperation between the two organizations. (BCTIC services aremore » available to its sponsors and their contractors and to any individual/group willing to participate in mutual exchange.) 1 figure.« less
Guidance, Navigation, and Control Technology Assessment for Future Planetary Science Missions
NASA Technical Reports Server (NTRS)
Beauchamp, Pat; Cutts, James; Quadrelli, Marco B.; Wood, Lincoln J.; Riedel, Joseph E.; McHenry, Mike; Aung, MiMi; Cangahuala, Laureano A.; Volpe, Rich
2013-01-01
Future planetary explorations envisioned by the National Research Council's (NRC's) report titled Vision and Voyages for Planetary Science in the Decade 2013-2022, developed for NASA Science Mission Directorate (SMD) Planetary Science Division (PSD), seek to reach targets of broad scientific interest across the solar system. This goal requires new capabilities such as innovative interplanetary trajectories, precision landing, operation in close proximity to targets, precision pointing, multiple collaborating spacecraft, multiple target tours, and advanced robotic surface exploration. Advancements in Guidance, Navigation, and Control (GN&C) and Mission Design in the areas of software, algorithm development and sensors will be necessary to accomplish these future missions. This paper summarizes the key GN&C and mission design capabilities and technologies needed for future missions pursuing SMD PSD's scientific goals.
Model MTF for the mosaic window
NASA Astrophysics Data System (ADS)
Xing, Zhenchong; Hong, Yongfeng; Zhang, Bao
2017-10-01
An electro-optical targeting system mounted either within an airframe or housed in separate pods requires a window to form an environmental barrier to the outside world. In current practice, such windows usually use a mosaic or segmented window. When scanning the target, internally gimbaled systems sweep over the window, which can affect the modulation transfer function (MTF) due to wave-front division and optical path differences arising from the thickness/wedge differences between panes. In this paper, a mathematical model of the MTF of the mosaic window is presented that allows an analysis of influencing factors; we show how the model may be integrated into ZEMAX® software for optical design. The model can be used to guide both the design and the tolerance analysis of optical systems that employ a mosaic window.
Fronthaul evolution: From CPRI to Ethernet
NASA Astrophysics Data System (ADS)
Gomes, Nathan J.; Chanclou, Philippe; Turnbull, Peter; Magee, Anthony; Jungnickel, Volker
2015-12-01
It is proposed that using Ethernet in the fronthaul, between base station baseband unit (BBU) pools and remote radio heads (RRHs), can bring a number of advantages, from use of lower-cost equipment, shared use of infrastructure with fixed access networks, to obtaining statistical multiplexing and optimised performance through probe-based monitoring and software-defined networking. However, a number of challenges exist: ultra-high-bit-rate requirements from the transport of increased bandwidth radio streams for multiple antennas in future mobile networks, and low latency and jitter to meet delay requirements and the demands of joint processing. A new fronthaul functional division is proposed which can alleviate the most demanding bit-rate requirements by transport of baseband signals instead of sampled radio waveforms, and enable statistical multiplexing gains. Delay and synchronisation issues remain to be solved.
The Physics of Information Technology
NASA Astrophysics Data System (ADS)
Gershenfeld, Neil
2000-10-01
The Physics of Information Technology explores the familiar devices that we use to collect, transform, transmit, and interact with electronic information. Many such devices operate surprisingly close to very many fundamental physical limits. Understanding how such devices work, and how they can (and cannot) be improved, requires deep insight into the character of physical law as well as engineering practice. The book starts with an introduction to units, forces, and the probabilistic foundations of noise and signaling, then progresses through the electromagnetics of wired and wireless communications, and the quantum mechanics of electronic, optical, and magnetic materials, to discussions of mechanisms for computation, storage, sensing, and display. This self-contained volume will help both physical scientists and computer scientists see beyond the conventional division between hardware and software to understand the implications of physical theory for information manipulation.
Run control techniques for the Fermilab DART data acquisition system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oleynik, G.; Engelfried, J.; Mengel, L.
1995-10-01
DART is the high speed, Unix based data acquisition system being developed by the Fermilab Computing Division in collaboration with eight High Energy Physics Experiments. This paper describes DART run-control which implements flexible, distributed, extensible and portable paradigms for the control and monitoring of data acquisition systems. We discuss the unique and interesting aspects of the run-control - why we chose the concepts we did, the benefits we have seen from the choices we made, as well as our experiences in deploying and supporting it for experiments during their commissioning and sub-system testing phases. We emphasize the software and techniquesmore » we believe are extensible to future use, and potential future modifications and extensions for those we feel are not.« less
NASA Technical Reports Server (NTRS)
Schmahl, Edward J.; Kundu, Mukul R.
1998-01-01
We have continued our previous efforts in studies of fourier imaging methods applied to hard X-ray flares. We have performed physical and theoretical analysis of rotating collimator grids submitted to GSFC(Goddard Space Flight Center) for the High Energy Solar Spectroscopic Imager (HESSI). We have produced simulation algorithms which are currently being used to test imaging software and hardware for HESSI. We have developed Maximum-Entropy, Maximum-Likelihood, and "CLEAN" methods for reconstructing HESSI images from count-rate profiles. This work is expected to continue through the launch of HESSI in July, 2000. Section 1 shows a poster presentation "Image Reconstruction from HESSI Photon Lists" at the Solar Physics Division Meeting, June 1998; Section 2 shows the text and viewgraphs prepared for "Imaging Simulations" at HESSI's Preliminary Design Review on July 30, 1998.
78 FR 60880 - Statement of Organization, Functions, and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
... Commissioner (KFB) Division of Business and Resource Management (KFB2) Division of Customer Communications... Systems, Division of Business and Resource Management, Division of Customer Communications, Division of...: The Division of Management Services to the Division of Business and Resource Management; the Division...
Evolving the Reuse Process at the Flight Dynamics Division (FDD) Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Condon, S.; Seaman, C.; Basili, Victor; Kraft, S.; Kontio, J.; Kim, Y.
1996-01-01
This paper presents the interim results from the Software Engineering Laboratory's (SEL) Reuse Study. The team conducting this study has, over the past few months, been studying the Generalized Support Software (GSS) domain asset library and architecture, and the various processes associated with it. In particular, we have characterized the process used to configure GSS-based attitude ground support systems (AGSS) to support satellite missions at NASA's Goddard Space Flight Center. To do this, we built detailed models of the tasks involved, the people who perform these tasks, and the interdependencies and information flows among these people. These models were based on information gleaned from numerous interviews with people involved in this process at various levels. We also analyzed effort data in order to determine the cost savings in moving from actual development of AGSSs to support each mission (which was necessary before GSS was available) to configuring AGSS software from the domain asset library. While characterizing the GSS process, we became aware of several interesting factors which affect the successful continued use of GSS. Many of these issues fall under the subject of evolving technologies, which were not available at the inception of GSS, but are now. Some of these technologies could be incorporated into the GSS process, thus making the whole asset library more usable. Other technologies are being considered as an alternative to the GSS process altogether. In this paper, we outline some of issues we will be considering in our continued study of GSS and the impact of evolving technologies.
Earthquake Analysis (EA) Software for The Earthquake Observatories
NASA Astrophysics Data System (ADS)
Yanik, K.; Tezel, T.
2009-04-01
There are many software that can used for observe the seismic signals and locate the earthquakes, but some of them commercial and has technical support. For this reason, many seismological observatories developed and use their own seismological software packets which are convenient with their seismological network. In this study, we introduce our software which has some capabilities that it can read seismic signals and process and locate the earthquakes. This software is used by the General Directorate of Disaster Affairs Earthquake Research Department Seismology Division (here after ERD) and will improve according to the new requirements. ERD network consist of 87 seismic stations that 63 of them were equipped with 24 bite digital Guralp CMG-3T, 16 of them with analogue short period S-13-Geometrics and 8 of them 24 bite digital short period S-13j-DR-24 Geometrics seismometers. Data is transmitted with satellite from broadband stations, whereas leased line used from short period stations. Daily data archive capacity is 4 GB. In big networks, it is very important that observe the seismic signals and locate the earthquakes as soon as possible. This is possible, if they use software which was developed considering their network properties. When we started to develop a software for big networks as our, we recognized some realities that all known seismic format data should be read without any convert process, observing of the only selected stations and do this on the map directly, add seismic files with import command, establishing relation between P and S phase readings and location solutions, store in database and entering to the program with user name and password. In this way, we can prevent data disorder and repeated phase readings. There are many advantages, when data store on the database proxies. These advantages are easy access to data from anywhere using ethernet, publish the bulletin and catalogues using website, easily sending of short message (sms) and e-mail, data reading from anywhere that has ethernet connection and store the results in same centre. The Earthqukae Analysis (EA) program was developed considering above facilities. Microsoft Visual Basic 6.0 and Microsoft GDI tools were used as a basement for program development. EA program can image five different seismic formats (gcf, suds, seisan, sac, nanometrics-y) without any conversion and use all seismic process facilities that are filtering (band-pass, low-pass, high-pass), fast fourier transform, offset adjustment etc.
NASA Astrophysics Data System (ADS)
Latal, Jan; Vogl, Jan; Koudelka, Petr; Vitasek, Jan; Siska, Petr; Liner, Andrej; Papes, Martin; Vasinek, Vladimir
2012-01-01
The optical access networks are nowadays swiftly developing in the telecommunications field. These networks can provide higher data transfer rates, and have great potential to the future in terms of transmission possibilities. Many local internet providers responded to these facts and began gradually installing optical access networks into their originally built networks, mostly based on wireless communication. This allowed enlargement of possibilities for end-users in terms of high data rates and also new services such as Triple play, IPTV (Internet Protocol television) etc. However, with this expansion and building-up is also related the potential of reach in case of these networks. Big cities, such as Prague, Brno, Ostrava or Olomouc cannot be simply covered, because of their sizes and also because of their internal regulations given by various organizations in each city. Standard logical and also physical reach of EPON (IEEE 802.3ah - Ethernet Passive Optical Network) optical access network is about 20 km. However, for networks based on Wavelength Division Multiplex the reach can be up to 80 km, if the optical-fiber amplifier is inserted into the network. This article deals with simulation of different types of amplifiers for WDM-PON (Wavelength Division Multiplexing-Passive Optical Network) network in software application Optiwave OptiSystem and than are the values from the application and from real measurement compared.
NASA Astrophysics Data System (ADS)
Almeida, W. G.; Ferreira, A. L.; Mendes, M. V.; Ribeiro, A.; Yoksas, T.
2007-05-01
CPTEC, a division of Brazil’s INPE, has been using several open-source software packages for a variety of tasks in its Data Division. Among these tools are ones traditionally used in research and educational communities such as GrADs (Grid Analysis and Display System from the Center for Ocean-Land-Atmosphere Studies (COLA)), the Local Data Manager (LDM) and GEMPAK (from Unidata), andl operational tools such the Automatic File Distributor (AFD) that are popular among National Meteorological Services. In addition, some tools developed locally at CPTEC are also being made available as open-source packages. One package is being used to manage the data from Automatic Weather Stations that INPE operates. This system uses only open- source tools such as MySQL database, PERL scripts and Java programs for web access, and Unidata’s Internet Data Distribution (IDD) system and AFD for data delivery. All of these packages are get bundled into a low-cost and easy to install and package called the Meteorological Data Operational System. Recently, in a cooperation with the SICLIMAD project, this system has been modified for use by Portuguese- speaking countries in Africa to manage data from many Automatic Weather Stations that are being installed in these countries under SICLIMAD sponsorship. In this presentation we describe the tools included-in and and architecture-of the Meteorological Data Operational System.
Design of dual ring wavelength filters for WDM applications
NASA Astrophysics Data System (ADS)
Sathyadevaki, R.; Shanmuga sundar, D.; Sivanantha Raja, A.
2016-12-01
Wavelength division multiplexing plays a prime role in an optical communication due to its advantages such as easy network expansion, longer span lengths etc. In this work, photonic crystal based filters with the dual rings are proposed which act as band pass filters (BPF) and channel drop filter (CDF) that has found a massive applications in C and L-bands used for wavelength selection and noise filtering at erbium doped fiber amplifiers and dense wavelength division multiplexing operation. These filters are formulated on the square lattice with crystal rods of silicon material of refractive index 3.4 which are perforated on an air of refractive index 1. Dual ring double filters (band pass filter and channel drop filter) on single layout possess passing and dropping band of wavelengths in two distinct arrangements with entire band quality factors of 92.09523 & 505.263 and 124.85019 & 456.8633 for the pass and drop filters of initial setup and amended setup respectively. These filters have the high-quality factor with broad and narrow bandwidths of 16.8 nm & 3.04 nm and 12.85 nm & 3.3927 nm. Transmission spectra and band gap of the desired filters is analyzed using Optiwave software suite. Two dual ring filters incorporated on a single layout comprises the size of 15×11 μm which can also be used in the integrated photonic chips for the ultra-compact unification of devices.
Development of the sphagnoid areolation pattern in leaves of Palaeozoic protosphagnalean mosses.
Ivanov, Oleg V; Maslova, Elena V; Ignatov, Michael S
2018-04-11
Protosphagnalean mosses constitute the largest group of extinct mosses of still uncertain affinity. Having the general morphology of the Bryopsida, some have leaves with an areolation pattern characteristic of modern Sphagna. This study describes the structure and variation of these patterns in protosphagnalean mosses and provides a comparison with those of modern Sphagna. Preparations of fossil mosses showing preserved leaf cell structure were obtained by dissolving rock, photographed, and the resulting images were transformed to graphical format and analysed with Areoana computer software. The sphagnoid areolation pattern is identical in its basic structure for both modern Sphagnum and Palaeozoic protosphagnalean mosses. However, in the former group the pattern develops through unequal oblique cell divisions, while in the latter the same pattern is a result of equal cell divisions taking place in a specific order with subsequent uneven cell growth. The protosphagnalean pathway leads to considerable variability in leaf structure. Protosphagnalean mosses had a unique ability to switch the development of leaf areolation between a pathway unique to Sphagnum and another one common to all other mosses. This developmental polyvariancy hinders attempts to classify these mosses, as characters previously considered to be of generic significance can be shown to co-occur in one individual leaf. New understanding of the ontogeny has allowed us to re-evaluate the systematic significance of such diagnostic characters in these Palaeozoic plants, showing that their similarity to Sphagnum is less substantial.
The iRoCS Toolbox--3D analysis of the plant root apical meristem at cellular resolution.
Schmidt, Thorsten; Pasternak, Taras; Liu, Kun; Blein, Thomas; Aubry-Hivet, Dorothée; Dovzhenko, Alexander; Duerr, Jasmin; Teale, William; Ditengou, Franck A; Burkhardt, Hans; Ronneberger, Olaf; Palme, Klaus
2014-03-01
To achieve a detailed understanding of processes in biological systems, cellular features must be quantified in the three-dimensional (3D) context of cells and organs. We described use of the intrinsic root coordinate system (iRoCS) as a reference model for the root apical meristem of plants. iRoCS enables direct and quantitative comparison between the root tips of plant populations at single-cell resolution. The iRoCS Toolbox automatically fits standardized coordinates to raw 3D image data. It detects nuclei or segments cells, automatically fits the coordinate system, and groups the nuclei/cells into the root's tissue layers. The division status of each nucleus may also be determined. The only manual step required is to mark the quiescent centre. All intermediate outputs may be refined if necessary. The ability to learn the visual appearance of nuclei by example allows the iRoCS Toolbox to be easily adapted to various phenotypes. The iRoCS Toolbox is provided as an open-source software package, licensed under the GNU General Public License, to make it accessible to a broad community. To demonstrate the power of the technique, we measured subtle changes in cell division patterns caused by modified auxin flux within the Arabidopsis thaliana root apical meristem. © 2014 The Authors The Plant Journal © 2014 John Wiley & Sons Ltd.
Constellation Training Facility Support
NASA Technical Reports Server (NTRS)
Flores, Jose M.
2008-01-01
The National Aeronautics and Space Administration is developing the next set of vehicles that will take men back to the moon under the Constellation Program. The Constellation Training Facility (CxTF) is a project in development that will be used to train astronauts, instructors, and flight controllers on the operation of Constellation Program vehicles. It will also be used for procedure verification and validation of flight software and console tools. The CxTF will have simulations for the Crew Exploration Vehicle (CEV), Crew Module (CM), CEV Service Module (SM), Launch Abort System (LAS), Spacecraft Adapter (SA), Crew Launch Vehicle (CLV), Pressurized Cargo Variant CM, Pressurized Cargo Variant SM, Cargo Launch Vehicle, Earth Departure Stage (EDS), and the Lunar Surface Access Module (LSAM). The Facility will consist of part-task and full-task trainers, each with a specific set of mission training capabilities. Part task trainers will be used for focused training on a single vehicle system or set of related systems. Full task trainers will be used for training on complete vehicles and all of its subsystems. Support was provided in both software development and project planning areas of the CxTF project. Simulation software was developed for the hydraulic system of the Thrust Vector Control (TVC) of the ARES I launch vehicle. The TVC system is in charge of the actuation of the nozzle gimbals for navigation control of the upper stage of the ARES I rocket. Also, software was developed using C standards to send and receive data to and from hand controllers to be used in CxTF cockpit simulations. The hand controllers provided movement in all six rotational and translational axes. Under Project Planning & Control, support was provided to the development and maintenance of integrated schedules for both the Constellation Training Facility and Missions Operations Facilities Division. These schedules maintain communication between projects in different levels. The CxTF support provided is one that requires continuous maintenance since the project is still on initial development phases.
Demarcation of local neighborhoods to study relations between contextual factors and health
2010-01-01
Background Several studies have highlighted the importance of collective social factors for population health. One of the major challenges is an adequate definition of the spatial units of analysis which present properties potentially related to the target outcomes. Political and administrative divisions of urban areas are the most commonly used definition, although they suffer limitations in their ability to fully express the neighborhoods as social and spatial units. Objective This study presents a proposal for defining the boundaries of local neighborhoods in Rio de Janeiro city. Local neighborhoods are constructed by means of aggregation of contiguous census tracts which are homogeneous regarding socioeconomic indicators. Methodology Local neighborhoods were created using the SKATER method (TerraView software). Criteria used for socioeconomic homogeneity were based on four census tract indicators (income, education, persons per household, and percentage of population in the 0-4-year age bracket) considering a minimum population of 5,000 people living in each local neighborhood. The process took into account the geographic boundaries between administrative neighborhoods (a political-administrative division larger than a local neighborhood, but smaller than a borough) and natural geographic barriers. Results The original 8,145 census tracts were collapsed into 794 local neighborhoods, distributed along 158 administrative neighborhoods. Local neighborhoods contained a mean of 10 census tracts, and there were an average of five local neighborhoods per administrative neighborhood. The local neighborhood units demarcated in this study are less socioeconomically heterogeneous than the administrative neighborhoods and provide a means for decreasing the well-known statistical variability of indicators based on census tracts. The local neighborhoods were able to distinguish between different areas within administrative neighborhoods, particularly in relation to squatter settlements. Conclusion Although the literature on neighborhood and health is increasing, little attention has been paid to criteria for demarcating neighborhoods. The proposed method is well-structured, available in open-access software, and easily reproducible, so we expect that new experiments will be conducted to evaluate its potential use in other settings. The method is thus a potentially important contribution to research on intra-urban differentials, particularly concerning contextual factors and their implications for different health outcomes. PMID:20587046
Validation of CFD/Heat Transfer Software for Turbine Blade Analysis
NASA Technical Reports Server (NTRS)
Kiefer, Walter D.
2004-01-01
I am an intern in the Turbine Branch of the Turbomachinery and Propulsion Systems Division. The division is primarily concerned with experimental and computational methods of calculating heat transfer effects of turbine blades during operation in jet engines and land-based power systems. These include modeling flow in internal cooling passages and film cooling, as well as calculating heat flux and peak temperatures to ensure safe and efficient operation. The branch is research-oriented, emphasizing the development of tools that may be used by gas turbine designers in industry. The branch has been developing a computational fluid dynamics (CFD) and heat transfer code called GlennHT to achieve the computational end of this analysis. The code was originally written in FORTRAN 77 and run on Silicon Graphics machines. However the code has been rewritten and compiled in FORTRAN 90 to take advantage of more modem computer memory systems. In addition the branch has made a switch in system architectures from SGI's to Linux PC's. The newly modified code therefore needs to be tested and validated. This is the primary goal of my internship. To validate the GlennHT code, it must be run using benchmark fluid mechanics and heat transfer test cases, for which there are either analytical solutions or widely accepted experimental data. From the solutions generated by the code, comparisons can be made to the correct solutions to establish the accuracy of the code. To design and create these test cases, there are many steps and programs that must be used. Before a test case can be run, pre-processing steps must be accomplished. These include generating a grid to describe the geometry, using a software package called GridPro. Also various files required by the GlennHT code must be created including a boundary condition file, a file for multi-processor computing, and a file to describe problem and algorithm parameters. A good deal of this internship will be to become familiar with these programs and the structure of the GlennHT code. Additional information is included in the original extended abstract.
Demarcation of local neighborhoods to study relations between contextual factors and health.
Santos, Simone M; Chor, Dora; Werneck, Guilherme Loureiro
2010-06-29
Several studies have highlighted the importance of collective social factors for population health. One of the major challenges is an adequate definition of the spatial units of analysis which present properties potentially related to the target outcomes. Political and administrative divisions of urban areas are the most commonly used definition, although they suffer limitations in their ability to fully express the neighborhoods as social and spatial units. This study presents a proposal for defining the boundaries of local neighborhoods in Rio de Janeiro city. Local neighborhoods are constructed by means of aggregation of contiguous census tracts which are homogeneous regarding socioeconomic indicators. Local neighborhoods were created using the SKATER method (TerraView software). Criteria used for socioeconomic homogeneity were based on four census tract indicators (income, education, persons per household, and percentage of population in the 0-4-year age bracket) considering a minimum population of 5,000 people living in each local neighborhood. The process took into account the geographic boundaries between administrative neighborhoods (a political-administrative division larger than a local neighborhood, but smaller than a borough) and natural geographic barriers. The original 8,145 census tracts were collapsed into 794 local neighborhoods, distributed along 158 administrative neighborhoods. Local neighborhoods contained a mean of 10 census tracts, and there were an average of five local neighborhoods per administrative neighborhood.The local neighborhood units demarcated in this study are less socioeconomically heterogeneous than the administrative neighborhoods and provide a means for decreasing the well-known statistical variability of indicators based on census tracts. The local neighborhoods were able to distinguish between different areas within administrative neighborhoods, particularly in relation to squatter settlements. Although the literature on neighborhood and health is increasing, little attention has been paid to criteria for demarcating neighborhoods. The proposed method is well-structured, available in open-access software, and easily reproducible, so we expect that new experiments will be conducted to evaluate its potential use in other settings. The method is thus a potentially important contribution to research on intra-urban differentials, particularly concerning contextual factors and their implications for different health outcomes.
Clime: analyzing and producing climate data in GIS environment
NASA Astrophysics Data System (ADS)
Cattaneo, Luigi; Rillo, Valeria; Mercogliano, Paola
2014-05-01
In the last years, Impacts on Soil and Coasts Division (ISC) of CMCC (Euro-Mediterranean Center on Climate Change) had several collaboration experiences with impact communities, including IS-ENES (FP7-INF) and SafeLand (FP7-ENV) projects, which involved a study of landslide risk in Europe, and is currently active in GEMINA (FIRB) and ORIENTGATE (SEE Transnational Cooperation Programme) research projects. As a result, it has brought research activities about different impact of climate changes as flood and landslide hazards, based on climate simulation obtained from the high resolution regional climate models COSMO CLM, developed at CMCC as member of the consortium CLM Assembly. ISC-Capua also collaborates with local institutions interested in atmospherical climate change and also of their impacts on the soil, such as river basin authorities in the Campania region, ARPA Emilia Romagna and ARPA Calabria. Impact models (e.g. hydraulic or stability models) are usually developed in a GIS environment, since they need an accurate territory description, so Clime has been designed to bridge the usually existing gap between climate data - both observed and simulated - gathered from different sources, and impact communities. The main goal of Clime, special purpose Geographic Information System (GIS) software integrated in ESRI ArcGIS Desktop 10, is to easily evaluate multiple climate features and study climate changes over specific geographical domains with their related effects on environment, including impacts on soil. Developed as an add-in tool, this software has been conceived for research activities of ISC Division in order to provide a substantial contribution during post-processing and validation phase. Therefore, it is possible to analyze and compare multiple datasets (observations, climate simulations, etc.) through processes involving statistical functions, percentiles, trends test and evaluation of extreme events with a flexible system of temporal and spatial filtering, and to represent results as maps, temporal and statistic plots (time series, seasonal cycles, PDFs, scatter plots, Taylor diagrams) or Excel tables; in addition, it features bias correction techniques for climate model results. Summarizing, Clime is able to provide users a simple and fast way to retrieve analysis over simulated climate data and observations within any geographical site of interest (provinces, regions, countries, etc.).
Laboratory directed research and development program FY 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-03-01
This report compiles the annual reports of Laboratory Directed Research and Development projects supported by the Berkeley Lab. Projects are arranged under the following topical sections: (1) Accelerator and fusion research division; (2) Chemical sciences division; (3) Computing Sciences; (4) Earth sciences division; (5) Environmental energy technologies division; (6) life sciences division; (7) Materials sciences division; (8) Nuclear science division; (9) Physics division; (10) Structural biology division; and (11) Cross-divisional. A total of 66 projects are summarized.
49 CFR 177.841 - Division 6.1 and Division 2.3 materials.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 2 2010-10-01 2010-10-01 false Division 6.1 and Division 2.3 materials. 177.841... PUBLIC HIGHWAY Loading and Unloading § 177.841 Division 6.1 and Division 2.3 materials. (See also § 177...) or Division 6.1 (poisonous) materials. The transportation of a Division 2.3 (poisonous gas) or...
NASA Astrophysics Data System (ADS)
Bai, Yang; Chen, Shufen; Fu, Li; Fang, Wei; Lu, Junjun
2005-01-01
A high bit rate more than 10Gbit/s optical pulse generation device is the key to achieving high-speed and broadband optical fiber communication network system .Now, we propose a novel high-speed optical transmission module(TM) consisting of a Ti:Er:LiNbO3 waveguide laser and a Mach-Zehnder-type encoding modulator on the same Er-doped substrate. According to the standard of ITU-T, we design the 10Gbit/ s transmission module at 1.53μm on the Z cut Y propagation LiNbO3 slice. A dynamic model and the corresponding numerical code are used to analyze the waveguide laser while the electrooptic effect to design the modulator. Meanwhile, the working principle, key technology, typical characteristic parameters of the module are given. The transmission module has a high extinction ratio and a low driving voltage, which supplies the efficient, miniaturized light source for wavelength division multiplexing(WDM) system. In additional, the relation of the laser gain with the cavity parameter, as well as the relation of the bandwidth of the electrooptic modulator with some key factors are discussed .The designed module structure is simulated by BPM software and HFSS software.
New Ecuadorian VLF and ELF receiver for study the ionosphere
NASA Astrophysics Data System (ADS)
Lopez, Ericson; Montenegro, Jefferson; Vasconez, Michael; Vicente, Klever
Crucial physical phenomena occur in the equatorial atmosphere and ionosphere, which are currently understudied and poorly understood. Thus, scientific campaigns for monitoring the equatorial region are required in order to provide the necessary data for the physical models. Ecuador is located in strategic geographical position where these studies can be performed, providing quality data for the scientific community working in understanding the nature of these physical systems. The Quito Astronomical Observatory (QAO) of National Polytechnic School is moving in this direction by promoting research in space sciences for the study of the equatorial zone. With the participation and the valuable collaboration of international initiatives such us AWESOME, MAGDAS, SAVNET and CALLISTO, the Quito Observatory is establishing a new space physics division on the basis of the International Space Weather Initiative. As part of this project, in the QAO has been designed a new system for acquisition and processing VLF and ELF signals propagating in the ionosphere. The Labview Software is used to filtering, processing and conditioning the received signals, avoiding in this way 60 percent of the analog components present in a common receiver. The same software have been programmed to create the spectrograms and the amplitude and phase diagrams of the radio signals. The data is stored neatly in files that can be processed even with other applications.
Instrumentation, performance visualization, and debugging tools for multiprocessors
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.
1991-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.
An MRI Von Economo - Koskinas atlas.
Scholtens, Lianne H; de Reus, Marcel A; de Lange, Siemon C; Schmidt, Ruben; van den Heuvel, Martijn P
2018-04-15
The cerebral cortex displays substantial variation in cellular architecture, a regional patterning that has been of great interest to anatomists for centuries. In 1925, Constantin von Economo and George Koskinas published a detailed atlas of the human cerebral cortex, describing a cytoarchitectonic division of the cortical mantle into over 40 distinct areas. Von Economo and Koskinas accompanied their seminal work with large photomicrographic plates of their histological slides, together with tables containing for each described region detailed morphological layer-specific information on neuronal count, neuron size and thickness of the cortical mantle. Here, we aimed to make this legacy data accessible and relatable to in vivo neuroimaging data by constructing a digital Von Economo - Koskinas atlas compatible with the widely used FreeSurfer software suite. In this technical note we describe the procedures used for manual segmentation of the Von Economo - Koskinas atlas onto individual T1 scans and the subsequent construction of the digital atlas. We provide the files needed to run the atlas on new FreeSurfer data, together with some simple code of how to apply the atlas to T1 scans within the FreeSurfer software suite. The digital Von Economo - Koskinas atlas is easily applicable to modern day anatomical MRI data and is made publicly available online. Copyright © 2017 Elsevier Inc. All rights reserved.
Deconstructing Calculation Methods, Part 4: Division
ERIC Educational Resources Information Center
Thompson, Ian
2008-01-01
In the final article of a series of four, the author deconstructs the primary national strategy's approach to written division. The approach to division is divided into five stages: (1) mental division using partition; (2) short division of TU / U; (3) "expanded" method for HTU / U; (4) short division of HTU / U; and (5) long division.…
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.
ESOC - The satellite operation center of the European Space Agency
NASA Astrophysics Data System (ADS)
Dworak, H. P.
1980-04-01
The operation and individual functions of the European Space Operation Center (ESOC) that controls the flight of ESA satellites are presented. The main role of the ESOC is discussed and its division into three areas: telemetry, remote piloting, and tracking is outlined. Attention is given to the manipulation of experimental data collected on board the satellites as well as to the functions of the individual ground stations. A block diagram of the information flow to the Meteosat receiving station is presented along with the network outlay of data flow between the ground stations and the ESOC. Distribution of tasks between the ground operation manager, spacecraft operations manager, and flight dynamic software coordinator is discussed with reference to a mission team. A short description of the current missions including COS-B, GEOS-1 and 2, Meteosat, OTS, and ISEE-B is presented
ECUT (Energy Conversion and Utilization Technologies) program: Biocatalysis Project
NASA Technical Reports Server (NTRS)
1988-01-01
Fiscal year 1987 research activities and accomplishments for the Biocatalysis Project of the U.S. Department of Energy, Energy Conversion and Utilization Technologies (ECUT) Division are presented. The project's technical activities were organized into three work elements. The Molecular Modeling and Applied Genetics work element includes modeling and simulation studies to verify a dynamic model of the enzyme carboxypeptidase; plasmid stabilization by chromosomal integration; growth and stability characteristics of plasmid-containing cells; and determination of optional production parameters for hyper-production of polyphenol oxidase. The Bioprocess Engineering work element supports efforts in novel bioreactor concepts that are likely to lead to substantially higher levels of reactor productivity, product yields, and lower separation energetics. The Bioprocess Design and Assessment work element attempts to develop procedures (via user-friendly computer software) for assessing the economics and energetics of a given biocatalyst process.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. During an End-to-End (ETE) Mission Management Team (MMT) launch simulation at KSC, Mike Rein, division chief of Media Services, and Lisa Malone, director of External Relations and Business Development at KSC, work the consoles. In Firing Room 1 at KSC, Shuttle launch team members put the Shuttle system through an integrated simulation. The control room is set up with software used to simulate flight and ground systems in the launch configuration. The ETE MMT simulation included L-2 and L-1 day Prelaunch MMT meetings, an external tanking/weather briefing, and a launch countdown. The ETE transitioned to the Johnson Space Center for the flight portion of the simulation, with the STS-114 crew in a simulator at JSC. Such simulations are common before a launch to keep the Shuttle launch team sharp and ready for liftoff.
Optimizing Mars Airplane Trajectory with the Application Navigation System
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Riley, Derek
2004-01-01
Planning complex missions requires a number of programs to be executed in concert. The Application Navigation System (ANS), developed in the NAS Division, can execute many interdependent programs in a distributed environment. We show that the ANS simplifies user effort and reduces time in optimization of the trajectory of a martian airplane. We use a software package, Cart3D, to evaluate trajectories and a shortest path algorithm to determine the optimal trajectory. ANS employs the GridScape to represent the dynamic state of the available computer resources. Then, ANS uses a scheduler to dynamically assign ready task to machine resources and the GridScape for tracking available resources and forecasting completion time of running tasks. We demonstrate system capability to schedule and run the trajectory optimization application with efficiency exceeding 60% on 64 processors.
Acciarri, R.; Adamowski, M.; Artrip, D.; ...
2015-07-28
The second workshop to discuss the development of liquid argon time projection chambers (LArTPCs) in the United States was held at Fermilab on July 8-9, 2014. The workshop was organized under the auspices of the Coordinating Panel for Advanced Detectors, a body that was initiated by the American Physical Society Division of Particles and Fields. All presentations at the workshop were made in six topical plenary sessions: i) Argon Purity and Cryogenics, ii) TPC and High Voltage, iii) Electronics, Data Acquisition and Triggering, iv) Scintillation Light Detection, v) Calibration and Test Beams, and vi) Software. This document summarizes the currentmore » efforts in each of these areas. It primarily focuses on the work in the US, but also highlights work done elsewhere in the world.« less
Library reuse in a rapid development environment
NASA Technical Reports Server (NTRS)
Uhde, JO; Weed, Daniel; Gottlieb, Robert; Neal, Douglas
1995-01-01
The Aeroscience and Flight Mechanics Division (AFMD) established a Rapid Development Laboratory (RDL) to investigate and improve new 'rapid development' software production processes and refine the use of commercial, off-the-shelf (COTS) tools. These tools and processes take an avionics design project from initial inception through high fidelity, real-time, hardware-in-the-loop (HIL) testing. One central theme of a rapid development process is the use and integration of a variety of COTS tools: This paper discusses the RDL MATRIX(sub x)(R) libraries, as well as the techniques for managing and documenting these libraries. This paper also shows the methods used for building simulations with the Advanced Simulation Development System (ASDS) libraries, and provides metrics to illustrate the amount of reuse for five complete simulations. Combining ASDS libraries with MATRIX(sub x)(R) libraries is discussed.
Yao, Fei; Wang, Jian; Yao, Ju; Hang, Fangrong; Lei, Xu; Cao, Yongke
2017-03-01
The aim of this retrospective study was to evaluate the practice and the feasibility of Osirix, a free and open-source medical imaging software, in performing accurate video-assisted thoracoscopic lobectomy and segmentectomy. From July 2014 to April 2016, 63 patients received anatomical video-assisted thoracoscopic surgery (VATS), either lobectomy or segmentectomy, in our department. Three-dimensional (3D) reconstruction images of 61 (96.8%) patients were preoperatively obtained with contrast-enhanced computed tomography (CT). Preoperative resection simulations were accomplished with patient-individual reconstructed 3D images. For lobectomy, pulmonary lobar veins, arteries and bronchi were identified meticulously by carefully reviewing the 3D images on the display. For segmentectomy, the intrasegmental veins in the affected segment for division and the intersegmental veins to be preserved were identified on the 3D images. Patient preoperative characteristics, surgical outcomes and postoperative data were reviewed from a prospective database. The study cohort of 63 patients included 33 (52.4%) men and 30 (47.6%) women, of whom 46 (73.0%) underwent VATS lobectomy and 17 (27.0%) underwent VATS segmentectomy. There was 1 conversion from VATS lobectomy to open thoracotomy because of fibrocalcified lymph nodes. A VATS lobectomy was performed in 1 case after completing the segmentectomy because invasive adenocarcinoma was detected by intraoperative frozen-section analysis. There were no 30-day or 90-day operative mortalities CONCLUSIONS: The free, simple, and user-friendly software program Osirix can provide a 3D anatomic structure of pulmonary vessels and a clear vision into the space between the lesion and adjacent tissues, which allows surgeons to make preoperative simulations and improve the accuracy and safety of actual surgery. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
Adalsteinsson, David; McMillen, David; Elston, Timothy C
2004-03-08
Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA) molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. We have developed the software package Biochemical Network Stochastic Simulator (BioNetS) for efficiently and accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous) for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solves the appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.
Guhathakurta, Debarpan; Dutta, Anirban
2016-01-01
Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about -15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization.
Guhathakurta, Debarpan; Dutta, Anirban
2016-01-01
Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about −15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization. PMID:27378836
49 CFR 1242.03 - Made by accounting divisions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 9 2010-10-01 2010-10-01 false Made by accounting divisions. 1242.03 Section 1242... accounting divisions. The separation shall be made by accounting divisions, where such divisions are maintained, and the aggregate of the accounting divisions reported for the quarter and for the year. ...
Engineering physics and mathematics division
NASA Astrophysics Data System (ADS)
Sincovec, R. F.
1995-07-01
This report provides a record of the research activities of the Engineering Physics and Mathematics Division for the period 1 Jan. 1993 - 31 Dec. 1994. This report is the final archival record of the EPM Division. On 1 Oct. 1994, ORELA was transferred to Physics Division and on 1 Jan. 1995, the Engineering Physics and Mathematics Division and the Computer Applications Division reorganized to form the Computer Science and Mathematics Division and the Computational Physics and Engineering Division. Earlier reports in this series are identified on the previous pages, along with the progress reports describing ORNL's research in the mathematical sciences prior to 1984 when those activities moved into the Engineering Physics and Mathematics Division.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... Manufacturing, Multi-Plastics, Inc., Division, Sipco, Inc., Division, Including Leased Workers of M-Ploy... Manufacturing, Multi-Plastics, Inc., Division and Sipco, Inc., Division, including leased workers of M-Ploy... applicable to TA-W-70,457 is hereby issued as follows: ``All workers of Core Manufacturing, Multi-Plastics...
Allen, Peg; Jacob, Rebekah R; Lakshman, Meenakshi; Best, Leslie A; Bass, Kathryn; Brownson, Ross C
2018-03-02
Evidence-based public health (EBPH) practice, also called evidence-informed public health, can improve population health and reduce disease burden in populations. Organizational structures and processes can facilitate capacity-building for EBPH in public health agencies. This study involved 51 structured interviews with leaders and program managers in 12 state health department chronic disease prevention units to identify factors that facilitate the implementation of EBPH. Verbatim transcripts of the de-identified interviews were consensus coded in NVIVO qualitative software. Content analyses of coded texts were used to identify themes and illustrative quotes. Facilitator themes included leadership support within the chronic disease prevention unit and division, unit processes to enhance information sharing across program areas and recruitment and retention of qualified personnel, training and technical assistance to build skills, and the ability to provide support to external partners. Chronic disease prevention leaders' role modeling of EBPH processes and expectations for staff to justify proposed plans and approaches were key aspects of leadership support. Leaders protected staff time in order to identify and digest evidence to address the common barrier of lack of time for EBPH. Funding uncertainties or budget cuts, lack of political will for EBPH, and staff turnover remained challenges. In conclusion, leadership support is a key facilitator of EBPH capacity building and practice. Section and division leaders in public health agencies with authority and skills can institute management practices to help staff learn and apply EBPH processes and spread EBPH with partners.
Baker, Richard M; Brasch, Megan E; Manning, M Lisa; Henderson, James H
2014-08-06
Understanding single and collective cell motility in model environments is foundational to many current research efforts in biology and bioengineering. To elucidate subtle differences in cell behaviour despite cell-to-cell variability, we introduce an algorithm for tracking large numbers of cells for long time periods and present a set of physics-based metrics that quantify differences in cell trajectories. Our algorithm, termed automated contour-based tracking for in vitro environments (ACTIVE), was designed for adherent cell populations subject to nuclear staining or transfection. ACTIVE is distinct from existing tracking software because it accommodates both variability in image intensity and multi-cell interactions, such as divisions and occlusions. When applied to low-contrast images from live-cell experiments, ACTIVE reduced error in analysing cell occlusion events by as much as 43% compared with a benchmark-tracking program while simultaneously tracking cell divisions and resulting daughter-daughter cell relationships. The large dataset generated by ACTIVE allowed us to develop metrics that capture subtle differences between cell trajectories on different substrates. We present cell motility data for thousands of cells studied at varying densities on shape-memory-polymer-based nanotopographies and identify several quantitative differences, including an unanticipated difference between two 'control' substrates. We expect that ACTIVE will be immediately useful to researchers who require accurate, long-time-scale motility data for many cells. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Crutchfield, J.
2016-12-01
The presentation will discuss the current status of the International Production Assessment Division of the USDA ForeignAgricultural Service for operational monitoring and forecasting of current crop conditions, and anticipated productionchanges to produce monthly, multi-source consensus reports on global crop conditions including the use of Earthobservations (EO) from satellite and in situ sources.United States Department of Agriculture (USDA) Foreign Agricultural Service (FAS) International Production AssessmentDivision (IPAD) deals exclusively with global crop production forecasting and agricultural analysis in support of the USDAWorld Agricultural Outlook Board (WAOB) lockup process and contributions to the World Agricultural Supply DemandEstimates (WASE) report. Analysts are responsible for discrete regions or countries and conduct in-depth long-termresearch into national agricultural statistics, farming systems, climatic, environmental, and economic factors affectingcrop production. IPAD analysts become highly valued cross-commodity specialists over time, and are routinely soughtout for specialized analyses to support governmental studies. IPAD is responsible for grain, oilseed, and cotton analysison a global basis. IPAD is unique in the tools it uses to analyze crop conditions around the world, including customweather analysis software and databases, satellite imagery and value-added image interpretation products. It alsoincorporates all traditional agricultural intelligence resources into its forecasting program, to make the fullest use ofavailable information in its operational commodity forecasts and analysis. International travel and training play animportant role in learning about foreign agricultural production systems and in developing analyst knowledge andcapabilities.
A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology
NASA Astrophysics Data System (ADS)
Lina, L.; Murata, K.
2006-12-01
In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.
An Overview of High Performance Computing and Challenges for the Future
Google Tech Talks
2017-12-09
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.
An Overview of High Performance Computing and Challenges for the Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Google Tech Talks
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 2 2014-10-01 2014-10-01 false Bulk packaging for certain pyrophoric liquids (Division 4.2), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards...), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards (Division 6.1...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 2 2011-10-01 2011-10-01 false Bulk packaging for certain pyrophoric liquids (Division 4.2), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards...), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards (Division 6.1...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 2 2013-10-01 2013-10-01 false Bulk packaging for certain pyrophoric liquids (Division 4.2), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards...), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards (Division 6.1...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 2 2012-10-01 2012-10-01 false Bulk packaging for certain pyrophoric liquids (Division 4.2), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards...), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards (Division 6.1...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 2 2010-10-01 2010-10-01 false Bulk packaging for certain pyrophoric liquids (Division 4.2), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards...), dangerous when wet (Division 4.3) materials, and poisonous liquids with inhalation hazards (Division 6.1...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 2 2010-10-01 2010-10-01 false Special requirements for Division 6.1 (poisonous) material and Division 6.2 (infectious substances) materials. 175.630 Section 175.630 Transportation Other... Classification of Material § 175.630 Special requirements for Division 6.1 (poisonous) material and Division 6.2...
1. Oblique view of 215 Division Street, looking southwest, showing ...
1. Oblique view of 215 Division Street, looking southwest, showing front (east) facade and north side, 213 Division Street is visible at left and 217 Division Street appears at right - 215 Division Street (House), Rome, Floyd County, GA
Code of Federal Regulations, 2010 CFR
2010-01-01
... Banking FEDERAL DEPOSIT INSURANCE CORPORATION PROCEDURE AND RULES OF PRACTICE PROCEDURES FOR CORPORATE...) Director means the Director of the Division of Finance (DOF), the Director of the Division of...) Division of Finance (DOF) means the Division of Finance of the FDIC. (n) Division of Resolutions and...
A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing
NASA Technical Reports Server (NTRS)
Takaki, Mitsuo; Cavalcanti, Diego; Gheyi, Rohit; Iyoda, Juliano; dAmorim, Marcelo; Prudencio, Ricardo
2009-01-01
The complexity of constraints is a major obstacle for constraint-based software verification. Automatic constraint solvers are fundamentally incomplete: input constraints often build on some undecidable theory or some theory the solver does not support. This paper proposes and evaluates several randomized solvers to address this issue. We compare the effectiveness of a symbolic solver (CVC3), a random solver, three hybrid solvers (i.e., mix of random and symbolic), and two heuristic search solvers. We evaluate the solvers on two benchmarks: one consisting of manually generated constraints and another generated with a concolic execution of 8 subjects. In addition to fully decidable constraints, the benchmarks include constraints with non-linear integer arithmetic, integer modulo and division, bitwise arithmetic, and floating-point arithmetic. As expected symbolic solving (in particular, CVC3) subsumes the other solvers for the concolic execution of subjects that only generate decidable constraints. For the remaining subjects the solvers are complementary.
NASA Astrophysics Data System (ADS)
Popov, V. N.; Botygin, I. A.; Kolochev, A. S.
2017-01-01
The approach allows representing data of international codes for exchange of meteorological information using metadescription as the formalism associated with certain categories of resources. Development of metadata components was based on an analysis of the data of surface meteorological observations, atmosphere vertical sounding, atmosphere wind sounding, weather radar observing, observations from satellites and others. A common set of metadata components was formed including classes, divisions and groups for a generalized description of the meteorological data. The structure and content of the main components of a generalized metadescription are presented in detail by the example of representation of meteorological observations from land and sea stations. The functional structure of a distributed computing system is described. It allows organizing the storage of large volumes of meteorological data for their further processing in the solution of problems of the analysis and forecasting of climatic processes.
CellCognition: time-resolved phenotype annotation in high-throughput live cell imaging.
Held, Michael; Schmitz, Michael H A; Fischer, Bernd; Walter, Thomas; Neumann, Beate; Olma, Michael H; Peter, Matthias; Ellenberg, Jan; Gerlich, Daniel W
2010-09-01
Fluorescence time-lapse imaging has become a powerful tool to investigate complex dynamic processes such as cell division or intracellular trafficking. Automated microscopes generate time-resolved imaging data at high throughput, yet tools for quantification of large-scale movie data are largely missing. Here we present CellCognition, a computational framework to annotate complex cellular dynamics. We developed a machine-learning method that combines state-of-the-art classification with hidden Markov modeling for annotation of the progression through morphologically distinct biological states. Incorporation of time information into the annotation scheme was essential to suppress classification noise at state transitions and confusion between different functional states with similar morphology. We demonstrate generic applicability in different assays and perturbation conditions, including a candidate-based RNA interference screen for regulators of mitotic exit in human cells. CellCognition is published as open source software, enabling live-cell imaging-based screening with assays that directly score cellular dynamics.
The effect of shape on drag: a physics exercise inspired by biology
NASA Astrophysics Data System (ADS)
Fingerut, Jonathan; Johnson, Nicholas; Mongeau, Eric; Habdas, Piotr
2017-07-01
As part of a biomechanics course aimed at upper-division biology and physics majors, but applicable to a range of student learning levels, this laboratory exercise provides an insight into the effect of shape on hydrodynamic performance, as well an introduction to computer aided design (CAD) and 3D printing. Students use hydrodynamic modeling software and simple CAD programs to design a shape with the least amount of drag based on strategies gleaned from the study of natural forms. Students then print the shapes using a 3D printer and test their shapes against their classmates in a friendly competition. From this exercise, students gain a more intuitive sense of the challenges that organisms face when moving through fluid environments, the physical phenomena involved in moving through fluids at high Reynolds numbers and observe how and why certain morphologies, such as streamlining, are common answers to the challenge of swimming at high speeds.
Computer Programs (Turbomachinery)
NASA Technical Reports Server (NTRS)
1978-01-01
NASA computer programs are extensively used in design of industrial equipment. Available from the Computer Software Management and Information Center (COSMIC) at the University of Georgia, these programs are employed as analysis tools in design, test and development processes, providing savings in time and money. For example, two NASA computer programs are used daily in the design of turbomachinery by Delaval Turbine Division, Trenton, New Jersey. The company uses the NASA splint interpolation routine for analysis of turbine blade vibration and the performance of compressors and condensers. A second program, the NASA print plot routine, analyzes turbine rotor response and produces graphs for project reports. The photos show examples of Delaval test operations in which the computer programs play a part. In the large photo below, a 24-inch turbine blade is undergoing test; in the smaller photo, a steam turbine rotor is being prepared for stress measurements under actual operating conditions; the "spaghetti" is wiring for test instrumentation
Architectural development of an advanced EVA Electronic System
NASA Technical Reports Server (NTRS)
Lavelle, Joseph
1992-01-01
An advanced electronic system for future EVA missions (including zero gravity, the lunar surface, and the surface of Mars) is under research and development within the Advanced Life Support Division at NASA Ames Research Center. As a first step in the development, an optimum system architecture has been derived from an analysis of the projected requirements for these missions. The open, modular architecture centers around a distributed multiprocessing concept where the major subsystems independently process their own I/O functions and communicate over a common bus. Supervision and coordination of the subsystems is handled by an embedded real-time operating system kernel employing multitasking software techniques. A discussion of how the architecture most efficiently meets the electronic system functional requirements, maximizes flexibility for future development and mission applications, and enhances the reliability and serviceability of the system in these remote, hostile environments is included.
Development of a statewide Landsat digital data base for forest insect damage assessment
NASA Technical Reports Server (NTRS)
Williams, D. L.; Dottavio, C. L.; Nelson, R. F.
1983-01-01
A Joint Research Project (JRP) invlving NASA/Goddard Space Flight Center and the Pennsylvania Bureau of Forestry/Division of Forest Pest Management demonstrates the utility of Landsat data for assessing forest insect damage. A major effort within the project has been the creation of map-registered, statewide Landsat digital data base for Pennsylvania. The data base, developed and stored on computers at the Pennsylvania State University Computation Center, contains Landsat imagery, a Landsat-derived forest resource map, and digitized data layers depicting Forest Pest Management District boundaries and county boundaries. A data management front-end system was also developed to provide an interface between the various layers of information within the data base and image analysis software. This front-end system insures than an automated assessment of defoliation damage can be conducted and summarized by geographic area or jurisdiction of interest.
NASA Technical Reports Server (NTRS)
Ketchum, E.
1988-01-01
The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) will be responsible for performing ground attitude determination for Gamma Ray Observatory (GRO) support. The study reported in this paper provides the FDD and the GRO project with ground attitude determination error information and illustrates several uses of the Generalized Calibration System (GCS). GCS, an institutional software tool in the FDD, automates the computation of the expected attitude determination uncertainty that a spacecraft will encounter during its mission. The GRO project is particularly interested in the uncertainty in the attitude determination using Sun sensors and a magnetometer when both star trackers are inoperable. In order to examine the expected attitude errors for GRO, a systematic approach was developed including various parametric studies. The approach identifies pertinent parameters and combines them to form a matrix of test runs in GCS. This matrix formed the basis for this study.
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Kacpura, Thomas J.; Smith, Carl R.; Liebetreu, John; Hill, Gary; Mortensen, Dale J.; Andro, Monty; Scardelletti, Maximilian C.; Farrington, Allen
2008-01-01
This report defines a hardware architecture approach for software-defined radios to enable commonality among NASA space missions. The architecture accommodates a range of reconfigurable processing technologies including general-purpose processors, digital signal processors, field programmable gate arrays, and application-specific integrated circuits (ASICs) in addition to flexible and tunable radiofrequency front ends to satisfy varying mission requirements. The hardware architecture consists of modules, radio functions, and interfaces. The modules are a logical division of common radio functions that compose a typical communication radio. This report describes the architecture details, the module definitions, the typical functions on each module, and the module interfaces. Tradeoffs between component-based, custom architecture and a functional-based, open architecture are described. The architecture does not specify a physical implementation internally on each module, nor does the architecture mandate the standards or ratings of the hardware used to construct the radios.
Multiple Autonomous Discrete Event Controllers for Constellations
NASA Technical Reports Server (NTRS)
Esposito, Timothy C.
2003-01-01
The Multiple Autonomous Discrete Event Controllers for Constellations (MADECC) project is an effort within the National Aeronautics and Space Administration Goddard Space Flight Center's (NASA/GSFC) Information Systems Division to develop autonomous positioning and attitude control for constellation satellites. It will be accomplished using traditional control theory and advanced coordination algorithms developed by the Johns Hopkins University Applied Physics Laboratory (JHU/APL). This capability will be demonstrated in the discrete event control test-bed located at JHU/APL. This project will be modeled for the Leonardo constellation mission, but is intended to be adaptable to any constellation mission. To develop a common software architecture. the controllers will only model very high-level responses. For instance, after determining that a maneuver must be made. the MADECC system will output B (Delta)V (velocity change) value. Lower level systems must then decide which thrusters to fire and for how long to achieve that (Delta)V.
Multi-Algorithm Particle Simulations with Spatiocyte.
Arjunan, Satya N V; Takahashi, Koichi
2017-01-01
As quantitative biologists get more measurements of spatially regulated systems such as cell division and polarization, simulation of reaction and diffusion of proteins using the data is becoming increasingly relevant to uncover the mechanisms underlying the systems. Spatiocyte is a lattice-based stochastic particle simulator for biochemical reaction and diffusion processes. Simulations can be performed at single molecule and compartment spatial scales simultaneously. Molecules can diffuse and react in 1D (filament), 2D (membrane), and 3D (cytosol) compartments. The implications of crowded regions in the cell can be investigated because each diffusing molecule has spatial dimensions. Spatiocyte adopts multi-algorithm and multi-timescale frameworks to simulate models that simultaneously employ deterministic, stochastic, and particle reaction-diffusion algorithms. Comparison of light microscopy images to simulation snapshots is supported by Spatiocyte microscopy visualization and molecule tagging features. Spatiocyte is open-source software and is freely available at http://spatiocyte.org .
DARPA TIMIT acoustic-phonetic continous speech corpus CD-ROM. NIST speech disc 1-1.1
NASA Astrophysics Data System (ADS)
Garofolo, J. S.; Lamel, L. F.; Fisher, W. M.; Fiscus, J. G.; Pallett, D. S.
1993-02-01
The Texas Instruments/Massachusetts Institute of Technology (TIMIT) corpus of read speech has been designed to provide speech data for the acquisition of acoustic-phonetic knowledge and for the development and evaluation of automatic speech recognition systems. TIMIT contains speech from 630 speakers representing 8 major dialect divisions of American English, each speaking 10 phonetically-rich sentences. The TIMIT corpus includes time-aligned orthographic, phonetic, and word transcriptions, as well as speech waveform data for each spoken sentence. The release of TIMIT contains several improvements over the Prototype CD-ROM released in December, 1988: (1) full 630-speaker corpus, (2) checked and corrected transcriptions, (3) word-alignment transcriptions, (4) NIST SPHERE-headered waveform files and header manipulation software, (5) phonemic dictionary, (6) new test and training subsets balanced for dialectal and phonetic coverage, and (7) more extensive documentation.
Zero-forcing pre-coding for MIMO WiMAX transceivers: Performance analysis and implementation issues
NASA Astrophysics Data System (ADS)
Cattoni, A. F.; Le Moullec, Y.; Sacchi, C.
Next generation wireless communication networks are expected to achieve ever increasing data rates. Multi-User Multiple-Input-Multiple-Output (MU-MIMO) is a key technique to obtain the expected performance, because such a technique combines the high capacity achievable using MIMO channel with the benefits of space division multiple access. In MU-MIMO systems, the base stations transmit signals to two or more users over the same channel, for this reason every user can experience inter-user interference. This paper provides a capacity analysis of an online, interference-based pre-coding algorithm able to mitigate the multi-user interference of the MU-MIMO systems in the context of a realistic WiMAX application scenario. Simulation results show that pre-coding can significantly increase the channel capacity. Furthermore, the paper presents several feasibility considerations for implementation of the analyzed technique in a possible FPGA-based software defined radio.
OBSIFRAC: database-supported software for 3D modeling of rock mass fragmentation
NASA Astrophysics Data System (ADS)
Empereur-Mot, Luc; Villemin, Thierry
2003-03-01
Under stress, fractures in rock masses tend to form fully connected networks. The mass can thus be thought of as a 3D series of blocks produced by fragmentation processes. A numerical model has been developed that uses a relational database to describe such a mass. The model, which assumes the fractures to be plane, allows data from natural networks to test theories concerning fragmentation processes. In the model, blocks are bordered by faces that are composed of edges and vertices. A fracture can originate from a seed point, its orientation being controlled by the stress field specified by an orientation matrix. Alternatively, it can be generated from a discrete set of given orientations and positions. Both kinds of fracture can occur together in a model. From an original simple block, a given fracture produces two simple polyhedral blocks, and the original block becomes compound. Compound and simple blocks created throughout fragmentation are stored in the database. Several fragmentation processes have been studied. In one scenario, a constant proportion of blocks is fragmented at each step of the process. The resulting distribution appears to be fractal, although seed points are random in each fragmented block. In a second scenario, division affects only one random block at each stage of the process, and gives a Weibull volume distribution law. This software can be used for a large number of other applications.
[Digital administrative maps - a tool for visualization of epidemiological data].
Niewiadomska, Ewa; Kowalska, Malgorzata; Czech, Elibieta; Skrzypek, Michal
2013-01-01
The aim of the study is to present the methods for visualization of epidemiological data using digital contour maps that take into account administrative division of Poland. The possibility of epidemiological data visualization in a geographical order, limited to the administrative level of the country, voivodeships and poviats (countics), are presented. They are crucial for the process of identifying and undertaking adequate prophylactic activities directed towards decreasing the risk and improving the population's health. This paper presents tools and techniques available in Geographic Information System ArcGIS and statistical software package R. The work includes our own data reflecting: 1) the values of specific mortality rates due to respiratory diseases, Poland, 2010, based on the Central Statistical Office data, using the R statistical software package; 2) the averaged registered incidence rates of sarcoidosis in 2006-2010 for the population aged 19+ in the Silesian voivodeship, using G(eographic Information System ArcGIS; and 3) the number of children with diagnosed respiratory diseases in the city of L.egnica in 2009, taking into account their place of residence, using layered maps in Geographic Information System ArcGIS. The tools presented and described in this paper make it possible to visualize the results of research, to increase attractiveness of courses for students, as well as to enhance the skills and competence of students and participants of courses.
Advanced Diagnostic System on Earth Observing One
NASA Technical Reports Server (NTRS)
Hayden, Sandra C.; Sweet, Adam J.; Christa, Scott E.; Tran, Daniel; Shulman, Seth
2004-01-01
In this infusion experiment, the Livingstone 2 (L2) model-based diagnosis engine, developed by the Computational Sciences division at NASA Ames Research Center, has been uploaded to the Earth Observing One (EO-1) satellite. L2 is integrated with the Autonomous Sciencecraft Experiment (ASE) which provides an on-board planning capability and a software bridge to the spacecraft's 1773 data bus. Using a model of the spacecraft subsystems, L2 predicts nominal state transitions initiated by control commands, monitors the spacecraft sensors, and, in the case of failure, isolates the fault based on the discrepant observations. Fault detection and isolation is done by determining a set of component modes, including most likely failures, which satisfy the current observations. All mode transitions and diagnoses are telemetered to the ground for analysis. The initial L2 model is scoped to EO-1's imaging instruments and solid state recorder. Diagnostic scenarios for EO-1's nominal imaging timeline are demonstrated by injecting simulated faults on-board the spacecraft. The solid state recorder stores the science images and also hosts: the experiment software. The main objective of the experiment is to mature the L2 technology to Technology Readiness Level (TRL) 7. Experiment results are presented, as well as a discussion of the challenging technical issues encountered. Future extensions may explore coordination with the planner, and model-based ground operations.
Mission Evaluation Room Intelligent Diagnostic and Analysis System (MIDAS)
NASA Technical Reports Server (NTRS)
Pack, Ginger L.; Falgout, Jane; Barcio, Joseph; Shnurer, Steve; Wadsworth, David; Flores, Louis
1994-01-01
The role of Mission Evaluation Room (MER) engineers is to provide engineering support during Space Shuttle missions, for Space Shuttle systems. These engineers are concerned with ensuring that the systems for which they are responsible function reliably, and as intended. The MER is a central facility from which engineers may work, in fulfilling this obligation. Engineers participate in real-time monitoring of shuttle telemetry data and provide a variety of analyses associated with the operation of the shuttle. The Johnson Space Center's Automation and Robotics Division is working to transfer advances in intelligent systems technology to NASA's operational environment. Specifically, the MER Intelligent Diagnostic and Analysis System (MIDAS) project provides MER engineers with software to assist them with monitoring, filtering and analyzing Shuttle telemetry data, during and after Shuttle missions. MIDAS off-loads to computers and software, the tasks of data gathering, filtering, and analysis, and provides the engineers with information which is in a more concise and usable form needed to support decision making and engineering evaluation. Engineers are then able to concentrate on more difficult problems as they arise. This paper describes some, but not all of the applications that have been developed for MER engineers, under the MIDAS Project. The sampling described herewith was selected to show the range of tasks that engineers must perform for mission support, and to show the various levels of automation that have been applied to assist their efforts.
Biology Division progress report, October 1, 1991--September 30, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, F.C.; Cook, J.S.
This Progress Report summarizes the research endeavors of the Biology Division of the Oak Ridge National Laboratory during the period October 1, 1991, through September 30, 1993. The report is structured to provide descriptions of current activities and accomplishments in each of the Division`s major organizational units. Lists of information to convey the entire scope of the Division`s activities are compiled at the end of the report.
28 CFR 0.1 - Organizational units.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Oriented Policing Services. Office on Violence Against Women. Office of the Federal Detention Trustee... Division. Civil Rights Division. Criminal Division. Environment and Natural Resources Division. National...
Annual Historical Report Calendar Year 1993
1994-04-01
Physical Training, 16. PRICE CODE Military Performance, Military Nutrition , Military Psychology. 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 19... Nutrition Division . . . . . . . . . . 97 Military Performance & Neuroscience Division . 115 Occupational Medicine Division ........ .130 Occupational...Directorate, Dr. James A. Vogel, Director. The Directorate incorporates the Military Nutrition Division, the Military Performance and Neuroscience Division
ERIC Educational Resources Information Center
Frieder, Laura L., Comp.; Fulks, Daniel L., Comp.
2007-01-01
Recent years have seen a number of National Collegiate Athletic Association (NCAA) Division II institutions seeking reclassification to Division I-AA and Division I-AA institutions moving to Division I-A. Yet, other schools that seem like natural candidates to reclassify have resisted. The purpose of this study is to investigate the impact of the…
Oriented cell division: new roles in guiding skin wound repair and regeneration
Yang, Shaowei; Ma, Kui; Geng, Zhijun; Sun, Xiaoyan; Fu, Xiaobing
2015-01-01
Tissue morphogenesis depends on precise regulation and timely co-ordination of cell division and also on the control of the direction of cell division. Establishment of polarity division axis, correct alignment of the mitotic spindle, segregation of fate determinants equally or unequally between daughter cells, are essential for the realization of oriented cell division. Furthermore, oriented cell division is regulated by intrinsic cues, extrinsic cues and other cues, such as cell geometry and polarity. However, dysregulation of cell division orientation could lead to abnormal tissue development and function. In the present study, we review recent studies on the molecular mechanism of cell division orientation and explain their new roles in skin repair and regeneration. PMID:26582817
Fulton, James L.
1992-01-01
Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support hydrologic analysis, hydrologic data processing, and publication of hydrologic thermatic maps. There is a need for the GIS vendor community to develop data set documentation tools similar to those developed by the USGS, or to incorporate USGS developed tools in their software.
NASA Astrophysics Data System (ADS)
Löwe, Peter; Klump, Jens; Robertson, Jesse
2015-04-01
Text mining is commonly employed as a tool in data science to investigate and chart emergent information from corpora of research abstracts, such as the Geophysical Research Abstracts (GRA) published by Copernicus. In this context current standards, such as persistent identifiers like DOI and ORCID, allow us to trace, cite and map links between journal publications, the underlying research data and scientific software. This network can be expressed as a directed graph which enables us to chart networks of cooperation and innovation, thematic foci and the locations of research communities in time and space. However, this approach of data science, focusing on the research process in a self-referential manner, rather than the topical work, is still in a developing stage. Scientific work presented at the EGU General Assembly is often the first step towards new approaches and innovative ideas to the geospatial community. It represents a rich, deep and heterogeneous source of geoscientific thought. This corpus is a significant data source for data science, which has not been analysed on this scale previously. In this work, the corpus of the Geophysical Research Abstracts is used for the first time as a data base for analyses of topical text mining. For this, we used a sturdy and customizable software framework, based on the work of Schmitt et al. [1]. For the analysis we used the High Performance Computing infrastructure of the German Research Centre for Geosciences GFZ in Potsdam, Germany. Here, we report on the first results from the analysis of the continuous spreading the of use of Free and Open Source Software Tools (FOSS) within the EGU communities, mapping the general increase of FOSS-themed GRA articles in the last decade and the developing spatial patterns of involved parties and FOSS topics. References: [1] Schmitt, L. M., Christianson, K.T, Gupta R..: Linguistic Computing with UNIX Tools, in Kao, A., Poteet S.R. (Eds.): Natural Language processing and Text Mining, Springer, 2007. doi:10.1007/978-1-84628-754-1_12.
Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory
NASA Astrophysics Data System (ADS)
Brown, A. L.; Nunn, J. A.; Sears, S. O.
2008-12-01
Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults using seismic and well data but also compute volumetrics. Exam results indicated that while students could complete sophisticated exercises using the software, their understanding of key concepts such as conservation of volume in a palinspastic reconstruction or association of structures with a particular stress regime was limited. Future classes will incorporate more paper and pencil exercises to illustrate basic concepts. The equipment, software, and exercises developed will be used in additional upper level undergraduate and graduate classes.
Bhattacharya, Anindya; De, Rajat K
2010-08-01
Distance based clustering algorithms can group genes that show similar expression values under multiple experimental conditions. They are unable to identify a group of genes that have similar pattern of variation in their expression values. Previously we developed an algorithm called divisive correlation clustering algorithm (DCCA) to tackle this situation, which is based on the concept of correlation clustering. But this algorithm may also fail for certain cases. In order to overcome these situations, we propose a new clustering algorithm, called average correlation clustering algorithm (ACCA), which is able to produce better clustering solution than that produced by some others. ACCA is able to find groups of genes having more common transcription factors and similar pattern of variation in their expression values. Moreover, ACCA is more efficient than DCCA with respect to the time of execution. Like DCCA, we use the concept of correlation clustering concept introduced by Bansal et al. ACCA uses the correlation matrix in such a way that all genes in a cluster have the highest average correlation values with the genes in that cluster. We have applied ACCA and some well-known conventional methods including DCCA to two artificial and nine gene expression datasets, and compared the performance of the algorithms. The clustering results of ACCA are found to be more significantly relevant to the biological annotations than those of the other methods. Analysis of the results show the superiority of ACCA over some others in determining a group of genes having more common transcription factors and with similar pattern of variation in their expression profiles. Availability of the software: The software has been developed using C and Visual Basic languages, and can be executed on the Microsoft Windows platforms. The software may be downloaded as a zip file from http://www.isical.ac.in/~rajat. Then it needs to be installed. Two word files (included in the zip file) need to be consulted before installation and execution of the software. Copyright 2010 Elsevier Inc. All rights reserved.
24 CFR 4.36 - Action by the Ethics Law Division.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Action by the Ethics Law Division... the Ethics Law Division. (a) After review of the Inspector General's report, the Ethics Law Division... that a violation of Section 103 or this subpart B has occurred. (b) If the Ethics Law Division...
24 CFR 4.36 - Action by the Ethics Law Division.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Action by the Ethics Law Division... the Ethics Law Division. (a) After review of the Inspector General's report, the Ethics Law Division... that a violation of Section 103 or this subpart B has occurred. (b) If the Ethics Law Division...
47 CFR 73.3617 - Information available on the Internet.
Code of Federal Regulations, 2013 CFR
2013-10-01
....fcc.gov/mb/; the Audio Division's address is http://www.fcc.gov/mmb/audio; the Video Division's address is http://www.fcc.gov/mb/video; the Policy Division's address is http://www.fcc.gov/mb/policy; the Engineering Division's address is http://www.fcc.gov/mb/engineering; and the Industry Analysis Division's...
47 CFR 73.3617 - Information available on the Internet.
Code of Federal Regulations, 2012 CFR
2012-10-01
....fcc.gov/mb/; the Audio Division's address is http://www.fcc.gov/mmb/audio; the Video Division's address is http://www.fcc.gov/mb/video; the Policy Division's address is http://www.fcc.gov/mb/policy; the Engineering Division's address is http://www.fcc.gov/mb/engineering; and the Industry Analysis Division's...
47 CFR 73.3617 - Information available on the Internet.
Code of Federal Regulations, 2014 CFR
2014-10-01
....fcc.gov/mb/; the Audio Division's address is http://www.fcc.gov/mmb/audio; the Video Division's address is http://www.fcc.gov/mb/video; the Policy Division's address is http://www.fcc.gov/mb/policy; the Engineering Division's address is http://www.fcc.gov/mb/engineering; and the Industry Analysis Division's...
PHOTOCOPY OF DRAWING NO. F860, DIVISION AVENUE STATION, EAST ELEVATION ...
PHOTOCOPY OF DRAWING NO. F-860, DIVISION AVENUE STATION, EAST ELEVATION AND DETAILS, DRAWN BY W.H.C., MAR. 22, 1915. COURTESY OF THE DEPARTMENT OF PUBLIC UTILITIES, DIVISION OF WATER, CITY OF CLEVELAND. - Division Avenue Pumping Station & Filtration Plant, West 45th Street and Division Avenue, Cleveland, Cuyahoga County, OH
24 CFR 4.36 - Action by the Ethics Law Division.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Action by the Ethics Law Division... the Ethics Law Division. (a) After review of the Inspector General's report, the Ethics Law Division... that a violation of Section 103 or this subpart B has occurred. (b) If the Ethics Law Division...
24 CFR 4.36 - Action by the Ethics Law Division.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Action by the Ethics Law Division... the Ethics Law Division. (a) After review of the Inspector General's report, the Ethics Law Division... that a violation of Section 103 or this subpart B has occurred. (b) If the Ethics Law Division...
24 CFR 4.36 - Action by the Ethics Law Division.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Action by the Ethics Law Division... the Ethics Law Division. (a) After review of the Inspector General's report, the Ethics Law Division... that a violation of Section 103 or this subpart B has occurred. (b) If the Ethics Law Division...
Physics division. Progress report, January 1, 1995--December 31, 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, M.; Bacon, D.S.; Aine, C.J.
1997-10-01
This issue of the Physics Division Progress Report describes progress and achievements in Physics Division research during the period January 1, 1995-December 31, 1996. The report covers the five main areas of experimental research and development in which Physics Division serves the needs of Los Alamos National Laboratory and the nation in applied and basic sciences: (1) biophysics, (2) hydrodynamic physics, (3) neutron science and technology, (4) plasma physics, and (5) subatomic physics. Included in this report are a message from the Division Director, the Physics Division mission statement, an organizational chart, descriptions of the research areas of the fivemore » groups in the Division, selected research highlights, project descriptions, the Division staffing and funding levels for FY95-FY97, and a list of publications and presentations.« less
NASA Astrophysics Data System (ADS)
Kwok, Sun; Koo, Bon-Chul; Chu, You-Hua; Breitschwerdt, Dieter; De Gouveia Dal Pino, Elisabete; Yang, Ji; Tsuboi, Masato; Rozyczka, Michael; Juvela, Mika; Burton, Mike; Caselli, Paola; Lizano, Susana; Cabrit, Sylvie; Toth, Viktor
2016-04-01
Before 2012, Commission 34 was identical to Division VI. The organization and executive officers of the Division and the Commission were the same. At the 2012 General Assembly (GA) in Beijing, the IAU reformed the divisional structure and the previous Division VI which Commission 34 was under was combined with Div VII to form Division H: Interstellar Matter and Local Universe. Ewine van Dishoeck was named by the IAU executive committee as the President of the new Division. Since the Commission structure is to be reformed at the GA of 2015, Commission 34 retains its original name ``Interstellar Matter'' and joined Commissions 33 and 37 (formally under Division VII) as commissions under the new Division H.
History of Division 29, 1993-2013: another 20 years of psychotherapy.
Williams, Elizabeth Nutt; Barnett, Jeffrey E; Canter, Mathilda B
2013-03-01
The history of Division 29 (Psychotherapy) of the American Psychological Association (APA) from 1993 to 2013 is reviewed. The 20 years of history can be traced via the Division's primary publications (the journal Psychotherapy and its newsletter Psychotherapy Bulletin) as well as the history of those who have served leadership roles in the Division and have won Divisional awards. Several recurring themes emerge related to the Division's articulations of its own identity, the Division's advocacy efforts vis-à-vis the profession and the APA, and the work of the Division on behalf of major social issues (such as disaster relief and the nation's health care).
Rosene, John M; Raksnis, Bryan; Silva, Brie; Woefel, Tyler; Visich, Paul S; Dompier, Thomas P; Kerr, Zachary Y
2017-09-01
Examinations related to divisional differences in the incidence of sports-related concussions (SRC) in collegiate ice hockey are limited. To compare the epidemiologic patterns of concussion in National Collegiate Athletic Association (NCAA) ice hockey by sex and division. Descriptive epidemiology study. A convenience sample of men's and women's ice hockey teams in Divisions I and III provided SRC data via the NCAA Injury Surveillance Program during the 2009-2010 to 2014-2015 academic years. Concussion counts, rates, and distributions were examined by factors including injury activity and position. Injury rate ratios (IRRs) and injury proportion ratios (IPRs) with 95% confidence intervals (CIs) were used to compare concussion rates and distributions, respectively. Overall, 415 concussions were reported for men's and women's ice hockey combined. The highest concussion rate was found in Division I men (0.83 per 1000 athlete-exposures [AEs]), followed by Division III women (0.78/1000 AEs), Division I women (0.65/1000 AEs), and Division III men (0.64/1000 AEs). However, the only significant IRR was that the concussion rate was higher in Division I men than Division III men (IRR = 1.29; 95% CI, 1.02-1.65). The proportion of concussions from checking was higher in men than women (28.5% vs 9.4%; IPR = 3.02; 95% CI, 1.63-5.59); however, this proportion was higher in Division I women than Division III women (18.4% vs 1.8%; IPR = 10.47; 95% CI, 1.37-79.75). The proportion of concussions sustained by goalkeepers was higher in women than men (14.2% vs 2.9%; IPR = 4.86; 95% CI, 2.19-10.77), with findings consistent within each division. Concussion rates did not vary by sex but differed by division among men. Checking-related concussions were less common in women than men overall but more common in Division I women than Division III women. Findings highlight the need to better understand the reasons underlying divisional differences within men's and women's ice hockey and the need to develop concussion prevention strategies specific to each athlete population.
49 CFR 176.410 - Division 1.5 materials, ammonium nitrate and ammonium nitrate mixtures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Oxidizers and Organic Peroxides), and Division 1.5 Materials § 176.410 Division 1.5 materials, ammonium...) Ammonium nitrate, Division 5.1 (oxidizer), UN1942. (3) Ammonium nitrate fertilizer, Division 5.1 (oxidizer), UN 2067. (b) This section does not apply to Ammonium nitrate fertilizer, Class 9, UN 2071 or to any...
Cell division cycle 45 promotes papillary thyroid cancer progression via regulating cell cycle.
Sun, Jing; Shi, Run; Zhao, Sha; Li, Xiaona; Lu, Shan; Bu, Hemei; Ma, Xianghua
2017-05-01
Cell division cycle 45 was reported to be overexpressed in some cancer-derived cell lines and was predicted to be a candidate oncogene in cervical cancer. However, the clinical and biological significance of cell division cycle 45 in papillary thyroid cancer has never been investigated. We determined the expression level and clinical significance of cell division cycle 45 using The Cancer Genome Atlas, quantitative real-time polymerase chain reaction, and immunohistochemistry. A great upregulation of cell division cycle 45 was observed in papillary thyroid cancer tissues compared with adjacent normal tissues. Furthermore, overexpression of cell division cycle 45 positively correlates with more advanced clinical characteristics. Silence of cell division cycle 45 suppressed proliferation of papillary thyroid cancer cells via G1-phase arrest and inducing apoptosis. The oncogenic activity of cell division cycle 45 was also confirmed in vivo. In conclusion, cell division cycle 45 may serve as a novel biomarker and a potential therapeutic target for papillary thyroid cancer.
Griffith, Megan E.; Mayer, Ulrike; Capron, Arnaud; Ngo, Quy A.; Surendrarao, Anandkumar; McClinton, Regina; Jürgens, Gerd; Sundaresan, Venkatesan
2007-01-01
Embryogenesis in Arabidopsis thaliana is marked by a predictable sequence of oriented cell divisions, which precede cell fate determination. We show that mutation of the TORMOZ (TOZ) gene yields embryos with aberrant cell division planes and arrested embryos that appear not to have established normal patterning. The defects in toz mutants differ from previously described mutations that affect embryonic cell division patterns. Longitudinal division planes of the proembryo are frequently replaced by transverse divisions and less frequently by oblique divisions, while divisions of the suspensor cells, which divide only transversely, appear generally unaffected. Expression patterns of selected embryo patterning genes are altered in the mutant embryos, implying that the positional cues required for their proper expression are perturbed by the misoriented divisions. The TOZ gene encodes a nucleolar protein containing WD repeats. Putative TOZ orthologs exist in other eukaryotes including Saccharomyces cerevisiae, where the protein is predicted to function in 18S rRNA biogenesis. We find that disruption of the Sp TOZ gene results in cell division defects in Schizosaccharomyces pombe. Previous studies in yeast and animal cells have identified nucleolar proteins that regulate the exit from M phase and cytokinesis, including factors involved in pre-rRNA processing. Our study suggests that in plant cells, nucleolar functions might interact with the processes of regulated cell divisions and influence the selection of longitudinal division planes during embryogenesis. PMID:17616738
ERIC Educational Resources Information Center
Guerrero, Lourdes; Rivera, Antonio
Fourteen third graders were given numerical computation and division-with-remainder (DWR) problems both before and after they were taught the division algorithm in classrooms. Their solutions were examined. The results show that students' initial acquisition of the division algorithm did improve their performance in numerical division computations…
Fiscal Law Course Deskbook (36th)
1993-05-01
CONTRACT LAW DIVISION THE JUDGE ADVOCATE GENERAL’S SCHOOL UNITED STATES ARMY CHARLOTTESVILLE, VIRGINIA JA 506...MAY 93 93-13173 CONTRIBUTORS I LTC John T. Jones, Jr. Chief, Contract Law Division LTC Harry L. Dorsey Senior Instructor, Contract Law Division MAJ...Michael A. Killham Instructor, Contract Law Division MAJ Bobby D. Melvin Instructor, Contract Law Division MAJ Michael K. Cameron Instructor, Contract
Fuel management in the Subtropical and Savanna divisions
Kenneth W. Outcalt
2012-01-01
The Subtropical Division (230) and Savanna Division (410), both based on Baileyâs (1996) ecoregions, are found in the Southern United States (http://www.na.fs.fed.us/fire/cwedocs/map%20new_divisions.pdf). The Subtropical Division occupies the southern Atlantic and Gulf coastal areas. It is characterized by a humid subtropical climate with hot humid summers (chapter 3...
Quantitative regulation of B cell division destiny by signal strength.
Turner, Marian L; Hawkins, Edwin D; Hodgkin, Philip D
2008-07-01
Differentiation to Ab secreting and isotype-switched effector cells is tightly linked to cell division and therefore the degree of proliferation strongly influences the nature of the immune response. The maximum number of divisions reached, termed the population division destiny, is stochastically distributed in the population and is an important parameter in the quantitative outcome of lymphocyte responses. In this study, we further assessed the variables that regulate B cell division destiny in vitro in response to T cell- and TLR-dependent stimuli. Both the concentration and duration of stimulation were able to regulate the average maximum number of divisions undergone for each stimulus. Notably, a maximum division destiny was reached during provision of repeated saturating stimulation, revealing that an intrinsic limit to proliferation exists even under these conditions. This limit was linked directly to division number rather than time of exposure to stimulation and operated independently of the survival regulation of the cells. These results demonstrate that a B cell population's division destiny is regulable by the stimulatory conditions up to an inherent maximum value. Division destiny is a crucial parameter in regulating the extent of B cell responses and thereby also the nature of the immune response mounted.
Campanoni, Prisca; Nick, Peter
2005-01-01
During exponential phase, the tobacco (Nicotiana tabacum) cell line cv Virginia Bright Italia-0 divides axially to produce linear cell files of distinct polarity. This axial division is controlled by exogenous auxin. We used exponential tobacco cv Virginia Bright Italia-0 cells to dissect early auxin signaling, with cell division and cell elongation as physiological markers. Experiments with 1-naphthaleneacetic acid (NAA) and 2,4-dichlorophenoxyacetic acid (2,4-D) demonstrated that these 2 auxin species affect cell division and cell elongation differentially; NAA stimulates cell elongation at concentrations that are much lower than those required to stimulate cell division. In contrast, 2,4-D promotes cell division but not cell elongation. Pertussis toxin, a blocker of heterotrimeric G-proteins, inhibits the stimulation of cell division by 2,4-D but does not affect cell elongation. Aluminum tetrafluoride, an activator of the G-proteins, can induce cell division at NAA concentrations that are not permissive for division and even in the absence of any exogenous auxin. The data are discussed in a model where the two different auxins activate two different pathways for the control of cell division and cell elongation. PMID:15734918
Biology Division progress report, October 1, 1993--September 30, 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-10-01
This Progress Report summarizes the research endeavors of the Biology Division of the Oak Ridge National Laboratory during the period October 1, 1993, through September 30, 1995. The report is structured to provide descriptions of current activities and accomplishments in each of the Division`s major organizational units. Lists of information to convey the entire scope of the Division`s activities are compiled at the end of the report. Attention is focused on the following research activities: molecular, cellular, and cancer biology; mammalian genetics and development; genome mapping program; and educational activities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Laboratories Complex Staff. Division of Engineering Services. Environment, Safety And Strategic Initiatives.... Office of Cellular, Tissue, and Gene Therapies. Regulatory Management Staff. Division of Cellular and Gene Therapies. Division of Clinical Evaluation and Pharmacology/Toxicology. Division of Human Tissues...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-12
..., Adecco Employment Services, Winchester, KY; Niles America Wintech, Inc., Assembly and Testing Division, a... former workers of Niles America Wintech, Inc., Warehousing Division and Assembly and Testing Division...
Surgery on Fetus Reduces Complications of Spina Bifida
MedlinePlus Videos and Cool Tools
... Division of Intramural Research (DIR) Division of Intramural Population Health Research (DIPHR) Research Funded by NICHD Division of ... Research (DIR) Training in the Division of Intramural Population Health Research (DIPHR) Sample Applications Find a Program Officer ...
Stationary Size Distributions of Growing Cells with Binary and Multiple Cell Division
NASA Astrophysics Data System (ADS)
Rading, M. M.; Engel, T. A.; Lipowsky, R.; Valleriani, A.
2011-10-01
Populations of unicellular organisms that grow under constant environmental conditions are considered theoretically. The size distribution of these cells is calculated analytically, both for the usual process of binary division, in which one mother cell produces always two daughter cells, and for the more complex process of multiple division, in which one mother cell can produce 2 n daughter cells with n=1,2,3,… . The latter mode of division is inspired by the unicellular algae Chlamydomonas reinhardtii. The uniform response of the whole population to different environmental conditions is encoded in the individual rates of growth and division of the cells. The analytical treatment of the problem is based on size-dependent rules for cell growth and stochastic transition processes for cell division. The comparison between binary and multiple division shows that these different division processes lead to qualitatively different results for the size distribution and the population growth rates.
Incorporating CLIPS into a personal-computer-based Intelligent Tutoring System
NASA Technical Reports Server (NTRS)
Mueller, Stephen J.
1990-01-01
A large number of Intelligent Tutoring Systems (ITS's) have been built since they were first proposed in the early 1970's. Research conducted on the use of the best of these systems has demonstrated their effectiveness in tutoring in selected domains. Computer Sciences Corporation, Applied Technology Division, Houston Operations has been tasked by the Spacecraft Software Division at NASA/Johnson Space Center (NASA/JSC) to develop a number of lTS's in a variety of domains and on many different platforms. This paper will address issues facing the development of an ITS on a personal computer using the CLIPS (C Language Integrated Production System) language. For an ITS to be widely accepted, not only must it be effective, flexible, and very responsive, it must also be capable of functioning on readily available computers. There are many issues to consider when using CLIPS to develop an ITS on a personal computer. Some of these issues are the following: when to use CLIPS and when to use a procedural language such as C, how to maximize speed and minimize memory usage, and how to decrease the time required to load your rule base once you are ready to deliver the system. Based on experiences in developing the CLIPS Intelligent Tutoring System (CLIPSITS) on an IBM PC clone and an intelligent Physics Tutor on a Macintosh 2, this paper reports results on how to address some of these issues. It also suggests approaches for maintaining a powerful learning environment while delivering robust performance within the speed and memory constraints of the personal computer.
Evaluating the Medical Kit System for the International Space Station(ISS) - A Paradigm Revisited
NASA Technical Reports Server (NTRS)
Hailey, Melinda J.; Urbina, Michelle C.; Hughlett, Jessica L.; Gilmore, Stevan; Locke, James; Reyna, Baraquiel; Smith, Gwyn E.
2010-01-01
Medical capabilities aboard the International Space Station (ISS) have been packaged to help astronaut crew medical officers (CMO) mitigate both urgent and non-urgent medical issues during their 6-month expeditions. Two ISS crewmembers are designated as CMOs for each 3-crewmember mission and are typically not physicians. In addition, the ISS may have communication gaps of up to 45 minutes during each orbit, necessitating medical equipment that can be reliably operated autonomously during flight. The retirement of the space shuttle combined with ten years of manned ISS expeditions led the Space Medicine Division at the NASA Johnson Space Center to reassess the current ISS Medical Kit System. This reassessment led to the system being streamlined to meet future logistical considerations with current Russian space vehicles and future NASA/commercial space vehicle systems. Methods The JSC Space Medicine Division coordinated the development of requirements, fabrication of prototypes, and conducted usability testing for the new ISS Medical Kit System in concert with implementing updated versions of the ISS Medical Check List and associated in-flight software applications. The teams constructed a medical kit system with the flexibility for use on the ISS, and resupply on the Russian Progress space vehicle and future NASA/commercial space vehicles. Results Prototype systems were developed, reviewed, and tested for implementation. Completion of Preliminary and Critical Design Reviews resulted in a streamlined ISS Medical Kit System that is being used for training by ISS crews starting with Expedition 27 (June 2011). Conclusions The team will present the process for designing, developing, , implementing, and training with this new ISS Medical Kit System.
On the Unsteady-Motion Theory of Magnetic Forces for Maglev
1993-11-01
DivisionEnergy Technology Division Forces for Maglev Energy Technology DivisionEnergy Technology Division by S. S. Chen, S. Zhu, and Y. Cai APQ 4 袲...On the Unsteady-Motion Theory of Magnetic Forces for Maglev by S. S. Chen, S. Zhu, and Y. Cai Energy Technology Division November 1993 Work supported...vi On The Unsteady-Motion Theory of Magnetic Forces for Maglev by S. S
3. Oblique view of 215 Division Street, looking southeast, showing ...
3. Oblique view of 215 Division Street, looking southeast, showing rear (west) facade and north side, Fairbanks Company appears at left and 215 Division Street is visible at right - 215 Division Street (House), Rome, Floyd County, GA
2. Oblique view of 215 Division Street, looking northeast, showing ...
2. Oblique view of 215 Division Street, looking northeast, showing rear (west) facade and south side, 217 Division Street is visible at left and Fairbanks Company appears at right - 215 Division Street (House), Rome, Floyd County, GA
3. Oblique view of 213 Division Street, looking northeast, showing ...
3. Oblique view of 213 Division Street, looking northeast, showing rear (west) facade and south side, 215 Division Street is visible at left and Fairbanks Company appears at right - 213 Division Street (House), Rome, Floyd County, GA
Chromosome segregation drives division site selection in Streptococcus pneumoniae.
van Raaphorst, Renske; Kjos, Morten; Veening, Jan-Willem
2017-07-18
Accurate spatial and temporal positioning of the tubulin-like protein FtsZ is key for proper bacterial cell division. Streptococcus pneumoniae (pneumococcus) is an oval-shaped, symmetrically dividing opportunistic human pathogen lacking the canonical systems for division site control (nucleoid occlusion and the Min-system). Recently, the early division protein MapZ was identified and implicated in pneumococcal division site selection. We show that MapZ is important for proper division plane selection; thus, the question remains as to what drives pneumococcal division site selection. By mapping the cell cycle in detail, we show that directly after replication both chromosomal origin regions localize to the future cell division sites, before FtsZ. Interestingly, Z-ring formation occurs coincidently with initiation of DNA replication. Perturbing the longitudinal chromosomal organization by mutating the condensin SMC, by CRISPR/Cas9-mediated chromosome cutting, or by poisoning DNA decatenation resulted in mistiming of MapZ and FtsZ positioning and subsequent cell elongation. Together, we demonstrate an intimate relationship between DNA replication, chromosome segregation, and division site selection in the pneumococcus, providing a simple way to ensure equally sized daughter cells.
6. Contextual view of Fairbanks Company, looking south along Division ...
6. Contextual view of Fairbanks Company, looking south along Division Street, showing relationship of factory to surrounding area, 213, 215, & 217 Division Street appear on right side of street - Fairbanks Company, 202 Division Street, Rome, Floyd County, GA
Atmospheric and Geophysical Sciences Division Program Report, 1988--1989
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-06-01
In 1990, the Atmospheric and Geophysical Sciences Division begins its 17th year as a division. As the Division has grown over the years, its modeling capabilities have expanded to include a broad range of time and space scales ranging from hours to decades and from local to global. Our modeling is now reaching out from its atmospheric focus to treat linkages with the oceans and the land. In this report, we describe the Division's goal and organizational structure. We also provide tables and appendices describing the Division's budget, personnel, models, and publications. 2 figs., 1 tab.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
... DEPARTMENT OF JUSTICE [OMB Number 1124-0001] National Security Division; Agency Information..., 10th & Constitution Avenue, NW., National Security Division, Counterespionage Section/Registration Unit... Justice sponsoring the collection: Form Number: NSD- 1. National Security Division, U.S. Department of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
... DEPARTMENT OF JUSTICE [OMB Number 1124-0006] National Security Division; Agency Information...), National Security Division (NSD), will be submitting the following information collection request to the..., 10th & Constitution Avenue, NW., National Security Division, Counterespionage Section/Registration Unit...
Code of Federal Regulations, 2014 CFR
2014-01-01
... applicant for chemical, physical, or microbiological analyses and tests at a Science and Technology Division... Science and Technology Division laboratory, or by a laboratory approved and recognized by the Division to... quality control of procedures. Official plant or Science and Technology Division laboratories can analyze...
Code of Federal Regulations, 2013 CFR
2013-01-01
... applicant for chemical, physical, or microbiological analyses and tests at a Science and Technology Division... Science and Technology Division laboratory, or by a laboratory approved and recognized by the Division to... quality control of procedures. Official plant or Science and Technology Division laboratories can analyze...
Code of Federal Regulations, 2012 CFR
2012-01-01
... applicant for chemical, physical, or microbiological analyses and tests at a Science and Technology Division... Science and Technology Division laboratory, or by a laboratory approved and recognized by the Division to... quality control of procedures. Official plant or Science and Technology Division laboratories can analyze...
Judge, Lawrence W; Craig, Bruce; Baudendistal, Steve; Bodey, Kimberly J
2009-07-01
Research supports the use of preactivity warm-up and stretching, and the purpose of this study was to determine whether college football programs follow these guidelines. Questionnaires designed to gather demographic, professional, and educational information, as well as specific pre- and postactivity practices, were distributed via e-mail to midwestern collegiate programs from NCAA Division I and III conferences. Twenty-three male coaches (12 from Division IA schools and 11 from Division III schools) participated in the study. Division I schools employed certified strength coaches (CSCS; 100%), whereas Division III schools used mainly strength coordinators (73%), with only 25% CSCS. All programs used preactivity warm-up, with the majority employing 2-5 minutes of sport-specific jogging/running drills. Pre stretching (5-10 minutes) was performed in 19 programs (91%), with 2 (9%) performing no pre stretching. Thirteen respondents used a combination of static/proprioceptive neuromuscular facilitation/ballistic and dynamic flexibility, 5 used only dynamic flexibility, and 1 used only static stretching. All 12 Division I coaches used stretching, whereas only 9 of the 11 Division III coaches did (p = 0.22). The results indicate that younger coaches did not use pre stretching (p = 0.30). The majority of the coaches indicated that they did use post stretching, with 11 of the 12 Division I coaches using stretching, whereas only 5 of the 11 Division III coaches used stretching postactivity (p = 0.027). Divisional results show that the majority of Division I coaches use static-style stretching (p = 0.049). The results of this study indicate that divisional status, age, and certification may influence how well research guidelines are followed. Further research is needed to delineate how these factors affect coaching decisions.
2004-02-01
Potential new stan- dard ASME Boiler and Pressure Vessel Code, Section VIII ( BPVC -VIII), Division 1 Rules for Construction of Pressure Vessels...Published and avail- able for sale. ASME BPVC -VIII Division 2 Rules for Construction of Pressure Vessels, Division 2, Gerry Eisenberg, ASME ...Vessels, Division 3, Alternate ASME BPVC -VIII Division 3 Gerry Eisenberg, ASME Published and avail- able for sale. Rules High
TRADOC Annual Command History, 1 January to 31 December 1991
1992-06-01
24th Infantry Division (Mechanized), the French 6th Light Armored Division, the 3d Armored Cavalry Regiment, and other units assigned; and VII Corps...under Lt. Gen. Frederick M. Franks, Jr., with the U.S. Ist Infantry Division (Mechanized), Ist and 3d Armored Divisions, the Ist Cavalry Division, the...Alma Ata, Kazakhstan, the presidents of all the former republics except Georgia and the three seceding Baltic states, declared formation of the
Annual Historical Report Calendar Year 1992
1993-04-01
Environmental Stress, Exercise Physiology, Physical Training, 16. PRICE CODE Military Performance, Military Nutrition , Military Psychology. 17. SECURITY...63 Occupational Health & Performance Directorate . . . 84 Military Nutrition Division ........ ........... 87 Military Performance...Military Nutrition Division, the Military Performance and Neuroscience Division, the Occupational Medicine Division, and the Occupational Physiology
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
...] International Business Machines Corporation, ITD Business Unit, Division 7, E-mail and Collaboration Group... Business Machines Corporation (IBM), ITD Business Unit, Division 7, E- mail and Collaboration Group... Business Unit, Division 7, E-mail and Collaboration [[Page 46854
14 CFR 139.205 - Amendment of Airport Certification Manual.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Under § 139.3, the Regional Airports Division Manager may amend any Airport Certification Manual... Airports Division Manager's own initiative, if the Regional Airports Division Manager determines that... proposed amendment to its Airport Certification Manual to the Regional Airports Division Manager at least...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
... DEPARTMENT OF JUSTICE [OMB Number 1124-0004] National Security Division: Agency Information...), National Security Division (NSD), will be submitting the following information collection request to the... write to U.S. Department of Justice, 10th & Constitution Avenue, NW., National Security Division...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... Packard Company, Enterprise Business Division, Technical Services America, Global Parts Supply Chain Group... Business Division, Technical Services America, Global Parts Supply Chain Group, Including Leased Workers... Packard Company, Enterprise Business Division, Technical Services America, Global Parts Supply Chain Group...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-27
... of Jason Incorporated, Janesville Acoustics Division, Subsidiary of Jason Partners Holdings LLC... Incorporated, Janesville Acoustics Division, Subsidiary of Jason Partners Holdings LLC to be considered leased... Incorporated, Janesville Acoustics Division, Subsidiary of Jason Partners Holdings LLC. The amended notice...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-19
... Pine, Colville Tribal Enterprise Corporation, Wood Products Division, Including On-Site Contract... Tribal Enterprise Corporation Wood Products Division, Omak, Washington. The notice was published in the... Colville Indian Precision Pine, Colville Tribal Enterprise Corporation, Wood Products Division. The...
Waste minimization/pollution prevention study of high-priority waste streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogle, R.B.
1994-03-01
Although waste minimization has been practiced by the Metals and Ceramics (M&C) Division in the past, the effort has not been uniform or formalized. To establish the groundwork for continuous improvement, the Division Director initiated a more formalized waste minimization and pollution prevention program. Formalization of the division`s pollution prevention efforts in fiscal year (FY) 1993 was initiated by a more concerted effort to determine the status of waste generation from division activities. The goal for this effort was to reduce or minimize the wastes identified as having the greatest impact on human health, the environment, and costs. Two broadmore » categories of division wastes were identified as solid/liquid wastes and those relating to energy use (primarily electricity and steam). This report presents information on the nonradioactive solid and liquid wastes generated by division activities. More specifically, the information presented was generated by teams of M&C staff members empowered by the Division Director to study specific waste streams.« less
Kovacevic, Ismar; Bao, Zhirong
2018-01-01
C. elegans cell divisions that produce an apoptotic daughter cell exhibit Daughter Cell Size Asymmetry (DCSA), producing a larger surviving daughter cell and a smaller daughter cell fated to die. Genetic screens for mutants with defects in apoptosis identified several genes that are also required for the ability of these divisions to produce daughter cells that differ in size. One of these genes, ham-1, encodes a putative transcription factor that regulates a subset of the asymmetric cell divisions that produce an apoptotic daughter cell. In a survey of C. elegans divisions, we found that ham-1 mutations affect primarily anterior/posterior divisions that produce a small anterior daughter cell. The affected divisions include those that generate an apoptotic cell as well as those that generate two surviving cells. Our findings suggest that HAM-1 primarily promotes DCSA in a certain class of asymmetric divisions. PMID:29668718
Wagman, Petra; Nordin, Maria; Alfredsson, Lars; Westerholm, Peter J M; Fransson, Eleonor I
2017-01-01
The amount and perception of domestic work may affect satisfaction with everyday life, but further knowledge is needed about the relationship between domestic work division and health and well-being. To describe the division of, and satisfaction with, domestic work and responsibility for home/family in adults living with a partner. A further aim was to investigate the associations between these aspects and self-rated life satisfaction and health. Data from the Work, Lipids and Fibrinogen survey collected 2009 were used, comprising 4924 participants living with a partner. Data were analyzed using logistic regression. The majority shared domestic work and responsibility for home/family equally with their partner. However, more women conducted the majority of the domestic work and were less satisfied with its division. When both division and satisfaction with division was included in the analysis, solely satisfaction with the division and the responsibility were associated with higher odds for good life satisfaction. Regarding health, higher odds for good self-rated health were seen in those who were satisfied with their division of responsibility. The results highlight the importance of taking into account not solely the actual division of domestic work but also the satisfaction with it.
Wang, Yi-Wen; Yuan, Jin-Qiang; Gao, Xin; Yang, Xian-Yu
2012-12-01
There are six micronuclear divisions during conjugation of Paramecium caudatum: three prezygotic and three postzygotic divisions. Four haploid nuclei are formed during the first two meiotic prezygotic divisions. Usually only one meiotic product is located in the paroral cone (PC) region at the completion of meiosis, which survives and divides mitotically to complete the third prezygotic division to yield a stationary and a migratory pronucleus. The remaining three located outside of the PC degenerate. The migratory pronuclei are then exchanged between two conjugants and fuse with the stationary pronuclei to form synkarya, which undergo three successive divisions (postzygotic divisions). However, little is known about the surviving mechanism of the PC nuclei. In the current study, stage-specific appearance of cytoplasmic microtubules (cMTs) was indicated during the third prezygotic division by immunofluorescence labeling with anti-alpha tubulin antibodies surrounding the surviving nuclei, including the PC nuclei and the two types of prospective pronuclei. This suggested that cMTs were involved in the formation of a physical barrier, whose function may relate to sequestering and protecting the surviving nuclei from the major cytoplasm, where degeneration of extra-meiotic products occurs, another important nuclear event during the third prezygotic division.
Space Station Freedom (SSF) Data Management System (DMS) performance model data base
NASA Technical Reports Server (NTRS)
Stovall, John R.
1993-01-01
The purpose of this document was originally to be a working document summarizing Space Station Freedom (SSF) Data Management System (DMS) hardware and software design, configuration, performance and estimated loading data from a myriad of source documents such that the parameters provided could be used to build a dynamic performance model of the DMS. The document is published at this time as a close-out of the DMS performance modeling effort resulting from the Clinton Administration mandated Space Station Redesign. The DMS as documented in this report is no longer a part of the redesigned Space Station. The performance modeling effort was a joint undertaking between the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) Flight Data Systems Division (FDSD) and the NASA Ames Research Center (ARC) Spacecraft Data Systems Research Branch. The scope of this document is limited to the DMS core network through the Man Tended Configuration (MTC) as it existed prior to the 1993 Clinton Administration mandated Space Station Redesign. Data is provided for the Standard Data Processors (SDP's), Multiplexer/Demultiplexers (MDM's) and Mass Storage Units (MSU's). Planned future releases would have added the additional hardware and software descriptions needed to describe the complete DMS. Performance and loading data through the Permanent Manned Configuration (PMC) was to have been included as it became available. No future releases of this document are presently planned pending completion of the present Space Station Redesign activities and task reassessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGoldrick, P.R.; Allison, T.G.
The BASIC2 INTERPRETER was developed to provide a high-level easy-to-use language for performing both control and computational functions in the MCS-80. The package is supplied as two alternative implementations, hardware and software. The ''software'' implementation provides the following capabilities: entry and editing of BASIC programs, device-independent I/O, special functions to allow access from BASIC to any I/O port, formatted printing, special INPUT/OUTPUT-and-proceed statements to allow I/O without interrupting BASIC program execution, full arithmetic expressions, limited string manipulation (10 or fewer characters), shorthand forms for common BASIC keywords, immediate mode BASIC statement execution, and capability of running a BASIC program thatmore » is stored in PROM. The allowed arithmetic operations are addition, subtraction, multiplication, division, and raising a number to a positive integral power. In the second, or ''hardware'', implementation of BASIC2 requiring an Am9511 Arithmetic Processing Unit (APU) interfaced to the 8080 microprocessor, arithmetic operations are performed by the APU. The following additional built-in functions are available in this implementation: square root, sine, cosine, tangent, arcsine, arccosine, arctangent, exponential, logarithm base e, and logarithm base 10. MCS-80,8080-based microcomputers; 8080 Assembly language; Approximately 8K bytes of RAM to store the assembled interpreter, additional user program space, and necessary peripheral devices. The hardware implementation requires an Am9511 Arithmetic Processing Unit and an interface board (reference 2).« less
ANL site response for the DOE FY1994 information resources management long-range plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boxberger, L.M.
1992-03-01
Argonne National Laboratory`s ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory`s previous submissions. The response contains both narrative and tabular material.more » It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory`s Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, ``Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.« less
ANL site response for the DOE FY1994 information resources management long-range plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boxberger, L.M.
1992-03-01
Argonne National Laboratory's ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory's previous submissions. The response contains both narrative and tabular material.more » It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory's Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.« less
About Us | Alaska Division of Geological & Geophysical Surveys
Division of Geological & Geophysical Surveys (DGGS) 3354 College Road, Fairbanks, AK 99709 Phone: (907 Division also administers the 11-member Alaska Seismic Hazards Safety Commission. Accomplishments The . Department of Natural Resources, Division of Geological & Geophysical Surveys (DGGS) 3354 College Road
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
..., Inc., Automation and Control Solutions Division, Including On-Site Leased Workers From Manpower... International, Inc., Automation and Control Solutions Division, Rock Island, Illinois. The notice was published...., Automation and Control Solutions Division. The Department has determined that these workers were sufficiently...
28 CFR 3.2 - Assistant Attorney General, Criminal Division.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Assistant Attorney General, Criminal Division. 3.2 Section 3.2 Judicial Administration DEPARTMENT OF JUSTICE GAMBLING DEVICES § 3.2 Assistant Attorney General, Criminal Division. The Assistant Attorney General, Criminal Division, is authorized to...
28 CFR 3.2 - Assistant Attorney General, Criminal Division.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Assistant Attorney General, Criminal Division. 3.2 Section 3.2 Judicial Administration DEPARTMENT OF JUSTICE GAMBLING DEVICES § 3.2 Assistant Attorney General, Criminal Division. The Assistant Attorney General, Criminal Division, is authorized to...
28 CFR 3.2 - Assistant Attorney General, Criminal Division.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Assistant Attorney General, Criminal Division. 3.2 Section 3.2 Judicial Administration DEPARTMENT OF JUSTICE GAMBLING DEVICES § 3.2 Assistant Attorney General, Criminal Division. The Assistant Attorney General, Criminal Division, is authorized to...
28 CFR 3.2 - Assistant Attorney General, Criminal Division.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Assistant Attorney General, Criminal Division. 3.2 Section 3.2 Judicial Administration DEPARTMENT OF JUSTICE GAMBLING DEVICES § 3.2 Assistant Attorney General, Criminal Division. The Assistant Attorney General, Criminal Division, is authorized to...
Biorepositories | Division of Cancer Prevention
Carefully collected and controlled high-quality human biospecimens, annotated with clinical data and properly consented for investigational use, are available through the Division of Cancer Prevention Biorepositories listed in the charts below. Biorepositories Managed by the Division of Cancer Prevention Biorepositories Supported by the Division of Cancer Prevention Related
Berkeley Lab - Materials Sciences Division
Postdoc Forum Research Highlights Awards Publications Database Events Calendar Newsletter Archive People ; Finance Templates Travel One-Stop Investigators Division Staff Facilities and Centers Staff Jobs People Division, please use the links here. An outline of the Division structure is available at the Organization
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT; Notice of Negative... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management Division). The negative determination was issued on August 19, 2011...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
...., a Division of Deluxe Entertainment Services Group, Inc. Hollywood, California; Notice of Revised... of Deluxe Laboratories, Inc., a division of Deluxe Entertainment Services Group, Inc., Hollywood... workers of Deluxe Laboratories, Inc., a division of Deluxe Entertainment Services Group, Inc., Hollywood...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-08
..., Printing & Personal System Americas Division, Marketing Services, Houston, Texas; Notice of Investigation... Division, Marketing Services, Houston, Texas. On January 25, 2013, the Department issued a Notice of... & Personal System Americas Division, Marketing Services, Houston, Texas) to be filed. Because the later-filed...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-19
... and Veneer Colville Tribal Enterprise Corporation Wood Products Division Including On-Site Contract... Veneer, Colville Tribal Enterprise Corporation Wood Products Division, Omak, Washington. The notice was... Enterprise Corporation Wood Products Division. The Department has determined that these workers were...
The Battle of Aschaffenburg: An Example of Late World War 2 Urban Combat in Europe
1989-06-02
Division (LTC Abams) German Units - ARMY: Seventh ( Gen . der Inf. Felber) ( Gen . der Inf. Obstfelder, 26 March) CORPS: L)XOII ( Gen . dir Inf. Hahn...4 Armored Division (LTC Abrams) Ist Bn, 104 Infantry (attached to 4 AD) German Units - ARMY: Seventh ( Gen . dir Inf. Obstfelder) CORPS: DOCOII ( Gen ...Haislip) DIVISION: 44 Infantry Division (MG Dean) 45 Infantry Division (MG Fredericks) German Units - ARMY: Seventh ( Gen . dir Inf. Obstfelder) CORPS
NASA Astrophysics Data System (ADS)
Anonymous
2013-07-01
The Geological Society of America's (GSA) new class of medal and award recipients and fellows includes many AGU members. Medal and award recipients are Stephen G. Pollock, University of Southern Maine: GSA Distinguished Service Award; John R. Wheaton, Montana Bureau of Mines and Geology: John C. Frye Award; Clifford A. Jacobs, National Science Foundation (NSF): Outstanding Contributions Award, Geoinformatics Division; Peter Bird, University of California, Los Angeles (emeritus): George P. Woollard Award, Geophysics Division; Chunmiao Zheng, University of Alabama: O. E. Meinzer Award, Hydrogeology Division; Gerhard Wörner, Georg August Universität Göttingen: Distinguished Geologic Career Award, Mineralogy, Geochemistry, Petrology, and Volcanology Division; Alan D. Howard, University of Virginia: G. K. Gilbert Award, Planetary Geology Division; Michael E. Perkins, University of Utah: Kirk Bryan Award for Research Excellence, Quaternary Geology and Geomorphology Division; and Peter J. Hudleston, University of Minnesota: Career Contribution Award, Structural Geology and Tectonics Division.
Code inspection instructional validation
NASA Technical Reports Server (NTRS)
Orr, Kay; Stancil, Shirley
1992-01-01
The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.
Reinventing User Applications for Mission Control
NASA Technical Reports Server (NTRS)
Trimble, Jay Phillip; Crocker, Alan R.
2010-01-01
In 2006, NASA Ames Research Center's (ARC) Intelligent Systems Division, and NASA Johnson Space Centers (JSC) Mission Operations Directorate (MOD) began a collaboration to move user applications for JSC's mission control center to a new software architecture, intended to replace the existing user applications being used for the Space Shuttle and the International Space Station. It must also carry NASA/JSC mission operations forward to the future, meeting the needs for NASA's exploration programs beyond low Earth orbit. Key requirements for the new architecture, called Mission Control Technologies (MCT) are that end users must be able to compose and build their own software displays without the need for programming, or direct support and approval from a platform services organization. Developers must be able to build MCT components using industry standard languages and tools. Each component of MCT must be interoperable with other components, regardless of what organization develops them. For platform service providers and MOD management, MCT must be cost effective, maintainable and evolvable. MCT software is built from components that are presented to users as composable user objects. A user object is an entity that represents a domain object such as a telemetry point, a command, a timeline, an activity, or a step in a procedure. User objects may be composed and reused, for example a telemetry point may be used in a traditional monitoring display, and that same telemetry user object may be composed into a procedure step. In either display, that same telemetry point may be shown in different views, such as a plot, an alpha numeric, or a meta-data view and those views may be changed live and in place. MCT presents users with a single unified user environment that contains all the objects required to perform applicable flight controller tasks, thus users do not have to use multiple applications, the traditional boundaries that exist between multiple heterogeneous applications disappear, leaving open the possibility of new operations concepts that are not constrained by the traditional applications paradigm.
Controlling Infrastructure Costs: Right-Sizing the Mission Control Facility
NASA Technical Reports Server (NTRS)
Martin, Keith; Sen-Roy, Michael; Heiman, Jennifer
2009-01-01
Johnson Space Center's Mission Control Center is a space vehicle, space program agnostic facility. The current operational design is essentially identical to the original facility architecture that was developed and deployed in the mid-90's. In an effort to streamline the support costs of the mission critical facility, the Mission Operations Division (MOD) of Johnson Space Center (JSC) has sponsored an exploratory project to evaluate and inject current state-of-the-practice Information Technology (IT) tools, processes and technology into legacy operations. The general push in the IT industry has been trending towards a data-centric computer infrastructure for the past several years. Organizations facing challenges with facility operations costs are turning to creative solutions combining hardware consolidation, virtualization and remote access to meet and exceed performance, security, and availability requirements. The Operations Technology Facility (OTF) organization at the Johnson Space Center has been chartered to build and evaluate a parallel Mission Control infrastructure, replacing the existing, thick-client distributed computing model and network architecture with a data center model utilizing virtualization to provide the MCC Infrastructure as a Service. The OTF will design a replacement architecture for the Mission Control Facility, leveraging hardware consolidation through the use of blade servers, increasing utilization rates for compute platforms through virtualization while expanding connectivity options through the deployment of secure remote access. The architecture demonstrates the maturity of the technologies generally available in industry today and the ability to successfully abstract the tightly coupled relationship between thick-client software and legacy hardware into a hardware agnostic "Infrastructure as a Service" capability that can scale to meet future requirements of new space programs and spacecraft. This paper discusses the benefits and difficulties that a migration to cloud-based computing philosophies has uncovered when compared to the legacy Mission Control Center architecture. The team consists of system and software engineers with extensive experience with the MCC infrastructure and software currently used to support the International Space Station (ISS) and Space Shuttle program (SSP).
Transition of NOAA's GPS-Met Data Acquisition and Processing System to the Commercial Sector
NASA Astrophysics Data System (ADS)
Jackson, M. E.; Holub, K.; Callahan, W.; Blatt, S.
2014-12-01
In April of 2014, NOAA/OAR/ESRL Global Systems Division (GSD) and Trimble, in collaboration with Earth Networks, Inc. (ENI) signed a Cooperative Research and Development Agreement (CRADA) to transfer the existing NOAA GPS-Met Data Acquisition and Processing System (GPS-Met DAPS) technology to a commercial Trimble/ENI partnership. NOAA's GPS-Met DAPS is currently operated in a pseudo-operational mode but has proven highly reliable and running at over 95% uptime. The DAPS uses the GAMIT software to ingest dual frequency carrier phase GPS/GNSS observations and ancillary information such as real-time satellite orbits to estimate the zenith-scaled tropospheric (ZTD) signal delays and, where surface MET data are available, retrieve integrated precipitable water vapor (PWV). The NOAA data and products are made available to end users in near real-time. The Trimble/ENI partnership will use the Trimble Pivot™ software with the Atmosphere App to calculate zenith tropospheric (ZTD), tropospheric slant delay, and integrated precipitable water vapor (PWV). Evaluation of the Trimble software is underway starting with a comparison of ZTD and PWV values determined from GPS stations located near NOAA Radiosonde Observation (Upper-Air Observation) launch sites. A success metric was established that requires Trimble's PWV estimates to match ESRL/GSD's to within 1.5 mm 95% of the time, which corresponds to a ZTD uncertainty of less than 10 mm 95% of the time. Initial results indicate that Trimble/ENI data meet and exceed the ZTD metric, but for some stations PWV estimates are out of specification. These discrepancies are primarily due to how offsets between MET and GPS stations are handled and are easily resolved. Additional test networks are proposed that include low terrain/high moisture variability stations, high terrain/low moisture variability stations, as well as high terrain/high moisture variability stations. We will present results from further testing along with a timeline for the transition of the GPS-Met DAPS to an operational commercial service.
Smith, Rose; Ford, Kevin R; Myer, Gregory D; Holleran, Adam; Treadway, Erin; Hewett, Timothy E
2007-01-01
Context: The recent increase in women's varsity soccer participation has been accompanied by a lower extremity injury rate that is 2 to 6 times that of their male counterparts. Objective: To define the differences between lower extremity biomechanics (knee abduction and knee flexion measures) and performance (maximal vertical jump height) between National Collegiate Athletic Association Division I and III female soccer athletes during a drop vertical jump. Design: Mixed 2 × 2 design. Setting: Research laboratory. Patients or Other Participants: Thirty-four female collegiate soccer players (Division I: n = 19; Division III: n = 15) participated in the study. The groups were similar in height and mass. Intervention(s): Each subject performed a maximal vertical jump, followed by 3 drop vertical jumps. Main Outcome Measure(s): Kinematics (knee abduction and flexion angles) and kinetics (knee abduction and flexion moments) were measured with a motion analysis system and 2 force platforms during the drop vertical jumps. Results: Knee abduction angular range of motion and knee abduction external moments were not different between groups (P > .05). However, Division I athletes demonstrated decreased knee flexion range of motion (P = .038) and greater peak external knee flexion moment (P = .009) compared with Division III athletes. Division I athletes demonstrated increased vertical jump height compared with Division III (P = .008). Conclusions: Division I athletes demonstrated different sagittal-plane mechanics than Division III athletes, which may facilitate improved performance. The similarities in anterior cruciate ligament injury risk factors (knee abduction torques and angles) may correlate with the consistent incidence of anterior cruciate ligament injury across divisions. PMID:18174935
Cell division plane orientation based on tensile stress in Arabidopsis thaliana
Louveaux, Marion; Julien, Jean-Daniel; Mirabet, Vincent; Boudaoud, Arezki; Hamant, Olivier
2016-01-01
Cell geometry has long been proposed to play a key role in the orientation of symmetric cell division planes. In particular, the recently proposed Besson–Dumais rule generalizes Errera’s rule and predicts that cells divide along one of the local minima of plane area. However, this rule has been tested only on tissues with rather local spherical shape and homogeneous growth. Here, we tested the application of the Besson–Dumais rule to the divisions occurring in the Arabidopsis shoot apex, which contains domains with anisotropic curvature and differential growth. We found that the Besson–Dumais rule works well in the central part of the apex, but fails to account for cell division planes in the saddle-shaped boundary region. Because curvature anisotropy and differential growth prescribe directional tensile stress in that region, we tested the putative contribution of anisotropic stress fields to cell division plane orientation at the shoot apex. To do so, we compared two division rules: geometrical (new plane along the shortest path) and mechanical (new plane along maximal tension). The mechanical division rule reproduced the enrichment of long planes observed in the boundary region. Experimental perturbation of mechanical stress pattern further supported a contribution of anisotropic tensile stress in division plane orientation. Importantly, simulations of tissues growing in an isotropic stress field, and dividing along maximal tension, provided division plane distributions comparable to those obtained with the geometrical rule. We thus propose that division plane orientation by tensile stress offers a general rule for symmetric cell division in plants. PMID:27436908
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-02
... Chemical Corporation Wacker Polymers Division a Subsidiary of Wacker Chemie AG Including On-Site Leased.... and Yoh Managed Staffing South Brunswick, NJ; Wacker Chemical Corporation Wacker Polymers Division a... of Wacker Chemical Corporation, Wacker Polymers Division, a subsidiary of Wacker Chemie AG, including...
The Western Ecology Division (WED) is one of four ecological effects divisions of the National Health and Environmental Effects Research Laboratory. The four divisions are distributed bio-geographically. WED's mission is 1) to provide EPA with national scientific leadership for t...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-21
... Airlines, a Subsidiary of Skywest, Inc., Airport Customer Service Division, Including On-Site Leased... Airlines, a Subsidiary of Skywest, Inc., Airport Customer Service Division v. United States Secretary of... former workers of Atlantic Southeast Airlines, a Subsidiary of Skywest, Inc., Airport Customer Division...
12 CFR 1777.10 - Developments prompting supervisory response.
Code of Federal Regulations, 2010 CFR
2010-01-01
... less than the national HPI four quarters previously, or for any Census Division or Divisions in which... more than five percent less than the HPI for that Division or Divisions four quarters previously; (b...-half of its average quarterly net income for any four-quarter period during the prior eight quarters...
ERIC Educational Resources Information Center
Boote, Stacy K.
2016-01-01
Students' success with fourth-grade content standards builds on mathematical knowledge learned in third grade and creates a conceptual foundation for division standards in subsequent grades that focus on the division algorithm. The division standards in fourth and fifth grade are similar; but in fourth grade, division problem divisors are only one…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Medical Solutions USA, Inc., Oncology Care Systems Division, Concord, CA; Siemens Medical Solutions USA... Solutions USA, Inc. (Siemens), Oncology Care Systems Division, Concord, California (subject firm). The...., Oncology Care Systems Division, Concord, California (TA-W-73,158) and Siemens Medical Solutions USA, Inc...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... Logistics Division, Including On-Site Leased Workers From Prologistix and Employment Staffing Solutions... 19, 2010, applicable to workers of CEVA Freight, LLC, Dell Logistics Division, including on-site..., North Carolina location of CEVA Freight, LLC, Dell Logistics Division. The Department has determined...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-08
... Division Customer Care, Morgan Hill, California; Notice of Negative Determination on Reconsideration On... Reconsideration for the workers and former workers of Comcast Cable, West Division Customer Care, Morgan Hill... the petition for group eligibility of Comcast Cable, West Division Customer Care, Morgan Hill...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-18
... Division of CareNetwork, Inc., Front End Operations and Account Installation-Product Testing Groups, De... a Division of Carenetwork, Inc. Front End Operations and Account Installation-Product Testing Groups..., a Division of CareNetwork, Inc., Front End Operations and Account Installation- Product Testing...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-04
... Directory LLC, San Francisco Division, Publishing Operations Group, YP Subsidiary Holdings LLC, YP LLC, YP... Directory LLC, San Francisco Division, Publishing Operations Group, YP Subsidiary Holdings LLC, YP LLC, YP... workers of YP Western Directory LLC, San Francisco Division, Publishing Operations Group, YP Subsidiary...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-04
...,846B; TA-W-81,846C; TA-W-81,846D] Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division Alpharetta, GA; Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division Hunt Valley, MD; Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
...; Power Technologies Group Division; Including On-Site Leased Workers From Manpower Milwaukee, WI; Notice... former workers of Dana Holding Company, Power Technologies Group Division, Milwaukee, Wisconsin (subject... reconsideration investigation, I determine that workers of Dana Holding Company, Power Technologies Group Division...
Division Quilts: A Measurement Model
ERIC Educational Resources Information Center
Pratt, Sarah S.; Lupton, Tina M.; Richardson, Kerri
2015-01-01
As teachers seek activities to assist students in understanding division as more than just the algorithm, they find many examples of division as fair sharing. However, teachers have few activities to engage students in a quotative (measurement) model of division. Efraim Fischbein and his colleagues (1985) defined two types of whole-number…
Should Reproductive Anatomy Be Taught in University Health Courses?
ERIC Educational Resources Information Center
Powell, Brent; Fletcher, J. Sue
2013-01-01
There has been little research on undergraduate reproductive anatomy education. This pilot study explores knowledge of anatomical reproductive anatomy among university students in a lower division and upper division health course. Using a Qualtrics survey program, a convenience sample of 120 lower division and 157 upper division students for a…
Fighting blind: why US Army Divisions Need a Dedicated Reconnaissance and Security Force
2017-05-25
dedicated reconnaissance and security force. By the 2003 invasion of Iraq, 3d Infantry Division employed a division cavalry squadron that integrated...26 1 Introduction 3-7 Cavalry conducted reconnaissance and security tasks ahead of and on the flank of 3d Infantry Division as the...division led Vth Corp’s attack from Kuwait to Baghdad in 2003. 3-7 Cavalry’s movements confused the enemy about 3d Infantry Division’s location and intent
1991-03-15
General-and six ministerial divisions-- the Budget Division, the Personnel Management Division, the Quartering, Real Estate and Construction Division and... Management Division, 39,242 officers served in the Bundeswehr during the first half of the eighties: 26,102 regular line officers (Truppenoffiziere), 1,615...additionally attend a 6 months language course 17 Camand and leadeship doctrine, seaity policy and a=ed forces ard social sciences. in a fourth area, single
1998-01-01
right of the division. e) Missions of other units with a significant bearing on the division. 3) Attachments and detachments. 5-1 Key Inputs and...g) Missions of units to the immediate left and right of the division. h) Missions of other units with a significant bearing on the...other units with a significant bearing on the division. 3) Attachments and detachments. c. MISSION d. EXECUTION Intent of the division commander
Cell division is dispensable but not irrelevant in Streptomyces.
McCormick, Joseph R
2009-12-01
In part, members of the genus Streptomyces have been studied because they produce many important secondary metabolites with antibiotic activity and for the interest in their relatively elaborate life cycle. These sporulating filamentous bacteria are remarkably synchronous for division and genome segregation in specialized aerial hyphae. Streptomycetes share some, but not all, of the division genes identified in the historic model rod-shaped organisms. Curiously, normally essential cell division genes are dispensable for growth and viability of Streptomyces coelicolor. Mainly, cell division plays a more important role in the developmental phase of life than during vegetative growth. Dispensability provides an advantageous genetic system to probe the mechanisms of division proteins, especially those with functions that are poorly understood.
Cell and plastid division are coordinated through the prereplication factor AtCDT1
Raynaud, Cécile; Perennes, Claudette; Reuzeau, Christophe; Catrice, Olivier; Brown, Spencer; Bergounioux, Catherine
2005-01-01
The cell division cycle involves nuclear and cytoplasmic events, namely organelle multiplication and distribution between the daughter cells. Until now, plastid and plant cell division have been considered as independent processes because they can be uncoupled. Here, down-regulation of AtCDT1a and AtCDT1b, members of the prereplication complex, is shown to alter both nuclear DNA replication and plastid division in Arabidopsis thaliana. These data constitute molecular evidence for relationships between the cell-cycle and plastid division. Moreover, the severe developmental defects observed in AtCDT1-RNA interference (RNAi) plants underline the importance of coordinated cell and organelle division for plant growth and morphogenesis. PMID:15928083