ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)
NASA Technical Reports Server (NTRS)
Pearson, R. W.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Junkin, B. G.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
1990-04-23
developed Ada Real - Time Operating System (ARTOS) for bare machine environments(Target), ACW 1.1I0. " ; - -M.UIECTTERMS Ada programming language, Ada...configuration) Operating System: CSC developed Ada Real - Time Operating System (ARTOS) for bare machine environments Memory Size: 4MB 2.2...Test Method Testing of the MC Ado V1.2.beta/ Concurrent Computer Corporation compiler and the CSC developed Ada Real - Time Operating System (ARTOS) for
NASA Technical Reports Server (NTRS)
Comfort, R. H.; Horwitz, J. L.
1986-01-01
Temperature and density analysis in the Automated Analysis Program (for the global empirical model) were modified to use flow velocities produced by the flow velocity analysis. Revisions were started to construct an interactive version of the technique for temperature and density analysis used in the automated analysis program. A sutdy of ion and electron heating at high altitudes in the outer plasmasphere was initiated. Also the analysis of the electron gun experiments on SCATHA were extended to include eclipse operations in order to test a hypothesis that there are interactions between the 50 to 100 eV beam and spacecraft generated photoelectrons. The MASSCOMP software to be used in taking and displaying data in the two-ion plasma experiment was tested and is now working satisfactorily. Papers published during the report period are listed.
Wright Research and Development Center Test Facilities Handbook
1990-01-01
Variable Temperature (2-400K) and Field (0-5 Tesla) Squid Susceptometer Variable Temperature (10-80K) and Field (0-10 Tesla) Transport Current...determine products of combustion using extraction type probes INSTRUMENTATION: Mini computer/data acquisiton system Networking provides access to larger...data recorder, Masscomp MC-500 computer with acquisition digitizer, laser and ink -jet printers,lo-pass filters, pulse code modulation AVAILABILITY
1990-06-01
Layer Manipulator is placed AP differential pressure across the surface fence e, IC, mean and turbulent viscous dissipation Rt absolute viscosity of...feet long. The zero point for the traversing system is situated 3.3 feet from the inlet end of the blockhouse and ranges over 90% of the semi-open...tenth the absolute air pressure in millimeters of water. A voltage divider further reduces CD23 output voltage by one-half to accommodate the MASSCOMP
Development of a 32-bit UNIX-based ELAS workstation
NASA Technical Reports Server (NTRS)
Spiering, Bruce A.; Pearson, Ronnie W.; Cheng, Thomas D.
1987-01-01
A mini/microcomputer UNIX-based image analysis workstation has been designed and is being implemented to use the Earth Resources Laboratory Applications Software (ELAS). The hardware system includes a MASSCOMP 5600 computer, which is a 32-bit UNIX-based system (compatible with AT&T System V and Berkeley 4.2 BSD operating system), a floating point accelerator, a 474-megabyte fixed disk, a tri-density magnetic tape drive, and an 1152 by 910 by 12-plane color graphics/image interface. The software conversion includes reconfiguring the ELAs driver Master Task, recompiling and then testing the converted application modules. This hardware and software configuration is a self-sufficient image analysis workstation which can be used as a stand-alone system, or networked with other compatible workstations.
NASA Technical Reports Server (NTRS)
Jensen, E. Douglas
1988-01-01
Alpha is a new kind of operating system that is unique in two highly significant ways. First, it is decentralized transparently providing reliable resource management across physically dispersed nodes, so that distributed applications programming can be done largely as though it were centralized. And second, it provides comprehensive, high technology support for real-time system integration and operation, an application area which consists predominately of aperiodic activities having critical time constraints such as deadlines. Alpha is extremely adaptable so that it can be easily optimized for a wide range of problem-specific functionality, performance, and cost. Alpha is the first systems effort of the Archons Project, and the prototype was created at Carnegie-Mellon University directly on modified Sun multiprocessor workstation hardware. It has been demonstrated with a real-time C(sup 2) application. Continuing research is leading to a series of enhanced follow-ons to Alpha; these are portable but initially hosted on Concurrent's MASSCOMP line of multiprocessor products.
The Real Time Display Builder (RTDB)
NASA Technical Reports Server (NTRS)
Kindred, Erick D.; Bailey, Samuel A., Jr.
1989-01-01
The Real Time Display Builder (RTDB) is a prototype interactive graphics tool that builds logic-driven displays. These displays reflect current system status, implement fault detection algorithms in real time, and incorporate the operational knowledge of experienced flight controllers. RTDB utilizes an object-oriented approach that integrates the display symbols with the underlying operational logic. This approach allows the user to specify the screen layout and the driving logic as the display is being built. RTDB is being developed under UNIX in C utilizing the MASSCOMP graphics environment with appropriate functional separation to ease portability to other graphics environments. RTDB grew from the need to develop customized real-time data-driven Space Shuttle systems displays. One display, using initial functionality of the tool, was operational during the orbit phase of STS-26 Discovery. RTDB is being used to produce subsequent displays for the Real Time Data System project currently under development within the Mission Operations Directorate at NASA/JSC. The features of the tool, its current state of development, and its applications are discussed.
Concurrent Image Processing Executive (CIPE)
NASA Technical Reports Server (NTRS)
Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1988-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.
RELAV - RELIABILITY/AVAILABILITY ANALYSIS PROGRAM
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
RELAV (Reliability/Availability Analysis Program) is a comprehensive analytical tool to determine the reliability or availability of any general system which can be modeled as embedded k-out-of-n groups of items (components) and/or subgroups. Both ground and flight systems at NASA's Jet Propulsion Laboratory have utilized this program. RELAV can assess current system performance during the later testing phases of a system design, as well as model candidate designs/architectures or validate and form predictions during the early phases of a design. Systems are commonly modeled as System Block Diagrams (SBDs). RELAV calculates the success probability of each group of items and/or subgroups within the system assuming k-out-of-n operating rules apply for each group. The program operates on a folding basis; i.e. it works its way towards the system level from the most embedded level by folding related groups into single components. The entire folding process involves probabilities; therefore, availability problems are performed in terms of the probability of success, and reliability problems are performed for specific mission lengths. An enhanced cumulative binomial algorithm is used for groups where all probabilities are equal, while a fast algorithm based upon "Computing k-out-of-n System Reliability", Barlow & Heidtmann, IEEE TRANSACTIONS ON RELIABILITY, October 1984, is used for groups with unequal probabilities. Inputs to the program include a description of the system and any one of the following: 1) availabilities of the items, 2) mean time between failures and mean time to repairs for the items from which availabilities are calculated, 3) mean time between failures and mission length(s) from which reliabilities are calculated, or 4) failure rates and mission length(s) from which reliabilities are calculated. The results are probabilities of success of each group and the system in the given configuration. RELAV assumes exponential failure distributions for reliability calculations and infinite repair resources for availability calculations. No more than 967 items or groups can be modeled by RELAV. If larger problems can be broken into subsystems of 967 items or less, the subsystem results can be used as item inputs to a system problem. The calculated availabilities are steady-state values. Group results are presented in the order in which they were calculated (from the most embedded level out to the system level). This provides a good mechanism to perform trade studies. Starting from the system result and working backwards, the granularity gets finer; therefore, system elements that contribute most to system degradation are detected quickly. RELAV is a C-language program originally developed under the UNIX operating system on a MASSCOMP MC500 computer. It has been modified, as necessary, and ported to an IBM PC compatible with a math coprocessor. The current version of the program runs in the DOS environment and requires a Turbo C vers. 2.0 compiler. RELAV has a memory requirement of 103 KB and was developed in 1989. RELAV is a copyrighted work with all copyright vested in NASA.
21 CFR 290.6 - Spanish-language version of required warning.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Spanish-language version of required warning. 290... (CONTINUED) DRUGS: GENERAL CONTROLLED DRUGS General Provisions § 290.6 Spanish-language version of required... of this drug to any person other than the patient for whom it was prescribed.” The Spanish version of...
Concurrent Image Processing Executive (CIPE). Volume 1: Design overview
NASA Technical Reports Server (NTRS)
Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1990-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.
21 CFR 290.6 - Spanish-language version of required warning.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Spanish-language version of required warning. 290.6 Section 290.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL CONTROLLED DRUGS General Provisions § 290.6 Spanish-language version of required...
21 CFR 201.16 - Drugs; Spanish-language version of certain required statements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; Spanish-language version of certain...; Spanish-language version of certain required statements. An increasing number of medications restricted to... where Spanish is the predominant language. Such labeling is authorized under § 201.15(c). One required...
21 CFR 801.16 - Medical devices; Spanish-language version of certain required statements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; Spanish-language version of....16 Medical devices; Spanish-language version of certain required statements. If devices restricted to... Spanish is the predominant language, such labeling is authorized under § 801.15(c). ...
21 CFR 801.16 - Medical devices; Spanish-language version of certain required statements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; Spanish-language version of....16 Medical devices; Spanish-language version of certain required statements. If devices restricted to prescription use only are labeled solely in Spanish for distribution in the Commonwealth of Puerto Rico where...
21 CFR 201.16 - Drugs; Spanish-language version of certain required statements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs; Spanish-language version of certain...; Spanish-language version of certain required statements. An increasing number of medications restricted to prescription use only are being labeled solely in Spanish for distribution in the Commonwealth of Puerto Rico...
Kork, John O.
1983-01-01
Version 1.00 of the Asynchronous Communications Support supplied with the IBM Personal Computer must be modified to be used for communications with Multics. Version 2.00 can be used as supplied, but error checking and screen printing capabilities can be added by using modifications very similar to those required for Version 1.00. This paper describes and lists required programs on Multics and appropriate modifications to both Versions 1.00 and 2.00 of the programs supplied by IBM.
A 'Global Reference' Comparator for Biosimilar Development.
Webster, Christopher J; Woollett, Gillian R
2017-08-01
Major drug regulators have indicated in guidance their flexibility to accept some development data for biosimilars generated with reference product versions licensed outside their own jurisdictions, but most authorities require new bridging studies between these versions and the versions of them licensed locally. The costs of these studies are not trivial in absolute terms and, due to the multiplier effect of required repetition by each biosimilar sponsor, their collective costs are substantial. Yet versions of biologics licensed in different jurisdictions usually share the same development data, and any manufacturing changes between versions have been justified by a rigorous comparability process. The fact that a biosimilar is usually expected to be licensed in multiple jurisdictions, in each case as similar to the local reference product, confirms that minor analytical differences between versions of reference biologics are typically inconsequential for clinical outcomes and licensing. A greatly simplified basis for selecting a reference comparator, that does not require conducting new bridging studies, is proposed and justified based on the shared data of the reference product versions as well as the proof offered where biosimilars have already been approved. The relevance of this proposal to the interchangeability designation available in the US is discussed.
GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions
Banta, Edward R.; Ahlfeld, David P.
2013-01-01
Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.
10 CFR 431.203 - Materials incorporated by reference.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Environmental Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0 issued January 1... Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0, may be obtained from the...
10 CFR 431.203 - Materials incorporated by reference.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... Environmental Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0 issued January 1... Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0, may be obtained from the...
10 CFR 431.203 - Materials incorporated by reference.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... Environmental Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0 issued January 1... Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0, may be obtained from the...
10 CFR 431.203 - Materials incorporated by reference.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... Environmental Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0 issued January 1... Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0, may be obtained from the...
10 CFR 431.203 - Materials incorporated by reference.
Code of Federal Regulations, 2010 CFR
2010-01-01
.... Environmental Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0 issued January 1... Protection Agency “ENERGY STAR Program Requirements for Exit Signs,” Version 2.0, may be obtained from the...
Doing It Right: 366 answers to computing questions you didn't know you had
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herring, Stuart Davis
Slides include information on history: version control, version control: branches, version control: Git, releases, requirements, readability, readability control flow, global variables, architecture, architecture redundancy, processes, input/output, unix, etcetera.
NASA Astrophysics Data System (ADS)
Hanson, Robert M.
2003-06-01
ORBITAL requires the following software, which is available for free download from the Internet: Netscape Navigator, version 4.75 or higher, or Microsoft Internet Explorer, version 5.0 or higher; Chime Plug-in, version compatible with your OS and browser (available from MDL).
Development of U-Mart System with Plural Brands and Plural Markets
NASA Astrophysics Data System (ADS)
Akimoto, Yoshihito; Mori, Naoki; Ono, Isao; Nakajima, Yoshihiro; Kita, Hajime; Matsumoto, Keinosuke
In this paper, we first discuss the notion that artificial market systems should meet the requirements of fidelity, transparency, reproducibility, and traceability. Next, we introduce history of development of the artificial market system named U-Mart system that meet the requirements well, which have been developed by the U-Mart project. We have already developed the U-Mart system called “U-Mart system version 3.0” to solve problems of old U-Mart systems. In version 3.0 system, trading process is modularized and universal market system can be easily introduced.
However, U-Mart system version 3.0 only simulates the single brand futures market. The simulation of the plural brands and plural markets has been required by lot of users. In this paper, we proposed a novel U-Mart system called “U-Mart system version 4.0” to solve this problem of U-Mart system version 3.0. We improve the server system, machine agents and GUI in order to simulate plural brands and plural markets in U-Mart system version 4.0. The effectiveness of the proposed system is confirmed by statistical analysis of results of spot market simulation with random agents.
A Descriptive Evaluation of Automated Software Cost-Estimation Models,
1986-10-01
Version 1.03D) * PCOC (Version 7.01) - PRICE S • SLIM (Version 1.1) • SoftCost (Version 5. 1) * SPQR /20 (Version 1. 1) - WICOMO (Version 1.3) These...produce detailed GANTT and PERT charts. SPQR /20 is based on a cost model developed at ITT. In addition to cost, schedule, and staffing estimates, it...cases and test runs required, and the effectiveness of pre-test and test activities. SPQR /20 also predicts enhancement and maintenance activities. C
Lambeek, A F; De Hundt, M; Vlemmix, F; Akerboom, B M C; Bais, J M J; Papatsonis, D N M; Mol, B W J; Kok, M
2013-04-01
To evaluate the effect of successful external cephalic version on the incidence of developmental dysplasia of the hip (DDH) requiring treatment in singleton breech presentation at term. Observational cohort study. Three large teaching hospitals in the Netherlands. Women with a singleton breech presentation of 34 weeks of gestation or more, who underwent an external cephalic version attempt. We made a comparison of the incidence of DDH between children born in breech presentation and children born in cephalic presentation after a successful external cephalic version. The incidence of DDH requiring either conservative treatment, with a harness, or surgical treatment. A total of 498 newborns were included in the study, of which 40 (8%) were diagnosed with DDH and 35 required treatment. Multivariate analysis showed that female gender (OR 2.79, 95% CI 1.23-6.35) and successful external cephalic version (OR 0.29, 95% CI 0.09-0.95) were independently associated with DDH. A successful external cephalic version is associated with a lower incidence of DDH, although a high percentage of children born after a successful external cephalic version still appear to have DDH. A larger cohort study is needed to establish the definite nature of this relationship. Until then, we recommend the same screening policy for infants born in cephalic position after a successful external cephalic version as for infants born in breech position. © 2012 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2012 RCOG.
Separation of Acids, Bases, and Neutral Compounds
NASA Astrophysics Data System (ADS)
Fujita, Megumi; Mah, Helen M.; Sgarbi, Paulo W. M.; Lall, Manjinder S.; Ly, Tai Wei; Browne, Lois M.
2003-01-01
Separation of Acids, Bases, and Neutral Compounds requires the following software, which is available for free download from the Internet: Netscape Navigator, version 4.75 or higher, or Microsoft Internet Explorer, version 5.0 or higher; Chime plug-in, version compatible with your OS and browser (available from MDL); and Flash player, version 5 or higher (available from Macromedia).
Code of Federal Regulations, 2013 CFR
2013-01-01
... “ENERGY STAR Program Requirements for [Compact Fluorescent Lamps] CFLs,” Version dated August 9, 2001... DOE's “ENERGY STAR Program Requirements for [Compact Fluorescent Lamps] CFLs,” Version dated August 9...
Paperless Contract Folder’s (PCF) DoD 5015.2 Certification
2010-06-01
Draft Version Controls ...........................................................11 i. Electronic Routing of Purchase Request (Funding) Documents...of electronic records, version control , robust search and retrieval, and automated disposition that is compliant with legal requirements. As shown...h. Draft Version Controls Draft and versioning controls track the changes to the documents once they are saved. Draft numbers (0.1, 0.2, 0.3
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, John C.
1987-01-01
Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.
2012-06-18
Blonetics Corporation, Newark Metrology Operations, complies with the requirements of the current version of ISO / IEC 17025 on the date of...requirements of the current version of ISOIIEC 17025 on the date of calibration. 2. This report may not be reproduced, except rn full, without
48 CFR 3439.701 - Internet Protocol version 6.
Code of Federal Regulations, 2013 CFR
2013-10-01
... REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holter, Gregory M
2001-01-26
This Operational Requirements Document (ORD) describes the capabilities that need to be incorporated in the NCC interactive simulation system being developed under the auspices of the NCC development program. The ORD addresses the necessary capabilities (i.e. what the system needs to be able to do); it defines the envelope of situations and circumstances that the NCC system must be able to represent and operate within. The NCC system will be developed in modules over a period of several years. This ORD, Version 2, supersedes the previous version. Future updates of this ORD are anticipated to be issued as needed tomore » guide the development of later versions of the NCC system.« less
NASA Technical Reports Server (NTRS)
Suhs, Norman E.; Dietz, William E.; Rogers, Stuart E.; Nash, Steven M.; Onufer, Jeffrey T.
2000-01-01
PEGASUS 5.1 is the latest version of the PEGASUS series of mesh interpolation codes. It is a fully three-dimensional code. The main purpose for the development of this latest version was to significantly decrease the number of user inputs required and to allow for easier operation of the code. This guide is to be used with the user's manual for version 4 of PEGASUS. A basic description of methods used in both versions is described in the Version 4 manual. A complete list of all user inputs used in version 5.1 is given in this guide.
48 CFR 3439.701 - Internet Protocol version 6.
Code of Federal Regulations, 2012 CFR
2012-10-01
... ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...
48 CFR 3439.701 - Internet Protocol version 6.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...
ERIC Educational Resources Information Center
Hart, Joseph T.
Two basic tests for checking memory skills are included in these appendices. The first, the General Information Test, uses the same 150 items for each of its two versions. One version is a completion-type test which measures recall by requiring the examinee to supply a specific response. The other version supplements each of the 150 items with…
IDC System Specification Document Version 1.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Lober, Randall R.
2015-02-01
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Reengineering Phase 2 project. Revisions Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Reengineering Project Team Initial delivery M. Harris V1.1 2/2015 IDC Reengineering Project Team Iteration I2 Review Comments M. Harris
NASA Technical Reports Server (NTRS)
Bishop, Matt
1991-01-01
The Network Time Protocol is being used throughout the Internet to provide an accurate time service. The security requirements are examined of such a service, version 2 of the NTP protocol is analyzed to determine how well it meets these requirements, and improvements are suggested where appropriate.
2016 Pesticide General Permit - Pre-publication Version
EPA has posted a pre-publication version of the 2016 Pesticide General Permit (PGP) to help the regulated community become familiar with the permit requirements before the permit becomes effective on October 31, 2016.
docBUILDER - Building Your Useful Metadata for Earth Science Data and Services.
NASA Astrophysics Data System (ADS)
Weir, H. M.; Pollack, J.; Olsen, L. M.; Major, G. R.
2005-12-01
The docBUILDER tool, created by NASA's Global Change Master Directory (GCMD), assists the scientific community in efficiently creating quality data and services metadata. Metadata authors are asked to complete five required fields to ensure enough information is provided for users to discover the data and related services they seek. After the metadata record is submitted to the GCMD, it is reviewed for semantic and syntactic consistency. Currently, two versions are available - a Web-based tool accessible with most browsers (docBUILDERweb) and a stand-alone desktop application (docBUILDERsolo). The Web version is available through the GCMD website, at http://gcmd.nasa.gov/User/authoring.html. This version has been updated and now offers: personalized templates to ease entering similar information for multiple data sets/services; automatic population of Data Center/Service Provider URLs based on the selected center/provider; three-color support to indicate required, recommended, and optional fields; an editable text window containing the XML record, to allow for quick editing; and improved overall performance and presentation. The docBUILDERsolo version offers the ability to create metadata records on a computer wherever you are. Except for installation and the occasional update of keywords, data/service providers are not required to have an Internet connection. This freedom will allow users with portable computers (Windows, Mac, and Linux) to create records in field campaigns, whether in Antarctica or the Australian Outback. This version also offers a spell-checker, in addition to all of the features found in the Web version.
XTALOPT version r11: An open-source evolutionary algorithm for crystal structure prediction
NASA Astrophysics Data System (ADS)
Avery, Patrick; Falls, Zackary; Zurek, Eva
2018-01-01
Version 11 of XTALOPT, an evolutionary algorithm for crystal structure prediction, has now been made available for download from the CPC library or the XTALOPT website, http://xtalopt.github.io. Whereas the previous versions of XTALOPT were published under the Gnu Public License (GPL), the current version is made available under the 3-Clause BSD License, which is an open source license that is recognized by the Open Source Initiative. Importantly, the new version can be executed via a command line interface (i.e., it does not require the use of a Graphical User Interface). Moreover, the new version is written as a stand-alone program, rather than an extension to AVOGADRO.
Versioned distributed arrays for resilience in scientific applications: Global view resilience
Chien, A.; Balaji, P.; Beckman, P.; ...
2015-06-01
Exascale studies project reliability challenges for future high-performance computing (HPC) systems. We propose the Global View Resilience (GVR) system, a library that enables applications to add resilience in a portable, application-controlled fashion using versioned distributed arrays. We describe GVR’s interfaces to distributed arrays, versioning, and cross-layer error recovery. Using several large applications (OpenMC, the preconditioned conjugate gradient solver PCG, ddcMD, and Chombo), we evaluate the programmer effort to add resilience. The required changes are small (<2% LOC), localized, and machine-independent, requiring no software architecture changes. We also measure the overhead of adding GVR versioning and show that generally overheads <2%more » are achieved. We conclude that GVR’s interfaces and implementation are flexible and portable and create a gentle-slope path to tolerate growing error rates in future systems.« less
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
Multiversion or N-version programming was proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. Specific topics addressed are: failure probabilities in N-version systems, consistent comparison in N-version systems, descriptions of the faults found in the Knight and Leveson experiment, analytic models of comparison testing, characteristics of the input regions that trigger faults, fault tolerance through data diversity, and the relationship between failures caused by automatically seeded faults.
48 CFR 3439.701 - Internet Protocol version 6.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 6. 3439.701 Section 3439.701 Federal Acquisition Regulations System DEPARTMENT OF EDUCATION ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...
Effects of task complexity on activation of language areas in a semantic decision fMRI protocol.
Lopes, Tátila Martins; Yasuda, Clarissa Lin; de Campos, Brunno Machado; Balthazar, Marcio L F; Binder, Jeffrey R; Cendes, Fernando
2016-01-29
Language tasks used for clinical fMRI studies may be too complex for some patients with cognitive impairments, and "easier" versions are sometimes substituted, though the effects on brain activity of such changes in task complexity are largely unknown. To investigate these differences, we compared two versions of an fMRI language comprehension protocol, with different levels of difficulty, in 24 healthy right-handed adults. The protocol contrasted an auditory word comprehension task (semantic decision) with a nonspeech control task using tone sequences (tone decision). In the "complex" version (CV), the semantic decision task required two complex semantic decisions for each word, and the tone decision task required the participant to count the number of target tones in each sequence. In the "easy" version (EV), the semantic task required only a single easier decision, and the tone task required only detection of the presence or absence of a target tone in each sequence. The protocols were adapted for a Brazilian population. Typical left hemisphere language lateralization was observed in 92% of participants for both CV and EV using the whole-brain lateralization index, and typical language lateralization was also observed for others regions of interest. Task performance was superior on the EV compared to the CV (p=0.014). There were many common areas of activation across the two version; however, the CV produced greater activation in the left superior and middle frontal giri, angular gyrus, and left posterior cingulate gyrus compared to the EV, the majority of which are areas previously identified with language and semantic processing. The EV produced stronger activation only in a small area in the posterior middle temporal gyrus. These results reveal differences between two versions of the protocol and provide evidence that both are useful for language lateralization and worked well for Brazilian population. The complex version produces stronger activation in several nodes of the semantic network and therefore is elected for participants who can perform well these tasks. Copyright © 2015 Elsevier Ltd. All rights reserved.
Salvarzi, Elham; Choobineh, Alireza; Jahangiri, Mehdi; Keshavarzi, Sareh
2018-02-26
Craniometry is a subset of anthropometry, which measures the anatomical sizes of the head and face (craniofacial indicators). These dimensions are used in designing devices applied in the facial area, including respirators. This study was conducted to measure craniofacial dimensions of Iranian male workers required for face protective equipment design. In this study, facial anthropometric dimensions of 50 randomly selected Iranian male workers were measured by photographic method and Digimizer version 4.1.1.0. Ten facial dimensions were extracted from photographs and measured by Digimizer version 4.1.1.0. Mean, standard deviation and 5th, 50th and 95th percentiles for each dimension were determined and the relevant data bank was established. The anthropometric data bank for the 10 dimensions required for respirator design was provided for the target group with photo-anthropometric methods. The results showed that Iranian face dimensions were different from those of other nations and ethnicities. In this pilot study, anthropometric dimensions required for half-mask respirator design for Iranian male workers were measured by Digimizer version 4.1.1.0. The obtained anthropometric tables could be useful for the design of personal face protective equipment.
Space shuttle on-orbit flight control software requirements, preliminary version
NASA Technical Reports Server (NTRS)
1975-01-01
Software modules associated with various flight control functions for the space shuttle orbiter are described. Data flow, interface requirements, initialization requirements and module sequencing requirements are considered. Block diagrams and tables are included.
Checkpointing in speculative versioning caches
Eichenberger, Alexandre E; Gara, Alan; Gschwind, Michael K; Ohmacht, Martin
2013-08-27
Mechanisms for generating checkpoints in a speculative versioning cache of a data processing system are provided. The mechanisms execute code within the data processing system, wherein the code accesses cache lines in the speculative versioning cache. The mechanisms further determine whether a first condition occurs indicating a need to generate a checkpoint in the speculative versioning cache. The checkpoint is a speculative cache line which is made non-speculative in response to a second condition occurring that requires a roll-back of changes to a cache line corresponding to the speculative cache line. The mechanisms also generate the checkpoint in the speculative versioning cache in response to a determination that the first condition has occurred.
Lima, Alex Vieira; Rech, Cassiano Ricardo; Reis, Rodrigo Siqueira
2013-12-01
The objective of this study was to describe the process of translation and cultural adaptation of the Brazilian version of the Neighborhood Environment Walkability Scale for Youth (NEWS-Y). The original and the Portuguese versions were independently translated and back-translated into English. An expert panel performed semantic analysis and conceptual adaptations. The translated version of the NEWS-Y was applied to a sample of eight adolescents and showed adequate understanding. After minor changes identified in the translation processes, the expert panel considered the Brazilian version of the NEWS-Y semantically and conceptually equivalent. The translated version of the NEWS-Y required a few adjustments to ensure conceptual, item, and semantic adaptation. Further studies are recommended to examine other steps in the cross-cultural adaptation of the Portuguese-language NEWS-Y version in the Brazilian context.
The inherent weaknesses in industrial control systems devices; hacking and defending SCADA systems
NASA Astrophysics Data System (ADS)
Bianco, Louis J.
The North American Electric Reliability Corporation (NERC) is about to enforce their NERC Critical Infrastructure Protection (CIP) Version Five and Six requirements on July 1st 2016. The NERC CIP requirements are a set of cyber security standards designed to protect cyber assets essential the reliable operation of the electric grid. The new Version Five and Six requirements are a major revision to the Version Three (currently enforced) requirements. The new requirements also bring substations into scope alongside Energy Control Centers. When the Version Five requirements were originally drafted they were vague, causing in depth discussions throughout the industry. The ramifications of these requirements has made owners look at their systems in depth, questioning how much money it will take to meet these requirements. Some owners saw backing down from routable networks to non-routable as a means to save money as they would be held to less requirements within the standards. Some owners saw removing routable connections as a proper security move. The purpose of this research was to uncover the inherent weaknesses in Industrial Control Systems (ICS) devices; to show how ICS devices can be hacked and figure out potential protections for these Critical Infrastructure devices. In addition, this research also aimed to validate the decision to move from External Routable connectivity to Non-Routable connectivity, as a security measure and not as a means of savings. The results reveal in order to ultimately protect Industrial Control Systems they must be removed from the Internet and all bi-directional external routable connections must be removed. Furthermore; non-routable serial connections should be utilized, and these non-routable serial connections should be encrypted on different layers of the OSI model. The research concluded that most weaknesses in SCADA systems are due to the inherent weaknesses in ICS devices and because of these weaknesses, human intervention is the biggest threat to SCADA systems.
HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL - USER'S GUIDE FOR VERSION 3
This report documents the solution methods and process descriptions used in the Version 3 of the HELP model. Program documentation including program options, system and operating requirements, file structures, program structure and variable descriptions are provided in a separat...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-29
... deadlines, the current policy limits EAC's ability to address the rare situations that require swift action... Procedure No. 1: Procedures for Voting by Circulation Version 2.0. EAC's current Proposed Rule of Agency...
Development and application of GASP 2.0
NASA Technical Reports Server (NTRS)
Mcgrory, W. D.; Huebner, L. D.; Slack, D. C.; Walters, R. W.
1992-01-01
GASP 2.0 represents a major new release of the computational fluid dynamics code in wide use by the aerospace community. The authors have spent the last two years analyzing the strengths and weaknesses of the previous version of the finite-rate chemistry, Navier Stokes solution algorithm. What has resulted is a completely redesigned computer code that offers two to four times the performance of previous versions while requiring as little as one quarter of the memory requirements. In addition to the improvements in efficiency over the original code, Version 2.0 contains many new features. A brief discussion of the improvements made to GASP, and an application using GASP 2.0 which demonstrates some of the new features are presented.
NASA Technical Reports Server (NTRS)
Juang, Hann-Ming Henry; Tao, Wei-Kuo; Zeng, Xi-Ping; Shie, Chung-Lin; Simpson, Joanne; Lang, Steve
2004-01-01
The capability for massively parallel programming (MPP) using a message passing interface (MPI) has been implemented into a three-dimensional version of the Goddard Cumulus Ensemble (GCE) model. The design for the MPP with MPI uses the concept of maintaining similar code structure between the whole domain as well as the portions after decomposition. Hence the model follows the same integration for single and multiple tasks (CPUs). Also, it provides for minimal changes to the original code, so it is easily modified and/or managed by the model developers and users who have little knowledge of MPP. The entire model domain could be sliced into one- or two-dimensional decomposition with a halo regime, which is overlaid on partial domains. The halo regime requires that no data be fetched across tasks during the computational stage, but it must be updated before the next computational stage through data exchange via MPI. For reproducible purposes, transposing data among tasks is required for spectral transform (Fast Fourier Transform, FFT), which is used in the anelastic version of the model for solving the pressure equation. The performance of the MPI-implemented codes (i.e., the compressible and anelastic versions) was tested on three different computing platforms. The major results are: 1) both versions have speedups of about 99% up to 256 tasks but not for 512 tasks; 2) the anelastic version has better speedup and efficiency because it requires more computations than that of the compressible version; 3) equal or approximately-equal numbers of slices between the x- and y- directions provide the fastest integration due to fewer data exchanges; and 4) one-dimensional slices in the x-direction result in the slowest integration due to the need for more memory relocation for computation.
Chien, Andrew A.; Balaji, Pavan; Dun, Nan; ...
2016-09-08
Exascale studies project reliability challenges for future HPC systems. We present the Global View Resilience (GVR) system, a library for portable resilience. GVR begins with a subset of the Global Arrays interface, and adds new capabilities to create versions, name versions, and compute on version data. Applications can focus versioning where and when it is most productive, and customize for each application structure independently. This control is portable, and its embedding in application source makes it natural to express and easy to maintain. The ability to name multiple versions and “partially materialize” them efficiently makes ambitious forward-recovery based on “datamore » slices” across versions or data structures both easy to express and efficient. Using several large applications (OpenMC, preconditioned conjugate gradient (PCG) solver, ddcMD, and Chombo), we evaluate the programming effort to add resilience. The required changes are small (< 2% lines of code (LOC)), localized and machine-independent, and perhaps most important, require no software architecture changes. We also measure the overhead of adding GVR versioning and show that overheads < 2% are generally achieved. This overhead suggests that GVR can be implemented in large-scale codes and support portable error recovery with modest investment and runtime impact. Our results are drawn from both IBM BG/Q and Cray XC30 experiments, demonstrating portability. We also present two case studies of flexible error recovery, illustrating how GVR can be used for multi-version rollback recovery, and several different forward-recovery schemes. GVR’s multi-version enables applications to survive latent errors (silent data corruption) with significant detection latency, and forward recovery can make that recovery extremely efficient. Lastly, our results suggest that GVR is scalable, portable, and efficient. GVR interfaces are flexible, supporting a variety of recovery schemes, and altogether GVR embodies a gentle-slope path to tolerate growing error rates in future extreme-scale systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chien, Andrew A.; Balaji, Pavan; Dun, Nan
Exascale studies project reliability challenges for future HPC systems. We present the Global View Resilience (GVR) system, a library for portable resilience. GVR begins with a subset of the Global Arrays interface, and adds new capabilities to create versions, name versions, and compute on version data. Applications can focus versioning where and when it is most productive, and customize for each application structure independently. This control is portable, and its embedding in application source makes it natural to express and easy to maintain. The ability to name multiple versions and “partially materialize” them efficiently makes ambitious forward-recovery based on “datamore » slices” across versions or data structures both easy to express and efficient. Using several large applications (OpenMC, preconditioned conjugate gradient (PCG) solver, ddcMD, and Chombo), we evaluate the programming effort to add resilience. The required changes are small (< 2% lines of code (LOC)), localized and machine-independent, and perhaps most important, require no software architecture changes. We also measure the overhead of adding GVR versioning and show that overheads < 2% are generally achieved. This overhead suggests that GVR can be implemented in large-scale codes and support portable error recovery with modest investment and runtime impact. Our results are drawn from both IBM BG/Q and Cray XC30 experiments, demonstrating portability. We also present two case studies of flexible error recovery, illustrating how GVR can be used for multi-version rollback recovery, and several different forward-recovery schemes. GVR’s multi-version enables applications to survive latent errors (silent data corruption) with significant detection latency, and forward recovery can make that recovery extremely efficient. Lastly, our results suggest that GVR is scalable, portable, and efficient. GVR interfaces are flexible, supporting a variety of recovery schemes, and altogether GVR embodies a gentle-slope path to tolerate growing error rates in future extreme-scale systems.« less
NASA Technical Reports Server (NTRS)
Lu, Yun-Chi; Chang, Hyo Duck; Krupp, Brian; Kumar, Ravindra; Swaroop, Anand
1992-01-01
Information on Earth Observing System (EOS) output data products and input data requirements that has been compiled by the Science Processing Support Office (SPSO) at GSFC is presented. Since Version 1.0 of the SPSO Report was released in August 1991, there have been significant changes in the EOS program. In anticipation of a likely budget cut for the EOS Project, NASA HQ restructured the EOS program. An initial program consisting of two large platforms was replaced by plans for multiple, smaller platforms, and some EOS instruments were either deselected or descoped. Updated payload information reflecting the restructured EOS program superseding the August 1991 version of the SPSO report is included. This report has been expanded to cover information on non-EOS data products, and consists of three volumes (Volumes 1, 2, and 3). Volume 1 provides information on instrument outputs and input requirements. Volume 2 is devoted to Interdisciplinary Science (IDS) outputs and input requirements, including the 'best' and 'alternative' match analysis. Volume 3 provides information about retrieval algorithms, non-EOS input requirements of instrument teams and IDS investigators, and availability of non-EOS data products at seven primary Distributed Active Archive Centers (DAAC's).
Supporting ontology adaptation and versioning based on a graph of relevance
NASA Astrophysics Data System (ADS)
Sassi, Najla; Jaziri, Wassim; Alharbi, Saad
2016-11-01
Ontologies recently have become a topic of interest in computer science since they are seen as a semantic support to explicit and enrich data-models as well as to ensure interoperability of data. Moreover, supporting ontology adaptation becomes essential and extremely important, mainly when using ontologies in changing environments. An important issue when dealing with ontology adaptation is the management of several versions. Ontology versioning is a complex and multifaceted problem as it should take into account change management, versions storage and access, consistency issues, etc. The purpose of this paper is to propose an approach and tool for ontology adaptation and versioning. A series of techniques are proposed to 'safely' evolve a given ontology and produce a new consistent version. The ontology versions are ordered in a graph according to their relevance. The relevance is computed based on four criteria: conceptualisation, usage frequency, abstraction and completeness. The techniques to carry out the versioning process are implemented in the Consistology tool, which has been developed to assist users in expressing adaptation requirements and managing ontology versions.
NASA Technical Reports Server (NTRS)
Kent, James; Holdaway, Daniel
2015-01-01
A number of geophysical applications require the use of the linearized version of the full model. One such example is in numerical weather prediction, where the tangent linear and adjoint versions of the atmospheric model are required for the 4DVAR inverse problem. The part of the model that represents the resolved scale processes of the atmosphere is known as the dynamical core. Advection, or transport, is performed by the dynamical core. It is a central process in many geophysical applications and is a process that often has a quasi-linear underlying behavior. However, over the decades since the advent of numerical modelling, significant effort has gone into developing many flavors of high-order, shape preserving, nonoscillatory, positive definite advection schemes. These schemes are excellent in terms of transporting the quantities of interest in the dynamical core, but they introduce nonlinearity through the use of nonlinear limiters. The linearity of the transport schemes used in Goddard Earth Observing System version 5 (GEOS-5), as well as a number of other schemes, is analyzed using a simple 1D setup. The linearized version of GEOS-5 is then tested using a linear third order scheme in the tangent linear version.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabello, Adan
We introduce an extended version of a previous all-versus-nothing proof of impossibility of Einstein-Podolsky-Rosen's local elements of reality for two photons entangled both in polarization and path degrees of freedom (A. Cabello, quant-ph/0507259), which leads to a Bell's inequality where the classical bound is 8 and the quantum prediction is 16. A simple estimation of the detection efficiency required to close the detection loophole using this extended version gives {eta}>0.69. This efficiency is lower than that required for previous proposals.
An implicit dispersive transport algorithm for the US Geological Survey MOC3D solute-transport model
Kipp, K.L.; Konikow, Leonard F.; Hornberger, G.Z.
1998-01-01
This report documents an extension to the U.S. Geological Survey MOC3D transport model that incorporates an implicit-in-time difference approximation for the dispersive transport equation, including source/sink terms. The original MOC3D transport model (Version 1) uses the method of characteristics to solve the transport equation on the basis of the velocity field. The original MOC3D solution algorithm incorporates particle tracking to represent advective processes and an explicit finite-difference formulation to calculate dispersive fluxes. The new implicit procedure eliminates several stability criteria required for the previous explicit formulation. This allows much larger transport time increments to be used in dispersion-dominated problems. The decoupling of advective and dispersive transport in MOC3D, however, is unchanged. With the implicit extension, the MOC3D model is upgraded to Version 2. A description of the numerical method of the implicit dispersion calculation, the data-input requirements and output options, and the results of simulator testing and evaluation are presented. Version 2 of MOC3D was evaluated for the same set of problems used for verification of Version 1. These test results indicate that the implicit calculation of Version 2 matches the accuracy of Version 1, yet is more efficient than the explicit calculation for transport problems that are characterized by a grid Peclet number less than about 1.0.
Automated Instructional Management Systems (AIMS) Version III, Users Manual.
ERIC Educational Resources Information Center
New York Inst. of Tech., Old Westbury.
This document sets forth the procedures necessary to utilize and understand the operating characteristics of the Automated Instructional Management System - Version III, a computer-based system for management of educational processes. Directions for initialization, including internal and user files; system and operational input requirements;…
Conservation Reasoning Ability and Performance on BSCS Blue Version Examinations.
ERIC Educational Resources Information Center
Lawson, Anton E.; Nordland, Floyd H.
Twenty-three high school biology students were individually administered three conservation tasks (weight, volume, volume displacement). During one semester, they were examined over the course material using published Biological Sciences Curriculum Study (BSCS) Blue Version examination questions which were previously classified as requiring either…
77 FR 73302 - Extension of Dates for Certain Requirements and Amendment of Form 19b-4
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-10
... amending section A of the General Instructions for Form 19b-4 to state that blank electronic and PDF.... An electronic version of Form 19b-4 is available in EFFS. A PDF version of the Form is also available...
DOT National Transportation Integrated Search
2008-04-23
In order to improve data quality in the SAFER system, two major software changes have been made in the recent SAFER releases. SAFER version 4.9, released in October 2005, has implemented data rules (SAFER CR 131) to support the requirements for manda...
Identification of single-nucleotide variants in RNA-seq data. Current version focuses on detection of RNA editing sites without requiring genome sequence data. New version is under development to separately identify RNA editing sites and genetic variants using RNA-seq data alone.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file, a discussion of radar cross section computations, a discussion of some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Astrophysics Data System (ADS)
Varanasi, Rao; Mesawich, Michael; Connor, Patrick; Johnson, Lawrence
2017-03-01
Two versions of a specific 2nm rated filter containing filtration medium and all other components produced from high density polyethylene (HDPE), one subjected to standard cleaning, the other to specialized ultra-cleaning, were evaluated in terms of their cleanliness characteristics, and also defectivity of wafers processed with photoresist filtered through each. With respect to inherent cleanliness, the ultraclean version exhibited a 70% reduction in total metal extractables and 90% reduction in organics extractables compared to the standard clean version. In terms of particulate cleanliness, the ultraclean version achieved stability of effluent particles 30nm and larger in about half the time required by the standard clean version, also exhibiting effluent levels at stability almost 90% lower. In evaluating defectivity of blanket wafers processed with photoresist filtered through either version, initial defect density while using the ultraclean version was about half that observed when the standard clean version was in service, with defectivity also falling more rapidly during subsequent usage of the ultraclean version compared to the standard clean version. Similar behavior was observed for patterned wafers, where the enhanced defect reduction was primarily of bridging defects. The filter evaluation and actual process-oriented results demonstrate the extreme value in using filtration designed possessing the optimal intrinsic characteristics, but with further improvements possible through enhanced cleaning processes
HALE UAS Concept of Operations. Version 3.0
NASA Technical Reports Server (NTRS)
2006-01-01
This document is a system level Concept of Operations (CONOPS) from the perspective of future High Altitude Long Endurance (HALE) Unmanned Aircraft Systems (UAS) service providers and National Airspace System (NAS) users. It describes current systems (existing UAS), describes HALE UAS functions and operations to be performed (via sample missions), and offers insight into the user s environment (i.e., the UAS as a system of systems). It is intended to be a source document for NAS UAS operational requirements, and provides a construct for government agencies to use in guiding their regulatory decisions, architecture requirements, and investment strategies. Although it does not describe the technical capabilities of a specific HALE UAS system (which do, and will vary widely), it is intended to aid in requirements capture and to be used as input to the functional requirements and analysis process. The document provides a basis for development of functional requirements and operational guidelines to achieve unrestricted access into the NAS. This document is an FY06 update to the FY05 Access 5 Project-approved Concept of Operations document previously published in the Public Domain on the Access 5 open website. This version is recommended to be approved for public release also. The updates are a reorganization of materials from the previous version with the addition of an updated set of operational requirements, inclusion of sample mission scenarios, and identification of roles and responsibilities of interfaces within flight phases.
NASA Technical Reports Server (NTRS)
1986-01-01
The Johnson Space Center Management Information System (JSCMIS) is an interface to computer data bases at NASA Johnson which allows an authorized user to browse and retrieve information from a variety of sources with minimum effort. This issue gives requirements definition and design specifications for versions 2.1 and 2.1.1, along with documented test scenario environments, and security object design and specifications.
Optical Breath Gas Extravehicular Activity Sensor for the Advanced Portable Life Support System
NASA Technical Reports Server (NTRS)
Wood, William R.; Casias, Miguel E.; Pilgrim, Jeffrey S.; Chullen, Cinda; Campbell, Colin
2016-01-01
The infrared gas transducer used during extravehicular activity (EVA) in the extravehicular mobility unit (EMU) measures and reports the concentration of carbon dioxide (CO2) in the ventilation loop. It is nearing its end of life and there are a limited number remaining. Meanwhile, the next generation advanced portable life support system (PLSS) now being developed requires CO2 sensing technology with performance beyond that presently in use. A laser diode (LD) spectrometer based on wavelength modulation spectroscopy (WMS) is being developed to address both applications by Vista Photonics, Inc. Accommodation within space suits demands that optical sensors meet stringent size, weight, and power requirements. Version 1.0 devices were delivered to NASA Johnson Space Center (JSC) in 2011. The sensors incorporate a laser diode based CO2 channel that also includes an incidental water vapor (humidity) measurement. The prototypes are controlled digitally with a field-programmable gate array (FPGA)/microcontroller architecture. Version 2.0 devices with improved electronics and significantly reduced wetted volumes were delivered to JSC in 2012. A version 2.5 upgrade recently implemented wavelength stabilized operation, better humidity measurement, and much faster data analysis/reporting. A wholly reconfigured version 3.0 will maintain the demonstrated performance of earlier versions while being backwards compatible with the EMU and offering a radiation tolerant architecture.
User's guide for mapIMG 3--Map image re-projection software package
Finn, Michael P.; Mattli, David M.
2012-01-01
Version 0.0 (1995), Dan Steinwand, U.S. Geological Survey (USGS)/Earth Resources Observation Systems (EROS) Data Center (EDC)--Version 0.0 was a command line version for UNIX that required four arguments: the input metadata, the output metadata, the input data file, and the output destination path. Version 1.0 (2003), Stephen Posch and Michael P. Finn, USGS/Mid-Continent Mapping Center (MCMC--Version 1.0 added a GUI interface that was built using the Qt library for cross platform development. Version 1.01 (2004), Jason Trent and Michael P. Finn, USGS/MCMC--Version 1.01 suggested bounds for the parameters of each projection. Support was added for larger input files, storage of the last used input and output folders, and for TIFF/ GeoTIFF input images. Version 2.0 (2005), Robert Buehler, Jason Trent, and Michael P. Finn, USGS/National Geospatial Technical Operations Center (NGTOC)--Version 2.0 added Resampling Methods (Mean, Mode, Min, Max, and Sum), updated the GUI design, and added the viewer/pre-viewer. The metadata style was changed to XML and was switched to a new naming convention. Version 3.0 (2009), David Mattli and Michael P. Finn, USGS/Center of Excellence for Geospatial Information Science (CEGIS)--Version 3.0 brings optimized resampling methods, an updated GUI, support for less than global datasets, UTM support and the whole codebase was ported to Qt4.
Fleming, Charles B; Mason, W Alex; Haggerty, Kevin P; Thompson, Ronald W; Fernandez, Kate; Casey-Goldstein, Mary; Oats, Robert G
2015-04-01
Engaging and retaining participants are crucial to achieving adequate implementation of parenting interventions designed to prevent problem behaviors among children and adolescents. This study examined predictors of engagement and retention in a group-based family intervention across two versions of the program: a standard version requiring only parent attendance for six sessions and an adapted version with two additional sessions that required attendance by the son or daughter. Families included a parent and an eighth grader who attended one of five high-poverty schools in an urban Pacific Northwest school district. The adapted version of the intervention had a higher rate of engagement than the standard version, a difference that was statistically significant after adjusting for other variables assessed at enrollment in the study. Higher household income and parent education, younger student age, and poorer affective quality in the parent-child relationship predicted greater likelihood of initial attendance. In the adapted version of the intervention, parents of boys were more likely to engage with the program than those of girls. The variables considered did not strongly predict retention, although retention was higher among parents of boys. Retention did not significantly differ between conditions. Asking for child attendance at workshops may have increased engagement in the intervention, while findings for other predictors of attendance point to the need for added efforts to recruit families who have less socioeconomic resources, as well as families who perceive they have less need for services.
Fleming, Charles B.; Mason, W. Alex; Haggerty, Kevin P.; Thompson, Ronald W.; Fernandez, Kate; Casey-Goldstein, Mary; Oats, Robert G.
2015-01-01
Engaging and retaining participants are crucial to achieving adequate implementation of parenting interventions designed to prevent problem behaviors among children and adolescents. This study examined predictors of engagement and retention in a group-based family intervention across two versions of the program: a standard version requiring only parent attendance for six sessions and an adapted version with two additional sessions that required attendance by the son or daughter. Families included a parent and an eighth grader who attended one of five high-poverty schools in an urban Pacific Northwest school district. The adapted version of the intervention had a higher rate of engagement than the standard version, a difference that was statistically significant after adjusting for other variables assessed at enrollment in the study. Higher household income and parent education, younger student age, and poorer affective quality in the parent-child relationship predicted greater likelihood of initial attendance. In the adapted version of the intervention, parents of boys were more likely to engage with the program than those of girls. The variables considered did not strongly predict retention, although retention was higher among parents of boys. Retention did not significantly differ between conditions. Asking for child attendance at workshops may have increased engagement in the intervention, while findings for other predictors of attendance point to the need for added efforts to recruit families who have less socioeconomic resources, as well as families who perceive they have less need for services. PMID:25656381
ERMes: Open Source Simplicity for Your E-Resource Management
ERIC Educational Resources Information Center
Doering, William; Chilton, Galadriel
2009-01-01
ERMes, the latest version of electronic resource management system (ERM), is a relational database; content in different tables connects to, and works with, content in other tables. ERMes requires Access 2007 (Windows) or Access 2008 (Mac) to operate as the database utilizes functionality not available in previous versions of Microsoft Access. The…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... Energy Agency Basic Safety Standards Version 3.0, Draft Safety Requirements DS379 AGENCY: Nuclear Regulatory Commission. ACTION: Notice of Public Meeting on the International Atomic Energy Agency Basic... development of U.S. Government comments on this International Atomic Energy Agency (IAEA) draft General Safety...
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Using the CoRE Requirements Method with ADARTS. Version 01.00.05
1994-03-01
requirements; combining ADARTS processes and objects derived from CoRE requirements into an ADARTS software architecture design ; and taking advantage of...CoRE’s precision in the ADARTS process structuring, class structuring, and software architecture design activities. Object-oriented requirements and
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version D is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version D code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMOND.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1991-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONB.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.
NASA Technical Reports Server (NTRS)
Carlson, H. W.
1994-01-01
This code was developed to aid design engineers in the selection and evaluation of aerodynamically efficient wing-canard and wing-horizontal-tail configurations that may employ simple hinged-flap systems. Rapid estimates of the longitudinal aerodynamic characteristics of conceptual airplane lifting surface arrangements are provided. The method is particularly well suited to configurations which, because of high speed flight requirements, must employ thin wings with highly swept leading edges. The code is applicable to wings with either sharp or rounded leading edges. The code provides theoretical pressure distributions over the wing, the canard or horizontal tail, and the deflected flap surfaces as well as estimates of the wing lift, drag, and pitching moments which account for attainable leading edge thrust and leading edge separation vortex forces. The wing planform information is specified by a series of leading edge and trailing edge breakpoints for a right hand wing panel. Up to 21 pairs of coordinates may be used to describe both the leading edge and the trailing edge. The code has been written to accommodate 2000 right hand panel elements, but can easily be modified to accommodate a larger or smaller number of elements depending on the capacity of the target computer platform. The code provides solutions for wing surfaces composed of all possible combinations of leading edge and trailing edge flap settings provided by the original deflection multipliers and by the flap deflection multipliers. Up to 25 pairs of leading edge and trailing edge flap deflection schedules may thus be treated simultaneously. The code also provides for an improved accounting of hinge-line singularities in determination of wing forces and moments. To determine lifting surface perturbation velocity distributions, the code provides for a maximum of 70 iterations. The program is constructed so that successive runs may be made with a given code entry. To make additional runs, it is necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.
NASA Technical Reports Server (NTRS)
Darden, C. M.
1994-01-01
This code was developed to aid design engineers in the selection and evaluation of aerodynamically efficient wing-canard and wing-horizontal-tail configurations that may employ simple hinged-flap systems. Rapid estimates of the longitudinal aerodynamic characteristics of conceptual airplane lifting surface arrangements are provided. The method is particularly well suited to configurations which, because of high speed flight requirements, must employ thin wings with highly swept leading edges. The code is applicable to wings with either sharp or rounded leading edges. The code provides theoretical pressure distributions over the wing, the canard or horizontal tail, and the deflected flap surfaces as well as estimates of the wing lift, drag, and pitching moments which account for attainable leading edge thrust and leading edge separation vortex forces. The wing planform information is specified by a series of leading edge and trailing edge breakpoints for a right hand wing panel. Up to 21 pairs of coordinates may be used to describe both the leading edge and the trailing edge. The code has been written to accommodate 2000 right hand panel elements, but can easily be modified to accommodate a larger or smaller number of elements depending on the capacity of the target computer platform. The code provides solutions for wing surfaces composed of all possible combinations of leading edge and trailing edge flap settings provided by the original deflection multipliers and by the flap deflection multipliers. Up to 25 pairs of leading edge and trailing edge flap deflection schedules may thus be treated simultaneously. The code also provides for an improved accounting of hinge-line singularities in determination of wing forces and moments. To determine lifting surface perturbation velocity distributions, the code provides for a maximum of 70 iterations. The program is constructed so that successive runs may be made with a given code entry. To make additional runs, it is necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.
An order (n) algorithm for the dynamics simulation of robotic systems
NASA Technical Reports Server (NTRS)
Chun, H. M.; Turner, J. D.; Frisch, Harold P.
1989-01-01
The formulation of an Order (n) algorithm for DISCOS (Dynamics Interaction Simulation of Controls and Structures), which is an industry-standard software package for simulation and analysis of flexible multibody systems is presented. For systems involving many bodies, the new Order (n) version of DISCOS is much faster than the current version. Results of the experimental validation of the dynamics software are also presented. The experiment is carried out on a seven-joint robot arm at NASA's Goddard Space Flight Center. The algorithm used in the current version of DISCOS requires the inverse of a matrix whose dimension is equal to the number of constraints in the system. Generally, the number of constraints in a system is roughly proportional to the number of bodies in the system, and matrix inversion requires O(p exp 3) operations, where p is the dimension of the matrix. The current version of DISCOS is therefore considered an Order (n exp 3) algorithm. In contrast, the Order (n) algorithm requires inversion of matrices which are small, and the number of matrices to be inverted increases only linearly with the number of bodies. The newly-developed Order (n) DISCOS is currently capable of handling chain and tree topologies as well as multiple closed loops. Continuing development will extend the capability of the software to deal with typical robotics applications such as put-and-place, multi-arm hand-off and surface sliding.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SGI IRIS VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
Vincent, Deborah; McEwen, Marylyn M; Pasvogel, Alice
2008-01-01
Translation of data collection instruments, paying careful attention to equivalency between the source and the target language, is important to obtain valid data collection instruments. To translate the Summary of Diabetes Self-Care Activities (SDSCA) questionnaire (English) into Spanish and to evaluate the reliability and validity of the Spanish version. Translation and back-translation were used to develop the Spanish version of the SDSCA. The Spanish version of the SDSCA was reviewed by an expert panel for conceptual and content equivalence to the English version. Psychometric properties were assessed further by combining data from three studies that used the Spanish version as a data collection instrument. Correlation of each item of the Spanish and English version of the SDSCA instrument ranged from .78 to 1.00, with no variability in the responses of 2 of the 12 items. Test-retest correlations for the SDSCA ranged from .51 to 1.00. Internal consistency (Cronbach's alpha) for the Spanish version was .68. Items loaded on three factors, with the factors accounting for 61% of the variance in SDSCA. The findings for the psychometric properties of the Spanish version of the SDSCA questionnaire suggest that it has conceptual and content equivalency with the original English version and is valid and reliable. However, further testing with larger samples is required.
Omotosho, Tola B; Hardart, Anne; Rogers, Rebecca G; Schaffer, Joseph I; Kobak, William H; Romero, Audrey A
2009-06-01
The purpose of this study is to validate Spanish versions of the Pelvic Floor Distress Inventory (PFDI) and Pelvic Floor Impact Questionnaire (PFIQ). Spanish versions were developed using back translation and validation was performed by randomizing bilingual women to complete the Spanish or English versions of the questionnaires first. Weighted kappa statistics assessed agreement for individual questions; interclass correlation coefficients (ICC) compared primary and subscale scores. Cronbach's alpha assessed internal consistency of Spanish versions. To detect a 2.7 point difference in scores with 80% power and alpha of 0.05, 44 bilingual subjects were required. Individual questions showed good to excellent agreement (kappa > 0.6) for all but eight questions on the PFIQ. ICCs of primary and subscale scores for both questionnaires showed excellent agreement. (All ICC > 0.79). All Cronbach's alpha values were excellent (>0.84) for the primary scales of both questionnaires. Valid and reliable Spanish versions of the PFIQ and PFDI have been developed.
Study of a unified hardware and software fault-tolerant architecture
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan; Alger, Linda; Friend, Steven; Greeley, Gregory; Sacco, Stephen; Adams, Stuart
1989-01-01
A unified architectural concept, called the Fault Tolerant Processor Attached Processor (FTP-AP), that can tolerate hardware as well as software faults is proposed for applications requiring ultrareliable computation capability. An emulation of the FTP-AP architecture, consisting of a breadboard Motorola 68010-based quadruply redundant Fault Tolerant Processor, four VAX 750s as attached processors, and four versions of a transport aircraft yaw damper control law, is used as a testbed in the AIRLAB to examine a number of critical issues. Solutions of several basic problems associated with N-Version software are proposed and implemented on the testbed. This includes a confidence voter to resolve coincident errors in N-Version software. A reliability model of N-Version software that is based upon the recent understanding of software failure mechanisms is also developed. The basic FTP-AP architectural concept appears suitable for hosting N-Version application software while at the same time tolerating hardware failures. Architectural enhancements for greater efficiency, software reliability modeling, and N-Version issues that merit further research are identified.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
... for Version 1.1 of the Voluntary Voting System Guidelines (VVSG) AGENCY: United States Election... Voluntary Voting System Guidelines (VVSG). SUMMARY: The Help America Vote Act of 2002 (HAVA) (Pub. L. 107... (EAC). Section 202 of HAVA directs the EAC to adopt voluntary voting system guidelines (VVSG) and to...
NASA Technical Reports Server (NTRS)
Gladden, Roy
2007-01-01
Version 2.0 of the autogen software has been released. "Autogen" (automated sequence generation) signifies both a process and software used to implement the process of automated generation of sequences of commands in a standard format for uplink to spacecraft. Autogen requires fewer workers than are needed for older manual sequence-generation processes and reduces sequence-generation times from weeks to minutes.
Modified Mean-Pyramid Coding Scheme
NASA Technical Reports Server (NTRS)
Cheung, Kar-Ming; Romer, Richard
1996-01-01
Modified mean-pyramid coding scheme requires transmission of slightly fewer data. Data-expansion factor reduced from 1/3 to 1/12. Schemes for progressive transmission of image data transmitted in sequence of frames in such way coarse version of image reconstructed after receipt of first frame and increasingly refined version of image reconstructed after receipt of each subsequent frame.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-02
... to replace the old version of the official sign with the revised official sign at required locations.... Additionally, a credit union must replace the old version of the official sign with the revised official sign... internet signs and deplete its stockpiles of other printed advertising materials. NCUA also believes that...
1982-05-01
insufficient need for a hard metric version of the ASME Boiler and Pressure Vessel Code and industry would not support the metric version. The Code Is not...aircraft industry is concerned with certification requirements in metric units. The inch-pound Boiler and Pressure Vessel Code is the current standard
ERIC Educational Resources Information Center
Sinharay, Sandip; Holland, Paul W.
2007-01-01
It is a widely held belief that anchor tests should be miniature versions (i.e., "minitests"), with respect to content and statistical characteristics, of the tests being equated. This article examines the foundations for this belief regarding statistical characteristics. It examines the requirement of statistical representativeness of…
10 CFR 431.223 - Materials incorporated by reference.
Code of Federal Regulations, 2013 CFR
2013-01-01
... procedures incorporated by reference. (1) Environmental Protection Agency, “ENERGY STAR Program Requirements... Agency “ENERGY STAR Program Requirements for Traffic Signals,” Version 1.1, may be obtained from the...
10 CFR 431.223 - Materials incorporated by reference.
Code of Federal Regulations, 2014 CFR
2014-01-01
... procedures incorporated by reference. (1) Environmental Protection Agency, “ENERGY STAR Program Requirements... Agency “ENERGY STAR Program Requirements for Traffic Signals,” Version 1.1, may be obtained from the...
10 CFR 431.223 - Materials incorporated by reference.
Code of Federal Regulations, 2012 CFR
2012-01-01
... procedures incorporated by reference. (1) Environmental Protection Agency, “ENERGY STAR Program Requirements... Agency “ENERGY STAR Program Requirements for Traffic Signals,” Version 1.1, may be obtained from the...
Gozzi, Marta; Cherubini, Paolo; Papagno, Costanza; Bricolo, Emanuela
2011-05-01
Previous studies found mixed results concerning the role of working memory (WM) in the gambling task (GT). Here, we aimed at reconciling inconsistencies by showing that the standard version of the task can be solved using intuitive strategies operating automatically, while more complex versions require analytic strategies drawing on executive functions. In Study 1, where good performance on the GT could be achieved using intuitive strategies, participants performed well both with and without a concurrent WM load. In Study 2, where analytical strategies were required to solve a more complex version of the GT, participants without WM load performed well, while participants with WM load performed poorly. In Study 3, where the complexity of the GT was further increased, participants in both conditions performed poorly. In addition to the standard performance measure, we used participants' subjective expected utility, showing that it differs from the standard measure in some important aspects.
Program Processes Thermocouple Readings
NASA Technical Reports Server (NTRS)
Quave, Christine A.; Nail, William, III
1995-01-01
Digital Signal Processor for Thermocouples (DART) computer program implements precise and fast method of converting voltage to temperature for large-temperature-range thermocouple applications. Written using LabVIEW software. DART available only as object code for use on Macintosh II FX or higher-series computers running System 7.0 or later and IBM PC-series and compatible computers running Microsoft Windows 3.1. Macintosh version of DART (SSC-00032) requires LabVIEW 2.2.1 or 3.0 for execution. IBM PC version (SSC-00031) requires LabVIEW 3.0 for Windows 3.1. LabVIEW software product of National Instruments and not included with program.
Bauer, A S; Timpe, J; Edmonds, E C; Bechara, A; Tranel, D; Denburg, N L
2013-02-01
It has been shown that older adults perform less well than younger adults on the Iowa Gambling Task (IGT), a real-world type decision-making task that factors together reward, punishment, and uncertainty. To explore the reasons behind this age-related decrement, we administered to an adult life span sample of 265 healthy participants (Mdn age = 62.00 +/- 16.17 years; range [23-88]) 2 versions of the IGT, which have different contingencies for successful performance: A'B'C'D' requires choosing lower immediate reward (paired with lower delayed punishment); E'F'G'H' requires choosing higher immediate punishment (paired with higher delayed reward). There was a significant negative correlation between age and performance on the A'B'C'D' version of the IGT (r = -.16, p = .01), while there was essentially no correlation between age and performance on the E'F'G'H' version (r = -.07, p = .24). In addition, the rate of impaired performance in older participants was significantly higher for the A'B'C'D' version (23%) compared with the E'F'G'H' version (13%). A parsimonious account of these findings is an age-related increase in hypersensitivity to reward, whereby the decisions of older adults are disproportionately influenced by prospects of receiving reward, irrespective of the presence or degree of punishment. PsycINFO Database Record (c) 2013 APA, all rights reserved.
The seasonal-cycle climate model
NASA Technical Reports Server (NTRS)
Marx, L.; Randall, D. A.
1981-01-01
The seasonal cycle run which will become the control run for the comparison with runs utilizing codes and parameterizations developed by outside investigators is discussed. The climate model currently exists in two parallel versions: one running on the Amdahl and the other running on the CYBER 203. These two versions are as nearly identical as machine capability and the requirement for high speed performance will allow. Developmental changes are made on the Amdahl/CMS version for ease of testing and rapidity of turnaround. The changes are subsequently incorporated into the CYBER 203 version using vectorization techniques where speed improvement can be realized. The 400 day seasonal cycle run serves as a control run for both medium and long range climate forecasts alsensitivity studies.
COMPPAP - COMPOSITE PLATE BUCKLING ANALYSIS PROGRAM (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Smith, J. P.
1994-01-01
The Composite Plate Buckling Analysis Program (COMPPAP) was written to help engineers determine buckling loads of orthotropic (or isotropic) irregularly shaped plates without requiring hand calculations from design curves or extensive finite element modeling. COMPPAP is a one element finite element program that utilizes high-order displacement functions. The high order of the displacement functions enables the user to produce results more accurate than traditional h-finite elements. This program uses these high-order displacement functions to perform a plane stress analysis of a general plate followed by a buckling calculation based on the stresses found in the plane stress solution. The current version assumes a flat plate (constant thickness) subject to a constant edge load (normal or shear) on one or more edges. COMPPAP uses the power method to find the eigenvalues of the buckling problem. The power method provides an efficient solution when only one eigenvalue is desired. Once the eigenvalue is found, the eigenvector, which corresponds to the plate buckling mode shape, results as a by-product. A positive feature of the power method is that the dominant eigenvalue is the first found, which is this case is the plate buckling load. The reported eigenvalue expresses a load factor to induce plate buckling. COMPPAP is written in ANSI FORTRAN 77. Two machine versions are available from COSMIC: a PC version (MSC-22428), which is for IBM PC 386 series and higher computers and compatibles running MS-DOS; and a UNIX version (MSC-22286). The distribution medium for both machine versions includes source code for both single and double precision versions of COMPPAP. The PC version includes source code which has been optimized for implementation within DOS memory constraints as well as sample executables for both the single and double precision versions of COMPPAP. The double precision versions of COMPPAP have been successfully implemented on an IBM PC 386 compatible running MS-DOS, a Sun4 series computer running SunOS, an HP-9000 series computer running HP-UX, and a CRAY X-MP series computer running UNICOS. COMPPAP requires 1Mb of RAM and the BLAS and LINPACK math libraries, which are included on the distribution medium. The COMPPAP documentation provides instructions for using the commercial post-processing package PATRAN for graphical interpretation of COMPPAP output. The UNIX version includes two electronic versions of the documentation: one in LaTex format and one in PostScript format. The standard distribution medium for the PC version (MSC-22428) is a 5.25 inch 1.2Mb MS-DOS format diskette. The standard distribution medium for the UNIX version (MSC-22286) is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. COMPPAP was developed in 1992.
COMPPAP - COMPOSITE PLATE BUCKLING ANALYSIS PROGRAM (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Smith, J. P.
1994-01-01
The Composite Plate Buckling Analysis Program (COMPPAP) was written to help engineers determine buckling loads of orthotropic (or isotropic) irregularly shaped plates without requiring hand calculations from design curves or extensive finite element modeling. COMPPAP is a one element finite element program that utilizes high-order displacement functions. The high order of the displacement functions enables the user to produce results more accurate than traditional h-finite elements. This program uses these high-order displacement functions to perform a plane stress analysis of a general plate followed by a buckling calculation based on the stresses found in the plane stress solution. The current version assumes a flat plate (constant thickness) subject to a constant edge load (normal or shear) on one or more edges. COMPPAP uses the power method to find the eigenvalues of the buckling problem. The power method provides an efficient solution when only one eigenvalue is desired. Once the eigenvalue is found, the eigenvector, which corresponds to the plate buckling mode shape, results as a by-product. A positive feature of the power method is that the dominant eigenvalue is the first found, which is this case is the plate buckling load. The reported eigenvalue expresses a load factor to induce plate buckling. COMPPAP is written in ANSI FORTRAN 77. Two machine versions are available from COSMIC: a PC version (MSC-22428), which is for IBM PC 386 series and higher computers and compatibles running MS-DOS; and a UNIX version (MSC-22286). The distribution medium for both machine versions includes source code for both single and double precision versions of COMPPAP. The PC version includes source code which has been optimized for implementation within DOS memory constraints as well as sample executables for both the single and double precision versions of COMPPAP. The double precision versions of COMPPAP have been successfully implemented on an IBM PC 386 compatible running MS-DOS, a Sun4 series computer running SunOS, an HP-9000 series computer running HP-UX, and a CRAY X-MP series computer running UNICOS. COMPPAP requires 1Mb of RAM and the BLAS and LINPACK math libraries, which are included on the distribution medium. The COMPPAP documentation provides instructions for using the commercial post-processing package PATRAN for graphical interpretation of COMPPAP output. The UNIX version includes two electronic versions of the documentation: one in LaTex format and one in PostScript format. The standard distribution medium for the PC version (MSC-22428) is a 5.25 inch 1.2Mb MS-DOS format diskette. The standard distribution medium for the UNIX version (MSC-22286) is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. COMPPAP was developed in 1992.
RMP*eSubmit facilitates secure online Risk Management Plan updates/resubmissions, required at least every 5 years. Reporting requirements have not changed since 2004, but the 2012 version of North American Industry Classification System has been integrated
Augmentation of Teaching Tools: Outsourcing the HSD Computing for SPSS Application
ERIC Educational Resources Information Center
Wang, Jianjun
2010-01-01
The widely-used Tukey's HSD index is not produced in the current version of SPSS (i.e., PASW Statistics, version 18), and a computer program named "HSD Calculator" has been chosen to amend this problem. In comparison to hand calculation, this program application does not require table checking, which eliminates potential concern on the size of a…
78 FR 24107 - Version 5 Critical Infrastructure Protection Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-24
... native applications or print-to-PDF format and not a scanned format. Mail/Hand Delivery: Those unable to... criteria that characterize their impact for the application of cyber security requirements commensurate... recognition. Requirement R2 requires testing to verify response plan effectiveness and consistent application...
Code of Federal Regulations, 2013 CFR
2013-01-01
... for Testing” of DOE's “ENERGY STAR Program Requirements for [Compact Fluorescent Lamps] CFLs,” Version... Specifications for Qualifying Products” of the EPA's “ENERGY STAR Program Requirements for Residential Light... requirements specified in section 4, “CFL Requirements for Testing,” of the “ENERGY STAR Program Requirements...
Code of Federal Regulations, 2014 CFR
2014-01-01
... for Testing” of DOE's “ENERGY STAR Program Requirements for [Compact Fluorescent Lamps] CFLs,” Version... Specifications for Qualifying Products” of the EPA's “ENERGY STAR Program Requirements for Residential Light... requirements specified in section 4, “CFL Requirements for Testing,” of the “ENERGY STAR Program Requirements...
Code of Federal Regulations, 2012 CFR
2012-01-01
... for Testing” of DOE's “ENERGY STAR Program Requirements for [Compact Fluorescent Lamps] CFLs,” Version... Specifications for Qualifying Products” of the EPA's “ENERGY STAR Program Requirements for Residential Light... requirements specified in section 4, “CFL Requirements for Testing,” of the “ENERGY STAR Program Requirements...
Detailed analysis of the Japanese version of the Rapid Dementia Screening Test, revised version.
Moriyama, Yasushi; Yoshino, Aihide; Muramatsu, Taro; Mimura, Masaru
2017-11-01
The number-transcoding task on the Japanese version of the Rapid Dementia Screening Test (RDST-J) requires mutual conversion between Arabic and Chinese numerals (209 to , 4054 to , to 681, to 2027). In this task, question and answer styles of Chinese numerals are written horizontally. We investigated the impact of changing the task so that Chinese numerals are written vertically. Subjects were 211 patients with very mild to severe Alzheimer's disease and 42 normal controls. Mini-Mental State Examination scores ranged from 26 to 12, and Clinical Dementia Rating scores ranged from 0.5 to 3. Scores of all four subtasks of the transcoding task significantly improved in the revised version compared with the original version. The sensitivity and specificity of total scores ≥9 on the RDST-J original and revised versions for discriminating between controls and subjects with Clinical Dementia Rating scores of 0.5 were 63.8% and 76.6% on the original and 60.1% and 85.8% on revised version. The revised RDST-J total score had low sensitivity and high specificity compared with the original RDST-J for discriminating subjects with Clinical Dementia Rating scores of 0.5 from controls. © 2017 Japanese Psychogeriatric Society.
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John M.; Iredell, Lena; Keita, Fricky
2009-01-01
This paper describes the AIRS Science Team Version 5 retrieval algorithm in terms of its three most significant improvements over the methodology used in the AIRS Science Team Version 4 retrieval algorithm. Improved physics in Version 5 allows for use of AIRS clear column radiances in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations are now used primarily in the generation of clear column radiances .R(sub i) for all channels. This new approach allows for the generation of more accurate values of .R(sub i) and T(p) under most cloud conditions. Secondly, Version 5 contains a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 also contains for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology, referred to as AIRS Version 5 AO, was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Results are shown comparing the relative performance of the AIRS Version 4, Version 5, and Version 5 AO for the single day, January 25, 2003. The Goddard DISC is now generating and distributing products derived using the AIRS Science Team Version 5 retrieval algorithm. This paper also described the Quality Control flags contained in the DISC AIRS/AMSU retrieval products and their intended use for scientific research purposes.
75 FR 60129 - Draft Guidance for Industry and Investigators on Safety Reporting Requirements for...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... with the new requirements in the final rule entitled ``Investigational New Drug Safety Reporting...] Draft Guidance for Industry and Investigators on Safety Reporting Requirements for Investigational New... the agency considers your comment on this draft guidance before it begins work on the final version of...
Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.
Richie, Megan; Josephson, S Andrew
2018-01-01
Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained significance for students (Version A, M = 7.28, SD = 3.46; Version B, M = 5.82, SD = 3.22), t(153) = 2.67, p = .008, and residents (Version A, M = 7.19, SD = 3.24; Version B, M = 5.56, SD = 2.72), t(77) = 2.32, p = .02, but not attendings. Authors developed an instrument to isolate and quantify bias produced by the availability and representativeness heuristics, and illustrated the utility of their instrument by demonstrating decreased heuristic bias within medical contexts at higher training levels.
Numerical Arc Segmentation Algorithm for a Radio Conference-NASARC, Version 2.0: User's Manual
NASA Technical Reports Server (NTRS)
Whyte, Wayne A., Jr.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Zuzek, John E.
1987-01-01
The information contained in the NASARC (Version 2.0) Technical Manual (NASA TM-100160) and the NASARC (Version 2.0) User's Manual (NASA TM-100161) relates to the state of the Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software development through October 16, 1987. The technical manual describes the NASARC concept and the algorithms which are used to implement it. The User's Manual provides information on computer system considerations, installation instructions, description of input files, and program operation instructions. Significant revisions have been incorporated in the Version 2.0 software over prior versions. These revisions have enhanced the modeling capabilities of the NASARC procedure while greatly reducing the computer run time and memory requirements. Array dimensions within the software have been structured to fit into the currently available 6-megabyte memory capacity of the International Frequency Registration Board (IFRB) computer facility. A piecewise approach to predetermined arc generation in NASARC (Version 2.0) allows worldwide scenarios to be accommodated within these memory constraints while at the same time reducing computer run time.
The Facilitative Effect of Positive Stimuli on 3-Year-Olds' Flexible Rule Use
ERIC Educational Resources Information Center
Qu, Li; Zelazo, Philip David
2007-01-01
This study examined the effect of emotional stimuli on 3- to 4-year old children's flexible rule use, as measured by the Dimensional Change Card Sort (DCCS). In Experiment 1, children in two countries (Canada and China) were given 2 versions of the DCCS. The Standard version required children to sort red and blue boats and rabbits first by shape…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... installing software version 8.2.Q1 to the engine electronic control unit (ECU), which increases the engine's... proposed AD would require the removal of the affected ECUs from service. We are proposing this AD to... software version 8.2.Q1 to the ECU, which increases the engine's margin to flameout. That AD was prompted...
The multidimensional Self-Adaptive Grid code, SAGE, version 2
NASA Technical Reports Server (NTRS)
Davies, Carol B.; Venkatapathy, Ethiraj
1995-01-01
This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.
ENERGY STAR Certified Vending Machines
Certified models meet all ENERGY STAR requirements as listed in the Version 3.0 ENERGY STAR Program Requirements for Refrigerated Beverage Vending Machines that are effective as of March 1, 2013. A detailed listing of key efficiency criteria are available at
Surrogate oracles, generalized dependency and simpler models
NASA Technical Reports Server (NTRS)
Wilson, Larry
1990-01-01
Software reliability models require the sequence of interfailure times from the debugging process as input. It was previously illustrated that using data from replicated debugging could greatly improve reliability predictions. However, inexpensive replication of the debugging process requires the existence of a cheap, fast error detector. Laboratory experiments can be designed around a gold version which is used as an oracle or around an n-version error detector. Unfortunately, software developers can not be expected to have an oracle or to bear the expense of n-versions. A generic technique is being investigated for approximating replicated data by using the partially debugged software as a difference detector. It is believed that the failure rate of each fault has significant dependence on the presence or absence of other faults. Thus, in order to discuss a failure rate for a known fault, the presence or absence of each of the other known faults needs to be specified. Also, in simpler models which use shorter input sequences without sacrificing accuracy are of interest. In fact, a possible gain in performance is conjectured. To investigate these propositions, NASA computers running LIC (RTI) versions are used to generate data. This data will be used to label the debugging graph associated with each version. These labeled graphs will be used to test the utility of a surrogate oracle, to analyze the dependent nature of fault failure rates and to explore the feasibility of reliability models which use the data of only the most recent failures.
Program package for multicanonical simulations of U(1) lattice gauge theory-Second version
NASA Astrophysics Data System (ADS)
Bazavov, Alexei; Berg, Bernd A.
2013-03-01
A new version STMCMUCA_V1_1 of our program package is available. It eliminates compatibility problems of our Fortran 77 code, originally developed for the g77 compiler, with Fortran 90 and 95 compilers. New version program summaryProgram title: STMC_U1MUCA_v1_1 Catalogue identifier: AEET_v1_1 Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language: Fortran 77 compatible with Fortran 90 and 95 Computers: Any capable of compiling and executing Fortran code Operating systems: Any capable of compiling and executing Fortran code RAM: 10 MB and up depending on lattice size used No. of lines in distributed program, including test data, etc.: 15059 No. of bytes in distributed program, including test data, etc.: 215733 Keywords: Markov chain Monte Carlo, multicanonical, Wang-Landau recursion, Fortran, lattice gauge theory, U(1) gauge group, phase transitions of continuous systems Classification: 11.5 Catalogue identifier of previous version: AEET_v1_0 Journal Reference of previous version: Computer Physics Communications 180 (2009) 2339-2347 Does the new version supersede the previous version?: Yes Nature of problem: Efficient Markov chain Monte Carlo simulation of U(1) lattice gauge theory (or other continuous systems) close to its phase transition. Measurements and analysis of the action per plaquette, the specific heat, Polyakov loops and their structure factors. Solution method: Multicanonical simulations with an initial Wang-Landau recursion to determine suitable weight factors. Reweighting to physical values using logarithmic coding and calculating jackknife error bars. Reasons for the new version: The previous version was developed for the g77 compiler Fortran 77 version. Compiler errors were encountered with Fortran 90 and Fortran 95 compilers (specified below). Summary of revisions: epsilon=one/10**10 is replaced by epsilon/10.0D10 in the parameter statements of the subroutines u1_bmha.f, u1_mucabmha.f, u1wl_backup.f, u1wlread_backup.f of the folder Libs/U1_par. For the tested compilers script files are added in the folder ExampleRuns and readme.txt files are now provided in all subfolders of ExampleRuns. The gnuplot driver files produced by the routine hist_gnu.f of Libs/Fortran are adapted to syntax required by gnuplot version 4.0 and higher. Restrictions: Due to the use of explicit real*8 initialization the conversion into real*4 will require extra changes besides replacing the implicit.sta file by its real*4 version. Unusual features: The programs have to be compiled the script files like those contained in the folder ExampleRuns as explained in the original paper. Running time: The prepared test runs took up to 74 minutes to execute on a 2 GHz PC.
Automated Orbit Determination System (AODS) requirements definition and analysis
NASA Technical Reports Server (NTRS)
Waligora, S. R.; Goorevich, C. E.; Teles, J.; Pajerski, R. S.
1980-01-01
The requirements definition for the prototype version of the automated orbit determination system (AODS) is presented including the AODS requirements at all levels, the functional model as determined through the structured analysis performed during requirements definition, and the results of the requirements analysis. Also specified are the implementation strategy for AODS and the AODS-required external support software system (ADEPT), input and output message formats, and procedures for modifying the requirements.
The version control service for the ATLAS data acquisition configuration files
NASA Astrophysics Data System (ADS)
Soloviev, Igor
2012-12-01
The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.
Guan, Ng Chong; Seng, Loh Huai; Hway Ann, Anne Yee; Hui, Koh Ong
2015-03-01
This study was aimed at validating the simplified Chinese version of the Multidimensional Scale of Perceived Support (MSPSS-SCV) among a group of medical and dental students in University Malaya. Two hundred and two students who took part in this study were given the MSPSS-SCV, the Medical Outcome Study social support survey, the Malay version of the Beck Depression Inventory, the Malay version of the General Health Questionnaire, and the English version of the MSPSS. After 1 week, these students were again required to complete the MSPSS-SCV but with the item sequences shuffled. This scale displayed excellent internal consistency (Cronbach's α = .924), high test-retest reliability (.71), parallel form reliability (.92; Spearman's ρ, P < .01), and validity. In conclusion, the MSPSS-SCV demonstrated sound psychometric properties in measuring social support among a group of medical and dental students. It could therefore be used as a simple screening tool among young educated Malaysian adolescents. © 2013 APJPH.
Reliability and Validity of the Turkish Version of the Job Performance Scale Instrument.
Harmanci Seren, Arzu Kader; Tuna, Rujnan; Eskin Bacaksiz, Feride
2018-02-01
Objective measurement of the job performance of nursing staff using valid and reliable instruments is important in the evaluation of healthcare quality. A current, valid, and reliable instrument that specifically measures the performance of nurses is required for this purpose. The aim of this study was to determine the validity and reliability of the Turkish version of the Job Performance Instrument. This study used a methodological design and a sample of 240 nurses working at different units in four hospitals in Istanbul, Turkey. A descriptive data form, the Job Performance Scale, and the Employee Performance Scale were used to collect data. Data were analyzed using IBM SPSS Statistics Version 21.0 and LISREL Version 8.51. On the basis of the data analysis, the instrument was revised. Some items were deleted, and subscales were combined. The Turkish version of the Job Performance Instrument was determined to be valid and reliable to measure the performance of nurses. The instrument is suitable for evaluating current nursing roles.
MODIS information, data and control system (MIDACS) level 2 functional requirements
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Sharts, B.; Folta, D.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.
1988-01-01
The MODIS Information, Data and Control System (MIDACS) Level 2 Functional Requirements Document establishes the functional requirements for MIDACS and provides a basis for the mutual understanding between the users and the designers of the EosDIS, including the requirements, operating environment, external interfaces, and development plan. In defining the requirements and scope of the system, this document describes how MIDACS will operate as an element of the EOS within the EosDIS environment. This version of the Level 2 Requirements Document follows an earlier release of a preliminary draft version. The sections on functional and performance requirements do not yet fully represent the requirements of the data system needed to achieve the scientific objectives of the MODIS instruments and science teams. Indeed, the team members have not yet been selected and the team has not yet been formed; however, it has been possible to identify many relevant requirements based on the present concept of EosDIS and through interviews and meetings with key members of the scientific community. These requirements have been grouped by functional component of the data system, and by function within each component. These requirements have been merged with the complete set of Level 1 and Level 2 context diagrams, data flow diagrams, and data dictionary.
Formal Analysis of Privacy Requirements Specifications for Multi-Tier Applications
2013-07-30
Requirements Engineering Lab and co- founder of the Requirements Engineering and Law Workshop and has several publications in ACM- and IEEE- sponsored journals...Advertising that serves the online ad “Buying Razors Sucks” in this game. Zynga also produces a version of this game for the Android and iPhone mobile
40 CFR 63.7342 - What records must I keep?
Code of Federal Regulations, 2010 CFR
2010-07-01
... malfunction. (3) Records of performance tests, performance evaluations, and opacity observations as required...) Monitoring data for COMS during a performance evaluation as required in § 63.6(h)(7)(i) and (ii). (3) Previous (that is, superceded) versions of the performance evaluation plan as required in § 63.8(d)(3). (4...
Calculating Trajectories And Orbits
NASA Technical Reports Server (NTRS)
Alderson, Daniel J.; Brady, Franklyn H.; Breckheimer, Peter J.; Campbell, James K.; Christensen, Carl S.; Collier, James B.; Ekelund, John E.; Ellis, Jordan; Goltz, Gene L.; Hintz, Gerarld R.;
1989-01-01
Double-Precision Trajectory Analysis Program, DPTRAJ, and Orbit Determination Program, ODP, developed and improved over years to provide highly reliable and accurate navigation capability for deep-space missions like Voyager. Each collection of programs working together to provide desired computational results. DPTRAJ, ODP, and supporting utility programs capable of handling massive amounts of data and performing various numerical calculations required for solving navigation problems associated with planetary fly-by and lander missions. Used extensively in support of NASA's Voyager project. DPTRAJ-ODP available in two machine versions. UNIVAC version, NPO-15586, written in FORTRAN V, SFTRAN, and ASSEMBLER. VAX/VMS version, NPO-17201, written in FORTRAN V, SFTRAN, PL/1 and ASSEMBLER.
USGS library for S-PLUS for Windows -- Release 4.0
Lorenz, David L.; Ahearn, Elizabeth A.; Carter, Janet M.; Cohn, Timothy A.; Danchuk, Wendy J.; Frey, Jeffrey W.; Helsel, Dennis R.; Lee, Kathy E.; Leeth, David C.; Martin, Jeffrey D.; McGuire, Virginia L.; Neitzert, Kathleen M.; Robertson, Dale M.; Slack, James R.; Starn, J. Jeffrey; Vecchia, Aldo V.; Wilkison, Donald H.; Williamson, Joyce E.
2011-01-01
Release 4.0 of the U.S. Geological Survey S-PLUS library supercedes release 2.1. It comprises functions, dialogs, and datasets used in the U.S. Geological Survey for the analysis of water-resources data. This version does not contain ESTREND, which was in version 2.1. See Release 2.1 for information and access to that version. This library requires Release 8.1 or later of S-PLUS for Windows. S-PLUS is a commercial statistical and graphical analysis software package produced by TIBCO corporation(http://www.tibco.com/). The USGS library is not supported by TIBCO or its technical support staff.
ERIC Educational Resources Information Center
College Store Journal, 1979
1979-01-01
Topics discussed by the NACS Store Planning/Renovation Committees in this updated version of the college store renovation manual include: short- and long-range planning, financial considerations, professional planning assistance, the store's image and business character, location considerations, building requirements, space requirements, fixtures,…
40 CFR Table 7 to Subpart Lllll of... - Applicability of General Provisions to Subpart LLLLL
Code of Federal Regulations, 2010 CFR
2010-07-01
... approval procedures 3. Performance audit requirements 4. Internal and external QA procedures for testing.... Keep old versions for 5 years after revisions No; § 63.8688 specifies the CMS requirements. § 63.8(e...
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Riley, G.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
Method For Determining And Modifying Protein/Peptide Solubilty
Waldo, Geoffrey S.
2005-03-15
A solubility reporter for measuring a protein's solubility in vivo or in vitro is described. The reporter, which can be used in a single living cell, gives a specific signal suitable for determining whether the cell bears a soluble version of the protein of interest. A pool of random mutants of an arbitrary protein, generated using error-prone in vitro recombination, may also be screened for more soluble versions using the reporter, and these versions may be recombined to yield variants having further-enhanced solubility. The method of the present invention includes "irrational" (random mutagenesis) methods, which do not require a priori knowledge of the three-dimensional structure of the protein of interest. Multiple sequences of mutation/genetic recombination and selection for improved solubility are demonstrated to yield versions of the protein which display enhanced solubility.
Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide
NASA Technical Reports Server (NTRS)
Bartrand, Timothy A.; Willis, Edward A.
1993-01-01
This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.
Eye movements as a function of response contingencies measured by blackout technique1
Doran, Judith; Holland, James G.
1971-01-01
A program may have a low error rate but, at the same time, require little of the student and teach him little. A measure to supplement error rate in evaluating a program has recently been developed. This measure, called the blackout ratio, is the percentage of material that may be deleted without increasing the error rate. In high blackout-ratio programs, obtaining a correct answer is contingent upon only a small portion of the item. The present study determined if such low response-contingent material is read less thoroughly than programmed material that is heavily response-contingent. Eye movements were compared for two versions of the same program that differed only in the choice of the omitted words. The alteration of the required responses resulted in a version with a higher blackout ratio than the original version, which had a low blackout ratio. Eighteen undergraduates received half their material from the high and half their material from the low blackout-ratio version. The order was counterbalanced. Location and duration of all eye fixations in each item were recorded by a Mackworth Eye Marker Camera. On high blackout-ratio material, subjects used fewer fixations, shorter fixation time, and shorter scanning time. High blackout-ratio material failed to evoke the students' attention. PMID:16795275
Zhang, Qian; Ge, Yan; Qu, Weina; Zhang, Kan; Sun, Xianghong
2018-04-01
Traffic safety climate is defined as road users' attitudes and perceptions of traffic in a specific context at a given point in time. The current study aimed to validate the Chinese version of the Traffic Climate Scale (TCS) and to explore its relation to drivers' personality and dangerous driving behavior. A sample of 413 drivers completed the Big Five Inventory (BFI), the Chinese version of the TCS, the Dula Dangerous Driving Index (DDDI) and a demographic questionnaire. Exploratory factor analysis and confirmatory factor analysis were performed to confirm a three-factor (external affective demands, internal requirements and functionality) solution of the TCS. The reliability and validity of the Chinese version of TCS were verified. More importantly, the results showed that the effect of personality on dangerous driving behavior was mediated by traffic climate. Specifically, the functionality of the TCS mediated the effect of neuroticism on negative cognitive/emotional driving and drunk driving, while openness had an indirect impact on aggressive driving, risky driving and drunk driving based on the internal requirements of the TCS. Additionally, agreeableness had a negative direct impact on four factors of the DDDI, while neuroticism had a positive direct impact on negative cognitive/emotional driving, drunk driving and risky driving. In conclusion, the Chinese version of the TCS will be useful to evaluate drivers' attitudes towards and perceptions of the requirements of traffic environment in which they participate and will also be valuable for comparing traffic cultures and environments in different countries. Copyright © 2018 Elsevier Ltd. All rights reserved.
40 CFR Table 10 to Subpart Uuuu of... - Applicability of General Provisions to Subpart UUUU
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirements; internal and external QA procedures for testing No. § 63.7(d) Testing Facilities Requirements for....; must keep quality control plan on record for 5 years; keep old versions for 5 years after revisions No...
40 CFR Table 12 to Subpart Eeee of... - Applicability of General Provisions to Subpart EEEE
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirements; internal and external QA procedures for testing Yes. § 63.7(d) Testing Facilities Requirements... 5 years; keep old versions for 5 years after revisions Yes, but only applies for CEMS. 40 CFR part...
User's guide for Skylab dynamics program, SKYDYN
NASA Technical Reports Server (NTRS)
Hopkins, M. S.
1980-01-01
This user's manual describes the capabilities, required input data, and resulting output of SKYDYN, version of the 6 degree-of-freedom digital program REENTR which was extensively modified for the Honeywell CP-V system and was tailored to the specfic requirements for SKYLAB.
Topographic data requirements for EOS global change research
Gesch, Dean B.
1994-01-01
This document is a result of Earth Observing System Data and Information System (EOSDIS) Version 0 activities of the Land Processes Distributed Active Archive Center at the U.S. Geological Survey's EROS Data Center. A relatively small part of the Version 0 funding provided by NASA is used to address topographic data issues related to EOS. These issues include identifying and improving access to existing sources of topographic data, data generation, facilitating the use of topographic data in global change research by demonstrating derivative products, and inventorying the specific topographic data requirements of EOS investigators. There is a clear need for global topographic data in EOSDIS. Only 10 percent of the global land surface is covered by high-resolution data that are available to the global change science community. Alternative sources for new data exist or have been proposed; however, none of them alone can fulfill the data requirements by the launch of the first EOS platform in 4 years. There is no operational provider of all the source data that are required. Even if appropriate global source data existed, a concerted production effort would be necessary to ensure the existence of the requisite topographic data before EOS launch. Additionally, no funding from NASA or any other agency has been appropriated for a new mapping mission or for other means of data acquisition. This effort to document requirements is an initial step toward understanding the severity of the data shortage. It is well beyond the scope of Version 0 funding and capabilities to provide the required data in the proper timeframe. The shortage of data and the lack of a plan for providing the necessary topographic data through EOSDIS in time for launch are issues that must be addressed by the EOS program.
Validation and cross cultural adaptation of the Italian version of the Harris Hip Score.
Dettoni, Federico; Pellegrino, Pietro; La Russa, Massimo R; Bonasia, Davide E; Blonna, Davide; Bruzzone, Matteo; Castoldi, Filippo; Rossi, Roberto
2015-01-01
The Harris Hip Score (HHS) is one of the most widely used health related quality of life (HRQOL) measures for the assessment of hip pathology: in spite of this, a validation study, and an official Italian version have not been provided yet. The aim of this study was to create an Italian valid and reliable version of the HHS. The score was translated and modified in Italian; then 103 patients with different hip pathologies were evaluated using this HHS version and also with the WOMAC and the SF-12 questionnaires. Content, construct and criterion validities were tested, such as interobserver reliability, test-retest reliability and internal consistency. Cross-cultural adaptation was easy, and only minor adaptation was required in the translation process. Construct and criterion validity of the HHS Italian Version were confirmed by satisfactory values of Spearman's Rho for correlation between specific domains of HHS and Womac and SF12 scores. Interobserver and test-retest reliabilities obtained values of 0.996 and 0.975 respectively; Cronbach's alpha for internal consistency was 0.816. Statistical and clinical analysis showed that HHS is highly valid and reliable in this new Italian version.
1998-12-01
3-16 3.4.7 Category 7: Areas Not Evaluated or Require Additional Evaluation .................... 3-16...Alternatives Evaluation Process for DIS Disposal and Reuse, Fort Holabird, Maryland ........ 2-6 Figure 3-1. Environmental Restoration Early Action...Requirement AREE ............... Area Requiring Environmental Evaluation AST .................. Above-ground Storage Tank BCP .................. BRAC
Di Riso, Daniela; Salcuni, Silvia; Lis, Adriana; Delvecchio, Elisa
2017-01-01
Affect in Play Scale-Preschool (APS-P) is one of the few standardized tools to measure pretend play. APS-P is an effective measure of symbolic play, able to detect both cognitive and affective dimensions which classically designated play in children, but often are evaluated separately and are scarcely integrated. The scale uses 5 min standardized play task with a set of toys. Recently the scale was extended from 6 to 10 years old and validated in Italy preschool and school-aged children. Some of the main limitations of this measure are that it requires videotaping, verbatim transcripts, and an extensive scoring training, which could compromise its clinical utility. For these reasons, a Brief version of the measure was developed by the original authors. This paper will focus on an APS-P Brief Version and its Extended Version through ages (6–10 years), which consists “in vivo” coding. This study aimed to evaluate construct and external validity of this APS-P Brief Version and its Extended Version in a sample of 538 Italian children aged 4-to-10 years. Confirmatory factor analysis yielded a two correlated factor structure including an affective and a cognitive factor. APS-P-BR and its Extended Version factor scores strongly related to APS-P Extended Version factor scores. Significant relationships were found with a divergent thinking task. Results suggest that the APS-P-BR and its Extended Version is an encouraging brief measure assessing pretend play using toys. It would easily substitute the APS-P and its Extended Version in clinical and research settings, reducing time and difficulties in scoring procedures and maintaining the same strengths. PMID:28553243
Validation of an electronic version of the Mini Asthma Quality of Life Questionnaire.
Olajos-Clow, J; Minard, J; Szpiro, K; Juniper, E F; Turcotte, S; Jiang, X; Jenkins, B; Lougheed, M D
2010-05-01
The Mini Asthma Quality of Life Questionnaire (MiniAQLQ) is a validated disease-specific quality of life (QOL) paper (p) questionnaire. Electronic (e) versions enable inclusion of asthma QOL in electronic medical records and research databases. To validate an e-version of the MiniAQLQ, compare time required for completion of e- and p-versions, and determine which version participants prefer. Adults with stable asthma were randomized to complete either the e- or p-MiniAQLQ, followed by a 2-h rest period before completing the other version. Agreement between versions was measured using the intraclass correlation coefficient (ICC) and Bland-Altman analysis. Two participants with incomplete p-MiniAQLQ responses were excluded. Forty participants (85% female; age 47.7 +/- 14.9 years; asthma duration 22.6 +/- 16.1 years; FEV(1) 87.1 +/- 21.6% predicted) with both AQLQ scores <6.0 completed the study. Agreement between e- and p-versions for the overall score was acceptable (ICC=0.95) with no bias (difference (Delta) p-e=0.1; P=0.21). ICCs for the symptom, activity limitation, emotional function and environmental stimuli domains were 0.94, 0.89, 0.90, and 0.91 respectively. A small but significant bias (Delta=0.3; P=0.004) was noted in the activity limitation domain. Completion time was significantly longer for the e-version (3.8 +/- 1.9min versus 2.7 +/- 1.1min; P<0.0001). The majority of patients (57.5%) preferred the e-MiniAQLQ; 35% had no preference. This e-version of the MiniAQLQ is valid and was preferred by most participants despite taking slightly longer to complete. Generalizabilty may be limited in younger (12-17) and older (>65) adults.
Development of an hp-version finite element method for computational optimal control
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Warner, Michael S.
1993-01-01
The purpose of this research effort was to begin the study of the application of hp-version finite elements to the numerical solution of optimal control problems. Under NAG-939, the hybrid MACSYMA/FORTRAN code GENCODE was developed which utilized h-version finite elements to successfully approximate solutions to a wide class of optimal control problems. In that code the means for improvement of the solution was the refinement of the time-discretization mesh. With the extension to hp-version finite elements, the degrees of freedom include both nodal values and extra interior values associated with the unknown states, co-states, and controls, the number of which depends on the order of the shape functions in each element. One possible drawback is the increased computational effort within each element required in implementing hp-version finite elements. We are trying to determine whether this computational effort is sufficiently offset by the reduction in the number of time elements used and improved Newton-Raphson convergence so as to be useful in solving optimal control problems in real time. Because certain of the element interior unknowns can be eliminated at the element level by solving a small set of nonlinear algebraic equations in which the nodal values are taken as given, the scheme may turn out to be especially powerful in a parallel computing environment. A different processor could be assigned to each element. The number of processors, strictly speaking, is not required to be any larger than the number of sub-regions which are free of discontinuities of any kind.
ERIC Educational Resources Information Center
Keaton, Patrick
2014-01-01
This documentation is for the provisional version 1a file of the National Center for Education Statistics' (NCES) Common Core of Data (CCD) Local Education Agency (LEA) Universe Survey for SY 2011-12. It contains a brief description of the data collection, along with information required to understand and access the data file. The CCD is a…
ERIC Educational Resources Information Center
Keaton, Patrick
2013-01-01
The documentation for this provisional version 1a file of the National Center for Education Statistics' (NCES) Common Core of Data (CCD) Public Elementary/Secondary School Universe Survey for SY 2011-12, contains a brief description of the data collection, along with information required to understand and access the data file. The SY 2011-12…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
... high-pressure compressor (HPC) of both engines. That AD also requires removing from service any engine... monitoring of EGT margin deterioration on engines in service to prevent two engines on an airplane from... 75 [deg]C; Removes FADEC software version 5.B.Q and earlier versions from the engine as mandatory...
IDC System Specification Document.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clifford, David J.
2014-12-01
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Reengineering Phase 2 project. Revisions Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Reengineering Project Team Initial delivery M. Harris
NASA Astrophysics Data System (ADS)
Ward, A. J.; Pendry, J. B.
2000-06-01
In this paper we present an updated version of our ONYX program for calculating photonic band structures using a non-orthogonal finite difference time domain method. This new version employs the same transparent formalism as the first version with the same capabilities for calculating photonic band structures or causal Green's functions but also includes extra subroutines for the calculation of transmission and reflection coefficients. Both the electric and magnetic fields are placed onto a discrete lattice by approximating the spacial and temporal derivatives with finite differences. This results in discrete versions of Maxwell's equations which can be used to integrate the fields forwards in time. The time required for a calculation using this method scales linearly with the number of real space points used in the discretization so the technique is ideally suited to handling systems with large and complicated unit cells.
[Fetal version as ambulatory intervention].
Nohe, G; Hartmann, W; Klapproth, C E
1996-06-01
The external cephalic version (ECV) of the fetus at term reduces the maternal and fetal risks of intrapartum breech presentation and Caesarean delivery. Since 1986 over 800 external cephalic versions were performed in the outpatient Department of Obstetrics and Gynaecology of the Städtische Frauenklinik Stuttgart. 60.5% were successful. NO severe complications occurred. Sufficient amniotic fluid as well as the mobility of the fetal breech is a major criterion for the success of the ECV. Management requires a safe technique for mother and fetus. This includes ultrasonography, elektronic fetal monitoring and the ability to perform immediate caesarean delivery as well as the performance of ECV without analgesicas and sedatives. More than 70% of the ECV were successful without tocolysis. In unsuccessful cases the additional use of tocolysis improves the success rate only slightly. Therefore routine use of tocolysis does not appear necessary. External cephalic version can be recommended as an outpatient treatment without tocolysis.
MPI-Defrost: Extension of Defrost to MPI-based Cluster Environment
NASA Astrophysics Data System (ADS)
Amin, Mustafa A.; Easther, Richard; Finkel, Hal
2011-06-01
MPI-Defrost extends Frolov’s Defrost to an MPI-based cluster environment. This version has been restricted to a single field. Restoring two-field support should be straightforward, but will require some code changes. Some output options may also not be fully supported under MPI. This code was produced to support our own work, and has been made available for the benefit of anyone interested in either oscillon simulations or an MPI capable version of Defrost, and it is provided on an "as-is" basis. Andrei Frolov is the primary developer of Defrost and we thank him for placing his work under the GPL (GNU Public License), and thus allowing us to distribute this modified version.
On the falsifiability of matching theory.
McDowell, J J
1986-01-01
Herrnstein's matching theory requires the parameter, k, which appears in the single-alternative form of the matching equation, to remain invariant with respect to changes in reinforcement parameters like magnitude or immediacy. Recent experiments have disconfirmed matching theory by showing that the invariant-k requirement does not hold. However, the theory can be asserted in a purely algebraic form that does not require an invariant k and that is not disconfirmed by the recent findings. In addition, both the original and the purely algebraic versions of matching theory can be asserted in forms that allow for commonly observed deviations from matching (bias, undermatching, and overmatching). The recent finding of a variable k does not disconfirm these versions of matching theory either. As a consequence, matching remains a viable theory of behavior, the strength of which lies in its general conceptualization of all behavior as choice, and in its unified mathematical treatment of single- and multialternative environments. PMID:3950535
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (HP9000 SERIES 700/800 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (IBM RS/6000 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SUN4 VERSION WITH MOTIF)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SUN4 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (DEC RISC ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
78 FR 66365 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF Data Report, the SSP-MOE Data Report, the Caseload Reduction Documentation Process, and the Reasonable...
Psychometric testing of the Italian and French versions of the Care Dependency Scale.
Zürcher, Simeon Joel; Vangelooven, Christa; Borter, Natalie; Schnyder, Daniel; Hahn, Sabine
2016-12-01
The aim of this study was to test psychometrically the Italian and French versions of the Care Dependency Scale. The Care Dependency Scale assesses changes in patients' level of care dependency including important functional and mental dimensions. Evaluation of the psychometric properties of the Italian version is still ongoing. The French version has to date not been validated. Nationwide cross-sectional point prevalence study. Data were extracted from the national, annual prevalence survey of hospital-acquired pressure ulcers and inpatient falls in Swiss acute care hospitals in 2011. A total of 799 Italian and 1068 French-speaking patients were included in the analysis. For the evaluation, the psychometric properties were tested for each language both separately and conjointly. The scales revealed high internal consistency. Factor analysis presented a one-factor solution for both versions separately as well as combined. Comparison of internal structure revealed an excellent degree of equivalence between the versions. Highly significant Spearman correlations between the Care Dependency Scale and the Braden Scale sum scores indicated satisfactory criterion validity. Both the Italian and the French versions of the Care Dependency Scale showed satisfactory psychometric properties and a high level of equivalence. Further psychometric testing, using modern test theory approaches, is required. However, the scale is recommended as a valid instrument for further use in Italian and French. © 2016 John Wiley & Sons Ltd.
ImSET: Impact of Sector Energy Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roop, Joseph M.; Scott, Michael J.; Schultz, Robert W.
2005-07-19
This version of the Impact of Sector Energy Technologies (ImSET) model represents the ''next generation'' of the previously developed Visual Basic model (ImBUILD 2.0) that was developed in 2003 to estimate the macroeconomic impacts of energy-efficient technology in buildings. More specifically, a special-purpose version of the 1997 benchmark national Input-Output (I-O) model was designed specifically to estimate the national employment and income effects of the deployment of Office of Energy Efficiency and Renewable Energy (EERE) -developed energy-saving technologies. In comparison with the previous versions of the model, this version allows for more complete and automated analysis of the essential featuresmore » of energy efficiency investments in buildings, industry, transportation, and the electric power sectors. This version also incorporates improvements in the treatment of operations and maintenance costs, and improves the treatment of financing of investment options. ImSET is also easier to use than extant macroeconomic simulation models and incorporates information developed by each of the EERE offices as part of the requirements of the Government Performance and Results Act.« less
Numerical arc segmentation algorithm for a radio conference-NASARC (version 2.0) technical manual
NASA Technical Reports Server (NTRS)
Whyte, Wayne A., Jr.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Zuzek, John E.
1987-01-01
The information contained in the NASARC (Version 2.0) Technical Manual (NASA TM-100160) and NASARC (Version 2.0) User's Manual (NASA TM-100161) relates to the state of NASARC software development through October 16, 1987. The Technical Manual describes the Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) concept and the algorithms used to implement the concept. The User's Manual provides information on computer system considerations, installation instructions, description of input files, and program operating instructions. Significant revisions have been incorporated in the Version 2.0 software. These revisions have enhanced the modeling capabilities of the NASARC procedure while greatly reducing the computer run time and memory requirements. Array dimensions within the software have been structured to fit within the currently available 6-megabyte memory capacity of the International Frequency Registration Board (IFRB) computer facility. A piecewise approach to predetermined arc generation in NASARC (Version 2.0) allows worldwide scenarios to be accommodated within these memory constraints while at the same time effecting an overall reduction in computer run time.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... Reference to the Required Assessment Tool for State Nursing Homes Receiving Per Diem Payments From VA AGENCY... resident assessment tool for State homes that receive per diem from VA for providing nursing home care to veterans. It requires State nursing homes receiving per diem from VA to use the most recent version of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-10
... Reference to the Required Assessment Tool for State Nursing Homes Receiving Per Diem Payments From VA AGENCY... State homes that receive per diem from VA for providing nursing home care to veterans. The proposed rule would require State nursing homes receiving per diem from VA to use the most recent version of the...
ERIC Educational Resources Information Center
Cornman, Stephen Q.; Zhou, Lei; Nakamoto, Nanae
2012-01-01
This documentation is for the revised file (Version 1b) of the National Center for Education Statistics' (NCES) Common Core of Data (CCD) National Public Education Financial Survey (NPEFS) for school year 2008-2009, fiscal year 2009 (FY 09). It contains a brief description of the data collection along with information required to understand and…
Modulation of habit formation by levodopa in Parkinson's disease.
Marzinzik, Frank; Wotka, Johann; Wahl, Michael; Krugel, Lea K; Kordsachia, Catarina; Klostermann, Fabian
2011-01-01
Dopamine promotes the execution of positively reinforced actions, but its role for the formation of behaviour when feedback is unavailable remains open. To study this issue, the performance of treated/untreated patients with Parkinson's disease and controls was analysed in an implicit learning task, hypothesising dopamine-dependent adherence to hidden task rules. Sixteen patients on/off levodopa and fourteen healthy subjects engaged in a Go/NoGo paradigm comprising four equiprobable stimuli. One of the stimuli was defined as target which was first consistently preceded by one of the three non-target stimuli (conditioning), whereas this coupling was dissolved thereafter (deconditioning). Two task versions were presented: in a 'Go version', only the target cue required the execution of a button press, whereas non-target stimuli were not instructive of a response; in a 'NoGo version', only the target cue demanded the inhibition of the button press which was demanded upon any non-target stimulus. Levodopa influenced in which task version errors grew from conditioning to deconditioning: in unmedicated patients just as controls errors only rose in the NoGo version with an increase of incorrect responses to target cues. Contrarily, in medicated patients errors went up only in the Go version with an increase of response omissions to target cues. The error increases during deconditioning can be understood as a perpetuation of reaction tendencies acquired during conditioning. The levodopa-mediated modulation of this carry-over effect suggests that dopamine supports habit conditioning under the task demand of response execution, but dampens it when inhibition is required. However, other than in reinforcement learning, supporting dopaminergic actions referred to the most frequent, i. e., non-target behaviour. Since this is passive whenever selective actions are executed against an inactive background, dopaminergic treatment could in according scenarios contribute to passive behaviour in patients with Parkinson's disease.
NASA Technical Reports Server (NTRS)
Hou, Gene
1998-01-01
Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.
ENERGY STAR Certified Audio Video
Certified models meet all ENERGY STAR requirements as listed in the Version 3.0 ENERGY STAR Program Requirements for Audio Video Equipment that are effective as of May 1, 2013. A detailed listing of key efficiency criteria are available at http://www.energystar.gov/index.cfm?c=audio_dvd.pr_crit_audio_dvd
ENERGY STAR Certified Data Center Storage
Certified models meet all ENERGY STAR requirements as listed in the Version 1.0 ENERGY STAR Program Requirements for Data Center Storage that are effective as of December 2, 2013. A detailed listing of key efficiency criteria are available at http://www.energystar.gov/certified-products/detail/data_center_storage
ENERGY STAR Certified Ventilating Fans
Certified models meet all ENERGY STAR requirements as listed in the Version 4.0 ENERGY STAR Program Requirements for Ventilating Fans that are effective as of October 1, 2015. A detailed listing of key efficiency criteria are available at http://www.energystar.gov/index.cfm?c=vent_fans.pr_crit_vent_fans
ENERGY STAR Certified Ceiling Fans
Certified models meet all ENERGY STAR requirements as listed in the Version 3.0 ENERGY STAR Program Requirements for Ceiling Fans that are effective as of April 1, 2012. A detailed listing of key efficiency criteria are available at http://www.energystar.gov/index.cfm?c=ceiling_fans.pr_crit_ceiling_fans
78 FR 26573 - Federal Acquisition Regulation; Irrevocable Letters of Credit
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
... requirements that require the approval of the Office of Management and Budget under the Paperwork Reduction Act... Practice (UCP) for Documentary Credits, 2006 Edition, International Chamber of Commerce Publication No. 600... Documentary Credits, International Chamber of Commerce Publication No. (Insert version in effect at the time...
76 FR 20989 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
...) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF Data Report, the SSP-MOE Data Report... Yearly hours per total annual respondents submittals response burden hours Preparation and Submission of...
76 FR 9020 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
... -------- Preparation and Submission of Data 54 1 640 34,560 Verification Procedures--Sec. Sec. 261.60-261.63 Caseload... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF...
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2013-01-01
The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.
Neuraxial blockade for external cephalic version: a systematic review.
Sultan, P; Carvalho, B
2011-10-01
The desire to decrease the number of cesarean deliveries has renewed interest in external cephalic version. The rationale for using neuraxial blockade to facilitate external cephalic version is to provide abdominal muscular relaxation and reduce patient discomfort during the procedure, so permitting successful repositioning of the fetus to a cephalic presentation. This review systematically examined the current evidence to determine the safety and efficacy of neuraxial anesthesia or analgesia when used for external cephalic version. A systematic literature review of studies that examined success rates of external cephalic version with neuraxial anesthesia was performed. Published articles written in English between 1945 and 2010 were identified using the Medline, Cochrane, EMBASE and Web of Sciences databases. Six, randomized controlled studies were identified. Neuraxial blockade significantly improved the success rate in four of these six studies. A further six non-randomized studies were identified, of which four studies with control groups found that neuraxial blockade increased the success rate of external cephalic version. Despite over 850 patients being included in the 12 studies reviewed, placental abruption was reported in only one patient with a neuraxial block, compared with two in the control groups. The incidence of non-reassuring fetal heart rate requiring cesarean delivery in the anesthesia groups was 0.44% (95% CI 0.15-1.32). Neuraxial blockade improved the likelihood of success during external cephalic version, although the dosing regimen that provides optimal conditions for successful version is unclear. Anesthetic rather than analgesic doses of local anesthetics may improve success. The findings suggest that neuraxial blockade does not compromise maternal or fetal safety during external cephalic version. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Assessment of Version 4 of the SMAP Passive Soil Moisture Standard Product
NASA Technical Reports Server (NTRS)
O'neill, P. O.; Chan, S.; Bindlish, R.; Jackson, T.; Colliander, A.; Dunbar, R.; Chen, F.; Piepmeier, Jeffrey R.; Yueh, S.; Entekhabi, D.;
2017-01-01
NASAs Soil Moisture Active Passive (SMAP) mission launched on January 31, 2015 into a sun-synchronous 6 am6 pm orbit with an objective to produce global mapping of high-resolution soil moisture and freeze-thaw state every 2-3 days. The SMAP radiometer began acquiring routine science data on March 31, 2015 and continues to operate nominally. SMAPs radiometer-derived standard soil moisture product (L2SMP) provides soil moisture estimates posted on a 36-km fixed Earth grid using brightness temperature observations and ancillary data. A beta quality version of L2SMP was released to the public in October, 2015, Version 3 validated L2SMP soil moisture data were released in May, 2016, and Version 4 L2SMP data were released in December, 2016. Version 4 data are processed using the same soil moisture retrieval algorithms as previous versions, but now include retrieved soil moisture from both the 6 am descending orbits and the 6 pm ascending orbits. Validation of 19 months of the standard L2SMP product was done for both AM and PM retrievals using in situ measurements from global core calval sites. Accuracy of the soil moisture retrievals averaged over the core sites showed that SMAP accuracy requirements are being met.
Corno, Giulia; Molinari, Guadalupe; Baños, Rosa Maria
2016-01-01
The aim of this study is to explore the psychometric properties of an affect scale, the Scale of Positive and Negative Experience (SPANE), in an Italian-speaking population. The results of this study demonstrate that the Italian version of the SPANE has psychometric properties similar to those shown by the original and previous versions, and it presents satisfactory reliability and factorial validity. The results of the Confirmatory Factor Analysis support the expected two-factor structure, positive and negative feeling, which characterized the previous versions. As expected, measures of negative affect, anxiety, negative future expectances, and depression correlated positively with the negative experiences SPANE subscale, and negatively with the positive experiences SPANE subscale. Results of this study demonstrate that the Italian version of the SPANE has psychometric properties similar to those shown by the original and previous versions, and it presents satisfactory reliability and factorial validity. The use of this instrument provides clinically useful information about a person’s overall emotional experience and it is an indicator of well-being. Although further studies are required to confirm the psychometric characteristics of the scale, the SPANE Italian version is expected to improve theoretical and empirical research on the well-being of the Italian population.
Are Shorter Versions of the Positive and Negative Syndrome Scale (PANSS) Doable? A Critical Review.
Lindenmayer, Jean-Pierre
2017-12-01
The Positive and Negative Syndrome Scale (PANSS) is a well-established assessment tool for measuring symptom severity in schizophrenia. Researchers and clinicians have been interested in the development of a short version of the PANSS that could reduce the burden of its administration for patients and raters. The author presents a comprehensive overview of existing brief PANSS measures, including their strengths and limitations, and discusses some possible next steps. There are two available scales that offer a reduced number of original PANSS items: PANSS-14 and PANSS-19; and two shorter versions that include six items: Brief PANSS and PANSS-6. The PANSS-6 has been tested quite extensively in established trials and appears to demonstrate high sensitivity to change and an established cut off definition for remission. Prospective testing in new antipsychotic treatment trials is still required for these shorter versions of PANSS. In addition, they need to be supplemented with interview guides, as well as provide conversion formulas to translate total scores from the short PANSS versions to the PANSS-30. Both short versions of the PANSS are essentially designed to evaluate response to antipsychotic treatment. Future PANSS scale development needs to address specific measurement of treatment-responsive positive symptoms by including treatment-sensitive items, as well as illness-phase specific PANSS tools.
Douglas, Kevin S
2014-09-01
The conditional release of insanity acquittees requires decisions both about community risk level and the contextual factors that may mitigate or aggravate risk. This article discusses the potential role of the newly revised Historical-Clinical-Risk Management-20 (HCR-20, Version 3) within the conditional release context. A brief review of the structured professional judgment (SPJ) approach to violence risk assessment and management is provided. Version 2 of the HCR-20, which has been broadly adopted and evaluated, is briefly described. New features of Version 3 of the HCR-20 with particular relevance to conditional release decision-making are reviewed, including: item indicators; ratings of the relevance of risk factors to an individual's violence; risk formulation; scenario planning; and risk management planning. Version 3 of the HCR-20 includes a number of features that should assist evaluators and decision-makers to determine risk level, as well as to anticipate and specify community conditions and contexts that may mitigate or aggravate risk. Research on the HCR-20 Version 3 using approximately 800 participants across three settings (forensic psychiatric, civil psychiatric, correctional) and eight countries is reviewed. Copyright © 2014 John Wiley & Sons, Ltd.
Wang, Yingying; Holland, Scott K
2014-05-01
Comprehension of narrative stories plays an important role in the development of language skills. In this study, we compared brain activity elicited by a passive-listening version and an active-response (AR) version of a narrative comprehension task by using independent component (IC) analysis on functional magnetic resonance imaging data from 21 adolescents (ages 14-18 years). Furthermore, we explored differences in functional network connectivity engaged by two versions of the task and investigated the relationship between the online response time and the strength of connectivity between each pair of ICs. Despite similar brain region involvements in auditory, temporoparietal, and frontoparietal language networks for both versions, the AR version engages some additional network elements including the left dorsolateral prefrontal, anterior cingulate, and sensorimotor networks. These additional involvements are likely associated with working memory and maintenance of attention, which can be attributed to the differences in cognitive strategic aspects of the two versions. We found significant positive correlation between the online response time and the strength of connectivity between an IC in left inferior frontal region and an IC in sensorimotor region. An explanation for this finding is that longer reaction time indicates stronger connection between the frontal and sensorimotor networks caused by increased activation in adolescents who require more effort to complete the task.
Gross, Daniel J; Golijanin, Petar; Dumont, Guillaume D; Parada, Stephen A; Vopat, Bryan G; Reinert, Steven E; Romeo, Anthony A; Provencher, C D R Matthew T
2016-01-01
Computed tomography (CT) scans of the shoulder are often not well aligned to the axis of the scapula and glenoid. The purpose of this paper was to determine the effect of sagittal rotation of the glenoid on axial measurements of anterior-posterior (AP) glenoid width and glenoid version attained by standard CT scan. In addition, we sought to define the angle of rotation required to correct the CT scan to optimal positioning. A total of 30 CT scans of the shoulder were reformatted using OsiriX software multiplanar reconstruction. The uncorrected (UNCORR) and corrected (CORR) CT scans were compared for measurements of both (1) axial AP glenoid width and (2) glenoid version at 5 standardized axial cuts. The mean difference in glenoid version was 2.6% (2° ± 0.1°; P = .0222) and the mean difference in AP glenoid width was 5.2% (1.2 ± 0.42 mm; P = .0026) in comparing the CORR and UNCORR scans. The mean angle of correction required to align the sagittal plane was 20.1° of rotation (range, 9°-39°; standard error of mean, 1.2°). These findings demonstrate that UNCORR CT scans of the glenohumeral joint do not correct for the sagittal rotation of the glenoid, and this affects the characteristics of the axial images. Failure to align the sagittal image to the 12-o'clock to 6-o'clock axis results in measurement error in both glenoid version and AP glenoid width. Use of UNCORR CT images may have notable implications for decision-making and surgical treatment. Copyright © 2016 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.
Andrews, Suzanne; Leeman, Lawrence; Yonke, Nicole
2017-09-01
Breech presentation affects 3-4% of pregnancies at term and malpresentation is the primary indication for 10-15% of cesarean deliveries. External cephalic version is an effective intervention that can decrease the need for cesarean delivery; however, timely identification of breech presentation is required. We hypothesized that women with a fetus in a breech presentation that is diagnosed after 38 weeks' estimated gestational age have a decreased likelihood of external cephalic version attempted and an increased likelihood of cesarean delivery. This was a retrospective cohort study. A chart review was performed for 251 women with breech presentation at term presenting to our tertiary referral university hospital for external cephalic version, cesarean for breech presentation, or vaginal breech delivery. Vaginal delivery was significantly more likely (31.1% vs 12.5%; P<.01) in women with breech presentation diagnosed before 38 weeks' estimated gestational age as external cephalic version was offered, and subsequently attempted in a greater proportion of women diagnosed before 38 weeks. External cephalic version was more successful when performed by physicians with greater procedural volume during the 3.5 year period of the study (59.1% for providers performing at least 10 procedures vs 31.3% if performing fewer than 10 procedures, P<.01). Results support the need for interventions to increase timely diagnosis of breech presentation as well as improved patient counseling and use of experienced providers for external cephalic version. © 2017 Wiley Periodicals, Inc.
1991-10-28
included as appropriate. 2-1 IMPLEMENTATION DEPENDENCIES The following 185 tests have floating-point type declarations requiring more digits than...SYSTEM.MAX DIGITS : C24113F..Y (20 tests) C? 705F..Y (20 tests) C35706F..Y (20 tests) C35707F..Y (20 tests) C35708F..Y (20 tests) C35802F..Z (21 tests...trademark of Teleoft. TeleGen2m is a trademark of TeleSofL VAX and VMS! are registered trademarks of Digital Equipment Corp. RESTRICTED RIGHTS LEGEND Use
2015-12-01
the MIS System/Subsystem Specification ( SSS ), and supplementary BAA document. On June 26, 2014, the SEI provided a draft interim report of the...findings and issues. The SEI team also received July 3, 2014, versions of the MIS Stakeholder Requirements, MIS SSS , and build plan and July 17, 2014...versions of the MIS SSS together with the MIS system model. On July 14–15, 2014, the SEI presented a summary of the issues at the two contractors
Aquarius Salinity Retrieval Algorithm: Final Pre-Launch Version
NASA Technical Reports Server (NTRS)
Wentz, Frank J.; Le Vine, David M.
2011-01-01
This document provides the theoretical basis for the Aquarius salinity retrieval algorithm. The inputs to the algorithm are the Aquarius antenna temperature (T(sub A)) measurements along with a number of NCEP operational products and pre-computed tables of space radiation coming from the galaxy and sun. The output is sea-surface salinity and many intermediate variables required for the salinity calculation. This revision of the Algorithm Theoretical Basis Document (ATBD) is intended to be the final pre-launch version.
A p-version finite element method for steady incompressible fluid flow and convective heat transfer
NASA Technical Reports Server (NTRS)
Winterscheidt, Daniel L.
1993-01-01
A new p-version finite element formulation for steady, incompressible fluid flow and convective heat transfer problems is presented. The steady-state residual equations are obtained by considering a limiting case of the least-squares formulation for the transient problem. The method circumvents the Babuska-Brezzi condition, permitting the use of equal-order interpolation for velocity and pressure, without requiring the use of arbitrary parameters. Numerical results are presented to demonstrate the accuracy and generality of the method.
OSMEAN - OSCULATING/MEAN CLASSICAL ORBIT ELEMENTS CONVERSION (HP9000/7XX VERSION)
NASA Technical Reports Server (NTRS)
Guinn, J. R.
1994-01-01
OSMEAN is a sophisticated FORTRAN algorithm that converts between osculating and mean classical orbit elements. Mean orbit elements are advantageous for trajectory design and maneuver planning since they can be propagated very quickly; however, mean elements cannot describe the exact orbit at any given time. Osculating elements will enable the engineer to give an exact description of an orbit; however, computation costs are significantly higher due to the numerical integration procedure required for propagation. By calculating accurate conversions between osculating and mean orbit elements, OSMEAN allows the engineer to exploit the advantages of each approach for the design and planning of orbital trajectories and maneuver planning. OSMEAN is capable of converting mean elements to osculating elements or vice versa. The conversion is based on modelling of all first order aspherical and lunar-solar gravitation perturbations as well as a second-order aspherical term based on the second degree central body zonal perturbation. OSMEAN is written in FORTRAN 77 for HP 9000 series computers running HP-UX (NPO-18796) and DEC VAX series computers running VMS (NPO-18741). The HP version requires 388K of RAM for execution and the DEC VAX version requires 254K of RAM for execution. Sample input and output are listed in the documentation. Sample input is also provided on the distribution medium. The standard distribution medium for the HP 9000 series version is a .25 inch streaming magnetic IOTAMAT tape cartridge in UNIX tar format. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the DEC VAX version is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. OSMEAN was developed on a VAX 6410 in 1989, and was ported to the HP 9000 series platform in 1991. It is a copyrighted work with all copyright vested in NASA.
OSMEAN - OSCULATING/MEAN CLASSICAL ORBIT ELEMENTS CONVERSION (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Guinn, J. R.
1994-01-01
OSMEAN is a sophisticated FORTRAN algorithm that converts between osculating and mean classical orbit elements. Mean orbit elements are advantageous for trajectory design and maneuver planning since they can be propagated very quickly; however, mean elements cannot describe the exact orbit at any given time. Osculating elements will enable the engineer to give an exact description of an orbit; however, computation costs are significantly higher due to the numerical integration procedure required for propagation. By calculating accurate conversions between osculating and mean orbit elements, OSMEAN allows the engineer to exploit the advantages of each approach for the design and planning of orbital trajectories and maneuver planning. OSMEAN is capable of converting mean elements to osculating elements or vice versa. The conversion is based on modelling of all first order aspherical and lunar-solar gravitation perturbations as well as a second-order aspherical term based on the second degree central body zonal perturbation. OSMEAN is written in FORTRAN 77 for HP 9000 series computers running HP-UX (NPO-18796) and DEC VAX series computers running VMS (NPO-18741). The HP version requires 388K of RAM for execution and the DEC VAX version requires 254K of RAM for execution. Sample input and output are listed in the documentation. Sample input is also provided on the distribution medium. The standard distribution medium for the HP 9000 series version is a .25 inch streaming magnetic IOTAMAT tape cartridge in UNIX tar format. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the DEC VAX version is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. OSMEAN was developed on a VAX 6410 in 1989, and was ported to the HP 9000 series platform in 1991. It is a copyrighted work with all copyright vested in NASA.
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John; Iredell, Lena
2010-01-01
AIRS was launched on EOS Aqua on May 4, 2002 together with ASMU-A and HSB to form a next generation polar orbiting infrared and microwave atmosphere sounding system (Pagano et al 2003). The theoretical approach used to analyze AIRS/AMSU/HSB data in the presence of clouds in the AIRS Science Team Version 3 at-launch algorithm, and that used in the Version 4 post-launch algorithm, have been published previously. Significant theoretical and practical improvements have been made in the analysis of AIRS/AMSU data since the Version 4 algorithm. Most of these have already been incorporated in the AIRS Science Team Version 5 algorithm (Susskind et al 2010), now being used operationally at the Goddard DISC. The AIRS Version 5 retrieval algorithm contains three significant improvements over Version 4. Improved physics in Version 5 allowed for use of AIRS clear column radiances (R(sub i)) in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations were used primarily in the generation of clear column radiances (R(sub i)) for all channels. This new approach allowed for the generation of accurate Quality Controlled values of R(sub i) and T(p) under more stressing cloud conditions. Secondly, Version 5 contained a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 contained for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Susskind et al 2010 shows that Version 5 AIRS Only sounding are only slightly degraded from the AIRS/AMSU soundings, even at large fractional cloud cover.
Application of Aeroelastic Solvers Based on Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Srivastava, Rakesh
1998-01-01
A pre-release version of the Navier-Stokes solver (TURBO) was obtained from MSU. Along with Dr. Milind Bakhle of the University of Toledo, subroutines for aeroelastic analysis were developed and added to the TURBO code to develop versions 1 and 2 of the TURBO-AE code. For specified mode shape, frequency and inter-blade phase angle the code calculates the work done by the fluid on the rotor for a prescribed sinusoidal motion. Positive work on the rotor indicates instability of the rotor. The version 1 of the code calculates the work for in-phase blade motions only. In version 2 of the code, the capability for analyzing all possible inter-blade phase angles, was added. The version 2 of TURBO-AE code was validated and delivered to NASA and the industry partners of the AST project. The capabilities and the features of the code are summarized in Refs. [1] & [2]. To release the version 2 of TURBO-AE, a workshop was organized at NASA Lewis, by Dr. Srivastava and Dr. M. A. Bakhle, both of the University of Toledo, in October of 1996 for the industry partners of NASA Lewis. The workshop provided the potential users of TURBO-AE, all the relevant information required in preparing the input data, executing the code, interpreting the results and bench marking the code on their computer systems. After the code was delivered to the industry partners, user support was also provided. A new version of the Navier-Stokes solver (TURBO) was later released by MSU. This version had significant changes and upgrades over the previous version. This new version was merged with the TURBO-AE code. Also, new boundary conditions for 3-D unsteady non-reflecting boundaries, were developed by researchers from UTRC, Ref. [3]. Time was spent on understanding, familiarizing, executing and implementing the new boundary conditions into the TURBO-AE code. Work was started on the phase lagged (time-shifted) boundary condition version (version 4) of the code. This will allow the users to calculate non-zero interblade phase angles using, only one blade passage for analysis.
SEGY to ASCII: Conversion and Plotting Program
Goldman, Mark R.
1999-01-01
This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/
Retinoblastoma—Health Professional Version
Retinoblastoma is a pediatric cancer. For patients with extraocular retinoblastoma, intensive chemotherapy is required, including high-dose chemotherapy and autologous hematopoietic stem cell rescue. Find evidence-based information on retinoblastoma treatment.
Corrosion guidelines : Version 1.0.
DOT National Transportation Integrated Search
2003-09-01
These guidelines outline the corrosion evaluation and recommendation aspects of site investigations for California Department of Transportation (Department) projects. The guidelines list the requirements for field investigations related to corrosion,...
Felisbino, Manuela Brisot; Steidle, Leila John Marques; Gonçalves-Tavares, Michelle; Pizzichini, Marcia Margaret Menezes; Pizzichini, Emilio
2014-01-01
Objective: To translate the Leicester Cough Questionnaire (LCQ) to Portuguese and adapt it for use in Brazil. Methods: Cross-cultural adaptation of a quality of life questionnaire requires a translated version that is conceptually equivalent to the original version and culturally acceptable in the target country. The protocol used consisted of the translation of the LCQ to Portuguese by three Brazilian translators who were fluent in English and its back-translation to English by another translator who was a native speaker of English and fluent in Portuguese. The back-translated version was evaluated by one of the authors of the original questionnaire in order to verify its equivalence. Later in the process, a provisional Portuguese-language version was thoroughly reviewed by an expert committee. In 10 patients with chronic cough, cognitive debriefing was carried out in order to test the understandability, clarity, and acceptability of the translated questionnaire in the target population. On that basis, the final Portuguese-language version of the LCQ was produced and approved by the committee. Results: Few items were questioned by the source author and revised by the committee of experts. During the cognitive debriefing phase, the Portuguese-language version of the LCQ proved to be well accepted and understood by all of the respondents, which demonstrates the robustness of the process of translation and cross-cultural adaptation. Conclusions: The final version of the LCQ adapted for use in Brazil was found to be easy to understand and easily applied. PMID:25029643
40 CFR 93.159 - Procedures for conformity determinations of general Federal actions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... based on the applicable air quality models, data bases, and other requirements specified in the most... applicable air quality models, data bases, and other requirements specified in the most recent version of the... data are available, such as actual stack test data from stationary sources which are part of the...
Total quality management: managing the human dimension in natural resource agencies
Denzil Verardo
1995-01-01
Stewardship in an era of dwindling human resources requires new approaches to the way business is conducted in the public sector, and Total Quality Management (TQM) can be the avenue for this transformation. Resource agencies are no exception to this requirement, although modifications to "traditional" private enterprise versions of TQM implementation...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-21
... on seaLandings, a consolidated electronic means of reporting landings and production of commercial... submitting required reports and logbooks using seaLandings. NMFS will provide a demonstration of the new version of seaLandings for at-sea catcher/processors and motherships, and training on how to submit daily...
NASA Technical Reports Server (NTRS)
2005-01-01
This document provides a study of the technical literature related to Command and Control (C2) link security for Unmanned Aircraft Systems (UAS) for operation in the National Airspace System (NAS). Included is a preliminary set of functional requirements for C2 link security.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... new certification for licensees to certify that their advertising sales agreements do not discriminate... word ``as'' has been replaced with the word ``if.'' The old version stated that stations are required... advertising sales agreements do not discriminate on the basis of race or ethnicity and that all such...
DOT National Transportation Integrated Search
2000-07-14
This is a draft document for the Surface Transportation Weather Decision Support Requirements (STWDSR) project. The STWDSR project is being conducted for the FHWAs Office of Transportation Operations (HOTO) Road Weather Management Program by Mitre...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
ENERGY STAR Certified Geothermal Heat Pumps
Certified models meet all ENERGY STAR requirements as listed in the Version 3.0 ENERGY STAR Program Requirements for Geothermal Heat Pumps that are effective as of January 1, 2012. A detailed listing of key efficiency criteria are available at http://www.energystar.gov/index.cfm?c=geo_heat.pr_crit_geo_heat_pumps
ENERGY STAR Certified Commercial Hot Food Holding Cabinet
Certified models meet all ENERGY STAR requirements as listed in the Version 2.0 ENERGY STAR Program Requirements for Commercial Hot Food Holding Cabinets that are effective as of October 1, 2011. A detailed listing of key efficiency criteria are available at http://www.energystar.gov/index.cfm?c=hfhc.pr_crit_hfhc
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil
2010-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.
2013-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2011-01-01
This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil
2009-01-01
This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.
Reference manual for a Requirements Specification Language (RSL), version 2.0
NASA Technical Reports Server (NTRS)
Fisher, Gene L.; Cohen, Gerald C.
1993-01-01
This report is a Reference Manual for a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages and includes constructs for formal mathematical specification.
Long wavelength propagation capacity, version 1.1 (computer diskette)
NASA Astrophysics Data System (ADS)
1994-05-01
File Characteristics: software and data file. (72 files); ASCII character set. Physical Description: 2 computer diskettes; 3 1/2 in.; high density; 1.44 MB. System Requirements: PC compatible; Digital Equipment Corp. VMS; PKZIP (included on diskette). This report describes a revision of the Naval Command, Control and Ocean Surveillance Center RDT&E Division's Long Wavelength Propagation Capability (LWPC). The first version of this capability was a collection of separate FORTRAN programs linked together in operation by a command procedure written in an operating system unique to the Digital Equipment Corporation (Ferguson & Snyder, 1989a, b). A FORTRAN computer program named Long Wavelength Propagation Model (LWPM) was developed to replace the VMS control system (Ferguson & Snyder, 1990; Ferguson, 1990). This was designated version 1 (LWPC-1). This program implemented all the features of the original VMS plus a number of auxiliary programs that provided summaries of the files and graphical displays of the output files. This report describes a revision of the LWPC, designated version 1.1 (LWPC-1.1)
Flens, Gerard; Smits, Niels; Terwee, Caroline B; Dekker, Joost; Huijbrechts, Irma; Spinhoven, Philip; de Beurs, Edwin
2017-12-01
We used the Dutch-Flemish version of the USA PROMIS adult V1.0 item bank for Anxiety as input for developing a computerized adaptive test (CAT) to measure the entire latent anxiety continuum. First, psychometric analysis of a combined clinical and general population sample ( N = 2,010) showed that the 29-item bank has psychometric properties that are required for a CAT administration. Second, a post hoc CAT simulation showed efficient and highly precise measurement, with an average number of 8.64 items for the clinical sample, and 9.48 items for the general population sample. Furthermore, the accuracy of our CAT version was highly similar to that of the full item bank administration, both in final score estimates and in distinguishing clinical subjects from persons without a mental health disorder. We discuss the future directions and limitations of CAT development with the Dutch-Flemish version of the PROMIS Anxiety item bank.
SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)
NASA Technical Reports Server (NTRS)
Coe, H. H.
1994-01-01
The SHABERTH computer program was developed to predict operating characteristics of bearings in a multibearing load support system. Lubricated and non-lubricated bearings can be modeled. SHABERTH calculates the loads, torques, temperatures, and fatigue life for ball and/or roller bearings on a single shaft. The program also allows for an analysis of the system reaction to the termination of lubricant supply to the bearings and other lubricated mechanical elements. SHABERTH has proven to be a valuable tool in the design and analysis of shaft bearing systems. The SHABERTH program is structured with four nested calculation schemes. The thermal scheme performs steady state and transient temperature calculations which predict system temperatures for a given operating state. The bearing dimensional equilibrium scheme uses the bearing temperatures, predicted by the temperature mapping subprograms, and the rolling element raceway load distribution, predicted by the bearing subprogram, to calculate bearing diametral clearance for a given operating state. The shaft-bearing system load equilibrium scheme calculates bearing inner ring positions relative to the respective outer rings such that the external loading applied to the shaft is brought into equilibrium by the rolling element loads which develop at each bearing inner ring for a given operating state. The bearing rolling element and cage load equilibrium scheme calculates the rolling element and cage equilibrium positions and rotational speeds based on the relative inner-outer ring positions, inertia effects, and friction conditions. The ball bearing subprograms in the current SHABERTH program have several model enhancements over similar programs. These enhancements include an elastohydrodynamic (EHD) film thickness model that accounts for thermal heating in the contact area and lubricant film starvation; a new model for traction combined with an asperity load sharing model; a model for the hydrodynamic rolling and shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the IBM PC version is a set of two 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The standard distribution medium for the CRAY version is also a 5.25 inch 360K MS-DOS format diskette, but alternate distribution media and formats are available upon request. The original version of SHABERTH was developed in FORTRAN IV at Lewis Research Center for use on a UNIVAC 1100 series computer. The Cray version was released in 1988, and was updated in 1990 to incorporate fluid rheological data for Rocket Propellant 1 (RP-1), thereby allowing the analysis of bearings lubricated with RP-1. The PC version is a port of the 1990 CRAY version and was developed in 1992 by SRS Technologies under contract to NASA Marshall Space Flight Center.
INDIPAY Financial Data Request Forms
The INDIPAY financial data request form requires the individual to provide financial information to support its claim of inability to pay the civil penalty. Both an English and Spanish version are provided.
Abdin, Edimansyah; Vaingankar, Janhavi Ajit; Picco, Louisa; Chua, Boon Yiang; Prince, Martin; Chong, Siow Ann; Subramaniam, Mythily
2017-04-21
To validate the short version of the 10/66 dementia diagnosis against the standard version of the 10/66 dementia diagnosis and clinical diagnosis and examine concurrent validity with the World Health Organisation Disability Assessment schedule and care needs in a multiethnic Asian older adult population in Singapore. Data from the Well-being of the Singapore Elderly study, a nationally representative survey of the older Singapore Resident population aged 60 years and above was used. The validity of the short version of the 10/66 dementia diagnostic criteria derived from the Community Screening Instrument for Dementia, the modified Consortium to Establish a Registry of Alzheimer's Disease 10-word list delayed recall and the EURO-D depression screen were examined against the standard version of the 10/66 dementia diagnosis and clinician diagnosis as a gold standard. Concurrent validity was tested by examining the relationships between the short version 10/66 dementia diagnosis, disability and care needs. A total of 2373 respondents who had completed data on the short version diagnosis were included in this study. The majority (82.63%) of respondents were of Chinese descent, 9.86% were Malays, 6.12% were of Indian descent and 1.39% belonged to other ethnic group. We found the short version 10/66 dementia diagnosis showed almost perfect agreement with the standard version 10/66 dementia diagnosis (kappa = 0.90, AUC = 0.96) and substantial agreement with clinical diagnosis (kappa = 0.70, AUC = 0.87). The weighted prevalence of dementia in the population was slightly higher based on the short version diagnosis than the standard version diagnosis (10.74% vs. 10.04%). We also found that those with the short version 10/66 dementia were significantly associated with higher disability (β = 28.90, 95% CI = 23.62, 9.62) and needed care occasionally (OR =35.21, 95% CI = 18.08, 68.59) or much of the time (OR = 9.02, 95% CI = 5.21, 15.61). The study found that the short version 10/66 dementia diagnosis has excellent validity to diagnose dementia in a multiethnic Asian population in Singapore. Further research is required to determine the usefulness of this diagnosis in clinical practice or institutional settings to aid early detection and intervention for dementia.
Haunsberger, Stefan J; Connolly, Niamh M C; Prehn, Jochen H M
2017-02-15
The miRBase database is the central and official repository for miRNAs and the current release is miRBase version 21.0. Name changes in different miRBase releases cause inconsistencies in miRNA names from version to version. When working with only a small number of miRNAs the translation can be done manually. However, with large sets of miRNAs, the necessary correction of such inconsistencies becomes burdensome and error-prone. We developed miRNAmeConverter , available as a Bioconductor R package and web interface that addresses the challenges associated with mature miRNA name inconsistencies. The main algorithm implemented enables high-throughput automatic translation of species-independent mature miRNA names to user selected miRBase versions. The web interface enables users less familiar with R to translate miRNA names given in form of a list or embedded in text and download of the results. The miRNAmeConverter R package is open source under the Artistic-2.0 license. It is freely available from Bioconductor ( http://bioconductor.org/packages/miRNAmeConverter ). The web interface is based on R Shiny and can be accessed under the URL http://www.systemsmedicineireland.ie/tools/mirna-name-converter/ . The database that miRNAmeConverter depends on is provided by the annotation package miRBaseVersions.db and can be downloaded from Bioconductor ( http://bioconductor.org/packages/miRBaseVersions.db ). Minimum R version 3.3.0 is required. stefanhaunsberger@rcsi.ie. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Henninger, Heath B; Barg, Alexej; Anderson, Andrew E; Bachus, Kent N; Tashjian, Robert Z; Burks, Robert T
2012-04-01
No clear recommendations exist regarding optimal humeral component version and deltoid tension in reverse total shoulder arthroplasty (TSA). A biomechanical shoulder simulator tested humeral versions (0°, 10°, 20° retroversion) and implant thicknesses (-3, 0, +3 mm from baseline) after reverse TSA in human cadavers. Abduction and external rotation ranges of motion as well as abduction and dislocation forces were quantified for native arms and arms implanted with 9 combinations of humeral version and implant thickness. Resting abduction angles increased significantly (up to 30°) after reverse TSA compared with native shoulders. With constant posterior cuff loads, native arms externally rotated 20°, whereas no external rotation occurred in implanted arms (20° net internal rotation). Humeral version did not affect rotational range of motion but did alter resting abduction. Abduction forces decreased 30% vs native shoulders but did not change when version or implant thickness was altered. Humeral center of rotation was shifted 17 mm medially and 12 mm inferiorly after implantation. The force required for lateral dislocation was 60% less than anterior and was not affected by implant thickness or version. Reverse TSA reduced abduction forces compared with native shoulders and resulted in limited external rotation and abduction ranges of motion. Because abduction force was reduced for all implants, the choice of humeral version and implant thickness should focus on range of motion. Lateral dislocation forces were less than anterior forces; thus, levering and inferior/posterior impingement may be a more probable basis for dislocation (laterally) than anteriorly directed forces. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.
Zitser, Jennifer; Peretz, Chava; Ber David, Aya; Shabtai, Herzl; Ezra, Adi; Kestenbaum, Meir; Brozgol, Marina; Rosenberg, Alina; Herman, Talia; Balash, Yakov; Gadoth, Avi; Thaler, Avner; Stebbins, Glenn T; Goetz, Christopher G; Tilley, Barbara C; Luo, Sheng T; Liu, Yuanyuan; Giladi, Nir; Gurevich, Tanya
2017-12-01
The Movement Disorders Society (MDS) published the English new Unified Parkinson's Disease Rating Scale (MDS-UPDRS) as the official benchmark scale for Parkinson's disease (PD) in 2008. We aimed to validate the Hebrew version of the MDS-UPDRS, explore its dimensionality and compare it to the original English one. The MDS-UPDRS questionnaire was translated to Hebrew and was tested on 389 patients with PD, treated at the Movement Disorders Unit at Tel-Aviv Medical Center. The MDS-UPDRS is made up of four sections. The higher the score, the worst the clinical situation of the patient is. Confirmatory and explanatory factor analysis were applied to determine if the factor structure of the English version could be confirmed in the Hebrew version. The Hebrew version of the MDS-UPDRS showed satisfactory clinimetric properties. The internal consistency of the Hebrew-version was satisfactory, with Cronbach's alpha values 0.79, 0.90, 0.93, 0.80, for parts 1 to 4 respectively. In the confirmatory factor analysis, all four parts had high (greater than 0.90) comparative fit index (CFI) in comparison to the original English MDS-UPDRS with high factor structure (0.96, 0.99, 0.94, 1.00, respectively), thus confirming the pre-specified English factor structure. Explanatory factor analysis yielded that the Hebrew responses differed from the English one within an acceptable range: in isolated item differences in factor structure and in the findings of few items having cross loading on multiple factors. The Hebrew version of the MDS-UPDRS meets the requirements to be designated as the Official Hebrew Version of the MDS-UPDRS. Copyright © 2017 Elsevier Ltd. All rights reserved.
Aktas, Emine; Esin, Melek Nihal
2016-03-01
Occupational skin diseases (OSDs) represent 10-40% of all occupational diseases in many industrialized countries. Young workers are frequently exposed to toxic substances and chemicals in the workplace. The occupational conditions of young workers can impose a high level of risk for the occurrence of OSDs. The Nordic Occupational Skin Questionnaire (NOSQ-2002) was developed in English as a new, comprehensive, standardized tool with which to screen for OSDs. The purpose of this study was to translate the NOSQ-2002 into Turkish and to culturally adapt the long version of the instrument for use with young workers in jobs with high risk for the occurrence of OSDs. Forward and back translations were carried out. Problematic items were modified until the Turkish-language version achieved a satisfactory consensus with the original version of the NOSQ-2002. The final Turkish version was tested in 40 randomly selected young workers with and without OSDs who were studying in the fields of hairdressing, jewelry making, and car mechanics at vocational training schools run by the National Education Ministry. When the original questionnaire had been translated into the target language, a first consensus version was evaluated by an expert panel. The expert panel determined that 36 questions (63.2%) in the Turkish version required some level of modification in order to facilitate clear understanding. Cognitive interviews were then performed. After some modification, the final Turkish version was established and tested among young workers. The new Turkish version of the NOSQ is a comprehensible, reliable, and useful tool that can be applied to young workers in specific occupations. © 2015 The International Society of Dermatology.
Advanced coal gasifier-fuel cell power plant systems design
NASA Technical Reports Server (NTRS)
Heller, M. E.
1983-01-01
Two advanced, high efficiency coal-fired power plants were designed, one utilizing a phosphoric acid fuel cell and one utilizing a molten carbonate fuel cell. Both incorporate a TRW Catalytic Hydrogen Process gasifier and regenerator. Both plants operate without an oxygen plant and without requiring water feed; they, instead, require makeup dolomite. Neither plant requires a shift converter; neither plant has heat exchangers operating above 1250 F. Both plants have attractive efficiencies and costs. While the molten carbonate version has a higher (52%) efficiency than the phosphoric acid version (48%), it also has a higher ($0.078/kWh versus $0.072/kWh) ten-year levelized cost of electricity. The phosphoric acid fuel cell power plant is probably feasible to build in the near term: questions about the TRW process need to be answered experimentally, such as weather it can operate on caking coals, and how effective the catalyzed carbon-dioxide acceptor will be at pilot scale, both in removing carbon dioxide and in removing sulfur from the gasifier.
Development of a new version of the Vehicle Protection Factor Code (VPF3)
NASA Astrophysics Data System (ADS)
Jamieson, Terrance J.
1990-10-01
The Vehicle Protection Factor (VPF) Code is an engineering tool for estimating radiation protection afforded by armoured vehicles and other structures exposed to neutron and gamma ray radiation from fission, thermonuclear, and fusion sources. A number of suggestions for modifications have been offered by users of early versions of the code. These include: implementing some of the more advanced features of the air transport rating code, ATR5, used to perform the air over ground radiation transport analyses; allowing the ability to study specific vehicle orientations within the free field; implementing an adjoint transport scheme to reduce the number of transport runs required; investigating the possibility of accelerating the transport scheme; and upgrading the computer automated design (CAD) package used by VPF. The generation of radiation free field fluences for infinite air geometries as required for aircraft analysis can be accomplished by using ATR with the air over ground correction factors disabled. Analysis of the effects of fallout bearing debris clouds on aircraft will require additional modelling of VPF.
Examining an emotion enhancement effect in working memory: evidence from age-related differences.
Mammarella, Nicola; Borella, Erika; Carretti, Barbara; Leonardi, Gloria; Fairfield, Beth
2013-01-01
The aim of the present study was to examine age-related differences between young, young-old and old-old adults in an affective version of the classical Working Memory Operation Span Test. The affective version of the Working Memory Operation Span Test included neutral words (as in the classical version) as well as negative and positive ones. Results showed that while young adults performed better than the young-old and old-old with neutral words, age-related differences between young and young-old with positive words were no longer significant, and age-related differences were nullified with negative ones. Altogether, results indicate that emotional words can reduce age-related decline when maintenance and manipulation of information in working memory in older adults are required.
Konkolÿ Thege, Barna; Ham, Elke; Ball, Laura C
2017-12-01
Recovery is understood as living a life with hope, purpose, autonomy, productivity, and community engagement despite a mental illness. The aim of this study was to provide further information on the psychometric properties of the Person-in-Recovery and Provider versions of the Revised Recovery Self-Assessment (RSA-R), a widely used measure of recovery orientation. Data from 654 individuals were analyzed, 519 of whom were treatment providers (63.6% female), while 135 were inpatients (10.4% female) of a Canadian tertiary-level psychiatric hospital. Confirmatory and exploratory techniques were used to investigate the factor structure of both versions of the instrument. Results of the confirmatory factor analyses showed that none of the four theoretically plausible models fit the data well. Principal component analyses could not replicate the structure obtained by the scale developers either and instead resulted in a five-component solution for the Provider and a four-component solution for the Person-in-Recovery version. When considering the results of a parallel analysis, the number of components to retain dropped to two for the Provider version and one for the Person-in-Recovery version. We can conclude that the RSA-R requires further revision to become a psychometrically sound instrument for assessing recovery-oriented practices in an inpatient mental health-care setting.
Holland, Scott K.
2014-01-01
Abstract Comprehension of narrative stories plays an important role in the development of language skills. In this study, we compared brain activity elicited by a passive-listening version and an active-response (AR) version of a narrative comprehension task by using independent component (IC) analysis on functional magnetic resonance imaging data from 21 adolescents (ages 14–18 years). Furthermore, we explored differences in functional network connectivity engaged by two versions of the task and investigated the relationship between the online response time and the strength of connectivity between each pair of ICs. Despite similar brain region involvements in auditory, temporoparietal, and frontoparietal language networks for both versions, the AR version engages some additional network elements including the left dorsolateral prefrontal, anterior cingulate, and sensorimotor networks. These additional involvements are likely associated with working memory and maintenance of attention, which can be attributed to the differences in cognitive strategic aspects of the two versions. We found significant positive correlation between the online response time and the strength of connectivity between an IC in left inferior frontal region and an IC in sensorimotor region. An explanation for this finding is that longer reaction time indicates stronger connection between the frontal and sensorimotor networks caused by increased activation in adolescents who require more effort to complete the task. PMID:24689887
A requirements index for information processing in hospitals.
Ammenwerth, E; Buchauer, A; Haux, R
2002-01-01
Reference models describing typical information processing requirements in hospitals do not currently exist. This leads to high hospital information system (HIS) management expenses, for example, during tender processes for the acquisition of software application programs. Our aim was, therefore, to develop a comprehensive, lasting, technology-independent, and sufficiently detailed index of requirements for information processing in hospitals in order to reduce respective expenses. Two-dozen German experts established an index of requirements for information processing in university hospitals. This was done in a consensus-based, top-down, cyclic manner. Each functional requirement was derived from information processing functions and sub-functions of a hospital. The result is the first official German version of a requirements index, containing 233 functional requirements and 102 function-independent requirements, focusing on German needs. The functional requirements are structured according to the primary care process from admission to discharge and supplemented by requirements for handling patient records, work organization and resource planning, hospital management, research and education. Both the German version and its English translation are available in the Internet. The index of requirements contains general information processing requirements in hospitals which are formulated independent of information processing tools, or of HIS architectures. It aims at supporting HIS management, especially HIS strategic planning, HIS evaluation, and tender processes. The index can be regarded as a draft, which must, however, be refined according to the specific aims of a particular project. Although focused on German needs, we expect that it can also be useful in other countries. The high amount of interest shown for the index supports its usefulness.
CMMI(Registered) for Services, Version 1.3
2010-11-01
ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information Security Management Systems – Requirements [ ISO /IEC 2005...Commission. ISO /IEC 27001 Information Technology – Security Techniques – Information Security Management Systems – Requirements, 2005. http...CMM or International Organization for Standardization ( ISO ) 9001, you will immediately recognize many similarities in their structure and content
18 CFR 34.7 - Number of copies to be filed.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., § 34.7 was revised, effective at the time of the next e-filing release during the Commission's next fiscal year. For the convenience of the user, the revised text follows: § 34.7 Filing requirements. Each...) and (2) of this chapter. As a qualified document, no paper copy version of the filing is required...
77 FR 10950 - Airworthiness Directives; General Electric Company (GE) Turbofan Engines
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-24
...) for all GE CF6-80C2B series turbofan engines. That AD currently requires installing software version 8.... This new AD requires the removal of the affected ECUs from service. This AD was prompted by two reports... ECUs from service. Comments We gave the public the opportunity to participate in developing this AD...
40 CFR 63.7842 - What records must I keep?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., performance evaluations, and opacity observations as required in § 63.10(b)(2)(viii). (b) For each COMS, you... in § 63.10(b)(2)(vi) through (xi). (2) Monitoring data for a performance evaluation as required in § 63.6(h)(7)(i) and (ii). (3) Previous (that is, superceded) versions of the performance evaluation...
Atmospheric Science Data Center
2018-06-25
... several atmospheric quantities including cloud mask and aerosol optical depth (AOD) required for atmospheric correction. The parameters ... Project Title: DSCOVR Discipline: Aerosol Clouds Version: V1 Level: L2 ...
Photovoltaic Module Encapsulation Design and Materials Selection, Volume 1, Abridged
NASA Technical Reports Server (NTRS)
Cuddihy, E. F.
1982-01-01
A summary version of Volume 1, presenting the basic encapsulation systems, their purposes and requirements, and the characteristics of the most promising candidate systems and materials, as identified and evaluated by the Flat-Plate Solar Array Project is presented. In this summary version considerable detail and much supporting and experimental information has necessarily been omitted. A reader interested in references and literature citations, and in more detailed information on specific topics, should consult Reference 1, JPL Document No. 5101-177, JPL Publication 81-102, DOE/JPL-1012-60 (JPL), June 1, 1982.
Development of a Mars Surface Imager
NASA Technical Reports Server (NTRS)
Squyres, Steve W.
1994-01-01
The Mars Surface Imager (MSI) is a multispectral, stereoscopic, panoramic imager that allows imaging of the full scene around a Mars lander from the lander body to the zenith. It has two functional components: panoramic imaging and sky imaging. In the most recent version of the MSI, called PIDDP-cam, a very long multi-line color CCD, an innovative high-performance drive system, and a state-of-the-art wavelet image compression code have been integrated into a single package. The requirements for the flight version of the MSI and the current design are presented.
JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System
NASA Astrophysics Data System (ADS)
Soppera, N.; Bossant, M.; Dupont, E.
2014-06-01
JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.
DoD Needs to Reinitiate Migration to Internet Protocol Version 6 (REDACTED)
2014-12-01
whether DoD was effectively migrating to Internet Protocol Version 6 ( IPv6 ). Finding Although DoD satisfied the requirement to demonstrate IPv6 on the...enterprise network to IPv6 . This occurred because: • DoD Chief Information Officer (CIO) and U.S. Cyber Command (USCYBERCOM) did not make IPv6 a...resources to further DoD-wide transition toward IPv6 ; and • DoD CIO did not have a current plan of action and milestones to advance DoD IPv6 migration
JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soppera, N., E-mail: nicolas.soppera@oecd.org; Bossant, M.; Dupont, E.
JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampornpan, Teerapat; Fisher, Forest W.
2010-01-01
Version 5.0 of the AutoGen software has been released. Previous versions, variously denoted Autogen and autogen, were reported in two articles: Automated Sequence Generation Process and Software (NPO-30746), Software Tech Briefs (Special Supplement to NASA Tech Briefs), September 2007, page 30, and Autogen Version 2.0 (NPO- 41501), NASA Tech Briefs, Vol. 31, No. 10 (October 2007), page 58. To recapitulate: AutoGen (now signifying automatic sequence generation ) automates the generation of sequences of commands in a standard format for uplink to spacecraft. AutoGen requires fewer workers than are needed for older manual sequence-generation processes, and greatly reduces sequence-generation times. The sequences are embodied in spacecraft activity sequence files (SASFs). AutoGen automates generation of SASFs by use of another previously reported program called APGEN. AutoGen encodes knowledge of different mission phases and of how the resultant commands must differ among the phases. AutoGen also provides means for customizing sequences through use of configuration files. The approach followed in developing AutoGen has involved encoding the behaviors of a system into a model and encoding algorithms for context-sensitive customizations of the modeled behaviors. This version of AutoGen addressed the MRO (Mars Reconnaissance Orbiter) primary science phase (PSP) mission phase. On previous Mars missions this phase has more commonly been referred to as mapping phase. This version addressed the unique aspects of sequencing orbital operations and specifically the mission specific adaptation of orbital operations for MRO. This version also includes capabilities for MRO s role in Mars relay support for UHF relay communications with the MER rovers and the Phoenix lander.
Towards seamless workflows in agile data science
NASA Astrophysics Data System (ADS)
Klump, J. F.; Robertson, J.
2017-12-01
Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the geosciences. We use code management that allows researchers to interact with the code through tools like Jupyter Notebooks while data are held in an object store. Our aim is an architecture allowing seamless integration of code development, data management, and data processing in virtual research environments.
Cultural adaptation of a pediatric functional assessment for rehabilitation outcomes research.
Arestad, Kristen E; MacPhee, David; Lim, Chun Y; Khetani, Mary A
2017-09-15
Significant racial and ethnic health care disparities experienced by Hispanic children with special health care needs (CSHCN) create barriers to enacting culturally competent rehabilitation services. One way to minimize the impact of disparities in rehabilitation is to equip practitioners with culturally relevant functional assessments to accurately determine service needs. Current approaches to culturally adapting assessments have three major limitations: use of inconsistent translation processes; current processes assess for some, but not all, elements of cultural equivalence; and limited evidence to guide decision making about whether to undertake cultural adaptation with and without language translation. The aims of this observational study are (a) to examine similarities and differences of culturally adapting a pediatric functional assessment with and without language translation, and (b) to examine the feasibility of cultural adaptation processes. The Young Children's Participation and Environment Measure (YC-PEM), a pediatric functional assessment, underwent cultural adaptation (i.e., language translation and cognitive testing) to establish Spanish and English pilot versions for use by caregivers of young CSHCN of Mexican descent. Following language translation to develop a Spanish YC-PEM pilot version, 7 caregivers (4 Spanish-speaking; 3 English-speaking) completed cognitive testing to inform decisions regarding content revisions to English and Spanish YC-PEM versions. Participant responses were content coded to established cultural equivalencies. Coded data were summed to draw comparisons on the number of revisions needed to achieve cultural equivalence between the two versions. Feasibility was assessed according to process data and data quality. Results suggest more revisions are required to achieve cultural equivalence for the translated (Spanish) version of the YC-PEM. However, issues around how the participation outcome is conceptualized were identified in both versions. Feasibility results indicate that language translation processes require high resource investment, but may increase translation quality. However, use of questionnaires versus interview methods for cognitive testing may have limited data saturation. Results lend preliminary support to the need for and feasibility of cultural adaptation with and without language translation. Results inform decisions surrounding cultural adaptations with and without language translation and thereby enhance cultural competence and quality assessment of healthcare need within pediatric rehabilitation.
Andersson, J E; Odén, A
2001-08-01
The aim of this study was to evaluate the frequency and type of hip-joint instability and the frequency of hip dislocation requiring treatment in neonates who had been lying in the breech presentation and were delivered vaginally after an external version or by caesarean section, and to compare them with neonates who were naturally in the vertex presentation. Breech presentations without ongoing labour were subjected to an attempted external version and, in cases where this proved unsuccessful or where labour had started, to deliver by caesarean section. None of the breech presentations was vaginally delivered. The anterior-dynamic ultrasound method was used to assess the hip-joint status of the neonates. Out of 6,571 foetuses, 257 were in breech presentation after 36 wk of pregnancy. Sixty-two were vaginally delivered following an external version to vertex presentation and 195 were delivered by caesarean section, 75 of these following unsuccessful attempts to perform a version. Treatment for congenital hip-joint dislocation was performed on 0.2%. Out of the breech presentations, 1.0% of those delivered by caesarean section were treated, while in those with vaginal delivery following an external version the treatment frequency was 3.2%. No case of late diagnosed hip dislocation was recorded. Significant differences in frequency of hip-joint instability and treatment were found between (i) neonates delivered in breech presentation and those delivered with vertex presentation, (ii) infants delivered in vertex presentation, naturally or after successful version, and (iii) those delivered by caesarean section with or without attempted external version and those delivered with vortex presentation. Breech presentation predisposes to increased hip instability. The instability is present prior to delivery and is certainly not a primary result of delivery forces. Both breech and vertex presentations following an external or spontaneous version should be considered as risk factors for neonatal hip instability.
Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.8
2013-06-28
be familiar with UNIX; BASH shell programming; and remote sensing, particularly regarding computer processing of satellite data. The system memory ...and storage requirements are difficult to gauge. The amount of memory needed is dependent upon the amount and type of satellite data you wish to...process; the larger the area, the larger the memory requirement. For example, the entire Atlantic Ocean will require more processing power than the
PDF Version of EPA Communication Product Standards Stylebook
This stylebook provides style and format guidance for most media, including print documents, audiovisual, broadcast, presentation and exhibit work. Also find templates and samples, copyright requirements, publishing information, and logo use standards.
Surface Transportation Weather Decision Support Requirements - Executive Summary, Version 1.0
DOT National Transportation Integrated Search
1999-12-16
WEATHER: IT AFFECTS THE VISIBILITY, TRACTABILITY, MANEUVERABILITY, VEHICLE STABILITY, EXHAUST EMISSIONS AND STRUCTURAL INTEGRITY OF THE SURFACE TRANSPORTATION SYSTEM. THEREBY WEATHER AFFECTS THE SAFETY, MOBILITY, PRODUCTIVITY AND ENVIRONMENTAL IMPACT...
Cognitive Load Mediates the Effect of Emotion on Analytical Thinking.
Trémolière, Bastien; Gagnon, Marie-Ève; Blanchette, Isabelle
2016-11-01
Although the detrimental effect of emotion on reasoning has been evidenced many times, the cognitive mechanism underlying this effect remains unclear. In the present paper, we explore the cognitive load hypothesis as a potential explanation. In an experiment, participants solved syllogistic reasoning problems with either neutral or emotional contents. Participants were also presented with a secondary task, for which the difficult version requires the mobilization of cognitive resources to be correctly solved. Participants performed overall worse and took longer on emotional problems than on neutral problems. Performance on the secondary task, in the difficult version, was poorer when participants were reasoning about emotional, compared to neutral contents, consistent with the idea that processing emotion requires more cognitive resources. Taken together, the findings afford evidence that the deleterious effect of emotion on reasoning is mediated by cognitive load.
LARCRIM user's guide, version 1.0
NASA Technical Reports Server (NTRS)
Davis, John S.; Heaphy, William J.
1993-01-01
LARCRIM is a relational database management system (RDBMS) which performs the conventional duties of an RDBMS with the added feature that it can store attributes which consist of arrays or matrices. This makes it particularly valuable for scientific data management. It is accessible as a stand-alone system and through an application program interface. The stand-alone system may be executed in two modes: menu or command. The menu mode prompts the user for the input required to create, update, and/or query the database. The command mode requires the direct input of LARCRIM commands. Although LARCRIM is an update of an old database family, its performance on modern computers is quite satisfactory. LARCRIM is written in FORTRAN 77 and runs under the UNIX operating system. Versions have been released for the following computers: SUN (3 & 4), Convex, IRIS, Hewlett-Packard, CRAY 2 & Y-MP.
NASA Technical Reports Server (NTRS)
Brauer, G. L.; Cornick, D. E.; Stevenson, R.
1977-01-01
The capabilities and applications of the three-degree-of-freedom (3DOF) version and the six-degree-of-freedom (6DOF) version of the Program to Optimize Simulated Trajectories (POST) are summarized. The document supplements the detailed program manuals by providing additional information that motivates and clarifies basic capabilities, input procedures, applications and computer requirements of these programs. The information will enable prospective users to evaluate the programs, and to determine if they are applicable to their problems. Enough information is given to enable managerial personnel to evaluate the capabilities of the programs and describes the POST structure, formulation, input and output procedures, sample cases, and computer requirements. The report also provides answers to basic questions concerning planet and vehicle modeling, simulation accuracy, optimization capabilities, and general input rules. Several sample cases are presented.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Phillips, T. A.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)
NASA Technical Reports Server (NTRS)
Baffes, P. T.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
NASA Technical Reports Server (NTRS)
McHenry, M. Q.; Angelaki, D. E.
2000-01-01
To maintain binocular fixation on near targets during fore-aft translational disturbances, largely disjunctive eye movements are elicited the amplitude and direction of which should be tuned to the horizontal and vertical eccentricities of the target. The eye movements generated during this task have been investigated here as trained rhesus monkeys fixated isovergence targets at different horizontal and vertical eccentricities during 10 Hz fore-aft oscillations. The elicited eye movements complied with the geometric requirements for binocular fixation, although not ideally. First, the corresponding vergence angle for which the movement of each eye would be compensatory was consistently less than that dictated by the actual fixation parameters. Second, the eye position with zero sensitivity to translation was not straight ahead, as geometrically required, but rather exhibited a systematic dependence on viewing distance and vergence angle. Third, responses were asymmetric, with gains being larger for abducting and downward compared with adducting and upward gaze directions, respectively. As frequency was varied between 4 and 12 Hz, responses exhibited high-pass filter properties with significant differences between abduction and adduction responses. As a result of these differences, vergence sensitivity increased as a function of frequency with a steeper slope than that of version. Despite largely undercompensatory version responses, vergence sensitivity was closer to ideal. Moreover, the observed dependence of vergence sensitivity on vergence angle, which was varied between 2.5 and 10 MA, was largely linear rather than quadratic (as geometrically predicted). We conclude that the spatial tuning of eye velocity sensitivity as a function of gaze and viewing distance follows the general geometric dependencies required for the maintenance of foveal visual acuity. However, systematic deviations from ideal behavior exist that might reflect asymmetric processing of abduction/adduction responses perhaps because of different functional dependencies of version and vergence eye movement components during translation.
Scheduling System Assessment, and Development and Enhancement of Re-engineered Version of GPSS
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah; Thomas, Bushrod; Passonno, Nicole
1996-01-01
The objective of this project is two-fold. First to provide an evaluation of a commercially developed version of the ground processing scheduling system (GPSS) for its applicability to the Kennedy Space Center (KSC) ground processing problem. Second, to work with the KSC GPSS development team and provide enhancement to the existing software. Systems reengineering is required to provide a sustainable system for the users and the software maintenance group. Using the LISP profile prototype code developed by the GPSS reverse reengineering groups as a building block, we have implemented the resource deconfliction portion of GPSS in common LISP using its object oriented features. The prototype corrects and extends some of the deficiencies of the current production version, plus it uses and builds on the classes from the development team's profile prototype.
Penev, Lyubomir; Hagedorn, Gregor; Mietchen, Daniel; Georgiev, Teodor; Stoev, Pavel; Sautter, Guido; Agosti, Donat; Plank, Andreas; Balke, Michael; Hendrich, Lars; Erwin, Terry
2011-01-01
Abstract Scholarly publishing and citation practices have developed largely in the absence of versioned documents. The digital age requires new practices to combine the old and the new. We describe how the original published source and a versioned wiki page based on it can be reconciled and combined into a single citation reference. We illustrate the citation mechanism by way of practical examples focusing on journal and wiki publishing of taxon treatments. Specifically, we discuss mechanisms for permanent cross-linking between the static original publication and the dynamic, versioned wiki, as well as for automated export of journal content to the wiki, to reduce the workload on authors, for combining the journal and the wiki citation and for integrating it with the attribution of wiki contributors. PMID:21594104
Reading Pictures for Story Comprehension Requires Mental Imagery Skills
Boerma, Inouk E.; Mol, Suzanne E.; Jolles, Jelle
2016-01-01
We examined the role of mental imagery skills on story comprehension in 150 fifth graders (10- to 12-year-olds), when reading a narrative book chapter with alternating words and pictures (i.e., text blocks were alternated by one- or two-page picture spreads). A parallel group design was used, in which we compared our experimental book version, in which pictures were used to replace parts of the corresponding text, to two control versions, i.e., a text-only version and a version with the full story text and all pictures. Analyses showed an interaction between mental imagery and book version: children with higher mental imagery skills outperformed children with lower mental imagery skills on story comprehension after reading the experimental narrative. This was not the case for both control conditions. This suggests that children’s mental imagery skills significantly contributed to the mental representation of the story that they created, by successfully integrating information from both words and pictures. The results emphasize the importance of mental imagery skills for explaining individual variability in reading development. Implications for educational practice are that we should find effective ways to instruct children how to “read” pictures and how to develop and use their mental imagery skills. This will probably contribute to their mental models and therefore their story comprehension. PMID:27822194
Kim, Stella H; Strutt, Adriana M; Olabarrieta-Landa, Laiene; Lequerica, Anthony H; Rivera, Diego; De Los Reyes Aragon, Carlos Jose; Utria, Oscar; Arango-Lasprilla, Juan Carlos
2018-02-23
The Boston Naming Test (BNT) is a widely used measure of confrontation naming ability that has been criticized for its questionable construct validity for non-English speakers. This study investigated item difficulty and construct validity of the Spanish version of the BNT to assess cultural and linguistic impact on performance. Subjects were 1298 healthy Spanish speaking adults from Colombia. They were administered the 60- and 15-item Spanish version of the BNT. A Rasch analysis was computed to assess dimensionality, item hierarchy, targeting, reliability, and item fit. Both versions of the BNT satisfied requirements for unidimensionality. Although internal consistency was excellent for the 60-item BNT, order of difficulty did not increase consistently with item number and there were a number of items that did not fit the Rasch model. For the 15-item BNT, a total of 5 items changed position on the item hierarchy with 7 poor fitting items. Internal consistency was acceptable. Construct validity of the BNT remains a concern when it is administered to non-English speaking populations. Similar to previous findings, the order of item presentation did not correspond with increasing item difficulty, and both versions were inadequate at assessing high naming ability.
NSRD-10: Leak Path Factor Guidance Using MELCOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less
Methods for Environments and Contaminants: Drinking Water
EPA’s Safe Drinking Water Information System Federal Version (SDWIS/FED) includes information on populations served and violations of maximum contaminant levels or required treatment techniques by the nation’s 160,000 public water systems.
SOLVENT WASTE REDUCTION ALTERNATIVES
This publication contains edited versions of presentations on this subject made at five Technology Transfer seminars in 1988. Chapters are included on land disposal regulations and requirements; waste solvent disposal alternatives from various industries such as process equipment...
An Interactive Version of MULR04 With Enhanced Graphic Capability
ERIC Educational Resources Information Center
Burkholder, Joel H.
1978-01-01
An existing computer program for computing multiple regression analyses is made interactive in order to alleviate core storage requirements. Also, some improvements in the graphics aspects of the program are included. (JKS)
To Your Health: NLM Update—MedlinePlus
... are using the current version of iTunes' client software on your computer and if you have an ... on other podcast distribution sites. If a podcast software client requires you to cut and paste a ...
Aviation Environmental Design Tool (AEDT): Version 2d: Installation Guide
DOT National Transportation Integrated Search
2017-09-01
Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...
An Evaluation Methodology for Protocol Analysis Systems
2007-03-01
Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS
NASA Astrophysics Data System (ADS)
Pugnaghi, Sergio; Guerrieri, Lorenzo; Corradini, Stefano; Merucci, Luca
2016-07-01
Volcanic plume removal (VPR) is a procedure developed to retrieve the ash optical depth, effective radius and mass, and sulfur dioxide mass contained in a volcanic cloud from the thermal radiance at 8.7, 11, and 12 µm. It is based on an estimation of a virtual image representing what the sensor would have seen in a multispectral thermal image if the volcanic cloud were not present. Ash and sulfur dioxide were retrieved by the first version of the VPR using a very simple atmospheric model that ignored the layer above the volcanic cloud. This new version takes into account the layer of atmosphere above the cloud as well as thermal radiance scattering along the line of sight of the sensor. In addition to improved results, the new version also offers an easier and faster preliminary preparation and includes other types of volcanic particles (andesite, obsidian, pumice, ice crystals, and water droplets). As in the previous version, a set of parameters regarding the volcanic area, particle types, and sensor is required to run the procedure. However, in the new version, only the mean plume temperature is required as input data. In this work, a set of parameters to compute the volcanic cloud transmittance in the three quoted bands, for all the aforementioned particles, for both Mt. Etna (Italy) and Eyjafjallajökull (Iceland) volcanoes, and for the Terra and Aqua MODIS instruments is presented. Three types of tests are carried out to verify the results of the improved VPR. The first uses all the radiative transfer simulations performed to estimate the above mentioned parameters. The second one makes use of two synthetic images, one for Mt. Etna and one for Eyjafjallajökull volcanoes. The third one compares VPR and Look-Up Table (LUT) retrievals analyzing the true image of Eyjafjallajökull volcano acquired by MODIS aboard the Aqua satellite on 11 May 2010 at 14:05 GMT.
ASTROP2 Users Manual: A Program for Aeroelastic Stability Analysis of Propfans
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Lucero, John M.
1996-01-01
This manual describes the input data required for using the second version of the ASTROP2 (Aeroelastic STability and Response Of Propulsion systems - 2 dimensional analysis) computer code. In ASTROP2, version 2.0, the program is divided into two modules: 2DSTRIP, which calculates the structural dynamic information; and 2DASTROP, which calculates the unsteady aerodynamic force coefficients from which the aeroelastic stability can be determined. In the original version of ASTROP2, these two aspects were performed in a single program. The improvements to version 2.0 include an option to account for counter rotation, improved numerical integration, accommodation for non-uniform inflow distribution, and an iterative scheme to flutter frequency convergence. ASTROP2 can be used for flutter analysis of multi-bladed structures such as those found in compressors, turbines, counter rotating propellers or propfans. The analysis combines a two-dimensional, unsteady cascade aerodynamics model and a three dimensional, normal mode structural model using strip theory. The flutter analysis is formulated in the frequency domain resulting in an eigenvalue determinant. The flutter frequency and damping can be inferred from the eigenvalues.
Design of a diagnostic encyclopaedia using AIDA.
van Ginneken, A M; Smeulders, A W; Jansen, W
1987-01-01
Diagnostic Encyclopaedia Workstation (DEW) is the name of a digital encyclopaedia constructed to contain reference knowledge with respect to the pathology of the ovary. Comparing DEW with the common sources of reference knowledge (i.e. books) leads to the following advantages of DEW: it contains more verbal knowledge, pictures and case histories, and it offers information adjusted to the needs of the user. Based on an analysis of the structure of this reference knowledge we have chosen AIDA to develop a relational database and we use a video-disc player to contain the pictorial part of the database. The system consists of a database input version and a read-only run version. The design of the database input version is discussed. Reference knowledge for ovary pathology requires 1-3 Mbytes of memory. At present 15% of this amount is available. The design of the run version is based on an analysis of which information must necessarily be specified to the system by the user to access a desired item of information. Finally, the use of AIDA in constructing DEW is evaluated.
Revision of FMM-Yukawa: An adaptive fast multipole method for screened Coulomb interactions
NASA Astrophysics Data System (ADS)
Zhang, Bo; Huang, Jingfang; Pitsianis, Nikos P.; Sun, Xiaobai
2010-12-01
FMM-YUKAWA is a mathematical software package primarily for rapid evaluation of the screened Coulomb interactions of N particles in three dimensional space. Since its release, we have revised and re-organized the data structure, software architecture, and user interface, for the purpose of enabling more flexible, broader and easier use of the package. The package and its documentation are available at http://www.fastmultipole.org/, along with a few other closely related mathematical software packages. New version program summaryProgram title: FMM-Yukawa Catalogue identifier: AEEQ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEQ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL 2.0 No. of lines in distributed program, including test data, etc.: 78 704 No. of bytes in distributed program, including test data, etc.: 854 265 Distribution format: tar.gz Programming language: FORTRAN 77, FORTRAN 90, and C. Requires gcc and gfortran version 4.4.3 or later Computer: All Operating system: Any Classification: 4.8, 4.12 Catalogue identifier of previous version: AEEQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2331 Does the new version supersede the previous version?: Yes Nature of problem: To evaluate the screened Coulomb potential and force field of N charged particles, and to evaluate a convolution type integral where the Green's function is the fundamental solution of the modified Helmholtz equation. Solution method: The new version of fast multipole method (FMM) that diagonalizes the multipole-to-local translation operator is applied with the tree structure adaptive to sample particle locations. Reasons for new version: To handle much larger particle ensembles, to enable the iterative use of the subroutines in a solver, and to remove potential contention in assignments for parallelization. Summary of revisions: The software package FMM-Yukawa has been revised and re-organized in data structure, software architecture, programming methods, and user interface. The revision enables more flexible use of the package and economic use of memory resources. It consists of five stages. The initial stage (stage 1) determines, based on the accuracy requirement and FMM theory, the length of multipole expansions and the number of quadrature points for diagonalization, and loads the quadrature nodes and weights that are computed off line. Stage 2 constructs the oct-tree and interaction lists, with adaptation to the sparsity or density of particles and employing a dynamic memory allocation scheme at every tree level. Stage 3 executes the core FMM subroutine for numerical calculation of the particle interactions. The subroutine can now be used iteratively as in a solver, while the particle locations remain the same. Stage 4 releases the memory allocated in Stage 2 for the adaptive tree and interaction lists. The user can modify the iterative routine easily. When the particle locations are changed such as in a molecular dynamics simulation, stage 2 to 4 can also be used together repeatedly. The final stage releases the memory space used for the quadrature and other remaining FMM parameters. Programs at the stage level and at the user interface are re-written in the C programming language, while most of the translation and interaction operations remain in FORTRAN. As a result of the change in data structures and memory allocation, the revised package can accommodate much larger particle ensembles while maintaining the same accuracy-efficiency performance. The new version is also developed as an important precursor to its parallel counterpart on multi-core or many core processors in a shared memory programming environment. Particularly, in order to ensure mutual exclusion in concurrent updates without incurring extra latency, we have replaced all the assignment statements at a source box that put its data to multiple target boxes with assignments at every target box that gather data from source boxes. This amounts to replacing the column version of matrix-vector multiplication with the row version. The matrix here, however, is in compressive representation. Sufficient care is taken in the revision not to alter the algorithmic complexity or numerical behavior, as concurrent writing potentially takes place in the upward calculation of the multipole expansion coefficients, interactions at every level of the FMM tree, and downward calculation of the local expansion coefficients. The software modules and their compositions are also organized according to the stages they are used. Demonstration files and makefiles for merging the user routines and the library routines are provided. Restrictions: Accuracy requirement is described in terms of three or six digits. Higher multiples of three digits will be allowed in a later version. Finer decimation in digits for accuracy specification may or may not be necessary. Unusual features: Ready and friendly for customized use and instrumental in expression of concurrency and dependency for efficient parallelization. Running time: The running time depends linearly on the number N of particles, and varies with the distribution characteristics of the particle distribution. It also depends on the accuracy requirement, a higher accuracy requirement takes relatively longer time. The code outperforms the direct summation method when N⩾750.
Kim, Myong; Lee, Hahn Ey; Kim, Sung Han; Cho, Sung Yong; Jeong, Seong Jin; Oh, Seung June; Cookson, Michael S; Ku, Ja Hyeon
2014-11-30
To develop a Korean version of the Functional Assessment of Cancer Therapy (FACT)-Vanderbilt Cystectomy Index (VCI) from the original English version, with subsequent linguistic validation in Korean patients who underwent radical cystectomy with urinary diversion. Translation and linguistic validation were carried out between January and May of 2013, which consisted of the following stages:(1) permission for translation;(2) forward translation;(3) reconciliation;(4) backward translation;(5) cognitive debriefing and(6) final proof-reading. During the forward translation phases,word as such as "bother","spend time", "support", "coping" and "concern" were adjusted to be more comprehensible to the target population. There conciled Korean version was accepted without certain objections because the original version and the backward translation were almost congruent except for minor differences in a subset of questions. The translation was tested using 5 Korean-speaking subjects. The subjects took an average of 8.2 minutes to complete the questionnaire, without difficulty and found the questionnaire clear and easy to understand. The panel discussed each of the issues raised by subjects and most terms were judged by the panel as to not require further changes because the overall comprehension levels were relatively high and because the translated terms were accurately rendered in the target languages. This report has demonstrated that despite translation difficulties, the linguistic validation of the FACT-VCI in the Korean language was successful. The next step is to assess the psychometric properties of the Korean version of FACT-VCI.
Freitas, N O; Forero, C G; Alonso, J; Caltran, M P; Dantas, R A S; Farina, J A; Rossi, L A
2017-01-01
Burn patients may encounter social barriers and stigmatization. The objectives of this study were to adapt the Social Comfort Questionnaire (SCQ) into Brazilian Portuguese and to assess the psychometric properties of the adapted version. Cross-cultural adaptation of the 8 items of the SCQ followed international guidelines. We interviewed 240 burn patients and verified the SCQ internal consistency, test-retest reliability and construct validity, correlating the scores with depression [Beck Depression Inventory (BDI)], affect/body image and interpersonal relationships [Burns Specific Health Scale-Revised (BSHS-R)] and self-esteem [Rosenberg's Self-Esteem Scale (RSES)]. We also performed a confirmatory factor analysis (CFA). The cross-cultural adaptation resulted in minor semantic modifications to the original SCQ version. After CFA, a reduced 6-item version showed satisfactory fit to the one-factor model (RMSEA = 0.05, CFI = 0.99, TLI = 0.99). Cronbach alpha's was 0.80, and test-retest intraclass correlation coefficient was 0.86. The final version presented a strong negative correlation with depression (BDI), and strong positive correlations with affect/body image (BSHS-R), interpersonal relationships (BSHS-R) and self-esteem (RSES) (all p < 0.001). The results showed that the SCQ Brazilian Portuguese adapted version complies with the validity and reliability criteria required for an instrument assessing social comfort in Brazilian burn patients. The Brazilian version yields a single score that is easy to interpret and well understood by patients.
Translation and validation of the Malay version of the Stroke Knowledge Test.
Sowtali, Siti Noorkhairina; Yusoff, Dariah Mohd; Harith, Sakinah; Mohamed, Monniaty
2016-04-01
To date, there is a lack of published studies on assessment tools to evaluate the effectiveness of stroke education programs. This study developed and validated the Malay language version of the Stroke Knowledge Test research instrument. This study involved translation, validity, and reliability phases. The instrument underwent backward and forward translation of the English version into the Malay language. Nine experts reviewed the content for consistency, clarity, difficulty, and suitability for inclusion. Perceived usefulness and utilization were obtained from experts' opinions. Later, face validity assessment was conducted with 10 stroke patients to determine appropriateness of sentences and grammar used. A pilot study was conducted with 41 stroke patients to determine the item analysis and reliability of the translated instrument using the Kuder Richardson 20 or Cronbach's alpha. The final Malay version Stroke Knowledge Test included 20 items with good content coverage, acceptable item properties, and positive expert review ratings. Psychometric investigations suggest that Malay version Stroke Knowledge Test had moderate reliability with Kuder Richardson 20 or Cronbach's alpha of 0.58. Improvement is required for Stroke Knowledge Test items with unacceptable difficulty indices. Overall, the average rating of perceived usefulness and perceived utility of the instruments were both 72.7%, suggesting that reviewers were likely to use the instruments in their facilities. Malay version Stroke Knowledge Test was a valid and reliable tool to assess educational needs and to evaluate stroke knowledge among participants of group-based stroke education programs in Malaysia.
Cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese.
Rossi, Natalia Freitas; Lindau, Tâmara de Andrade; Gillam, Ronald Bradley; Giacheti, Célia Maria
To accomplish the translation and cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese. The TNL is a formal instrument which assesses narrative comprehension and oral narration of children between the ages of 5-0 and 11-11 (years-months). The TNL translation and adaptation process had the following steps: (1) translation into the target language; (2) summary of the translated versions; (3) back-translation; (4) checking of the conceptual, semantics and cultural equivalence process and (5) pilot study (56 children within the test age range and from both genders). The adapted version maintained the same structure as the original version: number of tasks (both, three comprehension and oral narration), narrative formats (no picture, sequenced pictures and single picture) and scoring system. There were no adjustments to the pictures. The "McDonald's Story" was replaced by the "Snack Bar History" to meet the semantic and experiential equivalence of the target population. The other stories had semantic and grammatical adjustments. Statistically significant difference was found when comparing the raw score (comprehension, narration and total) of age groups from the adapted version. Adjustments were required to meet the equivalence between the original and the translated versions. The adapted version showed it has the potential to identify differences in oral narratives of children in the age range provided by the test. Measurement equivalence for validation and test standardization are in progress and will be able to supplement the study outcomes.
Stimulus discriminability may bias value-based probabilistic learning.
Schutte, Iris; Slagter, Heleen A; Collins, Anne G E; Frank, Michael J; Kenemans, J Leon
2017-01-01
Reinforcement learning tasks are often used to assess participants' tendency to learn more from the positive or more from the negative consequences of one's action. However, this assessment often requires comparison in learning performance across different task conditions, which may differ in the relative salience or discriminability of the stimuli associated with more and less rewarding outcomes, respectively. To address this issue, in a first set of studies, participants were subjected to two versions of a common probabilistic learning task. The two versions differed with respect to the stimulus (Hiragana) characters associated with reward probability. The assignment of character to reward probability was fixed within version but reversed between versions. We found that performance was highly influenced by task version, which could be explained by the relative perceptual discriminability of characters assigned to high or low reward probabilities, as assessed by a separate discrimination experiment. Participants were more reliable in selecting rewarding characters that were more discriminable, leading to differences in learning curves and their sensitivity to reward probability. This difference in experienced reinforcement history was accompanied by performance biases in a test phase assessing ability to learn from positive vs. negative outcomes. In a subsequent large-scale web-based experiment, this impact of task version on learning and test measures was replicated and extended. Collectively, these findings imply a key role for perceptual factors in guiding reward learning and underscore the need to control stimulus discriminability when making inferences about individual differences in reinforcement learning.
2008-06-13
technology developments. 2. This new-issue SMC standard comprises the text of The Aerospace Corporation report number TOR-2005( 8583 )-1. 3...issues of the documents are the current versions. 1. Aerospace Report No. TOR-2005( 8583 )-2, Electrical Power Systems, Direct Current, Space Vehicle...Design Requirements, The Aerospace Corp., 13 January 2005. 2. Aerospace Report No. TR-2004( 8583 )-1 (proposed MIL-STD-1540E), Test Requirements for
Design and analysis of the Collider SPXA/SPRA spool piece vacuum barrier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cruse, G.; Aksel, G.
1993-04-01
A design for the Collider SPXA/SPRA spool piece vacuum barrier was developed to meet a variety of thermal and structural performance requirements. Both composite and stainless steel alternatives were investigated using detailed finite-element analysis before selecting an optimized version of the ASST SPR spool vacuum barrier design. This design meets the structural requirements and will be able to meet the thermal performance requirements by using some newer thermal strapping configurations.
ARC2D - EFFICIENT SOLUTION METHODS FOR THE NAVIER-STOKES EQUATIONS (DEC RISC ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
Biyabani, S. R.
1994-01-01
ARC2D is a computational fluid dynamics program developed at the NASA Ames Research Center specifically for airfoil computations. The program uses implicit finite-difference techniques to solve two-dimensional Euler equations and thin layer Navier-Stokes equations. It is based on the Beam and Warming implicit approximate factorization algorithm in generalized coordinates. The methods are either time accurate or accelerated non-time accurate steady state schemes. The evolution of the solution through time is physically realistic; good solution accuracy is dependent on mesh spacing and boundary conditions. The mathematical development of ARC2D begins with the strong conservation law form of the two-dimensional Navier-Stokes equations in Cartesian coordinates, which admits shock capturing. The Navier-Stokes equations can be transformed from Cartesian coordinates to generalized curvilinear coordinates in a manner that permits one computational code to serve a wide variety of physical geometries and grid systems. ARC2D includes an algebraic mixing length model to approximate the effect of turbulence. In cases of high Reynolds number viscous flows, thin layer approximation can be applied. ARC2D allows for a variety of solutions to stability boundaries, such as those encountered in flows with shocks. The user has considerable flexibility in assigning geometry and developing grid patterns, as well as in assigning boundary conditions. However, the ARC2D model is most appropriate for attached and mildly separated boundary layers; no attempt is made to model wake regions and widely separated flows. The techniques have been successfully used for a variety of inviscid and viscous flowfield calculations. The Cray version of ARC2D is written in FORTRAN 77 for use on Cray series computers and requires approximately 5Mb memory. The program is fully vectorized. The tape includes variations for the COS and UNICOS operating systems. Also included is a sample routine for CONVEX computers to emulate Cray system time calls, which should be easy to modify for other machines as well. The standard distribution media for this version is a 9-track 1600 BPI ASCII Card Image format magnetic tape. The Cray version was developed in 1987. The IBM ES/3090 version is an IBM port of the Cray version. It is written in IBM VS FORTRAN and has the capability of executing in both vector and parallel modes on the MVS/XA operating system and in vector mode on the VM/XA operating system. Various options of the IBM VS FORTRAN compiler provide new features for the ES/3090 version, including 64-bit arithmetic and up to 2 GB of virtual addressability. The IBM ES/3090 version is available only as a 9-track, 1600 BPI IBM IEBCOPY format magnetic tape. The IBM ES/3090 version was developed in 1989. The DEC RISC ULTRIX version is a DEC port of the Cray version. It is written in FORTRAN 77 for RISC-based Digital Equipment platforms. The memory requirement is approximately 7Mb of main memory. It is available in UNIX tar format on TK50 tape cartridge. The port to DEC RISC ULTRIX was done in 1990. COS and UNICOS are trademarks and Cray is a registered trademark of Cray Research, Inc. IBM, ES/3090, VS FORTRAN, MVS/XA, and VM/XA are registered trademarks of International Business Machines. DEC and ULTRIX are registered trademarks of Digital Equipment Corporation.
ARC2D - EFFICIENT SOLUTION METHODS FOR THE NAVIER-STOKES EQUATIONS (CRAY VERSION)
NASA Technical Reports Server (NTRS)
Pulliam, T. H.
1994-01-01
ARC2D is a computational fluid dynamics program developed at the NASA Ames Research Center specifically for airfoil computations. The program uses implicit finite-difference techniques to solve two-dimensional Euler equations and thin layer Navier-Stokes equations. It is based on the Beam and Warming implicit approximate factorization algorithm in generalized coordinates. The methods are either time accurate or accelerated non-time accurate steady state schemes. The evolution of the solution through time is physically realistic; good solution accuracy is dependent on mesh spacing and boundary conditions. The mathematical development of ARC2D begins with the strong conservation law form of the two-dimensional Navier-Stokes equations in Cartesian coordinates, which admits shock capturing. The Navier-Stokes equations can be transformed from Cartesian coordinates to generalized curvilinear coordinates in a manner that permits one computational code to serve a wide variety of physical geometries and grid systems. ARC2D includes an algebraic mixing length model to approximate the effect of turbulence. In cases of high Reynolds number viscous flows, thin layer approximation can be applied. ARC2D allows for a variety of solutions to stability boundaries, such as those encountered in flows with shocks. The user has considerable flexibility in assigning geometry and developing grid patterns, as well as in assigning boundary conditions. However, the ARC2D model is most appropriate for attached and mildly separated boundary layers; no attempt is made to model wake regions and widely separated flows. The techniques have been successfully used for a variety of inviscid and viscous flowfield calculations. The Cray version of ARC2D is written in FORTRAN 77 for use on Cray series computers and requires approximately 5Mb memory. The program is fully vectorized. The tape includes variations for the COS and UNICOS operating systems. Also included is a sample routine for CONVEX computers to emulate Cray system time calls, which should be easy to modify for other machines as well. The standard distribution media for this version is a 9-track 1600 BPI ASCII Card Image format magnetic tape. The Cray version was developed in 1987. The IBM ES/3090 version is an IBM port of the Cray version. It is written in IBM VS FORTRAN and has the capability of executing in both vector and parallel modes on the MVS/XA operating system and in vector mode on the VM/XA operating system. Various options of the IBM VS FORTRAN compiler provide new features for the ES/3090 version, including 64-bit arithmetic and up to 2 GB of virtual addressability. The IBM ES/3090 version is available only as a 9-track, 1600 BPI IBM IEBCOPY format magnetic tape. The IBM ES/3090 version was developed in 1989. The DEC RISC ULTRIX version is a DEC port of the Cray version. It is written in FORTRAN 77 for RISC-based Digital Equipment platforms. The memory requirement is approximately 7Mb of main memory. It is available in UNIX tar format on TK50 tape cartridge. The port to DEC RISC ULTRIX was done in 1990. COS and UNICOS are trademarks and Cray is a registered trademark of Cray Research, Inc. IBM, ES/3090, VS FORTRAN, MVS/XA, and VM/XA are registered trademarks of International Business Machines. DEC and ULTRIX are registered trademarks of Digital Equipment Corporation.
Al Zoubi, Fadi M; Eilayyan, Owis; Mayo, Nancy E; Bussières, André E
2017-10-01
The purpose of this systematic review was to investigate the extent to which the STarT Back Screening Tool (SBST) has been evaluated for (1) the quality of translation of evidence for cross-cultural adaptation and (2) the measurement properties in languages other than English. A systematic search of 8 databases, including Medline, Embase, CINAHL, PsycINFO, AMED, Scopus, PubMed, and Web of Science, was performed. Electronic databases were searched for the period between 2008 and December 27, 2016. We included studies related to cross-cultural adaptation, including translation and assessment of the measurement properties of SBST. Study selection, translation, methodologic and quality assessments, and data extraction were performed independently by 2 reviewers. Of the 1566 citations retrieved, 17 studies were admissible, representing 11 different SBST versions in 10 languages. The quadratic weighted κ statistics of the 2 reviewers, for the translation, methodologic assessment, and quality assessment were 0.85, 0.76, and 0.83, respectively. For translation, only 2 versions (Belgian-French and Mandarin) fulfilled all requirements. None of the versions had tested all the measurement properties, and when performed, these were found to have been conducted inadequately. With regard to quality assessment, overall, the included versions had a "Poor" total summary score except 2 (Persian and Swiss-German), which were rated as "Fair." Few versions fully met the standard criteria for valid translation, and none of the versions tested all the measurement properties. There is a clear need for more accurate cross-cultural adaptation of SBST and greater attention to the quality of psychometric evaluation of the adapted versions of SBST. At this time, caution is recommended when using SBST in languages other than English. Copyright © 2017. Published by Elsevier Inc.
Efanov, J I; Shine, J J; Darwich, R; Besner Morin, C; Arsenault, J; Harris, P G; Danino, A M; Izadpanah, A
2018-04-01
Patient-Reported Outcome Measures (PROMs) are important clinical devices for evaluating injuries and surgeries of the hand. However, some of the most widely used questionnaires, such as the MHQ and bMHQ, are currently unavailable in French, which prevents them from being used in the French Canadian province of Quebec as well as in other French-speaking nations. We therefore intend to develop valid and culturally adapted French translations of the afore-mentioned questionnaires. Two independent bilingual translators converted all English questionnaires to French. Two distinct translators then translated the French versions back to English in reverse-blinded fashion. Discrepancies between the original and second English versions were examined by a committee of four bilingual healthcare professionals before final French translations of all documents were produced. Thirty patients bilingual in French and English were then asked to complete the original and French versions of the MHQ and bMHQ. Their answers were compared in order to assess the accuracy of our translation. In light of these findings, revised French versions were produced. French versions of the MHQ and bMHQ questionnaires produced metrological qualities of validity and fidelity with an inter-class correlation superior to 0.90 and a kappa coefficient of 0.81 to 1. Clinical applicability revealed the distribution of scores according to disease process was reproducible between the English and French versions. PROM translation requires a rigorous process in order to achieve strong metrological qualities in both the original and translated versions. We produced French translations of the MHQ and bMHQ by abiding to the Beaton method of cross-cultural adaptation of self-reported measures. Copyright © 2017 SFCM. Published by Elsevier Masson SAS. All rights reserved.
VizieR Online Data Catalog: Opacities from the Opacity Project (Seaton+, 1995)
NASA Astrophysics Data System (ADS)
Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.
1997-08-01
1 CODES. ***** 1.1 Code rop.for ************ This code reads opacity files written in standard OP format. Its main purpose is to provide documentation on the contents of the files. This code, like the other codes provided, prompts for the name of the file (or files) to be read. The file names read in response to the prompt may have up to 128 characters. 1.2 Code opfit.for ************** This code reads opacity files in standard OP format, and provides for interpolation of opacities to any required values of temperature and mass-density. The method used is described in OPF. The code prompts for the name of a file giving all required control parameters. As an example, the file opfit.dat is provided (users will need to change directory names and file names). The use of opfit.for is illustrated using opfit.dat. Most users will probably want to adapt opfit.for for use as a subroutine in other codes. Timings for DEC 7000 ALPHA: 0.3 sec for data read and initialisations; then 0.0007 sec for each temperature-density point. Users who like OPAL formats should note that opfit.for has a facility to produce files of OP data in OPAL-type formats. 1.3 Code ixz.for ************ This code provides for interpolations to any required values of X and Z. See IXZ. It prompts for the name of a file giving all required control parameters. An example of such a file if provided, ixz.dat (the user will need to change directory and file names). The output files have names s92INT.'nnn'. The user specifies the first value of nnn, and the number of files to be produced. 2. DATA FILES ********** 2.1 Data files for solar metal-mix ****************************** Data for solar metal-mix s92 as defined in SYMP. These files are from version 2 runs of December 1994 (see IXZ for details on Version 2). There are 213 files with names s92.'nnn', 'nnn'=201 to 413. Each file occupies 83762 bytes. The file s92.version2 gives values of X (hydrogen mass-faction) and Z (metals mass-fraction) for each value of 'nnn'. The user can get s92.version2, select the values of 'nnn' required, then get the required files s92.'nnn'. The user can see the file in ftp, displayed on the screen, by typing "get s92.version2 -". The files s92.'nnn' can be used with opfit.for to obtain opacities for any requires value of temperature and mass density. Files for other metal-mixtures will be added in due course. Send requests to mjs@star.ucl.ac.uk. 2.2 Files for interpolation in X and Z ********************************** The data files have names s92xz.'mmm', where 'mmm'=001 to 096. They differ from the standard OP files (such as s92.'nnn' --- section 2.1 above) in that they contain information giving derivatives of opacities with respect to X and Z. Each file s92xz.'mmm' occupies 148241 bytes. The interpolations to any required values of X and Z are made using ixz.for. Timings: on DEC 7000 ALPHA, 2.16 sec for each new-mixture file. For interpolations to some specified values of X and Z, one requires just 4 files s92xz.'mmm'. Most users will not require the complete set of files s92xz.'mmm'. The file s92xz.index includes a table (starting on line 3) giving values, for each 'mmm' file, of x,y,z (abundances by number-factions) and X,Y,Z (abundances by mass-fractions). Users are advised to get the file s92.index, and select values of 'mmm' for files required, then get those files. The files produced by ixz.for are in standard OP format and can be used with opfit.for to obtain opacities for any required values of temperature and mass density. 3 RECOMMENDED PROCEDURE FOR USE OF OPACITY FILES ********************************************** (1) Get the file s92.version2. (2) If the values of X and Z you require are available in the files s92.'nnn' then get those files. (3) If not, get the file s92xz.index. (4) Select from s92xz.index the values of 'mmm' which cover the range of X and Z in which your are interested. Get those files and use ixz.for to generate files for your exact required values of X and Z. (5) Note that the exact abundance mixtures used are specified in each file (see rop.for). Also each run of opfit.for produces a table of abundances. (6) If you want a metal-mix different from that of s92, contact mjs@star.ucl.ac.uk. 4 FUTURE DEVELOPMENTS ******************* (1) Data for the calculation of radiative forces are provided as the CDS catalog
Human factors research plan for instrument procedures : FY12 version 1.1
DOT National Transportation Integrated Search
2012-06-19
This research will support the development of instrument procedures for performance-based navigation (PBN) operations. These procedures include, but are not limited to, area navigation (RNAV) and required navigation performance (RNP) operations. The ...
75 FR 22577 - Proposed Notice and Comment Policy Version 2.0
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-29
... current policy limits EAC's ability to address the rare situations that require swift action. The proposed... Proposed Notice and Comment Policy 2.0. EAC's current Notice and Comment Policy is to provide effective...
Abbreviated Version Resource Conservation and Recovery Act (RCRA) Statutory Checklist
The RCRA Statutory Checklist is provided to aid attorneys and others in reviewing and documenting statutory provisions required for authorization under Section 3006(b) of the Resource Conservation and Recovery Act (RCRA), as amended.
Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [December 2015
DOT National Transportation Integrated Search
2015-12-01
Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...
Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [June 2016
DOT National Transportation Integrated Search
2016-06-01
Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...
Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [July 2015
DOT National Transportation Integrated Search
2015-07-01
Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...
Highway Economic Requirements System - v. IV. Technical Report (Version 2)
DOT National Transportation Integrated Search
1999-05-04
This paper present background information for evaluating a possible relationship between the geographic extent of broadband telecommunications infrastructure available for general use, and the level of control exercised by the public right-of-way (RO...
Discrimination of tonal and atonal music in congenital amusia: The advantage of implicit tasks.
Tillmann, Barbara; Lalitte, Philippe; Albouy, Philippe; Caclin, Anne; Bigand, Emmanuel
2016-05-01
Congenital amusia is a neurodevelopmental disorder of music perception and production, which has been attributed to a major deficit in pitch processing. While most studies and diagnosis tests have used explicit investigation methods, recent studies using implicit investigation approaches have revealed some unimpaired pitch structure processing in congenital amusia. The present study investigated amusic individuals' processing of tonal structures (e.g., musical structures respecting the Western tonal system) via three different questions. Amusic participants and their matched controls judged tonal versions (original musical excerpts) and atonal versions (with manipulated pitch content to remove tonal structures) of 12 musical pieces. For each piece, participants answered three questions that required judgments from different perspectives: an explicit structural one, a personal, emotional one and a more social one (judging the perception of others). Results revealed that amusic individuals' judgments differed between tonal and atonal versions. However, the question type influenced the extent of the revealed structure processing: while amusic individuals were impaired for the question requiring explicit structural judgments, they performed as well as their matched controls for the two other questions. Together with other recent studies, these findings suggest that congenital amusia might be related to a disorder of the conscious access to music processing rather than music processing per se. Copyright © 2016 Elsevier Ltd. All rights reserved.
Machado, Inês Maria de Jesus; Bandeira, Marina Bittencourt; Pinheiro, Hélady Sanders; Dutra, Nathália Dos Santos
2015-10-01
Treatment adherence in hemodialysis is important for guaranteeing better results for patients, but Brazil still lacks validated assessment tools for this purpose. The current study aimed to perform a cross-cultural adaptation of the Renal Adherence Behaviour Questionnaire (RABQ) and the Renal Adherence Attitudes Questionnaire (RAAQ). The two questionnaires were submitted to the following cross-cultural adaptation procedures: translation, back-translation, expert panel review, and pilot study. Changes were made in the items' wording and application, which requires a face-to-face interview. It was not necessary to change the choices of answers. The Brazilian versions of the RABQ and RAAQ showed semantic and cultural equivalence to the original versions and are easy for the target population to understand. The two scales still require validity and reliability studies before use in the field.
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2011-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
Angle-of-Attack-Modulated Terminal Point Control for Neptune Aerocapture
NASA Technical Reports Server (NTRS)
Queen, Eric M.
2004-01-01
An aerocapture guidance algorithm based on a calculus of variations approach is developed, using angle of attack as the primary control variable. Bank angle is used as a secondary control to alleviate angle of attack extremes and to control inclination. The guidance equations are derived in detail. The controller has very small onboard computational requirements and is robust to atmospheric and aerodynamic dispersions. The algorithm is applied to aerocapture at Neptune. Three versions of the controller are considered with varying angle of attack authority. The three versions of the controller are evaluated using Monte Carlo simulations with expected dispersions.
Wang, Pengran; Benhenda, Shirine; Wu, Haiyan; Lallemand-Breitenbach, Valérie; Zhen, Tao; Jollivet, Florence; Peres, Laurent; Li, Yuwen; Chen, Sai-Juan; Chen, Zhu; de Thé, Hugues; Meng, Guoyu
2018-05-04
In the originally published version of this Article, the authors Sai-Juan Chen and Zhu Chen were incorrectly listed as being affiliated with 'University Paris Diderot, Sorbonne Paris Cité, INSERM U944, CNRS UMR7212, Equipe labellisée LNCC, Hôpital St. Louis 1, Paris 75475, France', and the affiliation 'Institute of Health Sciences, Shanghai Institutes for Biological Sciences and Graduate School, Chinese Academy of Sciences, 320 Yueyang Road, Shanghai 200031, China' was inadvertently omitted. These errors have now been corrected in both the PDF and HTML versions of the Article.
User's Manual for the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Cheatwood, F. McNeil
1996-01-01
This user's manual provides detailed instructions for the installation and the application of version 4.1 of the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA). Also provides simulation of flow field in thermochemical nonequilibrium around vehicles traveling at hypersonic velocities through the atmosphere. Earlier versions of LAURA were predominantly research codes, and they had minimal (or no) documentation. This manual describes UNIX-based utilities for customizing the code for special applications that also minimize system resource requirements. The algorithm is reviewed, and the various program options are related to specific equations and variables in the theoretical development.
MODIS information, data and control system (MIDACS) operations concepts
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.
1988-01-01
The MODIS Information, Data, and Control System (MIDACS) Operations Concepts Document provides a basis for the mutual understanding between the users and the designers of the MIDACS, including the requirements, operating environment, external interfaces, and development plan. In defining the concepts and scope of the system, how the MIDACS will operate as an element of the Earth Observing System (EOS) within the EosDIS environment is described. This version follows an earlier release of a preliminary draft version. The individual operations concepts for planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, data archive and distribution, and user access do not yet fully represent the requirements of the data system needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams are not yet formed; however, it is possible to develop the operations concepts based on the present concept of EosDIS, the level 1 and level 2 Functional Requirements Documents, and through interviews and meetings with key members of the scientific community. The operations concepts were exercised through the application of representative scenarios.
From stable to unstable anomaly-induced inflation
NASA Astrophysics Data System (ADS)
Netto, Tibério de Paula; Pelinson, Ana M.; Shapiro, Ilya L.; Starobinsky, Alexei A.
2016-10-01
Quantum effects derived through conformal anomaly lead to an inflationary model that can be either stable or unstable. The unstable version requires a large dimensionless coefficient of about 5× {10}^8 in front of the {R}^2 term that results in the inflationary regime in the R+{R}^2 ("Starobinsky") model being a generic intermediate attractor. In this case the non-local terms in the effective action are practically irrelevant, and there is a `graceful exit' to a low curvature matter-like dominated stage driven by high-frequency oscillations of R - scalarons, which later decay to pairs of all particles and antiparticles, with the amount of primordial scalar (density) perturbations required by observations. The stable version is a genuine generic attractor, so there is no exit from it. We discuss a possible transition from stable to unstable phases of inflation. It is shown that this transition is automatic if the sharp cut-off approximation is assumed for quantum corrections in the period of transition. Furthermore, we describe two different quantum mechanisms that may provide a required large {R}^2-term in the transition period.
Masquillier, Caroline; Wouters, Edwin; Loos, Jasna; Nöstlinger, Christiana
2012-01-01
Background and Objectives Access to antiretroviral treatment among adolescents living with HIV (ALH) is increasing. Health-related quality of life (HRQOL) is relevant for monitoring the impact of the disease on both well-being and treatment outcomes. However, adequate screening tools to assess HRQOL in low-resource settings are scarce. This study aims to fill this research gap, by 1) assessing the psychometric properties and reliability of an Eastern African English version of a European HRQOL scale for adolescents (KIDSCREEN) and 2) determining which version of the KIDSCREEN (52-, 27- and 10-item version) is most suitable for low-resource settings. Methods The KIDSCREEN was translated into Eastern African English, Luganda (Uganda) and Dholuo (Kenya) according to standard procedures. The reconciled version was administered in 2011 to ALH aged 13–17 in Kenya (n = 283) and Uganda (n = 299). All three KIDSCREEN versions were fitted to the data with confirmatory factor analysis (CFA). After comparison, the most suitable version was adapted based on the CFA outcomes utilizing the results of previous formative research. In order to develop a general HRQOL factor, a second-order measurement model was fitted to the data. Results The CFA results showed that without adjustments, the KIDSCREEN cannot be used for measuring the HRQOL of HIV-positive adolescents. After comparison, the most suitable version for low-resource settings - the 27-item version - was adapted further. The introduction of a negative wording factor was required for the Dholuo model. The Dholuo (CFI: 0.93; RMSEA: 0.039) and the Luganda model (CFI: 0.90; RMSEA: 0.052) showed a good fit. All cronbach’s alphas of the factors were 0.70 or above. The alpha value of the Dholuo and Lugandan HRQOL second-order factor was respectively 0.84 and 0.87. Conclusions The study showed that the adapted KIDSCREEN-27 is an adequate tool for measuring HRQOL in low-resource settings with high HIV prevalence. PMID:22815776
SIENA Customer Problem Statement and Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. Sauer; R. Clay; C. Adams
2000-08-01
This document describes the problem domain and functional requirements of the SIENA framework. The software requirements and system architecture of SIENA are specified in separate documents (called SIENA Software Requirement Specification and SIENA Software Architecture, respectively). While currently this version of the document describes the problems and captures the requirements within the Analysis domain (concentrating on finite element models), it is our intention to subsequent y expand this document to describe problems and capture requirements from the Design and Manufacturing domains. In addition, SIENA is designed to be extendible to support and integrate elements from the other domains (see SIENAmore » Software Architecture document).« less
CMMI (Trademark) for Acquisition, Version 1.2
2007-11-01
operational concept to derive more detailed and prec inclu even t . The level of detail of contractual requirements is based on the acquisition...product components ma suppliers. idate Requirem Requirements Ana t the intended operational environment will have on the ability to satisfy stakeholder...es. Considerations such the t . , constraints, and interfaces and (2) to translate these e SP 3.1 Establish Operational Concepts and Scenarios
JMFA2—a graphically interactive Java program that fits microfibril angle X-ray diffraction data
Steve P. Verrill; David E. Kretschmann; Victoria L. Herian
2006-01-01
X-ray diffraction techniques have the potential to decrease the time required to determine microfibril angles dramatically. In this paper, we discuss the latest version of a curve-fitting toll that permits us to reduce the time required to evaluate MFA X-ray diffraction patterns. Further, because this tool reflects the underlying physics more accurately than existing...
Reference Model for Project Support Environments Version 1.0
1993-02-28
relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data
Sarge, Melanie A; Knobloch-Westerwick, Silvia
2013-01-01
Health information search is among the most popular Internet activities, requiring health campaigns to attract attention in a context of unprecedented competition with alternative content. The present study reconstructs a similar context that allows selective avoidance and exposure in order to examine which health message characteristics foster particular message impacts. Drawing on social cognitive theory, a 3-session study examined short-term and delayed impacts of efficacy and exemplification as characteristics of a weight loss online message, offered for selective reading among other content, on weight management self-efficacy, satisfaction, and personal importance. Short-term impacts and impacts 2 weeks after exposure reflect that the high-efficacy exemplar version increased self-efficacy and satisfaction, while the high-efficacy base-rate version lowered them. However, the exemplar and base-rate versions of the low-efficacy message increased importance of body weight management.
Methadone disrupts performance on the working memory version of the Morris water task.
Hepner, Ilana J; Homewood, Judi; Taylor, Alan J
2002-05-01
The aim of the study was to examine if administration of the mu-opiate agonist methadone hydrochloride resulted in deficits in performance on the Morris water tank task, a widely used test of spatial cognition. To this end, after initial training on the task, Long-Evans rats were administered saline or methadone at either 1.25, 2.5 or 5 mg/kg ip 15 min prior to testing. The performance of the highest-dose methadone group was inferior to that of the controls on the working memory version of the Morris task. There were also differences between the groups on the reference memory version of the task, but this result cannot be considered reliable. These data show that methadone has its most profound effect on cognition in rats when efficient performance on the task requires attention to and retention of new information, in this case, the relationship between platform location and the extramaze cues.
Brown, J E; Alfonso, B; Avila, R; Beresford, N A; Copplestone, D; Hosseini, A
2016-03-01
A new version of the ERICA Tool (version 1.2) was released in November 2014; this constitutes the first major update of the Tool since release in 2007. The key features of the update are presented in this article. Of particular note are new transfer databases extracted from an international compilation of concentration ratios (CRwo-media) and the modification of 'extrapolation' approaches used to select transfer data in cases where information is not available. Bayesian updating approaches have been used in some cases to draw on relevant information that would otherwise have been excluded in the process of deriving CRwo-media statistics. All of these efforts have in turn led to the requirement to update Environmental Media Concentration Limits (EMCLs) used in Tier 1 assessments. Some of the significant changes with regard to EMCLs are highlighted. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Production Experiences with the Cray-Enabled TORQUE Resource Manager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ezell, Matthew A; Maxwell, Don E; Beer, David
High performance computing resources utilize batch systems to manage the user workload. Cray systems are uniquely different from typical clusters due to Cray s Application Level Placement Scheduler (ALPS). ALPS manages binary transfer, job launch and monitoring, and error handling. Batch systems require special support to integrate with ALPS using an XML protocol called BASIL. Previous versions of Adaptive Computing s TORQUE and Moab batch suite integrated with ALPS from within Moab, using PERL scripts to interface with BASIL. This would occasionally lead to problems when all the components would become unsynchronized. Version 4.1 of the TORQUE Resource Manager introducedmore » new features that allow it to directly integrate with ALPS using BASIL. This paper describes production experiences at Oak Ridge National Laboratory using the new TORQUE software versions, as well as ongoing and future work to improve TORQUE.« less
European guidelines for workplace drug testing in urine.
Taskinen, Sanna; Beck, Olof; Bosch, Tessa; Brcak, Michaela; Carmichael, Duncan; Fucci, Nadia; George, Claire; Piper, Mark; Salomone, Alberto; Schielen, Wim; Steinmeyer, Stefan; Weinmann, Wolfgang
2017-06-01
These European Guidelines for Workplace Drug Testing in Urine have been prepared and updated by the European Workplace Drug Testing Society (EWDTS). The first version of these urine guidelines was published in 2002. Since then, the guidelines have been followed by many laboratories in different European countries and their role has been essential particularly in countries lacking legislation for workplace drug testing. In 2014, the EWDTS started a guidelines updating project and published a new version of the urine guidelines in 2015. Here we represent this updated version of the urine guidelines. The European Guidelines are designed to establish best practice procedures whilst allowing individual countries to operate within the requirements of national customs and legislation. The EWDTS recommends that all European laboratories that undertake legally defensible workplace drug testing should use these guidelines as a template for accreditation. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Integrated Medical Model (IMM) 4.0 Enhanced Functionalities
NASA Technical Reports Server (NTRS)
Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.
2015-01-01
The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.
Process design and economic analysis of the zinc selenide thermochemical hydrogen cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Otsuki, H.H.; Krikorian, O.H.
1978-09-06
A detailed preliminary design for a hydrogen production plant has been developed based on an improved version of the ZnSe thermochemical cycle for decomposing water. In the latest version of the cycle, ZnCl/sub 2/ is converted directly to ZnO through high temperature steam hydrolysis. This eliminates the need for first converting ZnCl/sub 2/ to ZnSO/sub 4/ and also slightly reduces the overall heat requirement. Moreover, it broadens the temperature range over which prime heat is required and improves the coupling of the cycle with a nuclear reactor heat source. The ZnSe cycle is driven by a very-high-temperature nuclear reactor (VHTR)more » proposed by Westinghouse that provides a high-temperature (1283 K) helium working gas for process heat and power. The plant is sized to produce 27.3 Mg H/sub 2//h (60,000 lb H/sub 2//h) and requires specially designed equipment to perform the critical reaction steps in the cycle. We have developed conceptual designs for several of the important process steps to make cost estimates, and have obtained a cycle efficiency of about 40% and a hydrogen production cost of about $14/GJ. We believe that the cost is high because input data on reaction rates and equipment lifetimes have been conservatively estimated and the cycle parameters have not been optimized. Nonetheless, this initial analysis serves an important function in delineating areas in the cycle where additional research is needed to increase efficiency and reduce costs in a more advanced version of the cycle.« less
By Stuart G. Baker, 2017 Introduction This software fits a zero-intercept random effects linear model to data on surrogate and true endpoints in previous trials. Requirement: Mathematica Version 11 or later. |
LAYER DEPENDENT ADVECTION IN CMAQ
The advection methods used in CMAQ require that the Courant-Friedrichs-Lewy (CFL) condition be satisfied for numerical stability and accuracy. In CMAQ prior to version 4.3, the ADVSTEP algorithm established CFL-safe synchronization and advection timesteps that were uniform throu...
2013 Vehicle Theft Prevention Quick Reference Guide for the Law Enforcement Community
DOT National Transportation Integrated Search
2013-08-01
"This and future versions of the Vehicle TheftPrevention Quick Reference Guide for the Law Enforcement Community will provide comprehensive information for vehicle lines. The parts-marking requirements have been : extended to include: : all passe...
Additional Guidance for Evaluating and Calculating Degradation Kinetics in Environmental Media
EFED compiled examples where the PestDF (version 0.8.4), the tool used most commonly by USEPA to conduct kinetic analysis following the NAFTA guidance, results required additional interpretation. Here are some of these examples.
Aviation Environmental Design Tool (AEDT): Version 2c Service Pack 2: Installation Guide
DOT National Transportation Integrated Search
2017-03-01
Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...
NASA Technical Reports Server (NTRS)
1987-01-01
A conceptual design study of an aeroassisted orbital transfer vehicle is discussed. Nicknamed TAXI, it will ferry personnel and cargo: (1) between low Earth orbit and a spacecraft circling around the Sun in permanent orbit intersecting gravitational fields of Earth and Mars, and (2) between the cycling spacecraft and a Mars orbiting station, co-orbiting with Phobos. Crew safety and mission flexibility (in terms of ability to provide a wide range of delta-V) were given high priority. Three versions were considered, using the same overall configuration based on a low L/D aerobrake with the geometry of a raked off elliptical cone with ellipsoidal nose and a toroidal skirt. The propulsion system consists of three gimballed LOX/LH2 engines firing away from the aerobrake. The versions differ mainly in the size of the aeroshields and propellant tanks. TAXI A version resulted from an initial effort to design a single transfer vehicle able to meet all delta-V requirements during the 15-year period (2025 to 2040) of Mars mission operations. TAXI B is designed to function with the cycling spacecraft moving in a simplified, nominal trajectory. On Mars missions, TAXI B would be able to meet the requirements of all the missions with a relative approach velocity near Mars of less than 9.3 km/sec. Finally, TAXI C is a revision of TAXI A, a transfer vehicle designed for missions with a relative velocity near Mars larger than 9.3 km/sec. All versions carry a crew of 9 (11 with modifications) and a cargo of 10000 lbm. Trip duration varies from 1 day for transfer from LEO to the cycling ship to nearly 5 days for transfer from the ship to the Phobos orbit.
Kaneko, Mei; Sato, Iori; Soejima, Takafumi; Kamibeppu, Kiyoko
2014-09-01
The purpose of the study is to develop a Japanese version of the Pediatric Quality of Life Inventory (PedsQL) Generic Core Scales Young Adult Version (PedsQL-YA-J) and determine the feasibility, reliability, and validity of the scales. Translation equivalence and content validity were verified using back-translation and cognitive debriefing tests. A total of 428 young adults recruited from one university, two vocational schools, or five companies completed questionnaires. We determined questionnaire feasibility, internal consistency, and test-retest reliability; checked concurrent validity against the Center for Epidemiologic Studies Depression Scale (CES-D); determined convergent and discriminant validity with the Medical Outcome Study 36-item Short Form Health Survey (SF-36); described known-groups validity with regard to subjective symptoms, illness or injury requiring regular medical visits, and depression; and verified factorial validity. All scales were internally consistent (Cronbach's coefficient alpha = 0.77-0.86); test-retest reliability was acceptable (intraclass correlation coefficient = 0.57-0.69); and all scales were concurrently valid with depression (Pearson's correlation coefficient = 0.43-0.57). The scales convergent and discriminant validity with the SF-36 and CES-D were acceptable. Evaluation of known-groups validity confirmed that the Physical Functioning scale was sensitive for subjective symptoms, the Emotional Functioning scale for depression, and the Work/School Functioning scale for illness or injury requiring regular medical visits. Exploratory factor analysis found a six-factor structure consistent with the assumed structure (cumulative proportion = 57.0%). The PedsQL-YA-J is suitable for assessing health-related quality of life in young adults in education, employment, or training, and for clinical trials and epidemiological research.
International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting.
Biering-Sørensen, F; DeVivo, M J; Charlifue, S; Chen, Y; New, P W; Noonan, V; Post, M W M; Vogel, L
2017-08-01
The study design includes expert opinion, feedback, revisions and final consensus. The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the adjustments made in Version 2.0, including standardization of data reporting. International. Comments received from the SCI community were discussed in a working group (WG); suggestions from the WG were reviewed and revisions were made. All suggested revisions were considered, and a final version was circulated for final approval. The International SCI Core Data Set (Version 2.0) consists of 25 variables. Changes made to this version include the deletion of one variable 'Total Days Hospitalized' and addition of two variables 'Date of Rehabilitation Admission' and 'Date of Death.' The variable 'Injury Etiology' was extended with six non-traumatic categories, and corresponding 'Date of Injury' for non-traumatic cases, was defined as the date of first physician visit for symptoms related to spinal cord dysfunction. A category reflecting transgender was added. A response category was added to the variable on utilization of ventilatory assistance to document the use of continuous positive airway pressure for sleep apnea. Other clarifications were made to the text. The reporting of the pediatric SCI population was updated as age groups 0-5, 6-12, 13-14, 15-17 and 18-21. Collection of the core data set should be a basic requirement of all studies of SCI to facilitate accurate descriptions of patient populations and comparison of results across published studies from around the world.
Validation study of the Leeds Dyspepsia Questionnaire in a multi-ethnic Asian population.
Mahadeva, Sanjiv; Chan, Wah-Kheong; Mohazmi, Mohammed; Sujarita, Ramanujam; Goh, Khean-Lee
2011-11-01
Outcome measures for clinical trials in dyspepsia require an assessment of symptom response. There is a lack of validated instruments assessing dyspepsia symptoms in the Asian region. We aimed to translate and validate the Leeds Dyspepsia Questionnaire (LDQ) in a multi-ethnic Asian population. A Malay and culturally adapted English version of the LDQ were developed according to established protocols. Psychometric evaluation was performed by assessing the validity, internal consistency, test-retest reliability and responsiveness of the instruments in both primary and secondary care patients. Between April and September 2010, both Malay (n=166) and Malaysian English (n=154) versions were assessed in primary and secondary care patients. Both language versions were found to be reliable (internal consistency was 0.80 and 0.74 (Cronbach's α) for Malay and English, respectively; spearman's correlation coefficient for test-retest reliability was 0.98 for both versions), valid (area under receiver operating curve for accuracy of diagnosing dyspepsia was 0.71 and 0.77 for Malay and English versions, respectively), discriminative (median LDQ score discriminated between primary and secondary care patients in Malay (11.0 vs 20.0, P<0.0001) and English (10.0 vs 14.0, P=0.001), and responsive (median LDQ score reduced after treatment in Malay (17.0 to 14.0, P=0.08) and English (18.0 to 11.0, P=0.008) to dyspepsia. The Malaysian versions of the LDQ are valid, reliable and responsive instruments for assessing symptoms in a multi-ethnic Asian population with dyspepsia. © 2011 Journal of Gastroenterology and Hepatology Foundation and Blackwell Publishing Asia Pty Ltd.
Ahn, SangNam; Smith, Matthew Lee; Altpeter, Mary; Belza, Basia; Post, Lindsey; Ory, Marcia G.
2015-01-01
Maintaining intervention fidelity should be part of any programmatic quality assurance (QA) plan and is often a licensure requirement. However, fidelity checklists designed by original program developers are often lengthy, which makes compliance difficult once programs become widely disseminated in the field. As a case example, we used Stanford’s original Chronic Disease Self-Management Program (CDSMP) fidelity checklist of 157 items to demonstrate heuristic procedures for generating shorter fidelity checklists. Using an expert consensus approach, we sought feedback from active master trainers registered with the Stanford University Patient Education Research Center about which items were most essential to, and also feasible for, assessing fidelity. We conducted three sequential surveys and one expert group-teleconference call. Three versions of the fidelity checklist were created using different statistical and methodological criteria. In a final group-teleconference call with seven national experts, there was unanimous agreement that all three final versions (e.g., a 34-item version, a 20-item version, and a 12-item version) should be made available because the purpose and resources for administering a checklist might vary from one setting to another. This study highlights the methodology used to generate shorter versions of a fidelity checklist, which has potential to inform future QA efforts for this and other evidence-based programs (EBP) for older adults delivered in community settings. With CDSMP and other EBP, it is important to differentiate between program fidelity as mandated by program developers for licensure, and intervention fidelity tools for providing an “at-a-glance” snapshot of the level of compliance to selected program indicators. PMID:25964941
Yang, Nan; Waddington, Gordon; Adams, Roger; Han, Jia
2018-05-01
Quantitative assessments of handedness and footedness are often required in studies of human cognition and behaviour, yet no reliable Chinese versions of commonly used handedness and footedness questionnaires are available. Accordingly, the objective of the present study was to translate the Edinburgh Handedness Inventory (EHI) and the Waterloo Footedness Questionnaire-Revised (WFQ-R) into Mandarin Chinese and to evaluate the reliability and validity of these translated versions in healthy Chinese people. In the first stage of the study, Chinese versions of the EHI and WFQ-R were produced from a process of translation, back translation and examination, with necessary cultural adaptations. The second stage involved determining the reliability and validity of the translated EHI and WFQ-R for the Chinese population. One hundred and ten Chinese participants were tested online, and the results showed that the Cronbach's alpha coefficient of internal consistency was 0.877 for the translated EHI and 0.855 for the translated WFQ-R. Another 170 Chinese participants were tested and re-tested after a 30-day interval. The intra-class correlation coefficients showed high reliability, 0.898 for the translated EHI and 0.869 for the translated WFQ-R. This preliminary validation study found the translated versions to be reliable and valid tools for assessing handedness and footedness in this population.
The use of self checks and voting in software error detection - An empirical study
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.; Cha, Stephen S.; Knight, John C.; Shimeall, Timothy J.
1990-01-01
The results of an empirical study of software error detection using self checks and N-version voting are presented. Working independently, each of 24 programmers first prepared a set of self checks using just the requirements specification of an aerospace application, and then each added self checks to an existing implementation of that specification. The modified programs were executed to measure the error-detection performance of the checks and to compare this with error detection using simple voting among multiple versions. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks. It was found that some checks that might have been effective failed to detect an error because they were badly placed, and there were numerous instances of checks signaling nonexistent errors. In general, specification-based checks alone were not as effective as specification-based checks combined with code-based checks. Self checks made it possible to identify faults that had not been detected previously by voting 28 versions of the program over a million randomly generated inputs. This appeared to result from the fact that the self checks could examine the internal state of the executing program, whereas voting examines only final results of computations. If internal states had to be identical in N-version voting systems, then there would be no reason to write multiple versions.
Continuous-time quantum Monte Carlo impurity solvers
NASA Astrophysics Data System (ADS)
Gull, Emanuel; Werner, Philipp; Fuchs, Sebastian; Surer, Brigitte; Pruschke, Thomas; Troyer, Matthias
2011-04-01
Continuous-time quantum Monte Carlo impurity solvers are algorithms that sample the partition function of an impurity model using diagrammatic Monte Carlo techniques. The present paper describes codes that implement the interaction expansion algorithm originally developed by Rubtsov, Savkin, and Lichtenstein, as well as the hybridization expansion method developed by Werner, Millis, Troyer, et al. These impurity solvers are part of the ALPS-DMFT application package and are accompanied by an implementation of dynamical mean-field self-consistency equations for (single orbital single site) dynamical mean-field problems with arbitrary densities of states. Program summaryProgram title: dmft Catalogue identifier: AEIL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: ALPS LIBRARY LICENSE version 1.1 No. of lines in distributed program, including test data, etc.: 899 806 No. of bytes in distributed program, including test data, etc.: 32 153 916 Distribution format: tar.gz Programming language: C++ Operating system: The ALPS libraries have been tested on the following platforms and compilers: Linux with GNU Compiler Collection (g++ version 3.1 and higher), and Intel C++ Compiler (icc version 7.0 and higher) MacOS X with GNU Compiler (g++ Apple-version 3.1, 3.3 and 4.0) IBM AIX with Visual Age C++ (xlC version 6.0) and GNU (g++ version 3.1 and higher) compilers Compaq Tru64 UNIX with Compq C++ Compiler (cxx) SGI IRIX with MIPSpro C++ Compiler (CC) HP-UX with HP C++ Compiler (aCC) Windows with Cygwin or coLinux platforms and GNU Compiler Collection (g++ version 3.1 and higher) RAM: 10 MB-1 GB Classification: 7.3 External routines: ALPS [1], BLAS/LAPACK, HDF5 Nature of problem: (See [2].) Quantum impurity models describe an atom or molecule embedded in a host material with which it can exchange electrons. They are basic to nanoscience as representations of quantum dots and molecular conductors and play an increasingly important role in the theory of "correlated electron" materials as auxiliary problems whose solution gives the "dynamical mean field" approximation to the self-energy and local correlation functions. Solution method: Quantum impurity models require a method of solution which provides access to both high and low energy scales and is effective for wide classes of physically realistic models. The continuous-time quantum Monte Carlo algorithms for which we present implementations here meet this challenge. Continuous-time quantum impurity methods are based on partition function expansions of quantum impurity models that are stochastically sampled to all orders using diagrammatic quantum Monte Carlo techniques. For a review of quantum impurity models and their applications and of continuous-time quantum Monte Carlo methods for impurity models we refer the reader to [2]. Additional comments: Use of dmft requires citation of this paper. Use of any ALPS program requires citation of the ALPS [1] paper. Running time: 60 s-8 h per iteration.
NASA Technical Reports Server (NTRS)
Manning, R. M.
1994-01-01
The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.
LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Manning, R. M.
1994-01-01
The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.
ERIC Educational Resources Information Center
Coles, Mike; Nelms, Rick
1996-01-01
Describes a study that explores the depth and breadth of scientific facts, principles, and procedures which are required in the Advanced General National Vocational Qualifications (GNVQ) science through comparison with GCE Advanced level. The final report takes account of the updated 1996 version of GNVQ science. (DDR)
40 CFR Table 10 to Subpart Dddd of... - Applicability of General Provisions to Subpart DDDD
Code of Federal Regulations, 2010 CFR
2010-07-01
... plan approval procedures; performance audit requirements; internal and external QA procedures for... control plan on record for 5 years. Keep old versions for 5 years after revisions Yes. § 63.8(e) CMS...
40 CFR Table 10 to Subpart Ddddd... - Applicability of General Provisions to Subpart DDDDD
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirements; and internal and external QA procedures for testing Yes. § 63.7(d) Testing Facilities... must keep quality control plan on record for the life of the affected source. Keep old versions for 5...
40 CFR Table 3 to Subpart Cccccc... - Applicability of General Provisions
Code of Federal Regulations, 2010 CFR
2010-07-01
... procedures; performance audit requirements; internal and external QA procedures for testing Yes. § 63.7(d... quality control plan on record for 5 years; keep old versions for 5 years after revisions No. § 63.8(e...
40 CFR Table 7 to Subpart Ppppp of... - Applicability of General Provisions to Subpart PPPPP
Code of Federal Regulations, 2010 CFR
2010-07-01
... Yes. 3. Performance audit requirements Yes. 4. Internal and external QA procedures for testing Yes... keep quality control plan on record for 5 years. Keep old versions for 5 years after revisions Yes...
ERIC Educational Resources Information Center
Hargrave, Lou Ann
2003-01-01
Temporary Assistance to Needy Families offers welfare recipients a second chance for success. How good that chance will be depends on the next version of the legislation. If its work requirements do not allow for sufficient training, achieving success may be more difficult. (Author/JOW)
REQUIREMENTS FOR HAZARDOUS WASTE LANDFILL DESIGN, CONSTRUCTION AND CLOSURE
This publication contains edited versions of the material presented at ten Technology Transfer seminars conducted in 1988 on this subject. Sections are included on design of clay and flexible membrane liners, leachate collector systems, and landfill covers. Construction quality a...
48 CFR 1815.604 - Agency points of contact. (NASA supplements paragraph (a))
Code of Federal Regulations, 2010 CFR
2010-10-01
... Internet at http://ec.msfc.nasa.gov/hq/library/unSol-Prop.html. A deviation is required for use of any modified or summarized version of the Internet information or for alternate means of general dissemination...
Vehicle information exchange needs for mobility applications : version 3.0.
DOT National Transportation Integrated Search
1996-06-01
The Evaluatory Design Document provides a unifying set of assumptions for other evaluations to utilize. Many of the evaluation activities require the definition of an actual implementation in order to be performed. For example, to cost the elements o...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen
Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less
Loads and low frequency dynamics data base: Version 1.1 November 8, 1985. [Space Shuttles
NASA Technical Reports Server (NTRS)
Garba, J. A. (Editor)
1985-01-01
Structural design data for the Shuttle are presented in the form of a data base. The data can be used by designers of Shuttle experiments to assure compliance with Shuttle safety and structural verification requirements. A glossary of Shuttle design terminology is given, and the principal safety requirements of Shuttle are summarized. The Shuttle design data are given in the form of load factors.
Air Force Research Laboratory Wright Site Guide to Technical Publishing
2005-04-01
Scientific and Technical Reports—Elements, Organization, and Design manual (and a version modified for documents generated for AFRL) • Merriam-Webster’s...notice page --SF 298 --original graphics /halftones Indicate the following on the letter of transmittal sheet: --quantity of copies required for...Elements, Organization and Design ? The WRS CDRL for a final report requires that the standard be followed. The only exception is SBIR Phase 1
ERIC Educational Resources Information Center
Bridgeman, Brent; Laitusis, Cara Cahalan; Cline, Frederick
2007-01-01
The current study used three data sources to estimate time requirements for different item types on the now current SAT Reasoning Test™. First, we estimated times from a computer-adaptive version of the SAT® (SAT CAT) that automatically recorded item times. Second, we observed students as they answered SAT questions under strict time limits and…
Lessons Learned Implementing DOORS in a Citrix Environment
NASA Technical Reports Server (NTRS)
Bussman, Marie
2005-01-01
NASA's James Web Space Telescope (JWST) Project is a large multi-national project with geographically dispersed contractors that all need access to the Projects requirement database. Initially, the project utilized multiple DOORS databases with the built-in partitions feature to exchange modules amongst the various contractor sites. As the requirements databases matured the use of partitions became extremely difficult. There have been many issues such as incompatible versions of DOORS, inefficient mechanism for sharing modules, security concerns, performance issues, and inconsistent document import and export formats. Deployment of the client software with limited IT resources available was also an issue. The solution chosen by JWST was to integrate the use of a Citrix environment with the DOORS database to address most of the project concerns. The use of the Citrix solution allowed a single Requirements database in a secure environment via a web interface. The Citrix environment allows JWST to upgrade to the most current version of DOORS without having to coordinate multiple sites and user upgrades. The single requirements database eliminates a multitude of Configuration Management concerns and facilitated the standardization of documentation formats. This paper discusses the obstacles and the lessons learned throughout the installation, implementation, usage and deployment process of a centralized DOORS database solution.
Caplan, David; Michaud, Jennifer; Hufford, Rebecca
2015-01-01
Sixty-one people with aphasia (pwa) and 41 matched controls were tested for the ability to understand sentences that required the ability to process particular syntactic elements and assign particular syntactic structures. Participants paced themselves word-by-word through 20 examples of 11 spoken sentence types and indicated which of two pictures corresponded to the meaning of each sentence. Sentences were developed in pairs such that comprehension of the experimental version of a pair required an aspect of syntactic processing not required in the corresponding baseline sentence. The need for the syntactic operations required only in the experimental version was triggered at a "critical word" in the experimental sentence. Listening times for critical words in experimental sentences were compared to those for corresponding words in the corresponding baseline sentences. The results were consistent with several models of syntactic comprehension deficits in pwa: resource reduction, slowed lexical and/or syntactic processing, abnormal susceptibility to interference from thematic roles generated non-syntactically. They suggest that a previously unidentified disturbance limiting the duration of parsing and interpretation may lead to these deficits, and that this mechanism may lead to structure-specific deficits in pwa. The results thus point to more than one mechanism underlying syntactic comprehension disorders both across and within pwa.
Step 1: Human System Interface (HSI) Functional Requirements Document (FRD). Version 2
NASA Technical Reports Server (NTRS)
2006-01-01
This Functional Requirements Document (FRD) establishes a minimum set of Human System Interface (HSI) functional requirements to achieve the Access 5 Vision of "operating High Altitude, Long Endurance (HALE) Unmanned Aircraft Systems (UAS) routinely, safely, and reliably in the National Airspace System (NAS)". Basically, it provides what functions are necessary to fly UAS in the NAS. The framework used to identify the appropriate functions was the "Aviate, Navigate, Communicate, and Avoid Hazards" structure identified in the Access 5 FRD. As a result, fifteen high-level functional requirements were developed. In addition, several of them have been decomposed into low-level functional requirements to provide more detail.
Modular reweighting software for statistical mechanical analysis of biased equilibrium data
NASA Astrophysics Data System (ADS)
Sindhikara, Daniel J.
2012-07-01
Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.
Siegel, Michael; Kurland, Rachel P; Castrini, Marisa; Morse, Catherine; de Groot, Alexander; Retamozo, Cynthia; Roberts, Sarah P; Ross, Craig S; Jernigan, David H
No previous paper has examined alcohol advertising on the internet versions of television programs popular among underage youth. To assess the volume of alcohol advertising on web sites of television networks which stream television programs popular among youth. Multiple viewers analyzed the product advertising appearing on 12 television programs that are available in full episode format on the internet. During a baseline period of one week, six coders analyzed all 12 programs. For the nine programs that contained alcohol advertising, three underage coders (ages 10, 13, and 18) analyzed the programs to quantify the extent of that advertising over a four-week period. Alcohol advertisements are highly prevalent on these programs, with nine of the 12 shows carrying alcohol ads, and six programs averaging at least one alcohol ad per episode. There was no difference in alcohol ad exposure for underage and legal age viewers. There is a substantial potential for youth exposure to alcohol advertising on the internet through internet-based versions of television programs. The Federal Trade Commission should require alcohol companies to report the underage youth and adult audiences for internet versions of television programs on which they advertise.
AEOSS runtime manual for system analysis on Advanced Earth-Orbital Spacecraft Systems
NASA Technical Reports Server (NTRS)
Lee, Hwa-Ping
1990-01-01
Advanced earth orbital spacecraft system (AEOSS) enables users to project the required power, weight, and cost for a generic earth-orbital spacecraft system. These variables are calculated on the component and subsystem levels, and then the system level. The included six subsystems are electric power, thermal control, structure, auxiliary propulsion, attitude control, and communication, command, and data handling. The costs are computed using statistically determined models that were derived from the flown spacecraft in the past and were categorized into classes according to their functions and structural complexity. Selected design and performance analyses for essential components and subsystems are also provided. AEOSS has the feature permitting a user to enter known values of these parameters, totally and partially, at all levels. All information is of vital importance to project managers of subsystems or a spacecraft system. AEOSS is a specially tailored software coded from the relational database program of the Acius' 4th Dimension with a Macintosh version. Because of the licensing agreements, two versions of the AEOSS documents were prepared. This version, AEOSS Runtime Manual, is permitted to be distributed with a finite number of the restrictive 4D Runtime version. It can perform all contained applications without any programming alterations.
AIRS Science Accomplishments Version 4.0/Plans for Version 5
NASA Technical Reports Server (NTRS)
Pagano, Thomas S.; Aumann, Hartmut; Elliott, Denis; Granger, Stephanie; Kahn, Brain; Eldering, Annmarie; Irion, Bill; Fetzer, Eric; Olsen, Ed; Lee, Sung-Yung;
2006-01-01
This talk is about accomplishments with AIRS data and what we have learned from almost three years of data what part of this is emerging in Version 4.0 what part we would like to see filtering into Version 5.0 and what part constitute limitations in the AIRS requirements, such as spectral and spatial resolution, which have to be deferred to the wish list for the next generation hyperspectral sounder. The AIRS calibration accuracy at the 1OOmK and stability at the 6 mK/year level are amazing. It establishes the unique capability of a cooled grating array spectrometer in Earth orbit for climate research. Data which are sufficiently clear to match the radiometric accuracy of the instrument, have a yield of less than 1%. This is OK for calibration. The 2616/cm window channel combined with the RTG.SST for tropical ocean allow excellent assessment radiometric calibration accuracy and stability. For absolute calibration verification 100mK is the limit due to cloud contamination. The 10 micron window channels can be used for stability assessment, but accuracy is limited at 300mK due to water continuum absorption uncertainties.
Eslami, Ahmad Ali; Amidi Mazaheri, Maryam; Mostafavi, Firoozeh; Abbasi, Mohamad Hadi; Noroozi, Ensieh
2014-01-01
Assessment of social skills is a necessary requirement to develop and evaluate the effectiveness of cognitive and behavioral interventions. This paper reports the cultural adaptation and psychometric properties of the Farsi version of the social skills rating system-secondary students form (SSRS-SS) questionnaire (Gresham and Elliot, 1990), in a normative sample of secondary school students. A two-phase design was used that phase 1 consisted of the linguistic adaptation and in phase 2, using cross-sectional sample survey data, the construct validity and reliability of the Farsi version of the SSRS-SS were examined in a sample of 724 adolescents aged from 13 to 19 years. Content validity index was excellent, and the floor/ceiling effects were low. After deleting five of the original SSRS-SS items, the findings gave support for the item convergent and divergent validity. Factor analysis revealed four subscales. RESULTS showed good internal consistency (0.89) and temporal stability (0.91) for the total scale score. Findings demonstrated support for the use of the 27-item Farsi version in the school setting. Directions for future research regarding the applicability of the scale in other settings and populations of adolescents are discussed.
Do dichromats see colours in this way? Assessing simulation tools without colorimetric measurements.
Lillo Jover, Julio A; Álvaro Llorente, Leticia; Moreira Villegas, Humberto; Melnikova, Anna
2016-11-01
Simulcheck evaluates Colour Simulation Tools (CSTs, they transform colours to mimic those seen by colour vision deficients). Two CSTs (Variantor and Coblis) were used to know if the standard Simulcheck version (direct measurement based, DMB) can be substituted by another (RGB values based) not requiring sophisticated measurement instruments. Ten normal trichromats performed the two psychophysical tasks included in the Simulcheck method. The Pseudoachromatic Stimuli Identification task provided the h uv (hue angle) values of the pseudoachromatic stimuli: colours seen as red or green by normal trichromats but as grey by colour deficient people. The Minimum Achromatic Contrast task was used to compute the L R (relative luminance) values of the pseudoachromatic stimuli. Simulcheck DMB version showed that Variantor was accurate to simulate protanopia but neither Variantor nor Coblis were accurate to simulate deuteranopia. Simulcheck RGB version provided accurate h uv values, so this variable can be adequately estimated when lacking a colorimeter —an expensive and unusual apparatus—. Contrary, the inaccuracy of the L R estimations provided by Simulcheck RGB version makes it advisable to compute this variable from the measurements performed with a photometer, a cheap and easy to find apparatus.
Conscientious refusals and reason-giving.
Marsh, Jason
2014-07-01
Some philosophers have argued for what I call the reason-giving requirement for conscientious refusal in reproductive healthcare. According to this requirement, healthcare practitioners who conscientiously object to administering standard forms of treatment must have arguments to back up their conscience, arguments that are purely public in character. I argue that such a requirement, though attractive in some ways, faces an overlooked epistemic problem: it is either too easy or too difficult to satisfy in standard cases. I close by briefly considering whether a version of the reason-giving requirement can be salvaged despite this important difficulty. © 2013 John Wiley & Sons Ltd.
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
Cumulative versus rapid introduction of new information.
Gleason, M; Carnine, D; Vala, N
1991-02-01
This study investigated the way new information is presented to students. Subjects were 60 elementary and middle school students, most with learning disabilities. Students used two versions of a specially designed computer-assisted instruction (CAI) program. One version rapidly presented students with seven pieces of information (rapid-introduction group); the other cumulatively presented smaller "chunks" of information (cumulative-introduction group). Both groups worked to mastery level successfully but students in the cumulative group spent one-third the time, required fewer responses, showed less frustration, and made fewer errors in the process. Results suggest that students with learning disabilities need much more practice than most commercial CAI programs supply.
Ada technology support for NASA-GSFC
NASA Technical Reports Server (NTRS)
1986-01-01
Utilization of the Ada programming language and environments to perform directorate functions was reviewed. The Mission and Data Operations Directorate Network (MNET) conversion effort was chosen as the first task for evaluation and assistance. The MNET project required the rewriting of the existing Network Control Program (NCP) in the Ada programming language. The DEC Ada compiler running on the VAX under WMS was used for the initial development efforts. Stress tests on the newly delivered version of the DEC Ada compiler were performed. The new Alsys Ada compiler was purchased for the IBM PC AT. A prevalidated version of the compiler was obtained. The compiler was then validated.
On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers
2012-08-01
the less exact one is solved later — assigned as step 4 of Algorithm 2 — because at each iteration , the ADM updates the variables in the Gauss - Seidel ...k) and that of an accelerated version descends at O(1/k2). Then, work [14] establishes the same rates on a Gauss - Seidel version and requires only one... iteration Fig. 5.1. Convergence curves of ADM for the elastic net problem. 17 0 50 100 150 200 0.75 0.8 0.85 0.9 0.95 1 Iteration ‖u k + 1 − u ∗ ‖ 2 G / ‖u k
Continuous Cooling from 10 K to 4 K Using a Toroidal ADR
NASA Technical Reports Server (NTRS)
DiPirro, Michael J.; Canavan, Edgar R.; Shirron, Peter J.; Tuttle, James G.
2003-01-01
Future large infrared space telescopes will require cooling to 4K to achieve background limited performance for submillimeter wavelengths. These observatories will require lifetimes of many years and will have relatively large cooling requirements making stored helium dewars impractical. We have designed and are building an adiabatic demagnetization refrigerator (ADR) for use in cooling relatively large loads (10- 100 mW) at 4K and rejecting that heat to a cryocooler operating at 1 OK. Cryocoolers below 1 OK have poor thermodynamic efficiency and ADRs can operate in this temperature range with an efficiency of 75% of Carnot or better. Overall, this can save as much as 2/3 of the input power required to operate a 4K cryocooler. The ADR magnet consists of 8 short coils wired in series and arranged in a toroid to provide self shielding of its magnetic field. This will save mass (about 30% of the mass or about 1.5 kg in our small version, higher percentages in higher cooling power, larger versions) that would have been used for passive or active shields in an ordinary solenoid. The toroid has a 100 mm outer diameter and will produce an approximately 3T average field. In the initial demonstration model the toroid coils will be wound with ordinary NbTi wire and operated at 4K. A second version will then use Nb3Sn wire to provide complete 10K operation. As a refrigerant for this temperature range we will use either GdLiF4 or GdF3 crystals, pending tests of these crystals' cooling capacity per field and thermal conductance. Preliminary indications are that these materials are superior to GGG. We will use gas gap heat switches to alternately connect the toroid to the cold load and the warm heat sink. A small continuous stage will maintain the cold end at 4K while the main toroid is recycled.
Four Stories about National Goals for American Education.
ERIC Educational Resources Information Center
Cuban, Larry
1990-01-01
Presents four versions of American educational history highlighting centralization/decentralization issues, American faith in schooling, and cascading national and international changes requiring extraordinary reforms. These diverse stories all arrive at the same conclusion--a need for national goals and performance standards to guide…
GAPIT version 2: an enhanced integrated tool for genomic association and prediction
USDA-ARS?s Scientific Manuscript database
Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.
1988-01-01
A revised version of expert knowledge for the onboard navigation (ONAV) entry system is given. Included is some brief background information together with information describing the knowledge that the system does contain.
NASA Technical Reports Server (NTRS)
1991-01-01
The Reusable Reentry Satellite (RRS) System is composed of the payload segment (PS), vehicle segment (VS), and mission support (MS) segments. This specification establishes the performance, design, development, and test requirements for the RRS Rodent Module (RM).
Use the available information on the relationship between juvenile body weights and energetic requirements to develop a general approach for calculating juvenile dietary exposure doses appropriate for a range of avian species.
Characterizing and Mapping of Ecosystem Services (CMESs) Literature Database Version 1.0
Ecosystem services (ESs) represent an ecosystem’s capacity for satisfying essential human needs, directly or indirectly, above that required to maintain ecosystem integrity (structure, function and processes). The spatial characterization and mapping of ESs is an essential first ...
Analgesia/anesthesia for external cephalic version.
Weiniger, Carolyn F
2013-06-01
Professional society guidelines recommend that women with breech presentation be delivered surgically due to a higher incidence of fetal risks compared with vaginal delivery. An alternative is attempted external cephalic version, which if successful, enables attempted vaginal delivery. Attitudes towards external cephalic version (ECV) will be considered in this review, along with pain relief methods and their impact on ECV success rates. Articles suggest that ECV is infrequently offered, due to both physician and patient factors. Success of ECV is higher in multiparous women, complete breech, posterior placenta, or smaller fetus. Preterm ECV performance does not increase vaginal delivery rates. Neuraxial techniques (spinal or epidural) significantly increase ECV success rates, as do moxibustion and hypnosis. Four reviews summarized studies considering ECV and neuraxial techniques. These reviews suggest that neuraxial techniques using high (surgical) doses of local anesthetic are efficacious compared with control groups not using anesthesia, whereas techniques using low-doses are not. Low-dose versus high-dose neuraxial analgesia/anesthesia has not been directly compared in a single study. Based on currently available data, the rate of cephalic presentation is not increased using neuraxial techniques, but vaginal delivery rates are higher. ECV appears to be a low-risk procedure. The logistics of routine ECV and provision of optimal neuraxial techniques for successful ECV require additional research. Safety aspects of neuraxial anesthesia for ECV require further investigation.
Vickers, Douglas; Bovet, Pierre; Lee, Michael D; Hughes, Peter
2003-01-01
The planar Euclidean version of the travelling salesperson problem (TSP) requires finding a tour of minimal length through a two-dimensional set of nodes. Despite the computational intractability of the TSP, people can produce rapid, near-optimal solutions to visually presented versions of such problems. To explain this, MacGregor et al (1999, Perception 28 1417-1428) have suggested that people use a global-to-local process, based on a perceptual tendency to organise stimuli into convex figures. We review the evidence for this idea and propose an alternative, local-to-global hypothesis, based on the detection of least distances between the nodes in an array. We present the results of an experiment in which we examined the relationships between three objective measures and performance measures of optimality and response uncertainty in tasks requiring participants to construct a closed tour or an open path. The data are not well accounted for by a process based on the convex hull. In contrast, results are generally consistent with a locally focused process based initially on the detection of nearest-neighbour clusters. Individual differences are interpreted in terms of a hierarchical process of constructing solutions, and the findings are related to a more general analysis of the role of nearest neighbours in the perception of structure and motion.
Prietula, M J; Feltovich, P J; Marchak, F
2000-01-01
We propose that considering four categories of task factors can facilitate knowledge elicitation efforts in the analysis of complex cognitive tasks: materials, strategies, knowledge characteristics, and goals. A study was conducted to examine the effects of altering aspects of two of these task categories on problem-solving behavior across skill levels: materials and goals. Two versions of an applied engineering problem were presented to expert, intermediate, and novice participants. Participants were to minimize the cost of running a steam generation facility by adjusting steam generation levels and flows. One version was cast in the form of a dynamic, computer-based simulation that provided immediate feedback on flows, costs, and constraint violations, thus incorporating key variable dynamics of the problem context. The other version was cast as a static computer-based model, with no dynamic components, cost feedback, or constraint checking. Experts performed better than the other groups across material conditions, and, when required, the presentation of the goal assisted the experts more than the other groups. The static group generated richer protocols than the dynamic group, but the dynamic group solved the problem in significantly less time. Little effect of feedback was found for intermediates, and none for novices. We conclude that demonstrating differences in performance in this task requires different materials than explicating underlying knowledge that leads to performance. We also conclude that substantial knowledge is required to exploit the information yielded by the dynamic form of the task or the explicit solution goal. This simple model can help to identify the contextual factors that influence elicitation and specification of knowledge, which is essential in the engineering of joint cognitive systems.
Implementing security in a distributed web-based EHCR.
Sucurovic, Snezana
2007-01-01
In many countries there are initiatives for building an integrated patient-centric electronic health record. There are also initiatives for transnational integrations. These growing demands for integration result from the fact that it can provide improving healthcare treatments and reducing the cost of healthcare services. While in European highly developed countries computerisation in healthcare sector began in the 1970s and reached a high level, some developing countries, and Serbia among them, have started computerisation recently. This is why MEDIS (MEDical Information System) is aimed at integration itself from the very beginning instead of integration of heterogeneous information systems on a middle layer or using HL7 protocol. The implementation of a national healthcare information system requires using standards as integrated and widely accepted solutions. Therefore, we have started building MEDIS to meet the requirements of CEN ENV 13606 and CEN ENV 13729 standards. The prototype version has a distributed component-based architecture with modern security solutions applied. MEDIS has been implemented as a federated system where the central server hosts basic EHCR information about a patient, and clinical servers contain their own part of patients' EHCR. At present, there is an initial version of prototype planned to be deployed at first in a small community. In particular, open source API for X.509 authentication and authorisation has been developed. Our project meets the requirements for education in health informatics, including appropriate knowledge and skills on EHCR. The points included in this article have been presented on several national conferences and widely discussed. MEDIS has explored a federated, component-based EHCR architecture and related security aspects. In its initial version it shows acceptable performances and administrative simplicity. It emphasizes the importance of using standards in building EHCR in our country, in order to prepare it for future integrations.
A feasibility assessment of magnetic bearings for free-piston Stirling space power converters
NASA Technical Reports Server (NTRS)
Curwen, Peter W.; Rao, Dantam K.; Wilson, Donald R.
1992-01-01
This report describes a design and analysis study performed by Mechanical Technology Incorporated (MTI) under NASA Contract NAS3-26061. The objective of the study was to assess the feasibility and efficacy of applying magnetic bearings to free-piston Stirling-cycle power conversion machinery of the type currently being evaluated for possible use in long-term space missions. The study was performed for a 50-kWe Reference Stirling Space Power Converter (RSSPC) system consisting of two 25-kWe free-piston Stirling engine modules. Two different versions of the RSSPC engine modules have been defined under NASA Contract NAS3-25463. These modules currently use hydrostatic gas bearings to support the reciprocating displacer and power piston assemblies. Results of this study show that active magnetic bearings of the attractive electromagnetic type are technically feasible for RSSPC application provided that wire insulation with 60,000-hr life capability at 300 C can be developed for the bearing coils. From a design integration standpoint, both versions of the RSSPC were found to be conceptually amenable to magnetic support of the power piston assembly. However, only one version of the RSSPC was found to be amendable to magnetic support of the displacer assembly. Unacceptable changes to the basic engine design would be required to incorporate magnetic displacer bearings into the second version. Complete magnetic suspension of the RSSPC can potentially increase overall efficiency of the Stirling cycle power converter by 0.53 to 1.4 percent (0.15 to 0.4 efficiency points). Magnetic bearings will also overcome several operational concerns associated with hydrostatic gas bearing systems. However, these advantages are accompanied by a 5 to 8 percent increase in specific mass of the RSSPC, depending on the RSSPC version employed. Additionally, magnetic bearings are much more complex, both mechanically and particularly electronically, than hydrostatic bearings. Accordingly, long-term stability and reliability represent areas of uncertainty for magnetic bearings. Considerable development effort will be required to establish the long-term suitability of these bearings for Stirling space power applications.
The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival
NASA Astrophysics Data System (ADS)
O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.
2016-02-01
The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.
Conducted Transients on Spacecraft Primary Power Lines
NASA Technical Reports Server (NTRS)
Mc Closkey, John; Dimov, Jen
2017-01-01
One of the sources of potential interference on spacecraft primary power lines is that of conducted transients resulting from equipment being switched on and off of the bus. Susceptibility to such transients is addressed by some version of the CS06 requirement of MIL-STD-461462. This presentation provides a summary of the history of the CS06 requirement and test method, a basis for understanding of the sources of these transients, analysis techniques for determining their worst-case characteristics, and guidelines for minimizing their magnitudes and applying the requirement appropriately.
General Recommendations on Fatigue Risk Management for the Canadian Forces
2010-04-01
missions performed in aviation require an individual(s) to process large amount of information in a short period of time and to do this on a continuous...information processing required during sustained operations can deteriorate an individual’s ability to perform a task. Given the high operational tempo...memory, which, in turn, is utilized to perform human thought processes (Baddeley, 2003). While various versions of this theory exist, they all share
International Commission on Zoological Nomenclature
2012-01-01
Abstract The International Commission on Zoological Nomenclature has voted in favour of a revised version of the amendment to the International Code of Zoological Nomenclature that was proposed in 2008. The purpose of the amendment is to expand and refine the methods of publication allowed by the Code, particularly in relation to electronic publication. The amendment establishes an Official Register of Zoological Nomenclature (with ZooBank as its online version), allows electronic publication after 2011 under certain conditions, and disallows publication on optical discs after 2012. The requirements for electronic publications are that the work be registered in ZooBank before it is published, that the work itself state the date of publication and contain evidence that registration has occurred, and that the ZooBank registration state both the name of an electronic archive intended to preserve the work and the ISSN or ISBN associated with the work. Registration of new scientific names and nomenclatural acts is not required. The Commission has confirmed that ZooBank is ready to handle the requirements of the amendment. PMID:22977348
Lightweight two-stroke cycle aircraft diesel engine technology enablement program, volume 1
NASA Technical Reports Server (NTRS)
Freen, P. D.; Berenyi, S. G.; Brouwers, A. P.; Moynihan, M. E.
1985-01-01
An experimental Single Cylinder Test Engine Program is conducted to confirm the analytically projected performance of a two-stroke cycle diesel engine for aircraft applications. The test engine delivered 78kW indicated power from 1007cc displacement, operating at 3500 RPM on Schnuerle loop scavenged two-stroke cycle. Testing confirms the ability of a proposed 4-cylinder version of such an engine to reach the target power at altitude, in a highly turbocharged configuration. The experimental program defines all necessary parameters to permit design of a multicylinder engine for eventual flight applications; including injection system requirement, turbocharging, heat rejection, breathing, scavenging, and structural requirements. The multicylinder engine concept is configured to operate with an augmented turbocharger, but with no primary scavenge blower. The test program is oriented to provide a balanced turbocharger compressor to turbine power balance without an auxiliary scavenging system. Engine cylinder heat rejection to the ambient air has been significantly reduced and the minimum overall turbocharger efficiency required is within the range of commercially available turbochargers. Analytical studies and finite element modeling is made of insulated configurations of the engines - including both ceramic and metallic versions. A second generation test engine is designed based on current test results.
Managing high-bandwidth real-time data storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bigelow, David D.; Brandt, Scott A; Bent, John M
2009-09-23
There exist certain systems which generate real-time data at high bandwidth, but do not necessarily require the long-term retention of that data in normal conditions. In some cases, the data may not actually be useful, and in others, there may be too much data to permanently retain in long-term storage whether it is useful or not. However, certain portions of the data may be identified as being vitally important from time to time, and must therefore be retained for further analysis or permanent storage without interrupting the ongoing collection of new data. We have developed a system, Mahanaxar, intended tomore » address this problem. It provides quality of service guarantees for incoming real-time data streams and simultaneous access to already-recorded data on a best-effort basis utilizing any spare bandwidth. It has built in mechanisms for reliability and indexing, can scale upwards to meet increasing bandwidth requirements, and handles both small and large data elements equally well. We will show that a prototype version of this system provides better performance than a flat file (traditional filesystem) based version, particularly with regard to quality of service guarantees and hard real-time requirements.« less
Computer versus paper--does it make any difference in test performance?
Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin
2015-01-01
CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low-performing students) guess at a higher rate. Further studies are necessary to understand this finding.
NASA Astrophysics Data System (ADS)
Brzuszek, Marcin; Daniluk, Andrzej
2006-11-01
Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronisation, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents multithreaded versions of the GROWTH program, which allow to calculate the layer coverages during the growth of thin epitaxial films and the corresponding RHEED intensities according to the kinematical approximation. The presented programs also contain graphical user interfaces, which enable displaying program data at run-time. New version program summaryTitles of programs:GROWTHGr, GROWTH06 Catalogue identifier:ADVL_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Catalogue identifier of previous version:ADVL Does the new version supersede the original program:No Computer for which the new version is designed and others on which it has been tested: Pentium-based PC Operating systems or monitors under which the new version has been tested: Windows 9x, XP, NT Programming language used:Object Pascal Memory required to execute with typical data:More than 1 MB Number of bits in a word:64 bits Number of processors used:1 No. of lines in distributed program, including test data, etc.:20 931 Number of bytes in distributed program, including test data, etc.: 1 311 268 Distribution format:tar.gz Nature of physical problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory [P.I. Cohen, G.S. Petrich, P.R. Pukite, G.J. Whaley, A.S. Arrott, Surf. Sci. 216 (1989) 222. [1
5 CFR 470.311 - Final project approval.
Code of Federal Regulations, 2010 CFR
2010-01-01
... MANAGEMENT RESEARCH PROGRAMS AND DEMONSTRATIONS PROJECTS Regulatory Requirements Pertaining to Demonstration Projects § 470.311 Final project approval. (a) The Office of Personnel Management will consider all timely...) The Office of Personnel Management shall provide a copy of the final version of the project plan to...
Simulation of a Moving Elastic Beam Using Hamilton’s Weak Principle
2006-03-01
versions were limited to two-dimensional systems with open tree configurations (where a cut in any component separates the system in half) [48]. This...whose com- ponents experienced large angular rotations (turbomachinery, camshafts , flywheels, etc.). More complex systems required the simultaneous
Analyticity without Differentiability
ERIC Educational Resources Information Center
Kirillova, Evgenia; Spindler, Karlheinz
2008-01-01
In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…
78 FR 4100 - Connect America Fund
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-18
... submit along with their acceptance notices. Should such carriers be required to specify the technology or... Wireline Competition Bureau (Bureau) the task of developing a forward-looking cost model to determine... recently announced the availability of version one of the Connect America Cost Model, which provides the...
Consolidated List of Lists under EPCRA/CERCLA/CAA §112(r) (March 2015 Version)
List of Lists was prepared to help firms handling chemicals determine, for a specific chemical, whether they may be subject to the following reporting requirements under Emergency Planning and Community Right-to-Know, CERCLA, and Clean Air Act.
48 CFR 935.010 - Scientific and technical reports.
Code of Federal Regulations, 2013 CFR
2013-10-01
... information. The DOE Order 241.1B Scientific and Technical Information Management, or its successor version... conveyed in scientific and technical information (STI) shall include an instruction requiring the.... Department of Energy (DOE), Office of Scientific and Technical Information (OSTI), using the DOE Energy Link...
48 CFR 935.010 - Scientific and technical reports.
Code of Federal Regulations, 2011 CFR
2011-10-01
... information. The DOE Order 241.1B Scientific and Technical Information Management, or its successor version... conveyed in scientific and technical information (STI) shall include an instruction requiring the.... Department of Energy (DOE), Office of Scientific and Technical Information (OSTI), using the DOE Energy Link...
48 CFR 935.010 - Scientific and technical reports.
Code of Federal Regulations, 2010 CFR
2010-10-01
... information. The DOE Order 241.1B Scientific and Technical Information Management, or its successor version... conveyed in scientific and technical information (STI) shall include an instruction requiring the.... Department of Energy (DOE), Office of Scientific and Technical Information (OSTI), using the DOE Energy Link...
48 CFR 935.010 - Scientific and technical reports.
Code of Federal Regulations, 2012 CFR
2012-10-01
... information. The DOE Order 241.1B Scientific and Technical Information Management, or its successor version... conveyed in scientific and technical information (STI) shall include an instruction requiring the.... Department of Energy (DOE), Office of Scientific and Technical Information (OSTI), using the DOE Energy Link...
48 CFR 935.010 - Scientific and technical reports.
Code of Federal Regulations, 2014 CFR
2014-10-01
... information. The DOE Order 241.1B Scientific and Technical Information Management, or its successor version... conveyed in scientific and technical information (STI) shall include an instruction requiring the.... Department of Energy (DOE), Office of Scientific and Technical Information (OSTI), using the DOE Energy Link...
Calibration of HERS-ST for estimating traffic impact on pavement deterioration in Texas.
DOT National Transportation Integrated Search
2012-08-01
The Highway Economic Requirements System-State Version (or the HERS-ST) is a software package which was developed by the Federal Highway Administration as a tool for evaluating the performance of state highway systems. HERS-ST has the capabilities of...
NASA Technical Reports Server (NTRS)
Patt, Frederick S.; Hoisington, Charles M.; Gregg, Watson W.; Coronado, Patrick L.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Indest, A. W. (Editor)
1993-01-01
An analysis of orbit propagation models was performed by the Mission Operations element of the Sea-viewing Wide Field-of-View Sensor (SeaWiFS) Project, which has overall responsibility for the instrument scheduling. The orbit propagators selected for this analysis are widely available general perturbations models. The analysis includes both absolute accuracy determination and comparisons of different versions of the models. The results show that all of the models tested meet accuracy requirements for scheduling and data acquisition purposes. For internal Project use the SGP4 propagator, developed by the North American Air Defense (NORAD) Command, has been selected. This model includes atmospheric drag effects and, therefore, provides better accuracy. For High Resolution Picture Transmission (HRPT) ground stations, which have less stringent accuracy requirements, the publicly available Brouwer-Lyddane models are recommended. The SeaWiFS Project will make available portable source code for a version of this model developed by the Data Capture Facility (DCF).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-31
...We are giving notice of changes to the Program Standards for the chronic wasting disease (CWD) herd certification program. The CWD herd certification program is a voluntary, cooperative program that establishes minimum requirements for the interstate movement of farmed or captive cervids, provisions for participating States to administer Approved State CWD Herd Certification Programs, and provisions for participating herds to become certified as having a low risk of being infected with CWD. The Program Standards provide optional guidance, explanation, and clarification on how to meet the requirements for interstate movement and for the Herd Certification Programs. Recently, we convened a group of State, laboratory, and industry representatives to discuss possible changes to the current Program Standards. The revised Program Standards reflect these discussions, and we believe the revised version will improve understanding of the program among State and industry cooperators. We are making the revised version of the Program Standards available for review and comment.
Bralet, Marie-Cécile; Falissard, Bruno; Neveu, Xavier; Lucas-Ross, Margaret; Eskenazi, Anne-Marie; Keefe, Richard S E
2007-09-01
Schizophrenic patients demonstrate impairments in several key dimensions of cognition. These impairments are correlated with important aspects of functional outcome. While assessment of these cognition disorders is increasingly becoming a part of clinical and research practice in schizophrenia, there is no standard and easily administered test battery. The BACS (Brief Assessment of Cognition in Schizophrenia) has been validated in English language [Keefe RSE, Golberg TE, Harvey PD, Gold JM, Poe MP, Coughenour L. The Brief Assessment of Cognition in Schizophrenia: reliability, sensibility, and comparison with a standard neurocognitive battery. Schizophr. Res 2004;68:283-97], and was found to be as sensitive to cognitive dysfunction as a standard battery of tests, with the advantage of requiring less than 35 min to complete. We developed a French adaptation of the BACS and this study tested its ease of administration and concurrent validity. Correlation analyses between the BACS (version A) and a standard battery were performed. A sample of 50 stable schizophrenic patients received the French Version A of the BACS in a first session, and in a second session a standard battery. All the patients completed each of the subtests of the French BACS . The mean duration of completion for the BACS French version was 36 min (S.D.=5.56). A correlation analysis between the BACS (version A) global score and the standard battery global score showed a significant result (r=0.81, p<0.0001). The correlation analysis between the BACS (version A) sub-scores and the standard battery sub-scores showed significant results for verbal memory, working memory, verbal fluency, attention and speed of information processing and executive functions (p<0.001) and for motor speed (p<0.05). The French Version of the BACS is easier to use in French schizophrenic patients compared to a standard battery (administration shorter and completion rate better) and its good psychometric properties suggest that the French Version of the BACS may be a useful tool for assessing cognition in schizophrenic patients with French as their primary language.
SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)
NASA Technical Reports Server (NTRS)
Coe, H. H.
1994-01-01
The SHABERTH computer program was developed to predict operating characteristics of bearings in a multibearing load support system. Lubricated and non-lubricated bearings can be modeled. SHABERTH calculates the loads, torques, temperatures, and fatigue life for ball and/or roller bearings on a single shaft. The program also allows for an analysis of the system reaction to the termination of lubricant supply to the bearings and other lubricated mechanical elements. SHABERTH has proven to be a valuable tool in the design and analysis of shaft bearing systems. The SHABERTH program is structured with four nested calculation schemes. The thermal scheme performs steady state and transient temperature calculations which predict system temperatures for a given operating state. The bearing dimensional equilibrium scheme uses the bearing temperatures, predicted by the temperature mapping subprograms, and the rolling element raceway load distribution, predicted by the bearing subprogram, to calculate bearing diametral clearance for a given operating state. The shaft-bearing system load equilibrium scheme calculates bearing inner ring positions relative to the respective outer rings such that the external loading applied to the shaft is brought into equilibrium by the rolling element loads which develop at each bearing inner ring for a given operating state. The bearing rolling element and cage load equilibrium scheme calculates the rolling element and cage equilibrium positions and rotational speeds based on the relative inner-outer ring positions, inertia effects, and friction conditions. The ball bearing subprograms in the current SHABERTH program have several model enhancements over similar programs. These enhancements include an elastohydrodynamic (EHD) film thickness model that accounts for thermal heating in the contact area and lubricant film starvation; a new model for traction combined with an asperity load sharing model; a model for the hydrodynamic rolling and shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the IBM PC version is a set of two 5.25 inch 360K MS-DOS format diskettes. The contents of the diske
Pérez-Llamas, F; Garaulet, M; Torralba, C; Zamora, S
2012-01-01
The aim of this paper is the description of a new version of the software application GRUNUMUR, a useful tool for human nutrition studies designed by the Nutrition Research Group from the Murcia University. Similar to the first, this second version offers the possibility to address different types of study: dietary habits (24 h recall, 7-days dietary record and Food Frequency Questionnaire), epidemiological, anthropometrical and clinical studies. The new version, called GRUNUMUR 2.0, compatible with the first one, has an online help system for all functions of the application, providing the user tasks, allows safe storage of a virtually unlimited number of results, in an orderly and organized way, you can retrieve it when required, through a system of backups and scheduled maintenance and unattended (tasks performed by a server), another advantage is its total accessibility, both from the university intranet (www.um.es) and from the internet, it works via Web Browser (http://senver.inf.um.es/esen), and finally, allows data to be exported to Excel for further processing with other applications as well as publishing reports in PDF, to deliver study participants if necessary. The new version has been validated by comparing the extracted results with those obtained from the other software with no significant differences for any of the variables analyzed. The application GRUNUMUR 2.0 is a tool improved, useful and reliable for addressing human nutrition studies.
MOLECULAR DESIGNER: an interactive program for the display of protein structure on the IBM-PC.
Hannon, G J; Jentoft, J E
1985-09-01
A BASIC interactive graphics program has been developed for the IBM-PC which utilizes the graphics capabilities of that computer to display and manipulate protein structure from coordinates. Structures may be generated from typed files, or from Brookhaven National Laboratories' Protein Data Bank data tapes. Once displayed, images may be rotated, translated and expanded to any desired size. Figures may be viewed as ball-and-stick or space-filling models. Calculated multiple-point perspective may also be added to the display. Docking manipulations are possible since more than a single figure may be displayed and manipulated simultaneously. Further, stereo images and red/blue three-dimensional images may be generated using the accompanying DESIPLOT program and an HP-7475A plotter. A version of the program is also currently available for the Apple Macintosh. Full implementation on the Macintosh requires 512 K and at least one disk drive. Otherwise this version is essentially identical to the IBM-PC version described herein.
Thistlethwaite, Jill; Dallest, Kathy; Moran, Monica; Dunston, Roger; Roberts, Chris; Eley, Diann; Bogossian, Fiona; Forman, Dawn; Bainbridge, Lesley; Drynan, Donna; Fyfe, Sue
2016-07-01
The individual Teamwork Observation and Feedback Tool (iTOFT) was devised by a consortium of seven universities in recognition of the need for a means of observing and giving feedback to individual learners undertaking an interprofessional teamwork task. It was developed through a literature review of the existing teamwork assessment tools, a discussion of accreditation standards for the health professions, Delphi consultation and field-testing with an emphasis on its feasibility and acceptability for formative assessment. There are two versions: the Basic tool is for use with students who have little clinical teamwork experience and lists 11 observable behaviours under two headings: 'shared decision making' and 'working in a team'. The Advanced version is for senior students and junior health professionals and has 10 observable behaviours under four headings: 'shared decision making', 'working in a team', 'leadership', and 'patient safety'. Both versions include a comprehensive scale and item descriptors. Further testing is required to focus on its validity and educational impact.
Analysis of CrIS-ATMS Data Using an AIRS Science Team Version 6 - Like Retrieval Algorithm
NASA Technical Reports Server (NTRS)
Susskind, Joel; Kouvaris, Louis C.
2013-01-01
CrIS/ATMS is flying on NPP and is scheduled to fly on JPSS-1. CrIS/ATMS has roughly equivalent capabilities to AIRS/AMSU. The AIRS Science Team Version 6 retrieval algorithm is currently producing very high quality level-3 Climate Data Records (CDR's) that will be critical for understanding climate processes AIRS CDRs should eventually cover the period September 2002 through at least 2020. CrIS/ATMS is the only scheduled follow on to AIRS AMSU. I have been asked by Ramesh Kakar if CrIS/ATMS can be counted on to adequately continue the AIRS/AMSU CDRs beyond 2020, or is something better needed? This research is being done to answer that question. A minimum requirement to obtain a yes answer is that CrIS/ATMS be analyzed using an AIRS Version 6 - like algorithm. NOAA is currently generating CrIS/ATMS products using 2 algorithms: IDPS and NUCAPS
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.
Tewes, Susanne; Mokov, Nikolaj; Hartung, Dagmar; Schick, Volker; Peters, Inga; Schedl, Peter; Pertschy, Stefanie; Wacker, Frank; Voshage, Götz; Hueper, Katja
2016-01-01
Introduction Objective of our study was to determine the agreement between version 1 (v1) and v2 of the Prostate Imaging Reporting and Data System (PI-RADS) for evaluation of multiparametric prostate MRI (mpMRI) and to compare their diagnostic accuracy, their inter-observer agreement and practicability. Material and Methods mpMRI including T2-weighted imaging, diffusion-weighted imaging (DWI) and dynamic contrast-enhanced imaging (DCE) of 54 consecutive patients, who subsequently underwent MRI-guided in-bore biopsy were re-analyzed according to PI-RADS v1 and v2 by two independent readers. Diagnostic accuracy for detection of prostate cancer (PCa) was assessed using ROC-curve analysis. Agreement between PI-RADS versions and observers was calculated and the time needed for scoring was determined. Results MRI-guided biopsy revealed PCa in 31 patients. Diagnostic accuracy for detection of PCa was equivalent with both PI-RADS versions for reader 1 with sensitivities and specificities of 84%/91% (AUC = 0.91 95%CI[0.8–1]) for PI-RADS v1 and 100%/74% (AUC = 0.92 95% CI[0.8–1]) for PI-RADS v2. Reader 2 achieved similar diagnostic accuracy with sensitivity and specificity of 74%/91% (AUC = 0.88 95%CI[0.8–1]) for PI-RADS v1 and 81%/91% (AUC = 0.91 95%CI[0.8–1]) for PI-RADS v2. Agreement between scores determined with different PI-RADS versions was good (reader 1: κ = 0.62, reader 2: κ = 0.64). Inter-observer agreement was moderate with PI-RADS v2 (κ = 0.56) and fair with v1 (κ = 0.39). The time required for building the PI-RADS score was significantly lower with PI-RADS v2 compared to v1 (24.7±2.3 s vs. 41.9±2.6 s, p<0.001). Conclusion Agreement between PI-RADS versions was high and both versions revealed high diagnostic accuracy for detection of PCa. Due to better inter-observer agreement for malignant lesions and less time demand, the new PI-RADS version could be more practicable for clinical routine. PMID:27657729
HALE UAS Command and Control Communications: Step 1 - Functional Requirements Document. Version 4.0
NASA Technical Reports Server (NTRS)
2006-01-01
The High Altitude Long Endurance (HALE) unmanned aircraft system (UAS) communicates with an off-board pilot-in-command in all flight phases via the C2 data link, making it a critical component for the UA to fly in the NAS safely and routinely. This is a new requirement in current FAA communications planning and monitoring processes. This document provides a set of comprehensive C2 communications functional requirements and performance guidelines to help facilitate the future FAA certification process for civil UAS to operate in the NAS. The objective of the guidelines is to provide the ability to validate the functional requirements and in future be used to develop performance-level requirements.
IDC Re-Engineering Phase 2 System Specification Document Version 1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satpathi, Meara Allena; Burns, John F.; Harris, James M.
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide datamore » and products.« less
NASA Technical Reports Server (NTRS)
Bullington, Stanley F.
1992-01-01
The following list of requirements specifies the proposed revisions to the Experiment Scheduling Program (ESP2) which deal with schedule repair. These requirements are divided into those which are general in nature, those which relate to measurement and analysis functions of the software, those which relate specifically to conflict resolution, and those relating directly to the user interface. (This list is not a complete list of requirements for the user interface, but only a list of those schedule repair requirements which relate to the interface.) Some of the requirements relate only to uses of the software in real-time operations. Others are clearly for future versions of the software, beyond the upcoming revision. In either case, the fact will be clearly stated.
Superconducting multiport antenna arrays
NASA Astrophysics Data System (ADS)
Chaloupka, H.
1993-10-01
Applications of HTS to radiating elements and beam-forming networks of multibeam and/or multifrequency arrays are discussed. This includes radiating elements which meet special requirements with respect to size and frequency response. Realized versions of both a three-port HTS array and a 4 x 4 Butler matrix are presented.
77 FR 34123 - Information Collection Available for Public Comments and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... from that office. SUPPLEMENTARY INFORMATION: Title of Collection: Monthly Report of Ocean Shipments... responsibilities under Public Resolution 17, to ensure compliance of ocean shipping requirements operating under.... An electronic version of this document is available on the World Wide Web at http://regulations.gov...
Steps toward Promoting Consistency in Educational Decisions
ERIC Educational Resources Information Center
Klein, Joseph
2010-01-01
Purpose: The literature indicates the advantages of decisions formulated through intuition, as well as the limitations, such as lack of consistency in similar situations. The principle of consistency (invariance), requiring that two equivalent versions of choice-problems will produce the same preference, is violated in intuitive judgment. This…
Estimating the Overdiagnosis Fraction in Cancer Screening | Division of Cancer Prevention
By Stuart G. Baker, 2017 Introduction This software supports the mathematical investigation into estimating the fraction of cancers detected on screening that are overdiagnosed. References Baker SG and Prorok PC. Estimating the overdiagnosis fraction in cancer screening. Requirement Mathematica Version 11 or later. |
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mix, Scott R.; Kirkham, Harold; Silverstein, Alison
Compliance with the NERC requirements for Critical Infrastructure Protection (CIP) for synchrophasor systems in the Version 5 paradigm seems to be a matter of some uncertainty for those in the synchrophasor user community. This report aims to provide clarification and guidance in the form of case studies based on methods seen in the industry
DOT National Transportation Integrated Search
1997-09-19
This report gives an overview of the National Intelligent Transportation Infrastructure Initiative (NITI). NITI refers to the integrated electronics, communications, and hardware and software elements that are available to support Intelligent Transpo...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-17
... Supply Schedules) AGENCY: Office of Acquisition Policy, General Services Administration (GSA). ACTION... collection requirement regarding the Modifications (Federal Supply Schedule) clause. DATES: Submit comments... (GSAR) to add clause 552.243-81 Modifications (Federal Supply Schedule) and an Alternate I version of...
New York City's fight over calorie labeling.
Farley, Thomas A; Caffarelli, Anna; Bassett, Mary T; Silver, Lynn; Frieden, Thomas R
2009-01-01
In 2006, New York City's Health Department amended the city Health Code to require the posting of calorie counts by chain restaurants on menus, menu boards, and item tags. This was one element of the city's response to rising obesity rates. Drafting the rule involved many decisions that affected its impact and its legal viability. The restaurant industry argued against the rule and twice sued to prevent its implementation. An initial version of the rule was found to be preempted by federal law, but a revised version was implemented in January 2008. The experience shows that state and local health departments can use their existing authority over restaurants to combat obesity and, indirectly, chronic diseases.
Chen, Alice P; Setser, Ann; Anadkat, Milan J; Cotliar, Jonathan; Olsen, Elise A; Garden, Benjamin C; Lacouture, Mario E
2012-11-01
Dermatologic adverse events to cancer therapies have become more prevalent and may to lead to dose modifications or discontinuation of life-saving or prolonging treatments. This has resulted in a new collaboration between oncologists and dermatologists, which requires accurate cataloging and grading of side effects. The Common Terminology Criteria for Adverse Events Version 4.0 is a descriptive terminology and grading system that can be used for uniform reporting of adverse events. A proper understanding of this standardized classification system is essential for dermatologists to properly communicate with all physicians caring for patients with cancer. Copyright © 2012 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.
CFD Based Computations of Flexible Helicopter Blades for Stability Analysis
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2011-01-01
As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.
Further investigation on "A multiplicative regularization for force reconstruction"
NASA Astrophysics Data System (ADS)
Aucejo, M.; De Smet, O.
2018-05-01
We have recently proposed a multiplicative regularization to reconstruct mechanical forces acting on a structure from vibration measurements. This method does not require any selection procedure for choosing the regularization parameter, since the amount of regularization is automatically adjusted throughout an iterative resolution process. The proposed iterative algorithm has been developed with performance and efficiency in mind, but it is actually a simplified version of a full iterative procedure not described in the original paper. The present paper aims at introducing the full resolution algorithm and comparing it with its simplified version in terms of computational efficiency and solution accuracy. In particular, it is shown that both algorithms lead to very similar identified solutions.
Revised and extended UTILITIES for the RATIP package
NASA Astrophysics Data System (ADS)
Nikkinen, J.; Fritzsche, S.; Heinäsmäki, S.
2006-09-01
During the last years, the RATIP package has been found useful for calculating the excitation and decay properties of free atoms. Based on the (relativistic) multiconfiguration Dirac-Fock method, this program is used to obtain accurate predictions of atomic properties and to analyze many recent experiments. The daily work with this package made an extension of its UTILITIES [S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163] desirable in order to facilitate the data handling and interpretation of complex spectra. For this purpose, we make available an enlarged version of the UTILITIES which mainly supports the comparison with experiment as well as large Auger computations. Altogether 13 additional tasks have been appended to the program together with a new menu structure to improve the interactive control of the program. Program summaryTitle of program: RATIP Catalogue identifier: ADPD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Reference in CPC to previous version: S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163 Catalogue identifier of previous version: ADPD Authors of previous version: S. Fritzsche, Department of Physics, University of Kassel, Heinrich-Plett-Strasse 40, D-34132 Kassel, Germany Does the new version supersede the original program?: yes Computer for which the new version is designed and others on which it has been tested: IBM RS 6000, PC Pentium II-IV Installations: University of Kassel (Germany), University of Oulu (Finland) Operating systems: IBM AIX, Linux, Unix Program language used in the new version: ANSI standard Fortran 90/95 Memory required to execute with typical data: 300 kB No. of bits in a word: All real variables are parameterized by a selected kind parameter and, thus, can be adapted to any required precision if supported by the compiler. Currently, the kind parameter is set to double precision (two 32-bit words) as used also for other components of the RATIP package [S. Fritzsche, C.F. Fischer, C.Z. Dong, Comput. Phys. Comm. 124 (2000) 341; G. Gaigalas, S. Fritzsche, Comput. Phys. Comm. 134 (2001) 86; S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163; S. Fritzsche, J. Elec. Spec. Rel. Phen. 114-116 (2001) 1155] No. of lines in distributed program, including test data, etc.:231 813 No. of bytes in distributed program, including test data, etc.: 3 977 387 Distribution format: tar.gzip file Nature of the physical problem: In order to describe atomic excitation and decay properties also quantitatively, large-scale computations are often needed. In the framework of the RATIP package, the UTILITIES support a variety of (small) tasks. For example, these tasks facilitate the file and data handling in large-scale applications or in the interpretation of complex spectra. Method of solution: The revised UTILITIES now support a total of 29 subtasks which are mainly concerned with the manipulation of output data as obtained from other components of the RATIP package. Each of these tasks are realized by one or several subprocedures which have access to the corresponding modules of the main components. While the main menu defines seven groups of subtasks for data manipulations and computations, a particular task is selected from one of these group menus. This allows to enlarge the program later if technical support for further tasks will become necessary. For each selected task, an interactive dialog about the required input and output data as well as a few additional information are printed during the execution of the program. Reasons for the new version: The requirement for enlarging the previous version of the UTILITIES [S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163] arose from the recent application of the RATIP package for large-scale radiative and Auger computations. A number of new subtasks now refer to the handling of Auger amplitudes and their proper combination in order to facilitate the interpretation of complex spectra. A few further tasks, such as the direct access to the one-electron matrix elements for some given set of orbital functions, have been found useful also in the analysis of data. Summary of revisions: extraction and handling of atomic data within the framework of RATIP. With the revised version, we now 'add' another 13 tasks which refer to the manipulation of data files, the generation and interpretation of Auger spectra, the computation of various one- and two-electron matrix elements as well as the evaluation of momentum densities and grid parameters. Owing to the rather large number of subtasks, the main menu has been divided into seven groups from which the individual tasks can be selected very similarly as before. Typical running time: The program responds promptly for most of the tasks. The responding time for some tasks, such as the generation of a relativistic momentum density, strongly depends on the size of the corresponding data files and the number of grid points. Unusual features of the program: A total of 29 different tasks are supported by the program. Starting from the main menu, the user is guided interactively through the program by a dialog and a few additional explanations. For each task, a short summary about its function is displayed before the program prompts for all the required input data.
2002-10-01
the Seated Operator of Off-Highway Work Machines ♦ SAE J1013 1992 http://www.sae.org/servlets/ index http://standards.nasa.gov/NPTS/login.taf...Public Access permits users to view the NASA Preferred Technical Standards index , with the capability to download free of charge the NASA- Developed ...www.sae.org/servlets/ index http://www.techstreet.com/ Design of Ergonomic Requirements for the Design of Displays and Control Actuators -
Friendly Extensible Transfer Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.
2016-04-15
Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).
NASA Technical Reports Server (NTRS)
Tikidjian, Raffi; Mackey, Ryan
2008-01-01
The DSN Array Simulator (wherein 'DSN' signifies NASA's Deep Space Network) is an updated version of software previously denoted the DSN Receive Array Technology Assessment Simulation. This software (see figure) is used for computational modeling of a proposed DSN facility comprising user-defined arrays of antennas and transmitting and receiving equipment for microwave communication with spacecraft on interplanetary missions. The simulation includes variations in spacecraft tracked and communication demand changes for up to several decades of future operation. Such modeling is performed to estimate facility performance, evaluate requirements that govern facility design, and evaluate proposed improvements in hardware and/or software. The updated version of this software affords enhanced capability for characterizing facility performance against user-defined mission sets. The software includes a Monte Carlo simulation component that enables rapid generation of key mission-set metrics (e.g., numbers of links, data rates, and date volumes), and statistical distributions thereof as functions of time. The updated version also offers expanded capability for mixed-asset network modeling--for example, for running scenarios that involve user-definable mixtures of antennas having different diameters (in contradistinction to a fixed number of antennas having the same fixed diameter). The improved version also affords greater simulation fidelity, sufficient for validation by comparison with actual DSN operations and analytically predictable performance metrics.
Siegel, Michael; Kurland, Rachel P.; Castrini, Marisa; Morse, Catherine; de Groot, Alexander; Retamozo, Cynthia; Roberts, Sarah P.; Ross, Craig S.; Jernigan, David H.
2015-01-01
Background No previous paper has examined alcohol advertising on the internet versions of television programs popular among underage youth. Objectives To assess the volume of alcohol advertising on web sites of television networks which stream television programs popular among youth. Methods Multiple viewers analyzed the product advertising appearing on 12 television programs that are available in full episode format on the internet. During a baseline period of one week, six coders analyzed all 12 programs. For the nine programs that contained alcohol advertising, three underage coders (ages 10, 13, and 18) analyzed the programs to quantify the extent of that advertising over a four-week period. Results Alcohol advertisements are highly prevalent on these programs, with nine of the 12 shows carrying alcohol ads, and six programs averaging at least one alcohol ad per episode. There was no difference in alcohol ad exposure for underage and legal age viewers. Conclusions There is a substantial potential for youth exposure to alcohol advertising on the internet through internet-based versions of television programs. The Federal Trade Commission should require alcohol companies to report the underage youth and adult audiences for internet versions of television programs on which they advertise. PMID:27212891
Curran, Patrick J.
2009-01-01
The following manuscript is the final accepted manuscript. It has not been subjected to the final copyediting, fact-checking, and proofreading required for formal publication. It is not the definitive, publisher-authenticated version. The American Psychological Association and its Council of Editors disclaim any responsibility or liabilities for errors or omissions of this manuscript version, any version derived from this manuscript by NIH, or other third parties. The published version is available at www.apa.org/journals/met. The goal of any empirical science is to pursue the construction of a cumulative base of knowledge upon which the future of the science may be built. However, there is mixed evidence that the science of psychology can accurately be characterized by such a cumulative progression. Indeed, some argue that the development of a truly cumulative psychological science is not possible using the current paradigms of hypothesis testing in single-study designs. The author explores this controversy as a framework to introduce the six papers that make up this special issue that is focused on the integration of data and empirical findings across multiple studies. The author proposes that the methods and techniques described in this set of papers can significantly propel us forward in our ongoing quest to build a cumulative psychological science. PMID:19485622
A theoretical basis for the analysis of redundant software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.
Environmental Adaptations Improve Everyday Action in Schizophrenia.
Kessler, Rachel K; Rhodes, Emma; Giovannetti, Tania
2015-05-01
Cognitive functioning, particularly executive functioning, is a strong predictor of functional outcomes in people with schizophrenia. Cognitive remediation has been shown to improve specific cognitive processes, but adjunctive interventions are required for meaningful gains in adaptive functioning, particularly in people with chronic illness. This study examined whether (and how) environmental adaptations, used without training, may circumvent cognitive difficulties and facilitate everyday task performance in individuals with chronic schizophrenia. Forty-two individuals with chronic schizophrenia/schizoaffective disorder were administered cognitive measures and two versions of the Naturalistic Action Test (NAT)-a standard version (ST-NAT), and a user-centered version (UC-NAT) that incorporated environmental adaptations designed to facilitate task performance. The NAT conditions were counterbalanced across participants. Analyses compared performance between the NAT versions and examined the cognitive correlates of each NAT condition. Individuals with schizophrenia made fewer errors on the UC-NAT as compared to the ST-NAT; this between-group difference was significant for all error types. Compared to the ST-NAT, the UC-NAT performance was not significantly associated with an executive function measure of planning. Environmental adaptations may be implemented without extensive training to improve everyday action in individuals with chronic schizophrenia. Environmental adaptations that reduce planning demands may be most effective in this population.
Development of alternative versions of the Logical Memory subtest of the WMS-R for use in Brazil
Bolognani, Silvia Adriana Prado; Miranda, Monica Carolina; Martins, Marjorie; Rzezak, Patricia; Bueno, Orlando Francisco Amodeo; de Camargo, Candida Helena Pires; Pompeia, Sabine
2015-01-01
The logical memory test of the Wechsler Memory Scale is one of the most frequently used standardized tests for assessing verbal memory and consists of two separate short stories each containing 25 idea units. Problems with practice effects arise with re-testing a patient, as these stories may be remembered from previous assessments. Therefore, alternative versions of the test stimuli should be developed to minimize learning effects when repeated testing is required for longitudinal evaluations of patients. Objective To present three alternative stories for each of the original stories frequently used in Brazil (Ana Soares and Roberto Mota) and to show their similarity in terms of content, structure and linguistic characteristics. Methods The alternative stories were developed according to the following criteria: overall structure or thematic content (presentation of the character, conflict, aggravation or complements and resolution); specific structure (sex of the character, location and occupation, details of what happened); formal structure (number of words, characters, verbs and nouns); and readability. Results The alternative stories and scoring criteria are presented in comparison to the original WMS stories (Brazilian version). Conclusion The alternative stories presented here correspond well thematically and structurally to the Brazilian versions of the original stories. PMID:29213955
Belfort, Tatiana; Bramham, Jessica; Simões Neto, José Pedro; Sousa, Maria Fernanda Barroso de; Santos, Raquel Luiza dos; Nogueira, Marcela Moreira Lima; Torres, Bianca; Rosa, Rachel Dias Lopes da; Dourado, Marcia Cristina Nascimento
2015-01-01
Impairments in social and emotional functioning may affect the communication skills and interpersonal relationships of people with dementia and their caregivers. This study had the aim of presenting the steps involved in the cross-cultural adaptation of the Social and Emotional Questionnaire (SEQ) for the Brazilian population. Cross-cultural adaptation study, conducted at the Center for Alzheimer's Disease and Related Disorders in a public university. The process adopted in this study required six consecutive steps: initial translation, translation synthesis, back translation, committee of judges, pretesting of final version and submission to the original author. In general, the items had semantic, idiomatic, conceptual and experiential equivalence. During the first pretest, people with dementia and their caregivers had difficulties in understanding some items relating to social skills, which were interpreted ambiguously. New changes were made to allow better adjustment to the target population and, following this, a new pretest was performed. This pre-test showed that the changes were relevant and gave rise to the final version of the instrument. There was no correlation between education level and performance in the questionnaire, among people with dementia (P = 0.951). The Brazilian Portuguese version of the Social and Emotional Questionnaire was well understood and, despite the cultural and linguistic differences, the constructs of the original version were maintained.
Sadegh Moghadam, Leila; Foroughan, Mahshid; Mohammadi Shahboulaghi, Farahnaz; Ahmadi, Fazlollah; Sajjadi, Moosa; Farhadi, Akram
2016-01-01
Background Perceptions of aging refer to individuals’ understanding of aging within their sociocultural context. Proper measurement of this concept in various societies requires accurate tools. Objective The present study was conducted with the aim to translate and validate the Brief Aging Perceptions Questionnaire (B-APQ) and assess its psychometric features in Iranian older adults. Method In this study, the Persian version of B-APQ was validated for 400 older adults. This questionnaire was translated into Persian according to the Wild et al’s model. The Persian version was validated using content, face, and construct (using confirmatory factor analysis) validities, and then its internal consistency and test–retest reliability were measured. Data were analyzed using the statistical software programs SPSS 18 and EQS-6.1. Results The confirmatory factor analysis confirmed construct validity and five subscales of B-APQ. Test–retest reliability with 3-week interval produced r=0.94. Cronbach’s alpha was found to be 0.75 for the whole questionnaire, and from 0.53 to 0.77 for the five factors. Conclusion The Persian version of B-APQ showed favorable validity and reliability, and thus it can be used for measuring different dimensions of perceptions of aging in Iranian older adults. PMID:27194907
Glauser, Bianca F; Vairo, Bruno C; Oliveira, Stephan-Nicollas M C G; Cinelli, Leonardo P; Pereira, Mariana S; Mourão, Paulo A S
2012-02-01
Patent protection for enoxaparin has expired. Generic preparations are developed and approved for clinical use in different countries. However, there is still skepticism about the possibility of making an exact copy of the original drug due to the complex processes involved in generating low-molecular-weight heparins. We have undertaken a careful analysis of generic versions of enoxaparin available for clinical use in Brazil. Thirty-three batches of active ingredient and 70 of the final pharmaceutical product were obtained from six different suppliers. They were analysed for their chemical composition, molecular size distribution, in vitro anticoagulant activity and pharmacological effects on animal models of experimental thrombosis and bleeding. Clearly, the generic versions of enoxaparin available for clinical use in Brazil are similar to the original drug. Only three out of 33 batches of active ingredient from one supplier showed differences in molecular size distribution, resulting from a low percentage of tetrasaccharide or the presence of a minor component eluted as monosaccharide. Three out of 70 batches of the final pharmaceutical products contained lower amounts of the active ingredient than that declared by the suppliers. Our results suggest that the generic versions of enoxaparin are a viable therapeutic option, but their use requires strict regulations to ensure accurate standards.
African Games of Strategy: A Teaching Manual. African Outreach Series, No. 2.
ERIC Educational Resources Information Center
Crane, Louise
Appreciation of African games has increased in this country; especially board games which have been popularized through commercial versions. African games are invaluable resources for studying subjects requiring mathematical concepts, as well as social studies, history, geography, and languages. This manual presents some of the better known…
Jobs: Finding and Keeping = Empleos: Buscandolos y Manteniendolos
ERIC Educational Resources Information Center
Private Industry Council of Lehigh Valley, Inc., Allentown, PA.
This document consists of the English and Spanish versions of a booklet to aid individuals in finding and keeping jobs for which they are best suited. Topics covered include analyzing personal requirements (abilities, interests), where to look for jobs, letters of application, resumes, application forms, employment interviews, and job keeping…
CD-ROM in a High School Library Media Center.
ERIC Educational Resources Information Center
Barlow, Diane; And Others
1987-01-01
Describes the experiences of high school students using microcomputers to access an electronic version of an encyclopedia in the school's media center. The topics discussed include hardware and software requirements of the CD-ROM format, information seeking strategies and problems observed, student satisfaction with the system, and recommendations…
From "Ritual" to "Mindfulness": Policy and Pedagogic Positioning
ERIC Educational Resources Information Center
Adams, Paul
2011-01-01
Schools and professionals respond to statute in different ways. However, professional activity is more than mediated response to policy. Versions of pedagogy are not simply envisaged on high and enacted in the workplace. This paper examines how professional views formulate policy imperatives. It proposes that to understand pedagogy requires an…
Microcomputer Applications in Local Assessment Systems.
ERIC Educational Resources Information Center
Harnisch, Delwyn L.; And Others
The capabilities and hardware requirements of four microcomputer software packages produced by the Office of Educational Testing, Research and Service at the University of Illinois at Urbana-Champaign are described. These programs are: (1) the Scan-Tron Forms Analysis Package Version 2.0, an interface between an IBM-compatible and a Scan-Tron…
NASA Technical Reports Server (NTRS)
Zawadzki, M.
2001-01-01
Presented is a description of the single stacked element, and measured and calculated results at 2.56 GHz. Also included are measured results for the array, and calculated results of a stacked element for the required frequency-scaled version at 32 GHz.
ERIC Educational Resources Information Center
Lane, W. Brian
2014-01-01
The traditional introductory-level meterstick-balancing lab assumes that students already know what torque is and that they readily identify it as a physical quantity of interest. We propose a modified version of this activity in which students qualitatively and quantitatively measure the amount of force required to keep the meterstick level. The…
Visual Elements and Container Metaphors for Multi-Media.
ERIC Educational Resources Information Center
Howarth, Mike
1997-01-01
An interactive version of an educational radio program can be developed quickly and easily with a main menu interface that takes into account physical classroom conditions; interactive learning interfaces that accommodate eye and vision requirements of children; and a story interface design informed by the "container" metaphor and the 2-D…
Code of Federal Regulations, 2010 CFR
2010-10-01
... TRANSPORTATION Foreign Travel 947.7001 Policy. Contractor foreign travel shall be conducted pursuant to the requirements contained in DOE Order 551.1C, or its successor, Official Foreign Travel, or any subsequent version of the order in effect at the time of award. [65 FR 81007, Dec. 22, 2000, as amended at 74 FR...
Code of Federal Regulations, 2012 CFR
2012-10-01
... TRANSPORTATION Foreign Travel 947.7001 Policy. Contractor foreign travel shall be conducted pursuant to the requirements contained in DOE Order 551.1C, or its successor, Official Foreign Travel, or any subsequent version of the order in effect at the time of award. [65 FR 81007, Dec. 22, 2000, as amended at 74 FR...
Code of Federal Regulations, 2013 CFR
2013-10-01
... TRANSPORTATION Foreign Travel 947.7001 Policy. Contractor foreign travel shall be conducted pursuant to the requirements contained in DOE Order 551.1C, or its successor, Official Foreign Travel, or any subsequent version of the order in effect at the time of award. [65 FR 81007, Dec. 22, 2000, as amended at 74 FR...
Code of Federal Regulations, 2011 CFR
2011-10-01
... TRANSPORTATION Foreign Travel 947.7001 Policy. Contractor foreign travel shall be conducted pursuant to the requirements contained in DOE Order 551.1C, or its successor, Official Foreign Travel, or any subsequent version of the order in effect at the time of award. [65 FR 81007, Dec. 22, 2000, as amended at 74 FR...
Code of Federal Regulations, 2014 CFR
2014-10-01
... TRANSPORTATION Foreign Travel 947.7001 Policy. Contractor foreign travel shall be conducted pursuant to the requirements contained in DOE Order 551.1C, or its successor, Official Foreign Travel, or any subsequent version of the order in effect at the time of award. [65 FR 81007, Dec. 22, 2000, as amended at 74 FR...
The Military Language Tutor (MILT)
1998-11-01
interactive tutor in a Pentium based laptop computer. The first version of MILT with keyboard input was designed for Spanish and Arabic and can recognize... NLP ). The goal of the MILT design team was an authoring system which would require no formal external training and which could be learned within four
77 FR 47076 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-07
... five-year funding period. Pre-test and follow-up versions of the survey are expected to require... from the respondents pre-test (e.g., demographics, agency type) in order to further expedite completion... respondents respondent per response hours IAATP: Trainee Survey Pre-Test Administration... 1,200 1 0.15 180...
48 CFR 1801.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... title 48, CFR. (iii) The single official NASA-maintained version of the NFS is on the Internet (http://www.hq.nasa.gov/office/procurement/regs/nfstoc.htm). [69 FR 21762, Apr. 22, 2004] ... require public comment. NASA personnel must comply with all regulatory and internal guidance and...
48 CFR 1801.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... title 48, CFR. (iii) The single official NASA-maintained version of the NFS is on the Internet (http://www.hq.nasa.gov/office/procurement/regs/nfstoc.htm). [69 FR 21762, Apr. 22, 2004] ... require public comment. NASA personnel must comply with all regulatory and internal guidance and...
48 CFR 1801.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... title 48, CFR. (iii) The single official NASA-maintained version of the NFS is on the Internet (http://www.hq.nasa.gov/office/procurement/regs/nfstoc.htm). [69 FR 21762, Apr. 22, 2004] ... require public comment. NASA personnel must comply with all regulatory and internal guidance and...
48 CFR 1801.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... title 48, CFR. (iii) The single official NASA-maintained version of the NFS is on the Internet (http://www.hq.nasa.gov/office/procurement/regs/nfstoc.htm). [69 FR 21762, Apr. 22, 2004] ... require public comment. NASA personnel must comply with all regulatory and internal guidance and...
48 CFR 1801.105-1 - Publication and code arrangement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... title 48, CFR. (iii) The single official NASA-maintained version of the NFS is on the Internet (http://www.hq.nasa.gov/office/procurement/regs/nfstoc.htm). [69 FR 21762, Apr. 22, 2004] ... require public comment. NASA personnel must comply with all regulatory and internal guidance and...
Improved Coulomb-Friction Damper
NASA Technical Reports Server (NTRS)
Campbell, G. E.
1985-01-01
Equal damping provided on forward and reverse strokes. Improved damper has springs and wedge rings symmetrically placed on both ends of piston wedge, so friction force same in both directions of travel. Unlike conventional automotive shock absorbers, they resemble on outside, both versions require no viscous liquid and operate over wide temperature range.
75 FR 17052 - Issuance of Electronic Documents and Related Recordkeeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-05
... BIS to eliminate the paper versions of most export and reexport licenses, notices of denial of license applications, notices of return of a license application without action, notices of results of classification requests, License Exception AGR notification results, and encryption review request results. This rule also...
Crime in the Classroom Part IV: Conclusions
ERIC Educational Resources Information Center
Harpp, David N.
2008-01-01
In 1990, the McGill University Senate established regulations governing how multiple-choice exams are to be conducted. The primary rules require multiple-version exams (scrambled question and if possible, scrambled answer positions) as well as assigned seating or seating with alternating rows of students from different courses. In recent years, we…
Astronomical Catalogues - Definition Elements and Afterlife
NASA Astrophysics Data System (ADS)
Jaschek, C.
1984-09-01
Based on a look at the different meanings of the term catalogue (or catalog), a definition is proposed. In an analysis of the main elements, a number of requirements that catalogues should satisfy are pointed out. A section is devoted to problems connected with computer-readable versions of printed catalogues.
Investigating Evolutionary Biology in the Laboratory.
ERIC Educational Resources Information Center
McComas, William F., Ed.
This document presents a collection of useful laboratory-based activities for teaching about evolution. Some of the activities in this monograph are previously unpublished exercises, some are new versions of well-known labs, a few make useful classroom demonstrations, and several require somewhat sophisticated equipment. As a group, the activities…
DOT National Transportation Integrated Search
2003-09-01
Electronic Flight Bags (EFBs) are coming into the flight deck, bringing along with them a wide range of human factors considerations. In order to understand and assess the full impact of an EFB, designers and evaluators require an understanding of ho...
Development of the PEBLebl Traveling Salesman Problem Computerized Testbed
ERIC Educational Resources Information Center
Mueller, Shane T.; Perelman, Brandon S.; Tan, Yin Yin; Thanasuan, Kejkaew
2015-01-01
The traveling salesman problem (TSP) is a combinatorial optimization problem that requires finding the shortest path through a set of points ("cities") that returns to the starting point. Because humans provide heuristic near-optimal solutions to Euclidean versions of the problem, it has sometimes been used to investigate human visual…
Extra-Rhetorical Restraints on Writing in Accounting.
ERIC Educational Resources Information Center
Lund, Donna
Each discourse community teaches and uses a particular version of reality. In the field of accounting, the interpretation of reality is an objective one. Such an interpretation is a constraint on writing, as are accounting's reliance on exclusionary language, its need to meet legal and professional requirements, its adherence to stylistic and…
DOT National Transportation Integrated Search
2003-09-01
Electronic Flight Bags (EFBs) are coming into the flight deck, bringing along with them a wide range of human factors considerations. In order to understand and assess the full impact of an EFB, designers and evaluators require an understanding of ho...
STEVE -- User Guide and Reference Manual
NASA Astrophysics Data System (ADS)
Fish, Adrian
This document describes an extended version of the EVE editor that has been tailored to the general Starlink user's requirements. This extended editor is STarlink Eve or STEve, and this document (along with it's introductory companion SUN/125) describes this editor, and offers additional help, advice and tips on general EVE usage.
Design Manual: Removal of Fluoride from Drinking Water Supplies by Activated Alumina
This document is an updated version of the Design Manual: Removal of Fluoride from Drinking Water Supplies by Activated Alumina (Rubel, 1984). The manual is an in-depth presentation of the steps required to design and operate a fluoride removal plant using activated alumina (AA)...
NASA Technical Reports Server (NTRS)
Baldwin, John; Zendejas, Silvino; Gutheinz, Sandy; Borden, Chester; Wang, Yeou-Fang
2009-01-01
Mission and Assets Database (MADB) Version 1.0 is an SQL database system with a Web user interface to centralize information. The database stores flight project support resource requirements, view periods, antenna information, schedule, and forecast results for use in mid-range and long-term planning of Deep Space Network (DSN) assets.
NASA Technical Reports Server (NTRS)
2005-01-01
The purpose of this document is to present the findings that resulted from a high-level analysis and evaluation of the following documents: (1) The OEP (Operational Evolution Plan) Version 7 -- a 10-year plan for operational improvements to increase capacity and efficiency in U.S. air travel and transport and other use of domestic airspace. The OEP is the FAA commitment to operational improvements. It is outcome driven, with clear lines of accountability within FAA organizations. The OEP concentrates on operational solutions and integrates safety, certification, procedures, staffing, equipment, avionics and research; (2) The Draft Flight Plan 2006 through 2010 -- a multi-year strategic effort, setting a course for the FAA through 2001, to provide the safest and most efficient air transportation system in the world; (3) The NAS System Architecture Version 5 -- a blueprint for modernizing the NAS and improving NAS services and capabilities through the year 2015; and (4) The NAS-SR-1000 System Requirements Specification (NASSRS) -- a compilation of requirements which describe the operational capabilities for the NAS. The analysis is particularly focused on examining the documents for relevance to existing and/or planned future UAV operations. The evaluation specifically focuses on potential factors that could materially affect the development of a commercial ROA industry, such as: (1) Design limitations of the CNS/ATM system, (2) Human limitations, The information presented was taken from program specifications or program office lead personnel.
Heberton, C.I.; Russell, T.F.; Konikow, Leonard F.; Hornberger, G.Z.
2000-01-01
This report documents the U.S. Geological Survey Eulerian-Lagrangian Localized Adjoint Method (ELLAM) algorithm that solves an integral form of the solute-transport equation, incorporating an implicit-in-time difference approximation for the dispersive and sink terms. Like the algorithm in the original version of the U.S. Geological Survey MOC3D transport model, ELLAM uses a method of characteristics approach to solve the transport equation on the basis of the velocity field. The ELLAM algorithm, however, is based on an integral formulation of conservation of mass and uses appropriate numerical techniques to obtain global conservation of mass. The implicit procedure eliminates several stability criteria required for an explicit formulation. Consequently, ELLAM allows large transport time increments to be used. ELLAM can produce qualitatively good results using a small number of transport time steps. A description of the ELLAM numerical method, the data-input requirements and output options, and the results of simulator testing and evaluation are presented. The ELLAM algorithm was evaluated for the same set of problems used to test and evaluate Version 1 and Version 2 of MOC3D. These test results indicate that ELLAM offers a viable alternative to the explicit and implicit solvers in MOC3D. Its use is desirable when mass balance is imperative or a fast, qualitative model result is needed. Although accurate solutions can be generated using ELLAM, its efficiency relative to the two previously documented solution algorithms is problem dependent.
Guimarães, Marcelo Pinto
2016-01-01
This paper includes some criticism in analysis of the development and implementation of the national standards for accessibility of the built environment in Brazil, i.e., the NBR9050. Currently, the 2015 version of it resembles an encyclopaedia including a variety of exotic contributions gathered historically from different sources; however, that characteristic makes it work like a puzzle that keeps alive prejudices about users' needs and disabilities. Besides, there are conflicts between recommended ideas and previous requirements from older versions. The definition of Universal Design has been published since 2004, but there is still no indication of how to make the principles work in practice. Therefore, it is very hard for city officials to assess quality of environments, and professionals have serious constraints to explore their skills further while addressing users' diversified needs. Certainly, the current NBR9050 requires further editorial work. Nevertheless, an important decision is necessary: it is important to organize information so that readers may identify in each topic whether Universal Design application can be achieved or whether the proposed technical solution may lead to construction of limited spatial adaptation and reach only some poor accommodation of users with uncommon needs. Presenting some examples in context of socially inclusive environments, the newer revised version of NBR9050 is necessary to explain about pitfalls of bad design of accessibility for discriminated disabled users. Readers should be able to establish conceptual links between the best ideas so that Universal Design could be easily understood.
JAMI: a Java library for molecular interactions and data interoperability.
Sivade Dumousseau, M; Koch, M; Shrivastava, A; Alonso-López, D; De Las Rivas, J; Del-Toro, N; Combe, C W; Meldal, B H M; Heimbach, J; Rappsilber, J; Sullivan, J; Yehudi, Y; Orchard, S
2018-04-11
A number of different molecular interactions data download formats now exist, designed to allow access to these valuable data by diverse user groups. These formats include the PSI-XML and MITAB standard interchange formats developed by Molecular Interaction workgroup of the HUPO-PSI in addition to other, use-specific downloads produced by other resources. The onus is currently on the user to ensure that a piece of software is capable of read/writing all necessary versions of each format. This problem may increase, as data providers strive to meet ever more sophisticated user demands and data types. A collaboration between EMBL-EBI and the University of Cambridge has produced JAMI, a single library to unify standard molecular interaction data formats such as PSI-MI XML and PSI-MITAB. The JAMI free, open-source library enables the development of molecular interaction computational tools and pipelines without the need to produce different versions of software to read different versions of the data formats. Software and tools developed on top of the JAMI framework are able to integrate and support both PSI-MI XML and PSI-MITAB. The use of JAMI avoids the requirement to chain conversions between formats in order to reach a desired output format and prevents code and unit test duplication as the code becomes more modular. JAMI's model interfaces are abstracted from the underlying format, hiding the complexity and requirements of each data format from developers using JAMI as a library.
CWG - MUTUAL COUPLING PROGRAM FOR CIRCULAR WAVEGUIDE-FED APERTURE ARRAY (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Bailey, M. C.
1994-01-01
Mutual Coupling Program for Circular Waveguide-fed Aperture Array (CWG) was developed to calculate the electromagnetic interaction between elements of an antenna array of circular apertures with specified aperture field distributions. The field distributions were assumed to be a superposition of the modes which could exist in a circular waveguide. Various external media were included to provide flexibility of use, for example, the flexibility to determine the effects of dielectric covers (i.e., thermal protection system tiles) upon the impedance of aperture type antennas. The impedance and radiation characteristics of planar array antennas depend upon the mutual interaction between all the elements of the array. These interactions are influenced by several parameters (e.g., the array grid geometry, the geometry and excitation of each array element, the medium outside the array, and the internal network feeding the array.) For the class of array antenna whose radiating elements consist of small holes in a flat conducting plate, the electromagnetic problem can be divided into two parts, the internal and the external. In solving the external problem for an array of circular apertures, CWG will compute the mutual interaction between various combinations of circular modal distributions and apertures. CWG computes the mutual coupling between various modes assumed to exist in circular apertures that are located in a flat conducting plane of infinite dimensions. The apertures can radiate into free space, a homogeneous medium, a multilayered region or a reflecting surface. These apertures are assumed to be excited by one or more modes corresponding to the modal distributions in circular waveguides of the same cross sections as the apertures. The apertures may be of different sizes and also of different polarizations. However, the program assumes that each aperture field contains the same modal distributions, and calculates the complex scattering matrix between all mode and aperture combinations. The scattering matrix can then be used to determine the complex modal field amplitudes for each aperture with a specified array excitation. CWG is written in VAX FORTRAN for DEC VAX series computers running VMS (LAR-15236) and IBM PC series and compatible computers running MS-DOS (LAR-15226). It requires 360K of RAM for execution. To compile the source code for the PC version, the NDP Fortran compiler and linker will be required; however, the distribution medium for the PC version of CWG includes a sample MS-DOS executable which was created using NDP Fortran with the -vms compiler option. The standard distribution medium for the PC version of CWG is a 3.5 inch 1.44Mb MS-DOS format diskette. The standard distribution medium for the VAX version of CWG is a 1600 BPI 9track magnetic tape in DEC VAX BACKUP format. The VAX version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. Both machine versions of CWG include an electronic version of the documentation in Microsoft Word for Windows format. CWG was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
CWG - MUTUAL COUPLING PROGRAM FOR CIRCULAR WAVEGUIDE-FED APERTURE ARRAY (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Bailey, M. C.
1994-01-01
Mutual Coupling Program for Circular Waveguide-fed Aperture Array (CWG) was developed to calculate the electromagnetic interaction between elements of an antenna array of circular apertures with specified aperture field distributions. The field distributions were assumed to be a superposition of the modes which could exist in a circular waveguide. Various external media were included to provide flexibility of use, for example, the flexibility to determine the effects of dielectric covers (i.e., thermal protection system tiles) upon the impedance of aperture type antennas. The impedance and radiation characteristics of planar array antennas depend upon the mutual interaction between all the elements of the array. These interactions are influenced by several parameters (e.g., the array grid geometry, the geometry and excitation of each array element, the medium outside the array, and the internal network feeding the array.) For the class of array antenna whose radiating elements consist of small holes in a flat conducting plate, the electromagnetic problem can be divided into two parts, the internal and the external. In solving the external problem for an array of circular apertures, CWG will compute the mutual interaction between various combinations of circular modal distributions and apertures. CWG computes the mutual coupling between various modes assumed to exist in circular apertures that are located in a flat conducting plane of infinite dimensions. The apertures can radiate into free space, a homogeneous medium, a multilayered region or a reflecting surface. These apertures are assumed to be excited by one or more modes corresponding to the modal distributions in circular waveguides of the same cross sections as the apertures. The apertures may be of different sizes and also of different polarizations. However, the program assumes that each aperture field contains the same modal distributions, and calculates the complex scattering matrix between all mode and aperture combinations. The scattering matrix can then be used to determine the complex modal field amplitudes for each aperture with a specified array excitation. CWG is written in VAX FORTRAN for DEC VAX series computers running VMS (LAR-15236) and IBM PC series and compatible computers running MS-DOS (LAR-15226). It requires 360K of RAM for execution. To compile the source code for the PC version, the NDP Fortran compiler and linker will be required; however, the distribution medium for the PC version of CWG includes a sample MS-DOS executable which was created using NDP Fortran with the -vms compiler option. The standard distribution medium for the PC version of CWG is a 3.5 inch 1.44Mb MS-DOS format diskette. The standard distribution medium for the VAX version of CWG is a 1600 BPI 9track magnetic tape in DEC VAX BACKUP format. The VAX version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. Both machine versions of CWG include an electronic version of the documentation in Microsoft Word for Windows format. CWG was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
The implementation and use of Ada on distributed systems with high reliability requirements
NASA Technical Reports Server (NTRS)
Knight, J. C.
1987-01-01
A preliminary analysis of the Ada implementation of the Advanced Transport Operating System (ATOPS), an experimental computer control system developed at NASA Langley for a modified Boeing 737 aircraft, is presented. The criteria that was determined for the evaluation of this approach is described. A preliminary version of the requirements for the ATOPS is contained. This requirements specification is not a formal document, but rather a description of certain aspects of the ATOPS system at a level of detail that best suits the needs of the research. The survey of backward error recovery techniques is also presented.
The 25 kW power module evolution study. Part 1: Payload requirements and growth scenarios
NASA Technical Reports Server (NTRS)
1978-01-01
Payload power level requirements and their general impact on the baseline and growth versions of the 25 kW power module during the 1983 to 1990 period are discussed. Extended duration Orbiter sortie flight, supported by a power module, with increased payload power requirements per flight, and free-flyer payload missions are included. Other payload disciplines considered, but not emphasized for the 1983 to 1986 period include astrophysics/astronomy, earth observations, solar power satellite, and life sciences. Of these, only the solar power satellite is a prime driver for the power module.
NASA Technical Reports Server (NTRS)
Oza, Nikunji C.
2005-01-01
Bagging and boosting are two of the most well-known ensemble learning methods due to their theoretical performance guarantees and strong experimental results. However, these algorithms have been used mainly in batch mode, i.e., they require the entire training set to be available at once and, in some cases, require random access to the data. In this paper, we present online versions of bagging and boosting that require only one pass through the training data. We build on previously presented work by presenting some theoretical results. We also compare the online and batch algorithms experimentally in terms of accuracy and running time.
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
ES-doc-errata: an issue tracker platform for CMIP6
NASA Astrophysics Data System (ADS)
Ben Nasser, Atef; Levavasseur, Guillaume; Greenslade, Mark; Denvil, Sébastien
2017-04-01
In the context of overseeing the quality of data, and as a result of the inherent complexity of projects such as CMIP5/6, it is a mandatory task to keep track of the status of datasets and the version evolution they sustain in their life-cycle. The ESdoc-errata project aims to keep track of the issues affecting specific versions of datasets/files. It enables users to resolve the history tree of each dataset/file enabling a better choice of the data used in their work based on the data status. The ES-doc-errata project has been designed and built on top of the Parent-IDentifiers handle service that will be deployed in the next iteration of the CMIP project, by ensuring maximum usability of ESGF ecosystem and encapsulated in the ES-doc structure. Consuming PIDs from handle service is guided by a specifically built algorithm that extracts meta-data regarding the issues that may or may not affect the quality of datasets/files and cause newer version to be published replacing older deprecated versions. This algorithm is able to deduce the nature of the flaws to the file granularity, that is of high value to the end-user. This new platform has been designed keeping in mind usability by end-users specialized in the data publishing process or other scientists requiring feedback on reliability of data required for their work. To this end, a specific set of rules and a code of conduct has been defined. A validation process ensures the quality of this newly introduced errata meta-data , an authentication safe-guard was implemented to prevent tampering with the archived data, and a wide variety of tools were put at users disposal to interact safely with the platform including a command-line client and a dedicated front-end.
Development of Responder Definitions for Fibromyalgia Clinical Trials
Arnold, Lesley M.; Williams, David A.; Hudson, James I.; Martin, Susan A.; Clauw, Daniel J.; Crofford, Leslie J.; Wang, Fujun; Emir, Birol; Lai, Chinglin; Zablocki, Rong; Mease, Philip J.
2011-01-01
Objective To develop responder definitions for fibromyalgia clinical trials using key symptom and functional domains. Methods 24 candidate responder definitions were developed by expert consensus and evaluated in 12 randomized, placebo-controlled fibromyalgia trials of 4 medications. For each definition, treatment effects of the medication compared with placebo were analyzed using the Cochran-Mantel-Haenszel test or Chi Square test. A meta-analysis of the pooled results for the 4 medications established risk ratios to determine the definitions that best favored medication over placebo. Results Two definitions performed best in the analyses. Both definitions included ≥ 30% reduction in pain and ≥ 10% improvement in physical function. They differed in that one (FM30 short version) included ≥ 30% improvement in sleep or fatigue, and the other (FM30 long version) required ≥ 30% improvement in 2 of the following symptoms: sleep, fatigue, depression, anxiety, or cognition. In the analysis of both versions, the response rate was ≥ 15% for each medication and significantly greater than placebo. The risk ratio favoring drug over placebo (95% CI) in the pooled analysis for the FM30 short version was 1.50 (1.24, 1.82), P ≤ 0.0001; the FM30 long version was 1.60 (1.31, 1.96), P ≤ 0.00001. Conclusion Among the 24 responder definitions tested, 2 were identified as most sensitive in identifying response to treatment. The identification of responder definitions for fibromyalgia clinical trials that include assessments of key symptom and functional domains may improve the sensitivity of clinical trials to identify meaningful improvements, leading to improved management of fibromyalgia. PMID:21953205
Arias, María Luisa Flores; Champion, Jane Dimmitt; Soto, Norma Elva Sáenz
2017-08-01
Development of a Spanish Version Contraceptive Self-efficacy Scale for use among heterosexual Mexican populations of reproductive age inclusive of 18-35years. Methods of family planning have decreased in Mexico which may lead to an increase in unintended pregnancies. Contraceptive self-efficacy is considered a predictor and precursor for use of family planning methods. Cross-sectional, descriptive study design was used to assess contraceptive self-efficacy among a heterosexual Mexican population (N=160) of reproductive age (18-35years). Adaptation of a Spanish Version Contraceptive Self-efficacy scale was conducted prior to instrument administration. Exploratory and confirmatory factorial analyses identified seven factors with a variance of 72.812%. The adapted scale had a Cronbach alpha of 0.771. A significant correlation between the Spanish Version Contraceptive Self-efficacy Scale and the use of family planning methods was identified. The Spanish Version Contraceptive Self-efficacy scale has an acceptable Cronbach alpha. Exploratory factor analysis identified 7 components. A positive correlation between self-reported contraceptive self-efficacy and family planning method use was identified. This scale may be used among heterosexual Mexican men and women of reproductive age. The factor analysis (7 factors versus 4 factors for the original scale) identified a discrepancy for interpretation of the Spanish versus English language versions. Interpretation of findings obtained via the Spanish versión among heterosexual Mexican men and women of reproductive age require interpretation based upon these differences identified in these analyses. Copyright © 2017 Elsevier Inc. All rights reserved.
TRMM Version 7 Near-Realtime Data Products
NASA Technical Reports Server (NTRS)
Tocker, Erich Franz; Kelley, Owen
2012-01-01
The TRMM data system has been providing near-realtime data products to the community since late 1999. While the TRMM project never had near-realtime production requirements, the science and applications communities had a great interest in receiving TRMM data as quickly as possible. As a result these NRT data are provided under a best-effort scenario but with the objective of having the swath data products available within three hours of data collection 90% of the time. In July of 2011 the Joint Precipitation Measurement Missions Science Team (JPST) authorized the reprocessing of TRMM mission data using the new version 7 algorithms. The reprocessing of the 14+ years of the mission was concluded within 30 days. Version 7 algorithms had substantial changes in the data product file formats both for data and metadata. In addition, the algorithms themselves had major modifications and improvements. The general approach to versioning up the NRT is to wait for the regular production algorithms to have run for a while and shake out any issues that might arise from the new version before updating the NRT products. Because of the substantial changes in data/metadata formats as well as the algorithm improvements themselves, the update of NRT to V7 followed an even more conservative path than usual. This was done to ensure that applications agencies and other users of the TRMM NRT would not be faces with short-timeframes for conversion to the new format. This paper will describe the process by which the TRMM NRT was updated to V7 and the V7 data products themselves.
IDC Re-Engineering Phase 2 System Requirements Document Version 1.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Burns, John F.; Satpathi, Meara Allena
This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less
PSD Review Requirements for Modified Petroleum Refineries
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Interim Implementation of NSR Requirements for PM2.5
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Applicability of NSPS and PSD Requirements to a Proposed Fuel Conversion
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Underwater Inspection of Navigation Structures with an Acoustic Camera
2013-08-01
the camera with a slow angular speed while recording the images. 5. After the scanning has been performed, review recorded data to determine the...Core x86) or newer 2GB RAM 120GB disc space Operating system requirements Windows XP, Vista, Windows 7, 32/64 bit Java requirements Sun... Java JDK, Version 1.6, Update 16 or newer, for installation Limitations and tips for proper scanning Best results are achieved when scanning in
Letter Addressing EPA's Requirements for Nonattainment Areas
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
RACT Requirements in Ozone Nonattainment Areas
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Offsets Required Prior to Permit Issuance
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Agreement that the PSD Regulations Require a Source to Commence Construction
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
PSD Permit Requirements and Applicability of New and Revised NAAQS
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Guidance On Enforcement of PSD Requirements Under the Clean Air Act
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Applicability of PSD Permitting Requirements, Wellcraft Marine Corporation, Sarasota, Florida
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Applicability of PSD Requirements to Asphalt Plants
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
An Analysis of USMC Facilities Support Contracts with a Focus on Base Maintenance
2017-06-01
for goods and services . In 2016, the Marine Corps was authorized $5.2 billion in Operating Forces (BA-1) funding (Office of the Secretary of the Navy...Association (NCMA, Version 1.0, n.d.) describes pre-award phase activities as “shaping the customer requirements for products or services , and then...will almost never go above the base requirement without an incentive to do so. The DOD’s Guidebook for the Acquisition of Services offers sage advice
PSD Requirements for Reactivated Sources
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
PSD Reconstruction Requirements American Cyanamid Company
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Federal Land Managers Notification and Visibility Assessment Requirements for PSD Permitting
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Offset Requirements for U.S. Steel's Fairfield Modernization
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
TOPEX Project Radar Altimeter Development Requirements and Specifications, Version 6.0
NASA Technical Reports Server (NTRS)
Rossi, Laurence C.
2003-01-01
This document provides the guidelines by which the TOPEX Radar Altimeter hardware development effort for the TOPEX flight project shall be implemented and conducted. The conduct of this activity shall take maximum advantage of the efforts expended during the TOPEX Radar Altimeter Advanced Technology Model development program and other related Radar Altimeter development efforts. This document complies with the TOPEX Project Office document 633-420 (D-2218), entitled, "TOPEX Project Requirements and Constraints for the NASA Radar Altimeter" dated December 1987.
2011-01-01
Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive computer game based on virtual reality was developed to evaluate the performance of the players. The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Results Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected. Conclusions The game proved to be a user-friendly tool capable to detect and quantify the influence of color on the performance of people executing tasks that require attention and showed to be attractive for people with ADHD. PMID:21854630
Silva, Alessandro P; Frère, Annie F
2011-08-19
Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. An interactive computer game based on virtual reality was developed to evaluate the performance of the players.The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected. The game proved to be a user-friendly tool capable to detect and quantify the influence of color on the performance of people executing tasks that require attention and showed to be attractive for people with ADHD.