Optical Design Using Small Dedicated Computers
NASA Astrophysics Data System (ADS)
Sinclair, Douglas C.
1980-09-01
Since the time of the 1975 International Lens Design Conference, we have developed a series of optical design programs for Hewlett-Packard desktop computers. The latest programs in the series, OSLO-25G and OSLO-45G, have most of the capabilities of general-purpose optical design programs, including optimization based on exact ray-trace data. The computational techniques used in the programs are similar to ones used in other programs, but the creative environment experienced by a designer working directly with these small dedicated systems is typically much different from that obtained with shared-computer systems. Some of the differences are due to the psychological factors associated with using a system having zero running cost, while others are due to the design of the program, which emphasizes graphical output and ease of use, as opposed to computational speed.
"Hour of Code": Can It Change Students' Attitudes toward Programming?
ERIC Educational Resources Information Center
Du, Jie; Wimmer, Hayden; Rada, Roy
2016-01-01
The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…
ERIC Educational Resources Information Center
Gercek, Gokhan; Saleem, Naveed
2006-01-01
Providing adequate computing lab support for Management Information Systems (MIS) and Computer Science (CS) programs is a perennial challenge for most academic institutions in the US and abroad. Factors, such as lack of physical space, budgetary constraints, conflicting needs of different courses, and rapid obsolescence of computing technology,…
NASA Astrophysics Data System (ADS)
Balac, Stéphane; Fernandez, Arnaud
2016-02-01
The computer program SPIP is aimed at solving the Generalized Non-Linear Schrödinger equation (GNLSE), involved in optics e.g. in the modelling of light-wave propagation in an optical fibre, by the Interaction Picture method, a new efficient alternative method to the Symmetric Split-Step method. In the SPIP program a dedicated costless adaptive step-size control based on the use of a 4th order embedded Runge-Kutta method is implemented in order to speed up the resolution.
Distributed computing feasibility in a non-dedicated homogeneous distributed system
NASA Technical Reports Server (NTRS)
Leutenegger, Scott T.; Sun, Xian-He
1993-01-01
The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.
Real time software for a heat recovery steam generator control system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdes, R.; Delgadillo, M.A.; Chavez, R.
1995-12-31
This paper is addressed to the development and successful implementation of a real time software for the Heat Recovery Steam Generator (HRSG) control system of a Combined Cycle Power Plant. The real time software for the HRSG control system physically resides in a Control and Acquisition System (SAC) which is a component of a distributed control system (DCS). The SAC is a programmable controller. The DCS installed at the Gomez Palacio power plant in Mexico accomplishes the functions of logic, analog and supervisory control. The DCS is based on microprocessors and the architecture consists of workstations operating as a Man-Machinemore » Interface (MMI), linked to SAC controllers by means of a communication system. The HRSG real time software is composed of an operating system, drivers, dedicated computer program and application computer programs. The operating system used for the development of this software was the MultiTasking Operating System (MTOS). The application software developed at IIE for the HRSG control system basically consisted of a set of digital algorithms for the regulation of the main process variables at the HRSG. By using the multitasking feature of MTOS, the algorithms are executed pseudo concurrently. In this way, the applications programs continuously use the resources of the operating system to perform their functions through a uniform service interface. The application software of the HRSG consist of three tasks, each of them has dedicated responsibilities. The drivers were developed for the handling of hardware resources of the SAC controller which in turn allows the signals acquisition and data communication with a MMI. The dedicated programs were developed for hardware diagnostics, task initializations, access to the data base and fault tolerance. The application software and the dedicated software for the HRSG control system was developed using C programming language due to compactness, portability and efficiency.« less
Business aspects of cardiovascular computed tomography: tackling the challenges.
Bateman, Timothy M
2008-01-01
The purpose of this article is to provide a comprehensive understanding of the business issues surrounding provision of dedicated cardiovascular computed tomographic imaging. Some of the challenges include high up-front costs, current low utilization relative to scanner capability, and inadequate payments. Cardiovascular computed tomographic imaging is a valuable clinical modality that should be offered by cardiovascular centers-of-excellence. With careful consideration of the business aspects, moderate-to-large size cardiology programs should be able to implement an economically viable cardiovascular computed tomographic service.
Propulsion/flight control integration technology (PROFIT) software system definition
NASA Technical Reports Server (NTRS)
Carlin, C. M.; Hastings, W. J.
1978-01-01
The Propulsion Flight Control Integration Technology (PROFIT) program is designed to develop a flying testbed dedicated to controls research. The control software for PROFIT is defined. Maximum flexibility, needed for long term use of the flight facility, is achieved through a modular design. The Host program, processes inputs from the telemetry uplink, aircraft central computer, cockpit computer control and plant sensors to form an input data base for use by the control algorithms. The control algorithms, programmed as application modules, process the input data to generate an output data base. The Host program formats the data for output to the telemetry downlink, the cockpit computer control, and the control effectors. Two applications modules are defined - the bill of materials F-100 engine control and the bill of materials F-15 inlet control.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel
NASA Technical Reports Server (NTRS)
Fox, C. H., Jr.
1980-01-01
The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
NASA/FAA North Texas Research Station Overview
NASA Technical Reports Server (NTRS)
Borchers, Paul F.
2012-01-01
NTX Research Staion: NASA research assets embedded in an interesting operational air transport environment. Seven personnel (2 civil servants, 5 contractors). ARTCC, TRACON, Towers, 3 air carrier AOCs(American, Eagle and Southwest), and 2 major airports all within 12 miles. Supports NASA Airspace Systems Program with research products at all levels (fundamental to system level). NTX Laboratory: 5000 sq ft purpose-built, dedicated, air traffic management research facility. Established data links to ARTCC, TRACON, Towers, air carriers, airport and NASA facilities. Re-configurable computer labs, dedicated radio tower, state-of-the-art equipment.
The Lister Hill National Center for Biomedical Communications.
Smith, K A
1994-09-01
On August 3, 1968, the Joint Resolution of the Congress established the program and construction of the Lister Hill National Center for Biomedical Communications. The facility dedicated in 1980 contains the latest in computer and communications technologies. The history, program requirements, construction management, and general planning are discussed including technical issues regarding cabling, systems functions, heating, ventilation, and air conditioning system (HVAC), fire suppression, research and development laboratories, among others.
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
Microcomputer programming skills
NASA Technical Reports Server (NTRS)
Barth, C. W.
1979-01-01
Some differences in skill and techniques required for conversion from programmer to microprogrammer are discussed. The primary things with which the programmer should work are hardware architecture, hardware/software trade off, and interfacing. The biggest differences, however, will stem from the differences in applications than from differences in machine size. The change to real-time programming is the most important of these differences, particularly on dedicated microprocessors. Another primary change is programming with a more computer-naive user in mind, and dealing with his limitations and expectations.
Analysis on laser plasma emission for characterization of colloids by video-based computer program
NASA Astrophysics Data System (ADS)
Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni
2016-02-01
Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.
Specialized computer architectures for computational aerodynamics
NASA Technical Reports Server (NTRS)
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
Nearly Interactive Parabolized Navier-Stokes Solver for High Speed Forebody and Inlet Flows
NASA Technical Reports Server (NTRS)
Benson, Thomas J.; Liou, May-Fun; Jones, William H.; Trefny, Charles J.
2009-01-01
A system of computer programs is being developed for the preliminary design of high speed inlets and forebodies. The system comprises four functions: geometry definition, flow grid generation, flow solver, and graphics post-processor. The system runs on a dedicated personal computer using the Windows operating system and is controlled by graphical user interfaces written in MATLAB (The Mathworks, Inc.). The flow solver uses the Parabolized Navier-Stokes equations to compute millions of mesh points in several minutes. Sample two-dimensional and three-dimensional calculations are demonstrated in the paper.
Company's Data Security - Case Study
NASA Astrophysics Data System (ADS)
Stera, Piotr
This paper describes a computer network and data security problems in an existing company. Two main issues were pointed out: data loss protection and uncontrolled data copying. Security system was designed and implemented. The system consists of many dedicated programs. This system protect from data loss and detected unauthorized file copying from company's server by a dishonest employee.
ERIC Educational Resources Information Center
Lenne, Dominique; Abel, Marie-Helene; Trigano, Philippe; Leblanc, Adeline
2008-01-01
In Technology Enhanced Learning Environments, self-regulated learning (SRL) partly relies on the features of the technological tools. The authors present two environments they designed in order to facilitate SRL: the first one (e-Dalgo) is a website dedicated to the learning of algorithms and computer programming. It is structured as a classical…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langer, S; Rotman, D; Schwegler, E
The Institutional Computing Executive Group (ICEG) review of FY05-06 Multiprogrammatic and Institutional Computing (M and IC) activities is presented in the attached report. In summary, we find that the M and IC staff does an outstanding job of acquiring and supporting a wide range of institutional computing resources to meet the programmatic and scientific goals of LLNL. The responsiveness and high quality of support given to users and the programs investing in M and IC reflects the dedication and skill of the M and IC staff. M and IC has successfully managed serial capacity, parallel capacity, and capability computing resources.more » Serial capacity computing supports a wide range of scientific projects which require access to a few high performance processors within a shared memory computer. Parallel capacity computing supports scientific projects that require a moderate number of processors (up to roughly 1000) on a parallel computer. Capability computing supports parallel jobs that push the limits of simulation science. M and IC has worked closely with Stockpile Stewardship, and together they have made LLNL a premier institution for computational and simulation science. Such a standing is vital to the continued success of laboratory science programs and to the recruitment and retention of top scientists. This report provides recommendations to build on M and IC's accomplishments and improve simulation capabilities at LLNL. We recommend that institution fully fund (1) operation of the atlas cluster purchased in FY06 to support a few large projects; (2) operation of the thunder and zeus clusters to enable 'mid-range' parallel capacity simulations during normal operation and a limited number of large simulations during dedicated application time; (3) operation of the new yana cluster to support a wide range of serial capacity simulations; (4) improvements to the reliability and performance of the Lustre parallel file system; (5) support for the new GDO petabyte-class storage facility on the green network for use in data intensive external collaborations; and (6) continued support for visualization and other methods for analyzing large simulations. We also recommend that M and IC begin planning in FY07 for the next upgrade of its parallel clusters. LLNL investments in M and IC have resulted in a world-class simulation capability leading to innovative science. We thank the LLNL management for its continued support and thank the M and IC staff for its vision and dedicated efforts to make it all happen.« less
Program For A Pushbutton Display
NASA Technical Reports Server (NTRS)
Busquets, Anthony M.; Luck, William S., Jr.
1989-01-01
Programmable Display Pushbutton (PDP) is pushbutton device available from Micro Switch having programmable 16X35 matrix of light-emitting diodes on pushbutton surface. Any desired legends display on PDP's, producing user-friendly applications reducing need for dedicated manual controls. Interacts with operator, calls for correct response before transmitting next message. Both simple manual control and sophisticated programmable link between operator and host system. Programmable Display Pushbutton Legend Editor (PDPE) computer program used to create light-emitting-diode (LED) displays for pushbuttons. Written in FORTRAN.
Methods and principles for determining task dependent interface content
NASA Technical Reports Server (NTRS)
Shalin, Valerie L.; Geddes, Norman D.; Mikesell, Brian G.
1992-01-01
Computer generated information displays provide a promising technology for offsetting the increasing complexity of the National Airspace System. To realize this promise, however, we must extend and adapt the domain-dependent knowledge that informally guides the design of traditional dedicated displays. In our view, the successful exploitation of computer generated displays revolves around the idea of information management, that is, the identification, organization, and presentation of relevant and timely information in a complex task environment. The program of research that is described leads to methods and principles for information management in the domain of commercial aviation. The multi-year objective of the proposed program of research is to develop methods and principles for determining task dependent interface content.
Programming for energy monitoring/display system in multicolor lidar system research
NASA Technical Reports Server (NTRS)
Alvarado, R. C., Jr.; Allen, R. J.
1982-01-01
The Z80 microprocessor based computer program that directs and controls the operation of the six channel energy monitoring/display system that is a part of the NASA Multipurpose Airborne Differential Absorption Lidar (DIAL) system is described. The program is written in the Z80 assembly language and is located on EPROM memories. All source and assembled listings of the main program, five subroutines, and two service routines along with flow charts and memory maps are included. A combinational block diagram shows the interfacing (including port addresses) between the six power sensors, displays, front panel controls, the main general purpose minicomputer, and this dedicated microcomputer system.
A Dedicated NEO Follow-up Program for the Southern Hemisphere
NASA Astrophysics Data System (ADS)
van Altena, W. F.; Bailyn, C. D.; Girard, T. M.; Rabinowitz, D.; Branham, R. L.; Hicks, M.; Lopez, C. E.
2001-11-01
We describe an ongoing program dedicated to the observation of NEOs found by the northern discovery programs and whose tracks carry them into the Southern Hemisphere. We are observing the NEOs, to determine their positions, compute improved orbits and submit them to the Minor Planet Center over the Internet. Alerts of needed observations are monitored on relevant Web pages and e-mail messages from our collaborators at the northern discovery programs. The observations are made at the Cesco Observatory at El Leoncito, Argentina with the 0.5-meter double astrograph and/or at CTIO with the 1.0-meter YALO telescope, depending on the magnitude of the NEO and the photometric requirements for the specific NEO. The double astrograph at El Leoncito observes simultaneous CCD B and V photometry and astrometry for those NEO's brighter than 20, while the YALO observes those brighter than 21.5. YALO also provides simultaneous V and IR photometry and astrometry. All YALO observations are ftp'd to San Juan for astrometric reduction and then a revised orbit is computed from the new and existing observations in Mendoza and a decision is made to retarget our observations if necessary. If so, the El Leoncito and/or YALO observers are notified and provided with an improved ephemeris. The final positions and photometry are then forwarded to the MPC, MPEC and our collaborators. To date, we have reported the positions of over 2000 asteroids, 61 comets and 142 NEOs.
Heterogeneous Hardware Parallelism Review of the IN2P3 2016 Computing School
NASA Astrophysics Data System (ADS)
Lafage, Vincent
2017-11-01
Parallel and hybrid Monte Carlo computation. The Monte Carlo method is the main workhorse for computation of particle physics observables. This paper provides an overview of various HPC technologies that can be used today: multicore (OpenMP, HPX), manycore (OpenCL). The rewrite of a twenty years old Fortran 77 Monte Carlo will illustrate the various programming paradigms in use beyond language implementation. The problem of parallel random number generator will be addressed. We will give a short report of the one week school dedicated to these recent approaches, that took place in École Polytechnique in May 2016.
Use of commercial grade item dedication to reduce procurement costs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosch, F.
1995-09-01
In the mid-1980s, the Nuclear Regulatory Industry (NRC) began inspecting utility practices of procuring and dedicating commercial grade items intended for plant safety-related applications. As a result of the industry efforts to address NRC concerns, nuclear utilities have enhanced existing programs and procedures for dedication of commercial grade items. Though these programs were originally enhanced to meet NRC concerns, utilities have discovered that the dedication of commercial grade items can also reduce overall procurement costs. This paper will discuss the enhancement of utility dedication programs and demonstrates how utilities have utilized them to reduce procurement costs.
High Performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions
2016-08-30
High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions A dedicated high-performance computer cluster was...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Computer cluster ...peer-reviewed journals: Final Report: High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions Report Title A dedicated
NASA Astrophysics Data System (ADS)
Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei
2018-02-01
The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.
Instrumentino: An Open-Source Software for Scientific Instruments.
Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C
2015-01-01
Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.
2011-08-13
CAPE CANAVERAL, Fla. -- Thousands of space shuttle workers and their families watch a Starfire Night Skyshow at the “We Made History! Shuttle Program Celebration,” Aug. 13, at the Kennedy Space Center Visitor Complex, Fla. The event was held to honor shuttle workers’ dedication to NASA’s Space Shuttle Program and to celebrate 30 years of space shuttle achievements. The show featured spectacular night aerobatics with special computer-controlled lighting and firework effects on a plane flown by experienced pilot Bill Leff. The event also featured food, music, entertainment, astronaut appearances, educational activities and giveaways. Photo credit: Jim Grossmann
2011-08-13
CAPE CANAVERAL, Fla. -- A Starfire Night Skyshow takes place above the Kennedy Space Center Visitor Complex in Florida during the “We Made History! Shuttle Program Celebration” on Aug. 13. The event was held to honor shuttle workers’ dedication to NASA’s Space Shuttle Program and to celebrate 30 years of space shuttle achievements. The show featured spectacular night aerobatics with special computer-controlled lighting and firework effects on a plane flown by experienced pilot Bill Leff. The event also featured food, music, entertainment, astronaut appearances, educational activities and giveaways. Photo credit: Jim Grossmann
47 CFR 69.125 - Dedicated signalling transport.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 3 2010-10-01 2010-10-01 false Dedicated signalling transport. 69.125 Section... (CONTINUED) ACCESS CHARGES Computation of Charges § 69.125 Dedicated signalling transport. (a) Dedicated signalling transport shall consist of two elements, a signalling link charge and a signalling transfer point...
47 CFR 69.125 - Dedicated signalling transport.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 3 2011-10-01 2011-10-01 false Dedicated signalling transport. 69.125 Section... (CONTINUED) ACCESS CHARGES Computation of Charges § 69.125 Dedicated signalling transport. (a) Dedicated signalling transport shall consist of two elements, a signalling link charge and a signalling transfer point...
47 CFR 69.125 - Dedicated signalling transport.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 3 2012-10-01 2012-10-01 false Dedicated signalling transport. 69.125 Section... (CONTINUED) ACCESS CHARGES Computation of Charges § 69.125 Dedicated signalling transport. (a) Dedicated signalling transport shall consist of two elements, a signalling link charge and a signalling transfer point...
47 CFR 69.125 - Dedicated signalling transport.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 3 2013-10-01 2013-10-01 false Dedicated signalling transport. 69.125 Section... (CONTINUED) ACCESS CHARGES Computation of Charges § 69.125 Dedicated signalling transport. (a) Dedicated signalling transport shall consist of two elements, a signalling link charge and a signalling transfer point...
Models@Home: distributed computing in bioinformatics using a screensaver based approach.
Krieger, Elmar; Vriend, Gert
2002-02-01
Due to the steadily growing computational demands in bioinformatics and related scientific disciplines, one is forced to make optimal use of the available resources. A straightforward solution is to build a network of idle computers and let each of them work on a small piece of a scientific challenge, as done by Seti@Home (http://setiathome.berkeley.edu), the world's largest distributed computing project. We developed a generally applicable distributed computing solution that uses a screensaver system similar to Seti@Home. The software exploits the coarse-grained nature of typical bioinformatics projects. Three major considerations for the design were: (1) often, many different programs are needed, while the time is lacking to parallelize them. Models@Home can run any program in parallel without modifications to the source code; (2) in contrast to the Seti project, bioinformatics applications are normally more sensitive to lost jobs. Models@Home therefore includes stringent control over job scheduling; (3) to allow use in heterogeneous environments, Linux and Windows based workstations can be combined with dedicated PCs to build a homogeneous cluster. We present three practical applications of Models@Home, running the modeling programs WHAT IF and YASARA on 30 PCs: force field parameterization, molecular dynamics docking, and database maintenance.
Automatic translation of digraph to fault-tree models
NASA Technical Reports Server (NTRS)
Iverson, David L.
1992-01-01
The author presents a technique for converting digraph models, including those models containing cycles, to a fault-tree format. A computer program which automatically performs this translation using an object-oriented representation of the models has been developed. The fault-trees resulting from translations can be used for fault-tree analysis and diagnosis. Programs to calculate fault-tree and digraph cut sets and perform diagnosis with fault-tree models have also been developed. The digraph to fault-tree translation system has been successfully tested on several digraphs of varying size and complexity. Details of some representative translation problems are presented. Most of the computation performed by the program is dedicated to finding minimal cut sets for digraph nodes in order to break cycles in the digraph. Fault-trees produced by the translator have been successfully used with NASA's Fault-Tree Diagnosis System (FTDS) to produce automated diagnostic systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandes, Ana; Pereira, Rita C.; Sousa, Jorge
The Instituto de Plasmas e Fusao Nuclear (IPFN) has developed dedicated re-configurable modules based on field programmable gate array (FPGA) devices for several nuclear fusion machines worldwide. Moreover, new Advanced Telecommunication Computing Architecture (ATCA) based modules developed by IPFN are already included in the ITER catalogue. One of the requirements for re-configurable modules operating in future nuclear environments including ITER is the remote update capability. Accordingly, this work presents an alternative method for FPGA remote programing to be implemented in new ATCA based re-configurable modules. FPGAs are volatile devices and their programming code is usually stored in dedicated flash memoriesmore » for properly configuration during module power-on. The presented method is capable to store new FPGA codes in Serial Peripheral Interface (SPI) flash memories using the PCIexpress (PCIe) network established on the ATCA back-plane, linking data acquisition endpoints and the data switch blades. The method is based on the Xilinx Quick Boot application note, adapted to PCIe protocol and ATCA based modules. (authors)« less
Torres, Daniel; Gugala, Zbigniew; Lindsey, Ronald W
2015-04-01
Programs seek to expose trainees to research during residency. However, little is known in any formal sense regarding how to do this effectively, or whether these efforts result in more or better-quality research output. The objective of our study was to evaluate a dedicated resident research program in terms of the quantity and quality of resident research peer-reviewed publications. Specifically we asked: (1) Did residents mentored through a dedicated resident research program have more peer-reviewed publications in higher-impact journals with higher citation rates compared with residents who pursued research projects under a less structured approach? (2) Did this effect continue after graduation? In 2006, our department of orthopaedic surgery established a dedicated resident research program, which consisted of a new research policy and a research committee to monitor quality and compliance with this policy. Peer-reviewed publications (determined from PubMed) of residents who graduated 6 years before establishing the dedicated resident research program were compared with publications from an equal period of the research-program-directed residents. The data were assessed using descriptive statistics and regression analysis. Twenty-four residents graduated from 2001 to 2006 (before implementation of the dedicated resident research program); 27 graduated from 2007 to 2012 (after implementation of the dedicated resident research program). There were 74 eligible publications as defined by the study inclusion and exclusion criteria. Residents who trained after implementation of the dedicated resident research program published more papers during residency than did residents who trained before the program was implemented (1.15 versus 0.79 publications per resident; 95% CI [0.05,0.93]; p = 0.047) and the journal impact factor was greater in the group that had the research program (1.25 versus 0.55 per resident; 95% CI [0.2,1.18]; p = 0.005). There were no differences between postresidency publications by trainees who graduated with versus without the research program in the number of publications, citations, and average journal impact factor per resident. A regression analysis showed no difference in citation rates of the residents' published papers before and since implementation of the research program. Currently in the United States, there are no standard policies or requirements that dictate how research should be incorporated in orthopaedic surgery residency training programs. The results of our study suggest that implementation of a dedicated resident research program improves the quantity and to some extent quality of orthopaedic resident research publications, but this effect did not persist after graduation.
Declarative Programming with Temporal Constraints, in the Language CG.
Negreanu, Lorina
2015-01-01
Specifying and interpreting temporal constraints are key elements of knowledge representation and reasoning, with applications in temporal databases, agent programming, and ambient intelligence. We present and formally characterize the language CG, which tackles this issue. In CG, users are able to develop time-dependent programs, in a flexible and straightforward manner. Such programs can, in turn, be coupled with evolving environments, thus empowering users to control the environment's evolution. CG relies on a structure for storing temporal information, together with a dedicated query mechanism. Hence, we explore the computational complexity of our query satisfaction problem. We discuss previous implementation attempts of CG and introduce a novel prototype which relies on logic programming. Finally, we address the issue of consistency and correctness of CG program execution, using the Event-B modeling approach.
Heegaard, P M; Holm, A; Hagerup, M
1993-01-01
A personal computer program for the conversion of linear amino acid sequences to multiple, small, overlapping peptide sequences has been developed. Peptide lengths and "jumps" (the distance between two consecutive overlapping peptides) are defined by the user. To facilitate the use of the program for parallel solid-phase chemical peptide syntheses for the synchronous production of multiple peptides, amino acids at each acylation step are laid out by the program in a convenient standard multi-well setup. Also, the total number of equivalents, as well as the derived amount in milligrams (depend-ending on user-defined equivalent weights and molar surplus), of each amino acid are given. The program facilitates the implementation of multipeptide synthesis, e.g., for the elucidation of polypeptide structure-function relationships, and greatly reduces the risk of introducing mistakes at the planning step. It is written in Pascal and runs on any DOS-based personal computer. No special graphic display is needed.
High resolution image processing on low-cost microcomputers
NASA Technical Reports Server (NTRS)
Miller, R. L.
1993-01-01
Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Parallel-aware, dedicated job co-scheduling within/across symmetric multiprocessing nodes
Jones, Terry R.; Watson, Pythagoras C.; Tuel, William; Brenner, Larry; ,Caffrey, Patrick; Fier, Jeffrey
2010-10-05
In a parallel computing environment comprising a network of SMP nodes each having at least one processor, a parallel-aware co-scheduling method and system for improving the performance and scalability of a dedicated parallel job having synchronizing collective operations. The method and system uses a global co-scheduler and an operating system kernel dispatcher adapted to coordinate interfering system and daemon activities on a node and across nodes to promote intra-node and inter-node overlap of said interfering system and daemon activities as well as intra-node and inter-node overlap of said synchronizing collective operations. In this manner, the impact of random short-lived interruptions, such as timer-decrement processing and periodic daemon activity, on synchronizing collective operations is minimized on large processor-count SPMD bulk-synchronous programming styles.
NASA Astrophysics Data System (ADS)
DiSalvo, Elizabeth Betsy
The implementation of a learning environment for young African American males, called the Glitch Game Testers, was launched in 2009. The development of this program was based on formative work that looked at the contrasting use of digital games between young African American males and individuals who chose to become computer science majors. Through analysis of cultural values and digital game play practices, the program was designed to intertwine authentic game development practices and computer science learning. The resulting program employed 25 African American male high school students to test pre-release digital games full-time in the summer and part-time in the school year, with an hour of each day dedicated to learning introductory computer science. Outcomes for persisting in computer science education are remarkable; of the 16 participants who had graduated from high school as of 2012, 12 have gone on to school in computing-related majors. These outcomes, and the participants' enthusiasm for engaging in computing, are in sharp contrast to the crisis in African American male education and learning motivation. The research presented in this dissertation discusses the formative research that shaped the design of Glitch, the evaluation of the implementation of Glitch, and a theoretical investigation of the way in which participants navigated conflicting motivations in learning environments.
NASA Technical Reports Server (NTRS)
Crane, J. M.; Boucek, G. P., Jr.; Smith, W. D.
1986-01-01
A flight management computer (FMC) control display unit (CDU) test was conducted to compare two types of input devices: a fixed legend (dedicated) keyboard and a programmable legend (multifunction) keyboard. The task used for comparison was operation of the flight management computer for the Boeing 737-300. The same tasks were performed by twelve pilots on the FMC control display unit configured with a programmable legend keyboard and with the currently used B737-300 dedicated keyboard. Flight simulator work activity levels and input task complexity were varied during each pilot session. Half of the points tested were previously familiar with the B737-300 dedicated keyboard CDU and half had no prior experience with it. The data collected included simulator flight parameters, keystroke time and sequences, and pilot questionnaire responses. A timeline analysis was also used for evaluation of the two keyboard concepts.
A study of application of remote sensing to river forecasting. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1975-01-01
A project is described whose goal was to define, implement and evaluate a pilot demonstration test to show the practicability of applying remotely sensed data to operational river forecasting in gaged or previously ungaged watersheds. A secondary objective was to provide NASA with documentation describing the computer programs that comprise the streamflow forecasting simulation model used. A computer-based simulation model was adapted to a streamflow forecasting application and implemented in an IBM System/360 Model 44 computer, operating in a dedicated mode, with operator interactive control through a Model 2250 keyboard/graphic CRT terminal. The test site whose hydrologic behavior was simulated is a small basin (365 square kilometers) designated Town Creek near Geraldine, Alabama.
Automated quantitative muscle biopsy analysis system
NASA Technical Reports Server (NTRS)
Castleman, Kenneth R. (Inventor)
1980-01-01
An automated system to aid the diagnosis of neuromuscular diseases by producing fiber size histograms utilizing histochemically stained muscle biopsy tissue. Televised images of the microscopic fibers are processed electronically by a multi-microprocessor computer, which isolates, measures, and classifies the fibers and displays the fiber size distribution. The architecture of the multi-microprocessor computer, which is iterated to any required degree of complexity, features a series of individual microprocessors P.sub.n each receiving data from a shared memory M.sub.n-1 and outputing processed data to a separate shared memory M.sub.n+1 under control of a program stored in dedicated memory M.sub.n.
Computer vision camera with embedded FPGA processing
NASA Astrophysics Data System (ADS)
Lecerf, Antoine; Ouellet, Denis; Arias-Estrada, Miguel
2000-03-01
Traditional computer vision is based on a camera-computer system in which the image understanding algorithms are embedded in the computer. To circumvent the computational load of vision algorithms, low-level processing and imaging hardware can be integrated in a single compact module where a dedicated architecture is implemented. This paper presents a Computer Vision Camera based on an open architecture implemented in an FPGA. The system is targeted to real-time computer vision tasks where low level processing and feature extraction tasks can be implemented in the FPGA device. The camera integrates a CMOS image sensor, an FPGA device, two memory banks, and an embedded PC for communication and control tasks. The FPGA device is a medium size one equivalent to 25,000 logic gates. The device is connected to two high speed memory banks, an IS interface, and an imager interface. The camera can be accessed for architecture programming, data transfer, and control through an Ethernet link from a remote computer. A hardware architecture can be defined in a Hardware Description Language (like VHDL), simulated and synthesized into digital structures that can be programmed into the FPGA and tested on the camera. The architecture of a classical multi-scale edge detection algorithm based on a Laplacian of Gaussian convolution has been developed to show the capabilities of the system.
NASA Technical Reports Server (NTRS)
Farley, Douglas L.
2005-01-01
NASA's Aviation Safety and Security Program is pursuing research in on-board Structural Health Management (SHM) technologies for purposes of reducing or eliminating aircraft accidents due to system and component failures. Under this program, NASA Langley Research Center (LaRC) is developing a strain-based structural health-monitoring concept that incorporates a fiber optic-based measuring system for acquiring strain values. This fiber optic-based measuring system provides for the distribution of thousands of strain sensors embedded in a network of fiber optic cables. The resolution of strain value at each discrete sensor point requires a computationally demanding data reduction software process that, when hosted on a conventional processor, is not suitable for near real-time measurement. This report describes the development and integration of an alternative computing environment using dedicated computing hardware for performing the data reduction. Performance comparison between the existing and the hardware-based system is presented.
Ma, Andrew; Clegg, Daniel; Fugit, Randolph V.; Pepe, Anthony; Goetz, Matthew Bidwell; Graber, Christopher J.
2015-01-01
Background: Stewardship of antimicrobial agents is an essential function of hospital pharmacies. The ideal pharmacist staffing model for antimicrobial stewardship programs is not known. Objective: To inform staffing decisions for antimicrobial stewardship teams, we aimed to compare an antimicrobial stewardship program with a dedicated Infectious Diseases (ID) pharmacist (Dedicated ID Pharmacist Hospital) to a program relying on ward pharmacists for stewardship activities (Geographic Model Hospital). Methods: We reviewed a randomly selected sample of 290 cases of inpatient parenteral antibiotic use. The electronic medical record was reviewed for compliance with indicators of appropriate antimicrobial stewardship. Results: At the hospital staffed by a dedicated ID pharmacist, 96.8% of patients received initial antimicrobial therapy that adhered to local treatment guidelines compared to 87% of patients at the hospital that assigned antimicrobial stewardship duties to ward pharmacists (P < .002). Therapy was modified within 24 hours of availability of laboratory data in 86.7% of cases at the Dedicated ID Pharmacist Hospital versus 72.6% of cases at the Geographic Model Hospital (P < .03). When a patient’s illness was determined not to be caused by a bacterial infection, antibiotics were discontinued in 78.0% of cases at the Dedicated ID Pharmacist Hospital and in 33.3% of cases at the Geographic Model Hospital (P < .0002). Conclusion: An antimicrobial stewardship program with a dedicated ID pharmacist was associated with greater adherence to recommended antimicrobial therapy practices when compared to a stewardship program that relied on ward pharmacists. PMID:26405339
Perspectives on an education in computational biology and medicine.
Rubinstein, Jill C
2012-09-01
The mainstream application of massively parallel, high-throughput assays in biomedical research has created a demand for scientists educated in Computational Biology and Bioinformatics (CBB). In response, formalized graduate programs have rapidly evolved over the past decade. Concurrently, there is increasing need for clinicians trained to oversee the responsible translation of CBB research into clinical tools. Physician-scientists with dedicated CBB training can facilitate such translation, positioning themselves at the intersection between computational biomedical research and medicine. This perspective explores key elements of the educational path to such a position, specifically addressing: 1) evolving perceptions of the role of the computational biologist and the impact on training and career opportunities; 2) challenges in and strategies for obtaining the core skill set required of a biomedical researcher in a computational world; and 3) how the combination of CBB with medical training provides a logical foundation for a career in academic medicine and/or biomedical research.
HELAC-Onia 2.0: An upgraded matrix-element and event generator for heavy quarkonium physics
NASA Astrophysics Data System (ADS)
Shao, Hua-Sheng
2016-01-01
We present an upgraded version (denoted as version 2.0) of the program HELAC-ONIA for the automated computation of heavy-quarkonium helicity amplitudes within non-relativistic QCD framework. The new code has been designed to include many new and useful features for practical phenomenological simulations. It is designed for job submissions under cluster environment for parallel computations via PYTHON scripts. We have interfaced HELAC-ONIA to the parton shower Monte Carlo programs PYTHIA 8 and QEDPS to take into account the parton-shower effects. Moreover, the decay module guarantees that the program can perform the spin-entangled (cascade-)decay of heavy quarkonium after its generation. We have also implemented a reweighting method to automatically estimate the uncertainties from renormalization and/or factorization scales as well as parton-distribution functions to weighted or unweighted events. A further update is the possibility to generate one-dimensional or two-dimensional plots encoded in the analysis files on the fly. Some dedicated examples are given at the end of the writeup.
Neuropeptide Signaling Networks and Brain Circuit Plasticity.
McClard, Cynthia K; Arenkiel, Benjamin R
2018-01-01
The brain is a remarkable network of circuits dedicated to sensory integration, perception, and response. The computational power of the brain is estimated to dwarf that of most modern supercomputers, but perhaps its most fascinating capability is to structurally refine itself in response to experience. In the language of computers, the brain is loaded with programs that encode when and how to alter its own hardware. This programmed "plasticity" is a critical mechanism by which the brain shapes behavior to adapt to changing environments. The expansive array of molecular commands that help execute this programming is beginning to emerge. Notably, several neuropeptide transmitters, previously best characterized for their roles in hypothalamic endocrine regulation, have increasingly been recognized for mediating activity-dependent refinement of local brain circuits. Here, we discuss recent discoveries that reveal how local signaling by corticotropin-releasing hormone reshapes mouse olfactory bulb circuits in response to activity and further explore how other local neuropeptide networks may function toward similar ends.
Visual Navigation - SARE Mission
NASA Technical Reports Server (NTRS)
Alonso, Roberto; Kuba, Jose; Caruso, Daniel
2007-01-01
The SARE Earth Observing and Technological Mission is part of the Argentinean Space Agency (CONAE - Comision Nacional de Actividades Espaciales) Small and Technological Payloads Program. The Argentinean National Space Program requires from the SARE program mission to test in a real environment of several units, assemblies and components to reduce the risk of using these equipments in more expensive Space Missions. The objective is to make use those components with an acceptable maturity in design or development, but without any heritage at space. From the application point of view, this mission offers new products in the Earth Observation data market which are listed in the present paper. One of the technological payload on board of the SARE satellite is the sensor Ground Tracker. It computes the satellite attitude and orbit in real time (goal) and/or by ground processing. For the first operating mode a dedicated computer and mass memory are necessary to be part of the mentioned sensor. For the second operational mode the hardware and software are much simpler.
Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact
NASA Astrophysics Data System (ADS)
Abadjiev, Valentin; Kawasaki, Haruhisa
2014-09-01
The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.
Multiphasic Health Testing in the Clinic Setting
LaDou, Joseph
1971-01-01
The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771
Using OSG Computing Resources with (iLC)Dirac
NASA Astrophysics Data System (ADS)
Sailer, A.; Petric, M.; CLICdp Collaboration
2017-10-01
CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called ‘SiteDirectors’, which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional site-specific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were developed. Not only is the usage of these types of computing elements now completely transparent for all DIRAC instances, which makes DIRAC a flexible solution for OSG based virtual organisations, but it also allows LCG Grid Sites to move to the HTCondor-CE software, without shutting DIRAC based VOs out of their site. In these proceedings we detail how we interfaced the DIRAC system to the HTCondor-CE and Globus computing elements and explain the encountered obstacles and solutions developed, and how the linear collider community uses resources in the OSG.
An extensive coronagraphic simulation applied to LBT
NASA Astrophysics Data System (ADS)
Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.
2016-08-01
In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.
Scalable cloud without dedicated storage
NASA Astrophysics Data System (ADS)
Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.
2015-05-01
We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.
Krueger, Chad A; Hoffman, Jeffery D; Balazs, George C; Johnson, Anthony E; Potter, Benjamin K; Belmont, Philip J
The effect of dedicated resident research time in terms of residency program research productivity remains largely unknown. We hypothesize that the quantity and quality of a residency program's peer-reviewed publications (PRPs) increase proportionately with the amount of dedicated research time given to residents. Three residency programs (P1, P2, and P3) were examined. P1 has a mandatory research year for all residents between postgraduate years 3 and 4. P2 has an elective research year for 1 resident between postgraduate years 2 and 3. P3 has no dedicated research time for residents. All publications produced by residents and staff at each program from January 2007 through December were recorded from PUBMED. SCImago Journal Rankings were used as a proxy to measure research quality. There was no significant difference in the number of publications produced between the institutions on a per-staff (p = 0.27) and per-resident (p = 0.80) basis. There were no residents at P3 who graduated without at least 1 PRP, whereas there were 7 residents from P1 and 8 residents from P2 who graduated without a PRP. There were no significant differences between programs in terms of the SCImago Journal Ranking for the journals containing their publications (p = 0.135). Residency programs with dedicated research time did not produce significantly (p > 0.05) more, or higher quality, PRPs than residencies without dedicated research time. It may be that the quantity and quality of PRPs is related more to faculty engagement, research interest, and mentorship at individual programs rather than the number of residents given dedicated time to complete research. Level 3. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Sausen, Tania Maria
The initial activities on space education began right after World War II, in the early 1950s, when USA and USSR started the Space Race. At that time, Space education was only and exclusively available to researchers and technicians working directly in space programs. This new area was restricted only to post-graduate programs (basically master and doctoral degree) or to very specific training programs dedicated for beginners. In South America, at that time there was no kind of activity on space education, simply because there was no activity in space research. In the beginning of the 1970s, Brazil, through INPE, had created masteral and doctoral courses on several space areas such as remote sensing and meteorology. Only in the mid-1980s did Brazil, after a UN request, create its specialisation course on remote sensing dedicated to Latin American professionals. At the same period, the Agustin Codazzi Institute (Bogota, Colombia) began to offer specialisation courses in remote sensing. In South America, educational space programs are currently being created for elementary and high schools and universities, but the author personally estimates that 90% of these educational programs still make use of traditional educational materials — such as books, tutorials, maps and graphics. There is little educational material that uses multimedia resources, advanced computing or communication methods and, basically, these are the materials that are best suited to conduct instructions in remote sensing, GIS, meteorology and astronomy.
A communications model for an ISAS to NASA span link
NASA Technical Reports Server (NTRS)
Green, James L.; Mcguire, Robert E.; Lopez-Swafford, Brian
1987-01-01
The authors propose that an initial computer-to-computer communication link use the public packet switched networks (PPSN) Venus-P in Japan and TELENET in the U.S. When the traffic warrants it, this link would then be upgraded to a dedicated leased line that directly connects into the Space Physics Analysis Network (SPAN). The proposed system of hardware and software will easily support migration to such a dedicated link. It therefore provides a cost effective approach to the network problem. Once a dedicated line becomes operation it is suggested that the public networks link and continue to coexist, providing a backup capability.
Cosmological coherent state expectation values in loop quantum gravity I. Isotropic kinematics
NASA Astrophysics Data System (ADS)
Dapor, Andrea; Liegener, Klaus
2018-07-01
This is the first paper of a series dedicated to loop quantum gravity (LQG) coherent states and cosmology. The concept is based on the effective dynamics program of Loop Quantum Cosmology, where the classical dynamics generated by the expectation value of the Hamiltonian on semiclassical states is found to be in agreement with the quantum evolution of such states. We ask the question of whether this expectation value agrees with the one obtained in the full theory. The answer is in the negative, Dapor and Liegener (2017 arXiv:1706.09833). This series of papers is dedicated to detailing the computations that lead to that surprising result. In the current paper, we construct the family of coherent states in LQG which represent flat (k = 0) Robertson–Walker spacetimes, and present the tools needed to compute expectation values of polynomial operators in holonomy and flux on such states. These tools will be applied to the LQG Hamiltonian operator (in Thiemann regularization) in the second paper of the series. The third paper will present an extension to cosmologies and a comparison with alternative regularizations of the Hamiltonian.
Simulations of Observations with the Far-Infrared Surveyor: Design Overview and Current Status
NASA Astrophysics Data System (ADS)
Jeong, W.; Pak, S.; Lee, H. M.; Kim, S.; Matsuura, M.; Nakagawa, T.; Yamamura, I.; Murakami, H.; Matsuura, S.; Kawada, M.; Kaneda, H.; Shibai, H.
2000-12-01
The Far-Infrared Surveyor (FIS) is one of the on-board instruments on the ASTRO-F satellite, which will be launched in early 2004. The first a half year of its mission period of 500 days is dedicated to an all sky survey in four bands between 50 and 200 micron. On the basis of the present hardware specifications and configurations of the FIS, we have written a computer program to simulate the FIS. The program can be used to evaluate the performance of the instrument as well as to produce input for the data reduction system. In this paper, we describe the current status of the program. As an example of the usage of the simulation program, we present the expected observing data for three different detector sampling rates. The functions which should be implemented into the program, in the future, are enumerated.
Patel, Parth; Khanna, Sarika; McLellan, Beth; Krishnamurthy, Karthik
2017-01-01
Background Inadequate dermoscopy training represents a major barrier to proper dermoscopy use. Objective To better understand the status of dermoscopy training in US residency programs. Methods A survey was sent to 417 dermatology residents and 118 program directors of dermatology residency programs. Results Comparing different training times for the same training type, residents with 1–10 hours of dedicated training had similar confidence using dermoscopy in general (p = 1.000) and satisfaction with training (p = .3224) than residents with >10 hours of dedicated training. Comparing similar training times for different training types, residents with 1–10 hours of dedicated training had significantly increased confidence using dermoscopy in general (p = .0105) and satisfaction with training (p = .0066) than residents with 1–10 hours of only bedside training. Lastly, residents with 1–10 hours of dedicated training and >10 hours of dedicated training had significantly increased confidence using dermoscopy in general (p = .0002, p = .2471) and satisfaction with training (p <.0001, p < .0001) than residents with no dermoscopy training at all. Conclusions Dermoscopy training in residency should include formal dermoscopy training that is overseen by the program director and is possibly supplemented by outside dermoscopy training. PMID:28515987
Trimarchi, Matteo; Lund, Valerie J; Nicolai, Piero; Pini, Massimiliano; Senna, Massimo; Howard, David J
2004-04-01
The Neoplasms of the Sinonasal Tract software package (NSNT v 1.0) implements a complete visual database for patients with sinonasal neoplasia, facilitating standardization of data and statistical analysis. The software, which is compatible with the Macintosh and Windows platforms, provides multiuser application with a dedicated server (on Windows NT or 2000 or Macintosh OS 9 or X and a network of clients) together with web access, if required. The system hardware consists of an Apple Power Macintosh G4500 MHz computer with PCI bus, 256 Mb of RAM plus 60 Gb hard disk, or any IBM-compatible computer with a Pentium 2 processor. Image acquisition may be performed with different frame-grabber cards for analog or digital video input of different standards (PAL, SECAM, or NTSC) and levels of quality (VHS, S-VHS, Betacam, Mini DV, DV). The visual database is based on 4th Dimension by 4D Inc, and video compression is made in real-time MPEG format. Six sections have been developed: demographics, symptoms, extent of disease, radiology, treatment, and follow-up. Acquisition of data includes computed tomography and magnetic resonance imaging, histology, and endoscopy images, allowing sequential comparison. Statistical analysis integral to the program provides Kaplan-Meier survival curves. The development of a dedicated, user-friendly database for sinonasal neoplasia facilitates a multicenter network and has obvious clinical and research benefits.
Propulsion/flight control integration technology (PROFIT) design analysis status
NASA Technical Reports Server (NTRS)
Carlin, C. M.; Hastings, W. J.
1978-01-01
The propulsion flight control integration technology (PROFIT) program was designed to develop a flying testbed dedicated to controls research. The preliminary design, analysis, and feasibility studies conducted in support of the PROFIT program are reported. The PROFIT system was built around existing IPCS hardware. In order to achieve the desired system flexibility and capability, additional interfaces between the IPCS hardware and F-15 systems were required. The requirements for additions and modifications to the existing hardware were defined. Those interfaces involving the more significant changes were studied. The DCU memory expansion to 32K with flight qualified hardware was completed on a brassboard basis. The uplink interface breadboard and a brassboard of the central computer interface were also tested. Two preliminary designs and corresponding program plans are presented.
Computer systems and software engineering
NASA Technical Reports Server (NTRS)
Mckay, Charles W.
1988-01-01
The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.
1979-09-01
joint orientetion and joint slippage than to failure of the intact rock mass. Dixon (1971) noted the importance of including the confining influence of...dedicated computer. The area of research not covered by this investigation which holds promise for a future study is a detailed comparison of the results of...block data, type key "W". The program writes this data on Linc tapes for future retripval. This feature can be used to store the consolidated block
The F-18 systems research aircraft facility
NASA Technical Reports Server (NTRS)
Sitz, Joel R.
1992-01-01
To help ensure that new aerospace initiatives rapidly transition to competitive U.S. technologies, NASA Dryden Flight Research Facility has dedicated a systems research aircraft facility. The primary goal is to accelerate the transition of new aerospace technologies to commercial, military, and space vehicles. Key technologies include more-electric aircraft concepts, fly-by-light systems, flush airdata systems, and advanced computer architectures. Future aircraft that will benefit are the high-speed civil transport and the National AeroSpace Plane. This paper describes the systems research aircraft flight research vehicle and outlines near-term programs.
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
Quality controls for gamma cameras and PET cameras: development of a free open-source ImageJ program
NASA Astrophysics Data System (ADS)
Carlier, Thomas; Ferrer, Ludovic; Berruchon, Jean B.; Cuissard, Regis; Martineau, Adeline; Loonis, Pierre; Couturier, Olivier
2005-04-01
Acquisition data and treatments for quality controls of gamma cameras and Positron Emission Tomography (PET) cameras are commonly performed with dedicated program packages, which are running only on manufactured computers and differ from each other, depending on camera company and program versions. The aim of this work was to develop a free open-source program (written in JAVA language) to analyze data for quality control of gamma cameras and PET cameras. The program is based on the free application software ImageJ and can be easily loaded on any computer operating system (OS) and thus on any type of computer in every nuclear medicine department. Based on standard parameters of quality control, this program includes 1) for gamma camera: a rotation center control (extracted from the American Association of Physics in Medicine, AAPM, norms) and two uniformity controls (extracted from the Institute of Physics and Engineering in Medicine, IPEM, and National Electronic Manufacturers Association, NEMA, norms). 2) For PET systems, three quality controls recently defined by the French Medical Physicist Society (SFPM), i.e. spatial resolution and uniformity in a reconstructed slice and scatter fraction, are included. The determination of spatial resolution (thanks to the Point Spread Function, PSF, acquisition) allows to compute the Modulation Transfer Function (MTF) in both modalities of cameras. All the control functions are included in a tool box which is a free ImageJ plugin and could be soon downloaded from Internet. Besides, this program offers the possibility to save on HTML format the uniformity quality control results and a warning can be set to automatically inform users in case of abnormal results. The architecture of the program allows users to easily add any other specific quality control program. Finally, this toolkit is an easy and robust tool to perform quality control on gamma cameras and PET cameras based on standard computation parameters, is free, run on any type of computer and will soon be downloadable from the net (http://rsb.info.nih.gov/ij/plugins or http://nucleartoolkit.free.fr).
7 CFR 1491.4 - Program requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... capability to acquire, manage, and enforce easements; (3) Sufficient number of staff dedicated to monitoring... of easement management, monitoring, and enforcement where such fund is sufficiently capitalized in accordance with NRCS standards. The dedicated fund must be dedicated to the purposes of managing, monitoring...
Implementation of a commercial-grade dedication program - Benefits and lessons learned
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, M.; MacFarlane, I.
1991-01-01
The recent issuance of industry guidelines, the Nuclear Management and Resources Council procurement initiative, and a US Nuclear Regulatory Commission NRC generic letter on commercial-grade item dedication (CGD) has been viewed by many utility managers and executives as only adding to the existing burden of compliance with regulatory requirements. While the incorporation of these documents into existing CGD programs has created additional costs, the resulting enhanced dedication programs have also produced benefits beyond regulatory compliance, and some lessons have been learned. This paper discusses the benefits and the lessons learned during implementation of an enhanced CGD program at New Hampshiremore » Yankee's (NHY's) Seabrook nuclear plant. Based on NHY's experience, it is believed that the benefits described in this paper can be realized by other utilities implementing CGD programs.« less
Abreu, Rui Mv; Froufe, Hugo Jc; Queiroz, Maria João Rp; Ferreira, Isabel Cfr
2010-10-28
Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a potential maximum speed-up of 10x, the parallel algorithm of MOLA performed with a speed-up of 8,64× using AutoDock4 and 8,60× using Vina.
[Intranet applications in radiology].
Knopp, M V; von Hippel, G M; Koch, T; Knopp, M A
2000-01-01
The aim of the paper is to present the conceptual basis and capabilities of intranet applications in radiology. The intranet, which is the local brother of the internet can be readily realized using existing computer components and a network. All current computer operating systems support intranet applications which allow hard and software independent communication of text, images, video and sound with the use of browser software without dedicated programs on the individual personal computers. Radiological applications for text communication e.g. department specific bulletin boards and access to examination protocols; use of image communication for viewing and limited processing and documentation of radiological images can be achieved on decentralized PCs as well as speech communication for dictation, distribution of dictation and speech recognition. The intranet helps to optimize the organizational efficiency and cost effectiveness in the daily work of radiological departments in outpatients and hospital settings. The general interest in internet and intranet technology will guarantee its continuous development.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
Illuminator, a desktop program for mutation detection using short-read clonal sequencing.
Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T
2011-10-01
Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.
Interactive brain shift compensation using GPU based programming
NASA Astrophysics Data System (ADS)
van der Steen, Sander; Noordmans, Herke Jan; Verdaasdonk, Rudolf
2009-02-01
Processing large images files or real-time video streams requires intense computational power. Driven by the gaming industry, the processing power of graphic process units (GPUs) has increased significantly. With the pixel shader model 4.0 the GPU can be used for image processing 10x faster than the CPU. Dedicated software was developed to deform 3D MR and CT image sets for real-time brain shift correction during navigated neurosurgery using landmarks or cortical surface traces defined by the navigation pointer. Feedback was given using orthogonal slices and an interactively raytraced 3D brain image. GPU based programming enables real-time processing of high definition image datasets and various applications can be developed in medicine, optics and image sciences.
NASA Technical Reports Server (NTRS)
Dodson, D. W.; Shields, N. L., Jr.
1978-01-01
The Experiment Computer Operating System (ECOS) of the Spacelab will allow the onboard Payload Specialist to command experiment devices and display information relative to the performance of experiments. Three candidate ECOS command and control service concepts were reviewed and laboratory data on operator performance was taken for each concept. The command and control service concepts evaluated included a dedicated operator's menu display from which all command inputs were issued, a dedicated command key concept with which command inputs could be issued from any display, and a multi-display concept in which command inputs were issued from several dedicated function displays. Advantages and disadvantages are discussed in terms of training, operational errors, task performance time, and subjective comments of system operators.
Variability and Limits of US State Laws Regulating Workplace Wellness Programs.
Pomeranz, Jennifer L; Garcia, Andrea M; Vesprey, Randy; Davey, Adam
2016-06-01
We examined variability in state laws related to workplace wellness programs for public and private employers. We conducted legal research using LexisNexis and Westlaw to create a master list of US state laws that existed in 2014 dedicated to workplace wellness programs. The master list was then divided into laws focusing on public employers and private employers. We created 2 codebooks to describe the variables used to examine the laws. Coders used LawAtlas(SM) Workbench to code the laws related to workplace wellness programs. Thirty-two states and the District of Columbia had laws related to workplace wellness programs in 2014. Sixteen states and the District of Columbia had laws dedicated to public employers, and 16 states had laws dedicated to private employers. Nine states and the District of Columbia had laws that did not specify employer type. State laws varied greatly in their methods of encouraging or shaping wellness program requirements. Few states have comprehensive requirements or incentives to support evidence-based workplace wellness programs.
NRC assessment of improvements in licensee procurement and dedication programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimes, B.K.; Potapovs, U.; Campbell, L.L.
1991-01-01
The US Nuclear Regulatory Commission (NRC) recently conducted a series of on-site assessments at selected nuclear plants to review improvements that the licenses had made to their procurement and dedication activities. The assessments included an evaluation of the progress made by licensees to strengthen their commercial-grade dedication programs to comply with 10CRF50, Appendix B, and to implement the comprehensive procurement improvements suggested in Nuclear Management and Resource Council (NUMARC) publication 901-3. This paper discusses the overall purpose of the assessments, procurement areas assessed, major strengths and weaknesses identified in licensee procurement programs, and the NRC's perspective of the industry's responsemore » to NUMARC procurement initiatives.« less
Scientific Services on the Cloud
NASA Astrophysics Data System (ADS)
Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong
Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.
High-Performance Computing Systems and Operations | Computational Science |
NREL Systems and Operations High-Performance Computing Systems and Operations NREL operates high-performance computing (HPC) systems dedicated to advancing energy efficiency and renewable energy technologies. Capabilities NREL's HPC capabilities include: High-Performance Computing Systems We operate
Using Mosix for Wide-Area Compuational Resources
Maddox, Brian G.
2004-01-01
One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.
A survey of 100 community colleges on student substance use, programming, and collaborations.
Chiauzzi, Emil; Donovan, Elizabeth; Black, Ryan; Cooney, Elizabeth; Buechner, Allison; Wood, Mollie
2011-01-01
The objective was to survey community college personnel about student substance use, and infrastructure (staff and funding), programs, and collaborations dedicated to substance use prevention. The sample included 100 administrators, faculty, and health services staff at 100 community colleges. Participants completed a Web-based survey. Participants reported a number of alcohol and other drug (AOD) related concerns. Despite limited staff and funding dedicated to AOD, institutions are implementing a number of programs, although many are not implementing some of the programs popular at traditional 4-year colleges. They are also collaborating with a number of on- and off-campus groups. The availability of staff and funding dedicated to AOD, and the presence of residence halls, is associated with health programming and substance abuse collaborations. Results suggest that there is a need for increased research to understand the most effective AOD prevention strategies for community colleges.
Dedicated training in adult education among otolaryngology faculty.
McMains, Kevin C; Peel, Jennifer
2014-12-01
Most faculty members undergo ad hoc training in education. This survey was developed to assess the prevalence and type of dedicated training in education received by academic otolaryngology-head and neck surgery (OTO-HNS) faculty in the United States. Survey. An 11-item survey was developed to assess the prevalence of dedicated instruction in education theory and practice, the types of instruction received, and the barriers to receiving instruction. The survey was sent to all OTO-HNS program directors for distribution among their respective faculty. A total of 216 responses were received. Seventy respondents (32.7%) serve as program director, associate program director, or assistant program director in their respective programs. Forty-six respondents (21.8%) had received dedicated training in education. Of the respondents who described the type of education training received, 48.7% participated in didactics/seminar, 35.9% in degree/certificate programs, 10.3% in multimodality training, and 5.1% online training. Among the barriers encountered to participation in instruction in education, time/productivity pressures was the most commonly cited reason (60.2%), followed by not knowing about the opportunity to receive training (36.4%), lack of departmental support (26.2%), lack of available training (22.3%), and the perception that such training would not be useful (7.8%). Presently, only a minority of surveyed academic otolaryngologists in the United States have received any dedicated instruction in the theory and practice of education. Personal, departmental, and institutional barriers exist in many practice environments that hinder otolaryngology faculty from participating in education training. N/A. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
An Analysis of Research Quality and Productivity at Six Academic Orthopaedic Residencies.
Osborn, Patrick M; Ames, S Elizabeth; Turner, Norman S; Caird, Michelle S; Karam, Matthew D; Mormino, Matthew A; Krueger, Chad A
2018-06-06
It remains largely unknown what factors impact the research productivity of residency programs. We hypothesized that dedicated resident research time would not affect the quantity and quality of a program's peer-reviewed publication within orthopedic residencies. These findings may help programs improve structure their residency programs to maximize core competencies. Three hundred fifty-nine residents and 240 staff from six different US orthopedic residency programs were analyzed. All publications published by residents and faculty at each program from January 2007 to December 2015 were recorded. SCImago Journal Rankings (SJR) were found for each journal. There were no significant differences in publications by residents at each program (p > 0.05). Faculty with 10+ years of on staff, had significantly more publications than those with less than 10years (p < 0.01). Programs with increased resident research time did not consistently produce publications with higher SJR than those without dedicated research time. Increased dedicated resident research time did not increase resident publication rates or lead to publications with higher SJR. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Alloul, Adam; Christensen, Neil D.; Degrande, Céline; Duhr, Claude; Fuks, Benjamin
2014-06-01
The program FEYNRULES is a MATHEMATICA package developed to facilitate the implementation of new physics theories into high-energy physics tools. Starting from a minimal set of information such as the model gauge symmetries, its particle content, parameters and Lagrangian, FEYNRULES provides all necessary routines to extract automatically from the Lagrangian (that can also be computed semi-automatically for supersymmetric theories) the associated Feynman rules. These can be further exported to several Monte Carlo event generators through dedicated interfaces, as well as translated into a PYTHON library, under the so-called UFO model format, agnostic of the model complexity, especially in terms of Lorentz and/or color structures appearing in the vertices or of number of external legs. In this work, we briefly report on the most recent new features that have been added to FEYNRULES, including full support for spin-1 fermions, a new module allowing for the automated diagonalization of the particle spectrum and a new set of routines dedicated to decay width calculations.
Research into display sharing techniques for distributed computing environments
NASA Technical Reports Server (NTRS)
Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.
1990-01-01
The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.
Integrated digital flight-control system for the space shuttle orbiter
NASA Technical Reports Server (NTRS)
1973-01-01
The integrated digital flight control system is presented which provides rotational and translational control of the space shuttle orbiter in all phases of flight: from launch ascent through orbit to entry and touchdown, and during powered horizontal flights. The program provides a versatile control system structure while maintaining uniform communications with other programs, sensors, and control effectors by using an executive routine/functional subroutine format. The program reads all external variables at a single point, copies them into its dedicated storage, and then calls the required subroutines in the proper sequence. As a result, the flight control program is largely independent of other programs in the GN&C computer complex and is equally insensitive to the characteristics of the processor configuration. The integrated structure of the control system and the DFCS executive routine which embodies that structure are described along with the input and output. The specific estimation and control algorithms used in the various mission phases are given.
A new approach in the design of an interactive environment for teaching Hamiltonian digraphs
NASA Astrophysics Data System (ADS)
Iordan, A. E.; Panoiu, M.
2014-03-01
In this article the authors present the necessary steps in object orientated design of an interactive environment that is dedicated to the process of acquaintances assimilation in Hamiltonian graphs theory domain, especially for the simulation of algorithms which determine the Hamiltonian trails and circuits. The modelling of the interactive environment is achieved through specific UML diagrams representing the steps of analysis, design and implementation. This interactive environment is very useful for both students and professors, because computer programming domain, especially digraphs theory domain is comprehended and assimilated with difficulty by students.
On critical exponents without Feynman diagrams
NASA Astrophysics Data System (ADS)
Sen, Kallol; Sinha, Aninda
2016-11-01
In order to achieve a better analytic handle on the modern conformal bootstrap program, we re-examine and extend the pioneering 1974 work of Polyakov’s, which was based on consistency between the operator product expansion and unitarity. As in the bootstrap approach, this method does not depend on evaluating Feynman diagrams. We show how this approach can be used to compute the anomalous dimensions of certain operators in the O(n) model at the Wilson-Fisher fixed point in 4-ɛ dimensions up to O({ɛ }2). AS dedicates this work to the loving memory of his mother.
Dedication file preparation for commercial-grade electric components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, J.R.; Farwell, C.R. Jr.
1988-01-01
Dedication is the process of making a commercial-grade item into a basic component that can be installed in safety systems. This process ensures that the commercially manufactured items are of the same or equivalent form, fit, function, and materials as the originally provided safety item. This process must ensure that the original utility's equipment qualification program is maintained per licensing commitments to 10CFR50.49 and general design criterion No. 4. Today, utilities recognize the need for establishing a dedication program to provide the flexibility in obtaining replacement items directly from the original manufacturers. This need has arisen because (a) most systemmore » houses, large manufacturers, and component manufacturers will sell their products only through distributors as straight commercial-grade items or only service former clients, and (b) lack of competition for specific safety-related items has resulted in excessive hardware cost and very long delivery schedules, which could affect plant availability. The vehicle for utilities to obtain safety-related items is to establish and manage a comprehensive dedication program for their own use or provide the direction for a nuclear supplier to follow. This paper provides both utilities and nuclear suppliers insight into the complexities of a dedication program. This insight is provided from our experience as a utilities agent and as a third-party nuclear supplier.« less
The ACI-REF Program: Empowering Prospective Computational Researchers
NASA Astrophysics Data System (ADS)
Cuma, M.; Cardoen, W.; Collier, G.; Freeman, R. M., Jr.; Kitzmiller, A.; Michael, L.; Nomura, K. I.; Orendt, A.; Tanner, L.
2014-12-01
The ACI-REF program, Advanced Cyberinfrastructure - Research and Education Facilitation, represents a consortium of academic institutions seeking to further advance the capabilities of their respective campus research communities through an extension of the personal connections and educational activities that underlie the unique and often specialized cyberinfrastructure at each institution. This consortium currently includes Clemson University, Harvard University, University of Hawai'i, University of Southern California, University of Utah, and University of Wisconsin. Working together in a coordinated effort, the consortium is dedicated to the adoption of models and strategies which leverage the expertise and experience of its members with a goal of maximizing the impact of each institution's investment in research computing. The ACI-REFs (facilitators) are tasked with making connections and building bridges between the local campus researchers and the many different providers of campus, commercial, and national computing resources. Through these bridges, ACI-REFs assist researchers from all disciplines in understanding their computing and data needs and in mapping these needs to existing capabilities or providing assistance with development of these capabilities. From the Earth sciences perspective, we will give examples of how this assistance improved methods and workflows in geophysics, geography and atmospheric sciences. We anticipate that this effort will expand the number of researchers who become self-sufficient users of advanced computing resources, allowing them to focus on making research discoveries in a more timely and efficient manner.
High-Performance Computing Data Center | Energy Systems Integration
Facility | NREL High-Performance Computing Data Center High-Performance Computing Data Center The Energy Systems Integration Facility's High-Performance Computing Data Center is home to Peregrine -the largest high-performance computing system in the world exclusively dedicated to advancing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasenkamp, Daren; Sim, Alexander; Wehner, Michael
Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less
Basrowi, Ray W; Sulistomo, Astrid B; Adi, Nuri Purwito; Vandenplas, Yvan
2015-06-01
A mother's working environment is believed to be a major determinant of exclusive breastfeeding (EBF) practice. We aimed to define the influence of a facility dedicated to breastfeeding and a breastfeeding support program at the workplace on breastfeeding practice. A cross-sectional study was performed in five workplaces. The inclusion criteria were female workers whose last child was between 6 and 36 months old. Observational data were obtained and a questionnaire was filled out. The World Health Organization definition for EBF was used. Data from 186 subjects (74 office workers and 112 factory workers) were collected. Just over half (52%) of the mothers were between 20 and 46 years old, 75.3% had graduated from high school and university, 12.9% had more than two children and 36.0% owned a house. The prevalence of EBF during the last 6 months was 32.3%. A proper dedicated breastfeeding facility was available for 21.5% of the mothers, but only 7.5% had been in contact with a breastfeeding support program. The presence of a dedicated breastfeeding facility increased EBF practice almost threefold, by an odds ratio (OR) of 2.74 and a 95% confidence interval (CI) of 1.34-5.64 (p<0.05). Knowledge of the breastfeeding support program increased EBF practice by almost six times (OR, 5.93; 95% CI, 1.78-19.79) (p<0.05). Our findings suggest that Governments should make it obligatory for employers to offer a breastfeeding support program and a dedicated breastfeeding facility at the workplace as these simple measures significantly increase EBF.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grulke, Eric; Stencel, John
2011-09-13
The KY DOE EPSCoR Program supports two research clusters. The Materials Cluster uses unique equipment and computational methods that involve research expertise at the University of Kentucky and University of Louisville. This team determines the physical, chemical and mechanical properties of nanostructured materials and examines the dominant mechanisms involved in the formation of new self-assembled nanostructures. State-of-the-art parallel computational methods and algorithms are used to overcome current limitations of processing that otherwise are restricted to small system sizes and short times. The team also focuses on developing and applying advanced microtechnology fabrication techniques and the application of microelectrornechanical systems (MEMS)more » for creating new materials, novel microdevices, and integrated microsensors. The second research cluster concentrates on High Energy and Nuclear Physics. lt connects research and educational activities at the University of Kentucky, Eastern Kentucky University and national DOE research laboratories. Its vision is to establish world-class research status dedicated to experimental and theoretical investigations in strong interaction physics. The research provides a forum, facilities, and support for scientists to interact and collaborate in subatomic physics research. The program enables increased student involvement in fundamental physics research through the establishment of graduate fellowships and collaborative work.« less
ERIC Educational Resources Information Center
Stiegler, Sam
2008-01-01
This interview-based essay explores how a teacher-training program, while ostensibly dedicated to the idea of teaching for social justice, completely neglected issues of homophobia and heterosexism. How did silence around queer issues leave a dedicated group of young, queer teachers-in-training without the academic, intellectual, or psychological…
Santos, Jonathan; Chaudhari, Abhijit J; Joshi, Anand A; Ferrero, Andrea; Yang, Kai; Boone, John M; Badawi, Ramsey D
2014-09-01
Dedicated breast CT and PET/CT scanners provide detailed 3D anatomical and functional imaging data sets and are currently being investigated for applications in breast cancer management such as diagnosis, monitoring response to therapy and radiation therapy planning. Our objective was to evaluate the performance of the diffeomorphic demons (DD) non-rigid image registration method to spatially align 3D serial (pre- and post-contrast) dedicated breast computed tomography (CT), and longitudinally-acquired dedicated 3D breast CT and positron emission tomography (PET)/CT images. The algorithmic parameters of the DD method were optimized for the alignment of dedicated breast CT images using training data and fixed. The performance of the method for image alignment was quantitatively evaluated using three separate data sets; (1) serial breast CT pre- and post-contrast images of 20 women, (2) breast CT images of 20 women acquired before and after repositioning the subject on the scanner, and (3) dedicated breast PET/CT images of 7 women undergoing neo-adjuvant chemotherapy acquired pre-treatment and after 1 cycle of therapy. The DD registration method outperformed no registration (p < 0.001) and conventional affine registration (p ≤ 0.002) for serial and longitudinal breast CT and PET/CT image alignment. In spite of the large size of the imaging data, the computational cost of the DD method was found to be reasonable (3-5 min). Co-registration of dedicated breast CT and PET/CT images can be performed rapidly and reliably using the DD method. This is the first study evaluating the DD registration method for the alignment of dedicated breast CT and PET/CT images. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Dedicated short-range communications roadside unit specifications.
DOT National Transportation Integrated Search
2017-04-28
The Intelligent Transportation Systems (ITS) Program definition of connected vehicles includes both 5.9 Gigahertz (GHz) Dedicated Short Range Communications (DSRC) and non-DSRC technologies as means of facilitating communication for vehicle-to-vehicl...
Press Site Auditorium dedicated to John Holliman
NASA Technical Reports Server (NTRS)
1999-01-01
A ceremony dedicated the KSC Press Site auditorium as the John Holliman Auditorium to honor the correspondent for his enthusiastic, dedicated coverage of America's space program. The auditorium was built in 1980 and has been the focal point for new coverage of Space Shuttle launches. The ceremony followed the 94th launch of a Space Shuttle, on mission STS-96, earlier this morning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alvarez, R.A.; Yost, D.M.
1995-11-01
A technology demonstration program of dedicated compressed natural gas (CNG) original equipment manufacturer (OEM) vehicles was conducted at FL Bliss, Texas to demonstrate the use of CNG as an alternative fuel. The demonstration program at FL Bliss was the first Army initiative with CNG-fueled vehicles under the legislated Alternative Motor Fuels Act. This Department of Energy (DOE)-supported fleet demonstration consisted of 48 General Services Administration (GSA)-owned, Army-leased 1992 dedicated CNG General Motors (GM) 3/4-ton pickup trucks and four 1993 gasoline-powered Chevrolet 3/4-ton pickup trucks.
Preface for the special issue of Mathematical Biosciences and Engineering, BIOCOMP 2012.
Buonocore, Aniello; Di Crescenzo, Antonio; Hastings, Alan
2014-04-01
The International Conference "BIOCOMP2012 - Mathematical Modeling and Computational Topics in Biosciences'', was held in Vietri sul Mare (Italy), June 4-8, 2012. It was dedicated to the Memory of Professor Luigi M. Ricciardi (1942-2011), who was a visionary and tireless promoter of the 3 previous editions of the BIOCOMP conference series. We thought that the best way to honor his memory was to continue the BIOCOMP program. Over the years, this conference promoted scientific activities related to his wide interests and scientific expertise, which ranged in various areas of applications of mathematics, probability and statistics to biosciences and cybernetics, also with emphasis on computational problems. We are pleased that many of his friends and colleagues, as well as many other scientists, were attracted by the goals of this recent event and offered to contribute to its success.
Measurement system for nitrous oxide based on amperometric gas sensor
NASA Astrophysics Data System (ADS)
Siswoyo, S.; Persaud, K. C.; Phillips, V. R.; Sneath, R.
2017-03-01
It has been well known that nitrous oxide is an important greenhouse gas, so monitoring and control of its concentration and emission is very important. In this work a nitrous oxide measurement system has been developed consisting of an amperometric sensor and an appropriate lab-made potentiostat that capable measuring picoampere current ranges. The sensor was constructed using a gold microelectrode as working electrode surrounded by a silver wire as quasi reference electrode, with tetraethyl ammonium perchlorate and dimethylsulphoxide as supporting electrolyte and solvent respectively. The lab-made potentiostat was built incorporating a transimpedance amplifier capable of picoampere measurements. This also incorporated a microcontroller based data acquisition system, controlled by a host personal computer using a dedicated computer program. The system was capable of detecting N2O concentrations down to 0.07 % v/v.
Optimizing R with SparkR on a commodity cluster for biomedical research.
Sedlmayr, Martin; Würfl, Tobias; Maier, Christian; Häberle, Lothar; Fasching, Peter; Prokosch, Hans-Ulrich; Christoph, Jan
2016-12-01
Medical researchers are challenged today by the enormous amount of data collected in healthcare. Analysis methods such as genome-wide association studies (GWAS) are often computationally intensive and thus require enormous resources to be performed in a reasonable amount of time. While dedicated clusters and public clouds may deliver the desired performance, their use requires upfront financial efforts or anonymous data, which is often not possible for preliminary or occasional tasks. We explored the possibilities to build a private, flexible cluster for processing scripts in R based on commodity, non-dedicated hardware of our department. For this, a GWAS-calculation in R on a single desktop computer, a Message Passing Interface (MPI)-cluster, and a SparkR-cluster were compared with regards to the performance, scalability, quality, and simplicity. The original script had a projected runtime of three years on a single desktop computer. Optimizing the script in R already yielded a significant reduction in computing time (2 weeks). By using R-MPI and SparkR, we were able to parallelize the computation and reduce the time to less than three hours (2.6 h) on already available, standard office computers. While MPI is a proven approach in high-performance clusters, it requires rather static, dedicated nodes. SparkR and its Hadoop siblings allow for a dynamic, elastic environment with automated failure handling. SparkR also scales better with the number of nodes in the cluster than MPI due to optimized data communication. R is a popular environment for clinical data analysis. The new SparkR solution offers elastic resources and allows supporting big data analysis using R even on non-dedicated resources with minimal change to the original code. To unleash the full potential, additional efforts should be invested to customize and improve the algorithms, especially with regards to data distribution. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Proceedings of RIKEN BNL Research Center Workshop: The Physics of p ↑+A Collisions at RHIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, Mei; Goto, Yuji; Heppelmann, Steve
The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the "Rikagaku Kenkyusho" (RIKEN, The Institute of Physical and Chemical Research) of Japan. The Memorandum of Understanding between RIKEN and BNL, initiated in 1997, has been renewed in 2002, 2007 and again in 2012. The Center is dedicated to the study of strong interactions, including spin physics, lattice QCD, and RHIC physics through the nurturing of a new generation of young physicists. The RBRC has both a theory and experimental component. The RBRC Theory Group and the RBRC Experimental Group consistsmore » of a total of 25-30 researchers. Positions include the following: full time RBRC Fellow, half-time RHIC Physics Fellow, and full-time postdoctoral Research Associate. The RHIC Physics Fellows hold joint appointments with RBRC and other institutions and have tenure track positions at their respective universities or BNL. To date, RBRC has over 95 graduates (Fellows and Post- docs) of which approximately 40 theorists and 20 experimenters have already attained tenure positions at major institutions worldwide. Beginning in 2001 a new RIKEN Spin Program (RSP) category was implemented at RBRC. These appointments are joint positions of RBRC and RIKEN and include the following positions in theory and experiment: RSP Researchers, RSP Research Associates, and Young Researchers, who are mentored by senior RBRC Scientists. A number of RIKEN Jr. Research Associates and Visiting Scientists also contribute to the physics program at the Center. RBRC has an active workshop program on strong interaction physics with each workshop focused on a specific physics problem. In most cases all the talks are made available on the RBRC website. In addition, highlights to each speaker’s presentation are collected to form proceedings which can therefore be made available within a short time after the workshop. To date there are over one hundred proceeding volumes available. A 10 teraflops RBRC QCDOC computer funded by RIKEN, Japan, was unveiled at a dedication ceremony at BNL on May 26, 2005. This supercomputer was designed and built by individuals from Columbia University, IBM, BNL, RBRC, and the University of Edinburgh, with the U.S. D.O.E. Office of Science providing infrastructure support at BNL. Physics results were reported at the RBRC QCDOC Symposium following the dedication. QCDSP, a 0.6 teraflops parallel processor, dedicated to lattice QCD, was begun at the Center on February 19, 1998, was completed on August 28, 1998, and was decommissioned in 2006. It was awarded the Gordon Bell Prize for price performance in 1998. QCDOC was decommissioned in May 2012. The next generation computer in this sequence, QCDCQ (600 Teraflops), is currently operational and is expected to produce many more interesting discoveries in the future.« less
Basrowi, Ray W; Sulistomo, Astrid B; Adi, Nuri Purwito
2015-01-01
Purpose A mother's working environment is believed to be a major determinant of exclusive breastfeeding (EBF) practice. We aimed to define the influence of a facility dedicated to breastfeeding and a breastfeeding support program at the workplace on breastfeeding practice. Methods A cross-sectional study was performed in five workplaces. The inclusion criteria were female workers whose last child was between 6 and 36 months old. Observational data were obtained and a questionnaire was filled out. The World Health Organization definition for EBF was used. Results Data from 186 subjects (74 office workers and 112 factory workers) were collected. Just over half (52%) of the mothers were between 20 and 46 years old, 75.3% had graduated from high school and university, 12.9% had more than two children and 36.0% owned a house. The prevalence of EBF during the last 6 months was 32.3%. A proper dedicated breastfeeding facility was available for 21.5% of the mothers, but only 7.5% had been in contact with a breastfeeding support program. The presence of a dedicated breastfeeding facility increased EBF practice almost threefold, by an odds ratio (OR) of 2.74 and a 95% confidence interval (CI) of 1.34-5.64 (p<0.05). Knowledge of the breastfeeding support program increased EBF practice by almost six times (OR, 5.93; 95% CI, 1.78-19.79) (p<0.05). Conclusion Our findings suggest that Governments should make it obligatory for employers to offer a breastfeeding support program and a dedicated breastfeeding facility at the workplace as these simple measures significantly increase EBF. PMID:26157694
Is There Computer Graphics after Multimedia?
ERIC Educational Resources Information Center
Booth, Kellogg S.
Computer graphics has been driven by the desire to generate real-time imagery subject to constraints imposed by the human visual system. The future of computer graphics, when off-the-shelf systems have full multimedia capability and when standard computing engines render imagery faster than real-time, remains to be seen. A dedicated pipeline for…
Portable Microcomputer Utilization for On-Line Pulmonary Testing
Pugh, R.; Fourre, J.; Karetzky, M.
1981-01-01
A host-remote pulmonary function testing system is described that is flexible, non-dedicated, inexpensive, and readily upgradable. It is applicable for laboratories considering computerization as well as for those which have converted to one of the already available but restricted systems. The remote unit has an 8 slot bus for memory, input-output boards, and an A-D converter. It has its own terminal for manual input and display of computed and measured data which is transmitted via an acoustic modem to a larger microcomputer. The program modules are written in Pascal-Z and/or the supplied Z-80 macro assembler as external procedures.
Samant, Sanjiv S; Xia, Junyi; Muyan-Ozcelik, Pinar; Owens, John D
2008-08-01
The advent of readily available temporal imaging or time series volumetric (4D) imaging has become an indispensable component of treatment planning and adaptive radiotherapy (ART) at many radiotherapy centers. Deformable image registration (DIR) is also used in other areas of medical imaging, including motion corrected image reconstruction. Due to long computation time, clinical applications of DIR in radiation therapy and elsewhere have been limited and consequently relegated to offline analysis. With the recent advances in hardware and software, graphics processing unit (GPU) based computing is an emerging technology for general purpose computation, including DIR, and is suitable for highly parallelized computing. However, traditional general purpose computation on the GPU is limited because the constraints of the available programming platforms. As well, compared to CPU programming, the GPU currently has reduced dedicated processor memory, which can limit the useful working data set for parallelized processing. We present an implementation of the demons algorithm using the NVIDIA 8800 GTX GPU and the new CUDA programming language. The GPU performance will be compared with single threading and multithreading CPU implementations on an Intel dual core 2.4 GHz CPU using the C programming language. CUDA provides a C-like language programming interface, and allows for direct access to the highly parallel compute units in the GPU. Comparisons for volumetric clinical lung images acquired using 4DCT were carried out. Computation time for 100 iterations in the range of 1.8-13.5 s was observed for the GPU with image size ranging from 2.0 x 10(6) to 14.2 x 10(6) pixels. The GPU registration was 55-61 times faster than the CPU for the single threading implementation, and 34-39 times faster for the multithreading implementation. For CPU based computing, the computational time generally has a linear dependence on image size for medical imaging data. Computational efficiency is characterized in terms of time per megapixels per iteration (TPMI) with units of seconds per megapixels per iteration (or spmi). For the demons algorithm, our CPU implementation yielded largely invariant values of TPMI. The mean TPMIs were 0.527 spmi and 0.335 spmi for the single threading and multithreading cases, respectively, with <2% variation over the considered image data range. For GPU computing, we achieved TPMI =0.00916 spmi with 3.7% variation, indicating optimized memory handling under CUDA. The paradigm of GPU based real-time DIR opens up a host of clinical applications for medical imaging.
Scintillator performance considerations for dedicated breast computed tomography
NASA Astrophysics Data System (ADS)
Vedantham, Srinivasan; Shi, Linxi; Karellas, Andrew
2017-09-01
Dedicated breast computed tomography (BCT) is an emerging clinical modality that can eliminate tissue superposition and has the potential for improved sensitivity and specificity for breast cancer detection and diagnosis. It is performed without physical compression of the breast. Most of the dedicated BCT systems use large-area detectors operating in cone-beam geometry and are referred to as cone-beam breast CT (CBBCT) systems. The large-area detectors in CBBCT systems are energy-integrating, indirect-type detectors employing a scintillator that converts x-ray photons to light, followed by detection of optical photons. A key consideration that determines the image quality achieved by such CBBCT systems is the choice of scintillator and its performance characteristics. In this work, a framework for analyzing the impact of the scintillator on CBBCT performance and its use for task-specific optimization of CBBCT imaging performance is described.
ERIC Educational Resources Information Center
Milton, Penny
2008-01-01
The Canadian Education Association (CEA) was commissioned by Hewlett-Packard Canada to create a case study describing the development, implementation and outcomes of New Brunswick's Dedicated Notebook Research Project. The New Brunswick Department of Education designed its research project to assess impacts on teaching and learning of dedicated…
Crew interface analysis: Selected articles on space human factors research, 1987 - 1991
NASA Technical Reports Server (NTRS)
Bagian, Tandi (Compiler)
1993-01-01
As part of the Flight Crew Support Division at NASA, the Crew Interface Analysis Section is dedicated to the study of human factors in the manned space program. It assumes a specialized role that focuses on answering operational questions pertaining to NASA's Space Shuttle and Space Station Freedom Programs. One of the section's key contributions is to provide knowledge and information about human capabilities and limitations that promote optimal spacecraft and habitat design and use to enhance crew safety and productivity. The section provides human factors engineering for the ongoing missions as well as proposed missions that aim to put human settlements on the Moon and Mars. Research providing solutions to operational issues is the primary objective of the Crew Interface Analysis Section. The studies represent such subdisciplines as ergonomics, space habitability, man-computer interaction, and remote operator interaction.
Integrated Digital Flight Control System for the Space Shuttle Orbiter
NASA Technical Reports Server (NTRS)
1973-01-01
The objectives of the integrated digital flight control system (DFCS) is to provide rotational and translational control of the space shuttle orbiter in all phases of flight: from launch ascent through orbit to entry and touchdown, and during powered horizontal flights. The program provides a versatile control system structure while maintaining uniform communications with other programs, sensors, and control effectors by using an executive routine/functional subroutine format. The program reads all external variables at a single point, copies them into its dedicated storage, and then calls the required subroutines in the proper sequence. As a result, the flight control program is largely independent of other programs in the computer complex and is equally insensitive to characteristics of the processor configuration. The integrated structure is described of the control system and the DFCS executive routine which embodies that structure. The input and output, including jet selection are included. Specific estimation and control algorithm are shown for the various mission phases: cruise (including horizontal powered flight), entry, on-orbit, and boost. Attitude maneuver routines that interface with the DFCS are included.
Impacts: NIST Building and Fire Research Laboratory (technical and societal)
NASA Astrophysics Data System (ADS)
Raufaste, N. J.
1993-08-01
The Building and Fire Research Laboratory (BFRL) of the National Institute of Standards and Technology (NIST) is dedicated to the life cycle quality of constructed facilities. The report describes major effects of BFRL's program on building and fire research. Contents of the document include: structural reliability; nondestructive testing of concrete; structural failure investigations; seismic design and construction standards; rehabilitation codes and standards; alternative refrigerants research; HVAC simulation models; thermal insulation; residential equipment energy efficiency; residential plumbing standards; computer image evaluation of building materials; corrosion-protection for reinforcing steel; prediction of the service lives of building materials; quality of construction materials laboratory testing; roofing standards; simulating fires with computers; fire safety evaluation system; fire investigations; soot formation and evolution; cone calorimeter development; smoke detector standards; standard for the flammability of children's sleepwear; smoldering insulation fires; wood heating safety research; in-place testing of concrete; communication protocols for building automation and control systems; computer simulation of the properties of concrete and other porous materials; cigarette-induced furniture fires; carbon monoxide formation in enclosure fires; halon alternative fire extinguishing agents; turbulent mixing research; materials fire research; furniture flammability testing; standard for the cigarette ignition resistance of mattresses; support of navy firefighter trainer program; and using fire to clean up oil spills.
Salvador, R; Luque, M P; Ciudin, A; Paño, B; Buñesch, L; Sebastia, C; Nicolau, C
2016-01-01
To prospectively evaluate the usefulness of dual-energy computed tomography (DECT) with and without dedicated software in identifying uric acid kidney stones in vivo. We studied 65 kidney stones in 63 patients. All stones were analyzed in vivo by DECT and ex vivo by spectrophotometry. We evaluated the diagnostic performance in identifying uric acid stones with DECT by analyzing the radiologic densities with dedicated software and without using it (through manual measurements) as well as by analyzing the attenuation ratios of the stones in both energies with and without the dedicated software. The six uric acid stones included were correctly identified by evaluating the attenuation ratios with a cutoff of 1.21, both with the dedicated software and without it, yielding perfect diagnostic performance without false positives or false negatives. The study of the attenuations of the stones obtained the following values on the receiver operating characteristic curves in the classification of the uric acid stones: 0.92 for the measurements done with the software and 0.89 for the manual measurements; a cutoff of 538 HU yielded 84% (42/50) diagnostic accuracy for the software and 83.1% (54/65) for the manual measurements. DECT enabled the uric acid stones to be identified correctly through the calculation of the ratio of the attenuations in the two energies. The results obtained with the dedicated software were similar to those obtained manually. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
76 FR 61717 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-05
... computer science based technology that may provide the capability of detecting untoward events such as... is comprised of a dedicated computer server that executes specially designed software with input data... computer assisted clinical ordering. J Biomed Inform. 2003 Feb-Apr;36(1-2):4-22. [PMID 14552843...
Using Flash Technology for Motivation and Assessment
ERIC Educational Resources Information Center
Deal, Walter F., III
2004-01-01
A visit to most any technology education laboratory or classroom will reveal that computers, software, and multimedia software are rapidly becoming a mainstay in learning about technology and technological literacy. Almost all technology labs have at least several computers dedicated to specialized software or hardware such as Computer-aided…
Genomics to feed a switchgrass breeding program
USDA-ARS?s Scientific Manuscript database
Development of improved cultivars is one of three pillars, along with sustainable production and efficient conversion, required for dedicated cellulosic bioenergy crops to succeed. Breeding new cultivars is a long, slow process requiring patience, dedication, and motivation to realize gains and adva...
Resource Isolation Method for Program’S Performance on CMP
NASA Astrophysics Data System (ADS)
Guan, Ti; Liu, Chunxiu; Xu, Zheng; Li, Huicong; Ma, Qiang
2017-10-01
Data center and cloud computing are more popular, which make more benefits for customers and the providers. However, in data center or clusters, commonly there is more than one program running on one server, but programs may interference with each other. The interference may take a little effect, however, the interference may cause serious drop down of performance. In order to avoid the performance interference problem, the mechanism of isolate resource for different programs is a better choice. In this paper we propose a light cost resource isolation method to improve program’s performance. This method uses Cgroups to set the dedicated CPU and memory resource for a program, aiming at to guarantee the program’s performance. There are three engines to realize this method: Program Monitor Engine top program’s resource usage of CPU and memory and transfer the information to Resource Assignment Engine; Resource Assignment Engine calculates the size of CPU and memory resource should be applied for the program; Cgroups Control Engine divide resource by Linux tool Cgroups, and drag program in control group for execution. The experiment result show that making use of the resource isolation method proposed by our paper, program’s performance can be improved.
A Sign Language Screen Reader for Deaf
NASA Astrophysics Data System (ADS)
El Ghoul, Oussama; Jemni, Mohamed
Screen reader technology has appeared first to allow blind and people with reading difficulties to use computer and to access to the digital information. Until now, this technology is exploited mainly to help blind community. During our work with deaf people, we noticed that a screen reader can facilitate the manipulation of computers and the reading of textual information. In this paper, we propose a novel screen reader dedicated to deaf. The output of the reader is a visual translation of the text to sign language. The screen reader is composed by two essential modules: the first one is designed to capture the activities of users (mouse and keyboard events). For this purpose, we adopted Microsoft MSAA application programming interfaces. The second module, which is in classical screen readers a text to speech engine (TTS), is replaced by a novel text to sign (TTSign) engine. This module converts text into sign language animation based on avatar technology.
Real-time control of the robotic lunar observatory telescope
Anderson, J.M.; Becker, K.J.; Kieffer, H.H.; Dodd, D.N.
1999-01-01
The US Geological Survey operates an automated observatory dedicated to the radiometry of the Moon with the objective of developing a multispectral, spatially resolved photometric model of the Moon to be used in the calibration of Earth-orbiting spacecraft. Interference filters are used with two imaging instruments to observe the Moon in 32 passbands from 350-2500 nm. Three computers control the telescope mount and instruments with a fourth computer acting as a master system to control all observation activities. Real-time control software has been written to operate the instrumentation and to automate the observing process. The observing software algorithms use information including the positions of objects in the sky, the phase of the Moon, and the times of evening and morning twilight to decide how to observe program objects. The observatory has been operating in a routine mode since late 1995 and is expected to continue through at least 2002 without significant modifications.
NASA Astrophysics Data System (ADS)
Barbasiewicz, Adrianna; Widerski, Tadeusz; Daliga, Karol
2018-01-01
This article was created as a result of research conducted within the master thesis. The purpose of the measurements was to analyze the accuracy of the positioning of points by computer programs. Selected software was a specialized computer software dedicated to photogrammetric work. For comparative purposes it was decided to use tools with similar functionality. As the basic parameters that affect the results selected the resolution of the photos on which the key points were searched. In order to determine the location of the determined points, it was decided to follow the photogrammetric resection rule. In order to automate the measurement, the measurement session planning was omitted. The coordinates of the points collected by the tachymetric measure were used as a reference system. The resulting deviations and linear displacements oscillate in millimeters. The visual aspects of the cloud points have also been briefly analyzed.
[Physical activity, sedentary leisure, short sleeping and childhood overweight].
Amigo Vázquez, Isaac; Busto Zapico, Raquel; Herrero Díez, Javier; Fernández Rodríguez, Concepción
2008-11-01
In this study, using the path analysis, the relation between physical activity, non-regulated activity, sedentary leisure, hours of sleeping, and the body mass index (BMI) was analyzed. The sample was made up of 103 students, 59 girls and 44 boys, aged between 9 and 10 1/2 years. An individual interview was performed in which the children were asked about the TV programs they watched each day of the week; the time they played with the console and the computer; the time dedicated to sports, games and other activities. The results showed that sedentary leisure (number of hours of TV, computer and console) maintains a significant and inverse relation with the hours of sleeping, non-regulated activity (games and others activities), and physical sport activity. The difference between the results of this study and the previous one is discussed, taking into account the recruitment procedure of the participants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, M; Kissel, L
2002-01-29
We are experimenting with a new computing model to be applied to a new computer dedicated to that model. Several LLNL science teams now have computational requirements, evidenced by the mature scientific applications that have been developed over the past five plus years, that far exceed the capability of the institution's computing resources. Thus, there is increased demand for dedicated, powerful parallel computational systems. Computation can, in the coming year, potentially field a capability system that is low cost because it will be based on a model that employs open source software and because it will use PC (IA32-P4) hardware.more » This incurs significant computer science risk regarding stability and system features but also presents great opportunity. We believe the risks can be managed, but the existence of risk cannot be ignored. In order to justify the budget for this system, we need to make the case that it serves science and, through serving science, serves the institution. That is the point of the meeting and the White Paper that we are proposing to prepare. The questions are listed and the responses received are in this report.« less
Commercial grade item (CGI) dedication of MDR relays for nuclear safety related applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, R.K.; Julka, A.; Modi, G.
1994-08-01
MDR relays manufactured by Potter and Brumfield (P and B) have been used in various safety related applications in commercial nuclear power plants. These include emergency safety features (ESF) actuation systems, emergency core cooling systems (ECCS) actuation, and reactor protection systems. The MDR relays manufactured prior to May 1990 showed signs of generic failure due to corrosion and outgassing of coil varnish. P and B has made design changes to correct these problems in relays manufactured after May 1990. However, P and B does not manufacture the relays under any 10CFR50 Appendix B quality assurance (QA) program. They manufacture themore » relays under their commercial QA program and supply these as commercial grade items. This necessitates CGI Dedication of these relays for use in nuclear-safety-related applications. This paper presents a CGI dedication program that has been used to dedicate the MDR relays manufactured after May 1990. The program is in compliance with current Nuclear Regulatory Commission (NRC) and Electric Power Research Institute (EPRI) guidelines and applicable industry standards; it specifies the critical characteristics of the relays, provides the tests and analysis required to verify the critical characteristics, the acceptance criteria for the test results, performs source verification to qualify P and B for its control of the critical characteristics, and provides documentation. The program provides reasonable assurance that the new MDR relays will perform their intended safety functions.« less
HoloNetwork: communicating science through holography
NASA Astrophysics Data System (ADS)
Pombo, Pedro; Santos, Emanuel; Magalhães, Carolina
2017-03-01
Since 1997 a program dedicated to holography has been developed and implemented in Portugal. This program started with focus on schools and science education. The HoloNetwork was created and it has been spread at a National level, involving a group of thirty schools and hundreds of students and teachers. In 2009 this network started to work to achieve a new target, the general public. With this goal, a larger program was developed with focus on science and society and on science communication through holography. For the implementation of this new program, special holography outreach activities were built, dedicated to informal learning and seven Science Centers around Portugal were add into the HoloNetwork. During last years, we have been working on holography, based on two main branches, one dedicated to schools and with the aimed to promote physics teaching and to teach how to make holograms, and another dedicated to society and with the aimed to promote holography and to increase scientific literacy. This paper would analyze the educational program, all holography outreach activities, exhibitions or events, all equipments, materials and setups used and it would present the holographic techniques explored with students or with the public. Finally, the results obtained in this work would be present and explored, with focus on students impact and outcomes, taking into account the public engagement on holography and its effect into scientific culture and analyzing the quality of holograms made by students and by the general public. subject.
Innovative approach for low-cost quick-access small payload missions
NASA Astrophysics Data System (ADS)
Friis, Jan W., Jr.
2000-11-01
A significant part of the burgeoning commercial space industry is placing an unprecedented number of satellites into low earth orbit for a variety of new applications and services. By some estimates the commercial space industry now exceeds that of government space activities. Yet the two markets remain largely separate, with each deploying dedicated satellites and infrastructure for their respective missions. One commercial space firm, Final Analysis, has created a new program wherein either government, scientific or new technology payloads can be integrated on a commercial spacecraft on commercial satellites for a variety of mission scenarios at a fraction of the cost of a dedicated mission. NASA has recognized the advantage of this approach, and has awarded the Quick Ride program to provide frequent, low cost flight opportunities for small independent payloads aboard the Final Analysis constellation, and investigators are rapidly developing science programs that conform to the proposed payload accommodations envelope. Missions that were not feasible using dedicated launches are now receiving approval under the lower cost Quick Ride approach. Final Analysis has dedicated ten out of its thirty-eight satellites in support of the Quick Ride efforts. The benefit of this type of space access extend beyond NASA science programs. Commercial space firms can now gain valuable flight heritage for new technology and satellite product offerings. Further, emerging international space programs can now place a payload in orbit enabling the country to allocate its resources against the payload and mission requirements rather htan increased launch costs of a dedicated spacecraft. Finally, the low cost nature provides University-based research educational opportunities previously out of the reach of most space-related budgets. This paper will describe the motivation, benefits, technical features, and program costs of the Final Analysis secondary payload program. Payloads can be accommodated on up to thirty-eight separate satellites. Since the secondary payloads will fly on satellites designed for global wireless data services, each user can utilize low cost communication system already in place for sending and retrieving digital information from its payload.
Casella, Ivan Benaduce; Fukushima, Rodrigo Bono; Marques, Anita Battistini de Azevedo; Cury, Marcus Vinícius Martins; Presti, Calógero
2015-03-01
To compare a new dedicated software program and Adobe Photoshop for gray-scale median (GSM) analysis of B-mode images of carotid plaques. A series of 42 carotid plaques generating ≥50% diameter stenosis was evaluated by a single observer. The best segment for visualization of internal carotid artery plaque was identified on a single longitudinal view and images were recorded in JPEG format. Plaque analysis was performed by both programs. After normalization of image intensity (blood = 0, adventitial layer = 190), histograms were obtained after manual delineation of plaque. Results were compared with nonparametric Wilcoxon signed rank test and Kendall tau-b correlation analysis. GSM ranged from 00 to 100 with Adobe Photoshop and from 00 to 96 with IMTPC, with a high grade of similarity between image pairs, and a highly significant correlation (R = 0.94, p < .0001). IMTPC software appears suitable for the GSM analysis of carotid plaques. © 2014 Wiley Periodicals, Inc.
Applications of artificial intelligence to mission planning
NASA Technical Reports Server (NTRS)
Ford, Donnie R.; Rogers, John S.; Floyd, Stephen A.
1990-01-01
The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques.
GPU based framework for geospatial analyses
NASA Astrophysics Data System (ADS)
Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus
2017-04-01
Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric
Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories
ERIC Educational Resources Information Center
Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher
2009-01-01
Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…
Billoud, B; Kontic, M; Viari, A
1996-01-01
At the DNA/RNA level, biological signals are defined by a combination of spatial structures and sequence motifs. Until now, few attempts had been made in writing general purpose search programs that take into account both sequence and structure criteria. Indeed, the most successful structure scanning programs are usually dedicated to particular structures and are written using general purpose programming languages through a complex and time consuming process where the biological problem of defining the structure and the computer engineering problem of looking for it are intimately intertwined. In this paper, we describe a general representation of structures, suitable for database scanning, together with a programming language, Palingol, designed to manipulate it. Palingol has specific data types, corresponding to structural elements-basically helices-that can be arranged in any way to form a complex structure. As a consequence of the declarative approach used in Palingol, the user should only focus on 'what to search for' while the language engine takes care of 'how to look for it'. Therefore, it becomes simpler to write a scanning program and the structural constraints that define the required structure are more clearly identified. PMID:8628670
Overview of NASA communications infrastructure
NASA Technical Reports Server (NTRS)
Arnold, Ray J.; Fuechsel, Charles
1991-01-01
The infrastructure of NASA communications systems for effecting coordination across NASA offices and with the national and international research and technological communities is discussed. The offices and networks of the communication system include the Office of Space Science and Applications (OSSA), which manages all NASA missions, and the Office of Space Operations, which furnishes communication support through the NASCOM, the mission critical communications support network, and the Program Support Communications network. The NASA Science Internet was established by OSSA to centrally manage, develop, and operate an integrated computer network service dedicated to NASA's space science and application research. Planned for the future is the National Research and Education Network, which will provide communications infrastructure to enhance science resources at a national level.
Administrative Assistant | Center for Cancer Research
We are looking for a pleasant, organized, dependable person to serve as an administrative assistant at the National Cancer Institute on the campus of the National Institutes of Health (NIH). Work supports a busy clinical program in the world’s largest dedicated research hospital patients call the “House of Hope.” Tasks involve calendar management, arranging travel, scheduling conferences and meetings, drafting and handling correspondence, timekeeping, placing purchase requests, office property management, greeting visitors, and office work, such as copying, filing, and scanning. Ability to work with basic computer office software (such as Word, Excel, and PowerPoint) required. Some administrative experience, including calendar management preferred. Full-time position, business hours. NIH is metro accessible.
Seismo-Live: Training in Seismology with Jupyter Notebooks
NASA Astrophysics Data System (ADS)
Krischer, Lion; Tape, Carl; Igel, Heiner
2016-04-01
Seismological training tends to occur within the isolation of a particular institution with a limited set of tools (codes, libraries) that are often not transferrable outside. Here, we propose to overcome these limitations with a community-driven library of Jupyter notebooks dedicated to training on any aspect of seismology for purposes of education and outreach, on-site or archived tutorials for codes, classroom instruction, and research. A Jupyter notebook (jupyter.org) is an open-source interactive computational environment that allows combining code execution, rich text, mathematics, and plotting. It can be considered a platform that supports reproducible research, as all inputs and outputs may be stored. Text, external graphics, equations can be handled using Markdown (incl. LaTeX) format. Jupyter notebooks are driven by standard web browsers, can be easily exchanged in text format, or converted to other documents (e.g. PDF, slide shows). They provide an ideal format for practical training in seismology. A pilot-platform was setup with a dedicated server such that the Jupyter notebooks can be run in any browser (PC, notepad, smartphone). We show the functionalities of the Seismo-Live platform with examples from computational seismology, seismic data access and processing using the ObsPy library, seismic inverse problems, and others. The current examples are all using the Python programming language but any free language can be used. Potentially, such community platforms could be integrated with the EPOS-IT infrastructure and extended to other fields of Earth sciences.
Fuad, Anis; Sanjaya, Guardian Yoki; Lazuardi, Lutfan; Rahmanti, Annisa Ristya; Hsu, Chien-Yeh
2013-01-01
Public health informatics has been defined as the systematic application of information and computer science and technology to public health practice, research, and learning [1]. Unfortunately, limited reports exist concerning to the capacity building strategies to improve public health informatics workforce in limited-resources setting. In Indonesia, only three universities, including Universitas Gadjah Mada (UGM), offer master degree program on related public health informatics discipline. UGM started a new dedicated master program on Health Management Information Systems in 2005, under the auspice of the Graduate Program of Public Health at the Faculty of Medicine. This is the first tracer study to the alumni aiming to a) identify the gaps between curriculum and the current jobs and b) describe their perception on public health informatics competencies. We distributed questionnaires to 114 alumni with 36.84 % response rate. Despite low response rate, this study provided valuable resources to set up appropriate competencies, curriculum and capacity building strategies of public health informatics workforce in Indonesia.
Monitoring of the electrical parameters in off-grid solar power system
NASA Astrophysics Data System (ADS)
Idzkowski, Adam; Leoniuk, Katarzyna; Walendziuk, Wojciech
2016-09-01
The aim of this work was to make a monitoring dedicated to an off-grid installation. A laboratory set, which was built for that purpose, was equipped with a PV panel, a battery, a charge controller and a load. Additionally, to monitor electrical parameters from this installation there were used: LabJack module (data acquisition card), measuring module (self-built) and a computer with a program, which allows to measure and present the off-grid installation parameters. The program was made in G language using LabVIEW software. The designed system enables analyzing the currents and voltages of PV panel, battery and load. It makes also possible to visualize them on charts and to make reports from registered data. The monitoring system was also verified by a laboratory test and in real conditions. The results of this verification are also presented.
Ören, Ünal; Hiller, Mauritius; Andersson, M
2017-04-28
A Monte Carlo-based stand-alone program, IDACstar (Internal Dose Assessment by Computer), was developed, dedicated to perform radiation dose calculations using complex voxel simulations. To test the program, two irradiation situations were simulated, one hypothetical contamination case with 600 MBq of 99mTc and one extravasation case involving 370 MBq of 18F-FDG. The effective dose was estimated to be 0.042 mSv for the contamination case and 4.5 mSv for the extravasation case. IDACstar has demonstrated that dosimetry results from contamination or extravasation cases can be acquired with great ease. An effective tool for radiation protection applications is provided with IDACstar allowing physicists at nuclear medicine departments to easily quantify the radiation risk of stochastic effects when a radiation accident has occurred. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Planning Communication Networks to Deliver Educational Services.
ERIC Educational Resources Information Center
Ballard, Richard J.; Eastwood, Lester F., Jr.
As companion to the more general document Telecommunications Media for the Delivery of Educational Programming , this report concentrates on the technical and economic factors affecting the design of only one class of educational networks, dedicated coaxial cable systems. To provide illustrations, possible single and dual dedicated cable networks…
WARP: Weight Associative Rule Processor. A dedicated VLSI fuzzy logic megacell
NASA Technical Reports Server (NTRS)
Pagni, A.; Poluzzi, R.; Rizzotto, G. G.
1992-01-01
During the last five years Fuzzy Logic has gained enormous popularity in the academic and industrial worlds. The success of this new methodology has led the microelectronics industry to create a new class of machines, called Fuzzy Machines, to overcome the limitations of traditional computing systems when utilized as Fuzzy Systems. This paper gives an overview of the methods by which Fuzzy Logic data structures are represented in the machines (each with its own advantages and inefficiencies). Next, the paper introduces WARP (Weight Associative Rule Processor) which is a dedicated VLSI megacell allowing the realization of a fuzzy controller suitable for a wide range of applications. WARP represents an innovative approach to VLSI Fuzzy controllers by utilizing different types of data structures for characterizing the membership functions during the various stages of the Fuzzy processing. WARP dedicated architecture has been designed in order to achieve high performance by exploiting the computational advantages offered by the different data representations.
SeedVicious: Analysis of microRNA target and near-target sites.
Marco, Antonio
2018-01-01
Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.
2016-05-05
Following a naming dedication ceremony May 5, 2016 - the 55th anniversary of Alan Shepard's historic rocket launch - NASA Langley Research Center's newest building is known as the Katherine G. Johnson Computational Research Facility, honoring the "human computer" who successfully calculated the trajectories for America's first space flights.
Kumar, P.; Collett, Timothy S.; Vishwanath, K.; Shukla, K.M.; Nagalingam, J.; Lall, M.V.; Yamada, Y; Schultheiss, P.; Holland, M.
2016-01-01
The India National Gas Hydrate Program Expedition 02 (NGHP-02) was conducted from 3-March-2015 to 28-July-2015 off the eastern coast of India using the deepwater drilling vessel Chikyu. The primary goal of this expedition was to explore for highly saturated gas hydrate occurrences in sand reservoirs that would become targets for future production tests. The first two months of the expedition were dedicated to logging-whiledrilling (LWD) operations, with a total of 25 holes drilled and logged. The next three months were dedicated to coring operations at 10 of the most promising sites. With a total of five months of continuous field operations, the expedition was the most comprehensive dedicated gas hydrate investigation ever undertaken.
The fourth International Conference on Information Science and Cloud Computing
NASA Astrophysics Data System (ADS)
This book comprises the papers accepted by the fourth International Conference on Information Science and Cloud Computing (ISCC), which was held from 18-19 December, 2015 in Guangzhou, China. It has 70 papers divided into four parts. The first part focuses on Information Theory with 20 papers; the second part emphasizes Machine Learning also containing 21 papers; in the third part, there are 21 papers as well in the area of Control Science; and the last part with 8 papers is dedicated to Cloud Science. Each part can be used as an excellent reference by engineers, researchers and students who need to build a knowledge base of the most current advances and state-of-practice in the topics covered by the ISCC conference. Special thanks go to Professor Deyu Qi, General Chair of ISCC 2015, for his leadership in supervising the organization of the entire conference; Professor Tinghuai Ma, Program Chair, and members of program committee for evaluating all the submissions and ensuring the selection of only the highest quality papers; and the authors for sharing their ideas, results and insights. We sincerely hope that you enjoy reading papers included in this book.
Economic feasibility analysis of conventional and dedicated energy crop production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, R.G.; Langemeier, M.R.; Krehbiel, L.R.
Economic feasibilities (net return per acre) associated with conventional agricultural crop production versus that of dedicated bioenergy crop (herbaceous energy crops) were investigated for northeastern Kansas. Conventional agricultural crops examined were corn, soybeans, wheat, sorghum and alfalfa and dedicated herbaceous energy crops included big bluestem/indiangrass, switchgrass, eastern gamagrass, brome, fescue and cane hay. Costs, prices and government program information from public and private sources were used to project the net return per acre over a six-year period beginning in 1997. Three soil productivity levels (low, average and high), which had a direct effect on the net return per acre, weremore » used to model differences in expected yield. In all three soil productivity cases, big bluestem/indiangrass, switchgrass and brome hay provided a higher net return per acre versus conventional crops grown on both program and non-program acres. Eastern gamagrass, fescue hay and cane hay had returns that were similar or less than returns provided by conventional crops.« less
A software tool for modeling and simulation of numerical P systems.
Buiu, Catalin; Arsene, Octavian; Cipu, Corina; Patrascu, Monica
2011-03-01
A P system represents a distributed and parallel bio-inspired computing model in which basic data structures are multi-sets or strings. Numerical P systems have been recently introduced and they use numerical variables and local programs (or evolution rules), usually in a deterministic way. They may find interesting applications in areas such as computational biology, process control or robotics. The first simulator of numerical P systems (SNUPS) has been designed, implemented and made available to the scientific community by the authors of this paper. SNUPS allows a wide range of applications, from modeling and simulation of ordinary differential equations, to the use of membrane systems as computational blocks of cognitive architectures, and as controllers for autonomous mobile robots. This paper describes the functioning of a numerical P system and presents an overview of SNUPS capabilities together with an illustrative example. SNUPS is freely available to researchers as a standalone application and may be downloaded from a dedicated website, http://snups.ics.pub.ro/, which includes an user manual and sample membrane structures. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
ANOPP/VMS HSCT ground contour system
NASA Technical Reports Server (NTRS)
Rawls, John, Jr.; Glaab, Lou
1992-01-01
This viewgraph shows the integration of the Visual Motion Simulator with ANOPP. ANOPP is an acronym for the Aircraft NOise Prediction Program. It is a computer code consisting of dedicated noise prediction modules for jet, propeller, and rotor powered aircraft along with flight support and noise propagation modules, all executed under the control of an executive system. The Visual Motion Simulator (VMS) is a ground based motion simulator with six degrees of freedom. The transport-type cockpit is equipped with conventional flight and engine-thrust controls and with flight instrument displays. Control forces on the wheel, column, and rudder pedals are provided by a hydraulic system coupled with an analog computer. The simulator provides variable-feel characteristics of stiffness, damping, coulomb friction, breakout forces, and inertia. The VMS provides a wide range of realistic flight trajectories necessary for computing accurate ground contours. The NASA VMS will be discussed in detail later in this presentation. An equally important part of the system for both ANOPP and VMS is the engine performance. This will also be discussed in the presentation.
MultiPhyl: a high-throughput phylogenomics webserver using distributed computing
Keane, Thomas M.; Naughton, Thomas J.; McInerney, James O.
2007-01-01
With the number of fully sequenced genomes increasing steadily, there is greater interest in performing large-scale phylogenomic analyses from large numbers of individual gene families. Maximum likelihood (ML) has been shown repeatedly to be one of the most accurate methods for phylogenetic construction. Recently, there have been a number of algorithmic improvements in maximum-likelihood-based tree search methods. However, it can still take a long time to analyse the evolutionary history of many gene families using a single computer. Distributed computing refers to a method of combining the computing power of multiple computers in order to perform some larger overall calculation. In this article, we present the first high-throughput implementation of a distributed phylogenetics platform, MultiPhyl, capable of using the idle computational resources of many heterogeneous non-dedicated machines to form a phylogenetics supercomputer. MultiPhyl allows a user to upload hundreds or thousands of amino acid or nucleotide alignments simultaneously and perform computationally intensive tasks such as model selection, tree searching and bootstrapping of each of the alignments using many desktop machines. The program implements a set of 88 amino acid models and 56 nucleotide maximum likelihood models and a variety of statistical methods for choosing between alternative models. A MultiPhyl webserver is available for public use at: http://www.cs.nuim.ie/distributed/multiphyl.php. PMID:17553837
NASA Astrophysics Data System (ADS)
Fiala, L.; Lokajicek, M.; Tumova, N.
2015-05-01
This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program coordinator Federico Carminati and the conference chair Denis Perret-Gallix for their global supervision. Further information on ACAT 2014 can be found at http://www.particle.cz/acat2014
Spuler, Martin
2015-08-01
A Brain-Computer Interface (BCI) allows to control a computer by brain activity only, without the need for muscle control. In this paper, we present an EEG-based BCI system based on code-modulated visual evoked potentials (c-VEPs) that enables the user to work with arbitrary Windows applications. Other BCI systems, like the P300 speller or BCI-based browsers, allow control of one dedicated application designed for use with a BCI. In contrast, the system presented in this paper does not consist of one dedicated application, but enables the user to control mouse cursor and keyboard input on the level of the operating system, thereby making it possible to use arbitrary applications. As the c-VEP BCI method was shown to enable very fast communication speeds (writing more than 20 error-free characters per minute), the presented system is the next step in replacing the traditional mouse and keyboard and enabling complete brain-based control of a computer.
D'Alessandro, M P; Ackerman, M J; Sparks, S M
1993-11-01
Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.
The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.
NASA Astrophysics Data System (ADS)
Reymond, Dominique
2016-04-01
We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/
NASA Astrophysics Data System (ADS)
Ewald, Mary Lou
2002-10-01
As a land-grant institution, Auburn University is committed to serving the citizens of Alabama through extension services and outreach programs. In following this outreach focus, the College of Sciences and Mathematics (COSAM) at AU has dedicated considerable resources to science and math related K-12 outreach programs, including two of our newest student-aimed programs: Youth Experiences in Science (YES) and Alabama BEST. Youth Experiences in Science (YES) is a Saturday enrichment program for middle school students. It includes a Fall and Spring Saturday component and a Summer camp experience. Activities include: LEGO's with Computers; Blood, Diseases & Forensics; Geometry of Models & Games; GPS Mapping; Polymer Chemistry; Electronics; and Genetics. Last year (2001-02), over 400 students attended a YES program on our campus. Alabama BEST (Boosting Engineering, Science & Technology) is a middle and high school robotics competition co-sponsored by COSAM and the College of Engineering at AU. Teams of students design and build robots and compete in a game format, with a new game theme introduced each year. This year, sixty teams from across Alabama and Georgia will have six weeks to design, build and perfect their robots before competition on October 18 and 19.
Computation Directorate Annual Report 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L; McGraw, J R; Ashby, S F
Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less
Development of a dedicated peptide tandem mass spectral library for conservation science.
Fremout, Wim; Dhaenens, Maarten; Saverwyns, Steven; Sanyova, Jana; Vandenabeele, Peter; Deforce, Dieter; Moens, Luc
2012-05-30
In recent years, the use of liquid chromatography tandem mass spectrometry (LC-MS/MS) on tryptic digests of cultural heritage objects has attracted much attention. It allows for unambiguous identification of peptides and proteins, and even in complex mixtures species-specific identification becomes feasible with minimal sample consumption. Determination of the peptides is commonly based on theoretical cleavage of known protein sequences and on comparison of the expected peptide fragments with those found in the MS/MS spectra. In this approach, complex computer programs, such as Mascot, perform well identifying known proteins, but fail when protein sequences are unknown or incomplete. Often, when trying to distinguish evolutionarily well preserved collagens of different species, Mascot lacks the required specificity. Complementary and often more accurate information on the proteins can be obtained using a reference library of MS/MS spectra of species-specific peptides. Therefore, a library dedicated to various sources of proteins in works of art was set up, with an initial focus on collagen rich materials. This paper discusses the construction and the advantages of this spectral library for conservation science, and its application on a number of samples from historical works of art. Copyright © 2012 Elsevier B.V. All rights reserved.
COMAN: a web server for comprehensive metatranscriptomics analysis.
Ni, Yueqiong; Li, Jun; Panagiotou, Gianni
2016-08-11
Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.
48 CFR 970.2703-2 - Patent rights clause provisions for management and operating contractors.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-exempted areas of technology or in operation of DOE facilities primarily dedicated to naval nuclear... for-profit, large business firm and the contract does not have a technology transfer mission or if... dedicated to naval nuclear propulsion or weapons related programs. That clause provides for DOE's statutory...
48 CFR 970.2703-2 - Patent rights clause provisions for management and operating contractors.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-exempted areas of technology or in operation of DOE facilities primarily dedicated to naval nuclear... for-profit, large business firm and the contract does not have a technology transfer mission or if... dedicated to naval nuclear propulsion or weapons related programs. That clause provides for DOE's statutory...
Design Principles for a Comprehensive Library System.
ERIC Educational Resources Information Center
Uluakar, Tamer; And Others
1981-01-01
Describes an online design featuring circulation control, catalog access, and serial holdings that uses an incremental approach to system development. Utilizing a dedicated computer, this second of three releases pays particular attention to present and predicted computing capabilities as well as trends in library automation. (Author/RAA)
"TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.
ERIC Educational Resources Information Center
Hampel, Viktor E.; And Others
TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…
Cognitive Load Theory vs. Constructivist Approaches: Which Best Leads to Efficient, Deep Learning?
ERIC Educational Resources Information Center
Vogel-Walcutt, J. J.; Gebrim, J. B.; Bowers, C.; Carper, T. M.; Nicholson, D.
2011-01-01
Computer-assisted learning, in the form of simulation-based training, is heavily focused upon by the military. Because computer-based learning offers highly portable, reusable, and cost-efficient training options, the military has dedicated significant resources to the investigation of instructional strategies that improve learning efficiency…
Kogon, S; Arnold, J; Wood, R; Merner, L
2010-04-15
DIP3, a computerized aid to assist in dental identification, was integrated into the RESOLVE INITIATIVE, a joint endeavour by the Ontario Provincial Police and the Office of the Chief Coroner for Ontario, to resolve cases of missing persons (MP) and unidentified remains (UNID). Dental data, from the UNID, collected by the coroner and the dental records of MP, provided by investigating police, are streamed separately for input into a dedicated computer program. All dental management is provided by forensic dentists. The advantage of having experienced dentists managing this data is explained. A description of the RESOLVE INITIATIVE and DIP3, including the method used for record transmission is provided. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Protyping machine vision software on the World Wide Web
NASA Astrophysics Data System (ADS)
Karantalis, George; Batchelor, Bruce G.
1998-10-01
Interactive image processing is a proven technique for analyzing industrial vision applications and building prototype systems. Several of the previous implementations have used dedicated hardware to perform the image processing, with a top layer of software providing a convenient user interface. More recently, self-contained software packages have been devised and these run on a standard computer. The advent of the Java programming language has made it possible to write platform-independent software, operating over the Internet, or a company-wide Intranet. Thus, there arises the possibility of designing at least some shop-floor inspection/control systems, without the vision engineer ever entering the factories where they will be used. It successful, this project will have a major impact on the productivity of vision systems designers.
Comedy, Yolanda L.; Gilbert, Juan E.; Pun, Suzie H.
2017-01-01
Inventors help solve all kinds of problems. The AAAS-Lemelson Invention Ambassador program celebrates inventors who have an impact on global challenges, making our communities and the globe better, one invention at a time. In this paper, we introduce two of these invention ambassadors: Dr. Suzie Pun and Dr. Juan Gilbert. Dr. Suzie Pun is the Robert F. Rushmer Professor of Bioengineering, an adjunct professor of chemical engineering, and a member of the Molecular Engineering and Sciences Institute at the University of Washington. Dr. Juan Gilbert is the Andrew Banks Family Preeminence Endowed Professor and chair of the Computer & Information Science & Engineering Department at the University of Florida. Both have a passion for solving problems and are dedicated to teaching their students to change the world. PMID:29527271
Utilising Raspberry Pi as a cheap and easy do it yourself streaming device for astronomy
NASA Astrophysics Data System (ADS)
Maulana, F.; Soegijoko, W.; Yamani, A.
2016-11-01
Recent developments in personal computing platforms have been revolutionary. With the advent of the Raspberry Pi series and the Arduino series, sub USD 100 computing platforms have changed the playing field altogether. It used to be that you would need a PC or an FPGA platform costing thousands of USD to create a dedicated device for a a dedicated task. Combining a PiCam with the Raspberry Pi allows for smaller budgets to be able to stream live images to the internet and to the public in general. This paper traces our path in designing and adapting the PiCam to a common sized eyepiece and telescope in preparation for the TSE in Indonesia this past March.
G-cueing microcontroller (a microprocessor application in simulators)
NASA Technical Reports Server (NTRS)
Horattas, C. G.
1980-01-01
A g cueing microcontroller is described which consists of a tandem pair of microprocessors, dedicated to the task of simulating pilot sensed cues caused by gravity effects. This task includes execution of a g cueing model which drives actuators that alter the configuration of the pilot's seat. The g cueing microcontroller receives acceleration commands from the aerodynamics model in the main computer and creates the stimuli that produce physical acceleration effects of the aircraft seat on the pilots anatomy. One of the two microprocessors is a fixed instruction processor that performs all control and interface functions. The other, a specially designed bipolar bit slice microprocessor, is a microprogrammable processor dedicated to all arithmetic operations. The two processors communicate with each other by a shared memory. The g cueing microcontroller contains its own dedicated I/O conversion modules for interface with the seat actuators and controls, and a DMA controller for interfacing with the simulation computer. Any application which can be microcoded within the available memory, the available real time and the available I/O channels, could be implemented in the same controller.
LOX/LH2 vane pump for auxiliary propulsion systems
NASA Technical Reports Server (NTRS)
Hemminger, J. A.; Ulbricht, T. E.
1985-01-01
Positive displacement pumps offer potential efficiency advantages over centrifugal pumps for future low thrust space missions. Low flow rate applications, such as space station auxiliary propulsion or dedicated low thrust orbiter transfer vehicles, are typical of missions where low flow and high head rise challenge centrifugal pumps. The positive displacement vane pump for pumping of LOX and LH2 is investigated. This effort has included: (1) a testing program in which pump performance was investigated for differing pump clearances and for differing pump materials while pumping LN2, LOX, and LH2; and (2) an analysis effort, in which a comprehensive pump performance analysis computer code was developed and exercised. An overview of the theoretical framework of the performance analysis computer code is presented, along with a summary of analysis results. Experimental results are presented for pump operating in liquid nitrogen. Included are data on the effects on pump performance of pump clearance, speed, and pressure rise. Pump suction performance is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doerfler, Douglas; Austin, Brian; Cook, Brandon
There are many potential issues associated with deploying the Intel Xeon Phi™ (code named Knights Landing [KNL]) manycore processor in a large-scale supercomputer. One in particular is the ability to fully utilize the high-speed communications network, given that the serial performance of a Xeon Phi TM core is a fraction of a Xeon®core. In this paper, we take a look at the trade-offs associated with allocating enough cores to fully utilize the Aries high-speed network versus cores dedicated to computation, e.g., the trade-off between MPI and OpenMP. In addition, we evaluate new features of Cray MPI in support of KNL,more » such as internode optimizations. We also evaluate one-sided programming models such as Unified Parallel C. We quantify the impact of the above trade-offs and features using a suite of National Energy Research Scientific Computing Center applications.« less
NASA Technical Reports Server (NTRS)
Malachowski, M. J.
1990-01-01
Laser beam positioning and beam rider modules were incorporated into the long hollow flexible segment of an articulated robot manipulator (ARM). Using a single laser beam, the system determined the position of the distal ARM endtip, with millimetric precision, in six degrees of freedom, at distances of up to 10 meters. Preliminary designs, using space rated technology for the critical systems, of a two segmented physical ARM, with a single and a dual degree of freedom articulation, were developed, prototyped, and tested. To control the positioning of the physical ARM, an indirect adaptive controller, which used the mismatch between the position of the laser beam under static and dynamic conditions, was devised. To predict the behavior of the system and test the concept, a computer simulation model was constructed. A hierarchical artificially intelligent real time ADA operating system program structure was created. The software was designed for implementation on a dedicated VME bus based Intel 80386 administered parallel processing multi-tasking computer system.
1999-05-27
A ceremony dedicated the KSC Press Site auditorium as the John Holliman Auditorium to honor the correspondent for his enthusiastic, dedicated coverage of America's space program. The auditorium was built in 1980 and has been the focal point for news coverage of Space Shuttle launches. The ceremony followed the 94th launch of a Space Shuttle, on mission STS-96, earlier this morning
Dynamic Positioning Capability Analysis for Marine Vessels Based on A DPCap Polar Plot Program
NASA Astrophysics Data System (ADS)
Wang, Lei; Yang, Jian-min; Xu, Sheng-wen
2018-03-01
Dynamic positioning capability (DPCap) analysis is essential in the selection of thrusters, in their configuration, and during preliminary investigation of the positioning ability of a newly designed vessel dynamic positioning system. DPCap analysis can help determine the maximum environmental forces, in which the DP system can counteract in given headings. The accuracy of the DPCap analysis is determined by the precise estimation of the environmental forces as well as the effectiveness of the thrust allocation logic. This paper is dedicated to developing an effective and efficient software program for the DPCap analysis for marine vessels. Estimation of the environmental forces can be obtained by model tests, hydrodynamic computation and empirical formulas. A quadratic programming method is adopted to allocate the total thrust on every thruster of the vessel. A detailed description of the thrust allocation logic of the software program is given. The effectiveness of the new program DPCap Polar Plot (DPCPP) was validated by a DPCap analysis for a supply vessel. The present study indicates that the developed program can be used in the DPCap analysis for marine vessels. Moreover, DPCap analysis considering the thruster failure mode might give guidance to the designers of vessels whose thrusters need to be safer.
Artificial Neural Network Metamodels of Stochastic Computer Simulations
1994-08-10
SUBTITLE r 5. FUNDING NUMBERS Artificial Neural Network Metamodels of Stochastic I () Computer Simulations 6. AUTHOR(S) AD- A285 951 Robert Allen...8217!298*1C2 ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC COMPUTER SIMULATIONS by Robert Allen Kilmer B.S. in Education Mathematics, Indiana...dedicate this document to the memory of my father, William Ralph Kilmer. mi ABSTRACT Signature ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC
Microbial community analysis using MEGAN.
Huson, Daniel H; Weber, Nico
2013-01-01
Metagenomics, the study of microbes in the environment using DNA sequencing, depends upon dedicated software tools for processing and analyzing very large sequencing datasets. One such tool is MEGAN (MEtaGenome ANalyzer), which can be used to interactively analyze and compare metagenomic and metatranscriptomic data, both taxonomically and functionally. To perform a taxonomic analysis, the program places the reads onto the NCBI taxonomy, while functional analysis is performed by mapping reads to the SEED, COG, and KEGG classifications. Samples can be compared taxonomically and functionally, using a wide range of different charting and visualization techniques. PCoA analysis and clustering methods allow high-level comparison of large numbers of samples. Different attributes of the samples can be captured and used within analysis. The program supports various input formats for loading data and can export analysis results in different text-based and graphical formats. The program is designed to work with very large samples containing many millions of reads. It is written in Java and installers for the three major computer operating systems are available from http://www-ab.informatik.uni-tuebingen.de. © 2013 Elsevier Inc. All rights reserved.
Educating the next generation of explorers at an historically Black University
NASA Astrophysics Data System (ADS)
Chaudhury, S.; Rodriguez, W. J.
2003-04-01
This paper describes the development of an innovative undergraduate research training model based at an Historically Black University in the USA that involves students with majors in diverse scientific disciplines in authentic Earth Systems Science research. Educating those who will be the next generation of explorers of earth and space poses several challenges at smaller academic institutions that might lack dedicated resources for this area of study. Over a 5-year span, Norfolk State University has been developing a program that has afforded the opportunity for students majoring in biology, chemistry, mathematics, computer science, physics, engineering and science education to work collaboratively in teams on research projects that emphasize the use of scientific visualization in studying the environment. Recently, a hands-on component has been added through partnerships with local K-12 school teachers in data collection and reporting for the GLOBE Program (GLobal Observations to Benefit the Environment). The successes and challenges of this program along with some innovative uses of technology to promote inquiry learning will be presented in this paper.
Joshua Smith, Jesse; Patel, Ravi K; Chen, Xi; Tarpley, Margaret J; Terhune, Kyla P
2014-01-01
Many residents supplement general surgery training with years of dedicated research, and an increasing number at our institution pursue additional degrees. We sought to determine whether it was worth the financial cost for residency programs to support degrees. We reviewed graduating chief residents (n = 69) in general surgery at Vanderbilt University from 2001 to 2010 and collected the data including research time and additional degrees obtained. We then compared this information with the following parameters: (1) total papers, (2) first-author papers, (3) Journal Citation Reports impact factors of journals in which papers were published, and (4) first job after residency or fellowship training. The general surgery resident training program at Vanderbilt University is an academic program, approved to finish training 7 chief residents yearly during the time period studied. Chief residents in general surgery at Vanderbilt who finished their training 2001 through 2010. We found that completion of a degree during residency was significantly associated with more total and first-author publications as compared with those by residents with only dedicated research time (p = 0.001 and p = 0.017). Residents completing a degree also produced publications of a higher caliber and level of authorship as determined by an adjusted resident impact factor score as compared with those by residents with laboratory research time only (p = 0.005). Degree completion also was significantly correlated with a first job in academia if compared to those with dedicated research time only (p = 0.046). Our data support the utility of degree completion when economically feasible and use of dedicated research time as an effective way to significantly increase research productivity and retain graduates in academic surgery. Aggregating data from other academic surgery programs would allow us to further determine association of funding of additional degrees as a means to encourage academic productivity and retention. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Government International, Research, and Nonprofit Organizations R&D Programs NREL is the only federal laboratory dedicated to the research, development, commercialization, and deployment of renewable energy and Program supports NREL research and development that focuses on biomass characterization, thermochemical
The State of Communication Education in Family Medicine Residencies.
Jansen, Kate L; Rosenbaum, Marcy E
2016-06-01
Communication skills are essential to medical training and have lasting effects on patient satisfaction and adherence rates. However, relatively little is reported in the literature identifying how communication is taught in the context of residency education. Our goal was to determine current practices in communication curricula across family medicine residency programs. Behavioral scientists and program directors in US family medicine residencies were surveyed via email and professional organization listservs. Questions included whether programs use a standardized communication model, methods used for teaching communication, hours devoted to teaching communication, as well as strengths and areas for improvement in their program. Analysis identified response frequencies and ranges complemented by analysis of narrative comments. A total of 204 programs out of 458 family medicine residency training sites responded (45%), with 48 out of 50 US states represented. The majority of respondents were behavioral scientists. Seventy-five percent of programs identified using a standard communication model; Mauksch's patient-centered observation model (34%) was most often used. Training programs generally dedicated more time to experiential teaching methods (video review, work with simulated patients, role plays, small groups, and direct observation of patient encounters) than to lectures (62% of time and 24% of time, respectively). The amount of time dedicated to communication education varied across programs (average of 25 hours per year). Respondent comments suggest that time dedicated to communication education and having a formal curriculum in place are most valued by educators. This study provides a picture of how communication skills teaching is conducted in US family medicine residency programs. These findings can provide a comparative reference and rationale for residency programs seeking to evaluate their current approaches to communication skills teaching and develop new or enhanced curricula.
contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and
NASA's X2000 Program: An Institutional Approach to Enabling Smaller Spacecraft
NASA Technical Reports Server (NTRS)
Deutsch, Leslie J.; Salvo, Chris; Woerner, David
2000-01-01
The number of NASA science missions per year is increasing from less than one to more than six. At the same time, individual mission budgets are smaller and cannot afford their own dedicated technology developments. In response to this, NASA has formed the X2000 Program. This program, which is divided into a set of subsequent "deliveries" will provide the basic avionics, power, communications, and software capability for future science missions. X2000 First Delivery, which will be completed in early 2001, will provide a full-functioned one MRAD tolerant flight computer, power switching electronics, a highly efficient radioisotope power source, and a transponder that provides high-level services at both 8.4 GHz and 32 GHz bands. The X2000 Second Delivery, which will be completed in the 2003 time frame, will enable complete spacecraft in the 10-50 kg class. All capabilities delivered by the X2000 program will be commercialized within the US and therefore will be available for others to use. Although the immediate customers for these technologies are deep space missions, most of the capabilities being delivered are generic in nature and will be equally applicable to Earth Observation missions.
King, Andrew J; Fisher, Arielle M; Becich, Michael J; Boone, David N
2017-01-01
The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.
King, Andrew J.; Fisher, Arielle M.; Becich, Michael J.; Boone, David N.
2017-01-01
The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist. PMID:28400991
2017-08-01
principles for effective Computer-Based Training (CBT) that can be applied broadly to Army courses to build and evaluate exemplar CBT for Army advanced...individual training courses. To assist cadre who do not have a dedicated instructional design team, the Computer-Based Training Principles Guide was...document is the resulting contents, organization, and presentation style of the Computer- Based Training Principles Guide and its companion User’s Guide
Creating Dedicated Local and State Revenue Sources for Youth Programs
ERIC Educational Resources Information Center
Sherman, Rachel H.; Deich, Sharon G.; Langford, Barbara Hanson
2007-01-01
This publication is part of a series of tools and resources on financing and sustaining youth programming. These tools and resources are intended to assist policy makers, program developers and community leaders in developing innovative strategies for implementing, financing and sustaining effective programs and policies. This brief highlights six…
Reactor Operations Monitoring System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, M.M.
1989-01-01
The Reactor Operations Monitoring System (ROMS) is a VME based, parallel processor data acquisition and safety action system designed by the Equipment Engineering Section and Reactor Engineering Department of the Savannah River Site. The ROMS will be analyzing over 8 million signal samples per minute. Sixty-eight microprocessors are used in the ROMS in order to achieve a real-time data analysis. The ROMS is composed of multiple computer subsystems. Four redundant computer subsystems monitor 600 temperatures with 2400 thermocouples. Two computer subsystems share the monitoring of 600 reactor coolant flows. Additional computer subsystems are dedicated to monitoring 400 signals from assortedmore » process sensors. Data from these computer subsystems are transferred to two redundant process display computer subsystems which present process information to reactor operators and to reactor control computers. The ROMS is also designed to carry out safety functions based on its analysis of process data. The safety functions include initiating a reactor scram (shutdown), the injection of neutron poison, and the loadshed of selected equipment. A complete development Reactor Operations Monitoring System has been built. It is located in the Program Development Center at the Savannah River Site and is currently being used by the Reactor Engineering Department in software development. The Equipment Engineering Section is designing and fabricating the process interface hardware. Upon proof of hardware and design concept, orders will be placed for the final five systems located in the three reactor areas, the reactor training simulator, and the hardware maintenance center.« less
Makerspaces: The Next Iteration for Educational Technology in K-12 Schools
ERIC Educational Resources Information Center
Strycker, Jesse
2015-01-01
With the continually growing number of computers and mobile devices available in K-12 schools, the need is dwindling for dedicated computer labs and media centers. Some schools are starting to repurpose those facilities into different kinds of exploratory learning environments known as "makerspaces". This article discusses this next…
ERIC Educational Resources Information Center
van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.
2016-01-01
This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…
Kepper, Nick; Ettig, Ramona; Dickmann, Frank; Stehr, Rene; Grosveld, Frank G; Wedemann, Gero; Knoch, Tobias A
2010-01-01
Especially in the life-science and the health-care sectors the huge IT requirements are imminent due to the large and complex systems to be analysed and simulated. Grid infrastructures play here a rapidly increasing role for research, diagnostics, and treatment, since they provide the necessary large-scale resources efficiently. Whereas grids were first used for huge number crunching of trivially parallelizable problems, increasingly parallel high-performance computing is required. Here, we show for the prime example of molecular dynamic simulations how the presence of large grid clusters including very fast network interconnects within grid infrastructures allows now parallel high-performance grid computing efficiently and thus combines the benefits of dedicated super-computing centres and grid infrastructures. The demands for this service class are the highest since the user group has very heterogeneous requirements: i) two to many thousands of CPUs, ii) different memory architectures, iii) huge storage capabilities, and iv) fast communication via network interconnects, are all needed in different combinations and must be considered in a highly dedicated manner to reach highest performance efficiency. Beyond, advanced and dedicated i) interaction with users, ii) the management of jobs, iii) accounting, and iv) billing, not only combines classic with parallel high-performance grid usage, but more importantly is also able to increase the efficiency of IT resource providers. Consequently, the mere "yes-we-can" becomes a huge opportunity like e.g. the life-science and health-care sectors as well as grid infrastructures by reaching higher level of resource efficiency.
Capabilities and application of a dedicated conventional bomber force in 1993. Student report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomits, J.R.
1988-04-01
The removal of intermediate-range ballistic missiles as a result of the INF treaty presents conventional balance-of-force implications that will be difficult for NATO to redress in the short term. This study evaluates how a dedicated conventional B-52 force, updated with presently available or programmed technologies, could be applied to overcome the conventional-force imbalance.
ERIC Educational Resources Information Center
Pitner, Ronald O.; Priester, Mary Ann; Lackey, Richard; Duvall, Deborah
2018-01-01
The Council on Social Work Education requires schools of social work to meet diversity and social justice competencies. Many MSW programs meet these standards by having either a dedicated diversity and social justice course, or by using some form of diversity and social justice curricular infusion. The current study explored which of these…
Press Site Auditorium dedicated to John Holliman
NASA Technical Reports Server (NTRS)
1999-01-01
NASA Administrator Daniel S. Goldin hands Mrs. Dianne Holliman a plaque honoring her late husband, John Holliman, a CNN national correspondent. Standing behind Goldin is Center Director Roy Bridges. At right is Tom Johnson, news group chairman of CNN. A ceremony dedicated the KSC Press Site auditorium as the John Holliman Auditorium to honor the correspondent for his enthusiastic, dedicated coverage of America's space program. The auditorium was built in 1980 and has been the focal point for new coverage of Space Shuttle launches. The ceremony followed the 94th launch of a Space Shuttle, on mission STS-96, earlier this morning.
NASA Technical Reports Server (NTRS)
Carson, John C. (Inventor); Indin, Ronald J. (Inventor); Shanken, Stuart N. (Inventor)
1994-01-01
A computer module is disclosed in which a stack of glued together IC memory chips is structurally integrated with a microprocessor chip. The memory provided by the stack is dedicated to the microprocessor chip. The microprocessor and its memory stack may be connected either by glue and/or by solder bumps. The solder bumps can perform three functions--electrical interconnection, mechanical connection, and heat transfer. The electrical connections in some versions are provided by wire bonding.
Seismic waveform modeling over cloud
NASA Astrophysics Data System (ADS)
Luo, Cong; Friederich, Wolfgang
2016-04-01
With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall
1998-01-01
The Tropical Rainfall Measuring Mission is the first mission dedicated to measuring tropical and subtropical rainfall using a variety of remote sensing instrumentation, including the first spaceborne rain-measuring radar. Since the energy released when tropical rainfall occurs is a primary "fuel" supply for the weather and climate "engine"; improvements in computer models which predict future weather and climate states may depend on better measurements of global tropical rainfall and its energy. In support of the STANYS conference theme of Education and Space, this presentation focuses on one aspect of NASA's Earth Systems Science Program. We seek to present an overview of the TRMM mission. This overview will discuss the scientific motivation for TRMM, the TRMM instrument package, and recent images from tropical rainfall systems and hurricanes. The presentation also targets educational components of the TRMM mission in the areas of weather, mathematics, technology, and geography that can be used by secondary school/high school educators in the classroom.
Kennedy Space Center's Command and Control System - "Toasters to Rocket Ships"
NASA Technical Reports Server (NTRS)
Lougheed, Kirk; Mako, Cheryle
2011-01-01
This slide presentation reviews the history of the development of the command and control system at Kennedy Space Center. From a system that could be brought to Florida in the trunk of a car in the 1950's. Including the development of larger and more complex launch vehicles with the Apollo program where human launch controllers managed the launch process with a hardware only system that required a dedicated human interface to perform every function until the Apollo vehicle lifted off from the pad. Through the development of the digital computer that interfaced with ground launch processing systems with the Space Shuttle program. Finally, showing the future control room being developed to control the missions to return to the moon and Mars, which will maximize the use of Commercial-Off-The Shelf (COTS) hardware and software which was standards based and not tied to a single vendor. The system is designed to be flexible and adaptable to support the requirements of future spacecraft and launch vehicles.
Investigation of iterative image reconstruction in low-dose breast CT
NASA Astrophysics Data System (ADS)
Bian, Junguo; Yang, Kai; Boone, John M.; Han, Xiao; Sidky, Emil Y.; Pan, Xiaochuan
2014-06-01
There is interest in developing computed tomography (CT) dedicated to breast-cancer imaging. Because breast tissues are radiation-sensitive, the total radiation exposure in a breast-CT scan is kept low, often comparable to a typical two-view mammography exam, thus resulting in a challenging low-dose-data-reconstruction problem. In recent years, evidence has been found that suggests that iterative reconstruction may yield images of improved quality from low-dose data. In this work, based upon the constrained image total-variation minimization program and its numerical solver, i.e., the adaptive steepest descent-projection onto the convex set (ASD-POCS), we investigate and evaluate iterative image reconstructions from low-dose breast-CT data of patients, with a focus on identifying and determining key reconstruction parameters, devising surrogate utility metrics for characterizing reconstruction quality, and tailoring the program and ASD-POCS to the specific reconstruction task under consideration. The ASD-POCS reconstructions appear to outperform the corresponding clinical FDK reconstructions, in terms of subjective visualization and surrogate utility metrics.
Enlarging the STEM pipeline working with youth-serving organizations
NASA Astrophysics Data System (ADS)
Porro, I.
2005-12-01
The After-School Astronomy Project (ASAP) is a comprehensive initiative to promote the pursuit of science learning among underrepresented youth. To this end ASAP specifically aims at building the capacity of urban community-based centers to deliver innovative science out-of-school programming to their youth. ASAP makes use of a modular curriculum consisting of a combination of hands-on activities and youth-led explorations of the night sky using MicroObservatory. Through project-based investigations students reinforce learning in astronomy and develop an understanding of science as inquiry, while also develop communication and computer skills. Through MicroObservatory students gain access to a network of educational telescopes, that they control over the Internet, software analysis tools and an online community of users. An integral part of ASAP is to provide professional development opportunities for after-school workers. This promotes a self-sustainable implementation of ASAP long-term and fosters the creation of a cadre of after-school professionals dedicated to facilitating science-based programs.
Being Dedicated in the Film The American Nurse.
Baumann, Steven L; Ganzer, Christine Anne
2016-01-01
The focus of this humanbecoming hermeneutic study of graduate nursing students' reflections is on being dedicated as portrayed in the documentary film, The American Nurse. Nursing students were invited to a public screening of the film, with the director, Carolyn Jones, and asked to write a brief reflective essay on "what is the meaning of being dedicated depicted in the film The American Nurse." The perspective is to be the humanbecoming school of thought. The participants were 20 nurses either in a graduate or doctoral nursing program at the time of this study. The emergent meanings of the study are offered to enhance knowledge and understanding of being dedicated. The use of documentary film to expand graduate nursing students' awareness of global issues is also considered. © The Author(s) 2015.
The Pisgah Astronomical Research Institute
NASA Astrophysics Data System (ADS)
Cline, J. Donald; Castelaz, M.
2009-01-01
Pisgah Astronomical Research Institute is a not-for-profit foundation located at a former NASA tracking station in the Pisgah National Forest in western North Carolina. PARI is celebrating its 10th year. During its ten years, PARI has developed and implemented innovative science education programs. The science education programs are hands-on experimentally based, mixing disciplines in astronomy, computer science, earth and atmospheric science, engineering, and multimedia. The basic tools for the educational programs include a 4.6-m radio telescope accessible via the Internet, a StarLab planetarium, the Astronomical Photographic Data Archive (APDA), a distributed computing online environment to classify stars called SCOPE, and remotely accessible optical telescopes. The PARI 200 acre campus has a 4.6-m, a 12-m and two 26-m radio telescopes, optical solar telescopes, a Polaris monitoring telescope, 0.4-m and 0.35-m optical research telescopes, and earth and atmospheric science instruments. PARI is also the home of APDA, a repository for astronomical photographic plate collections which will eventually be digitized and made available online. PARI has collaborated with visiting scientists who have developed their research with PARI telescopes and lab facilities. Current experiments include: the Dedicated Interferometer for Rapid Variability (Dennison et al. 2007, Astronomical and Astrophysical Transactions, 26, 557); the Plate Boundary Observatory operated by UNAVCO; the Clemson University Fabry-Perot Interferometers (Meriwether 2008, Journal of Geophysical Research, submitted) measuring high velocity winds and temperatures in the Thermosphere, and the Western Carolina University - PARI variable star program. Current status of the education and research programs and instruments will be presented. Also, development plans will be reviewed. Development plans include the greening of PARI with the installation of solar panels to power the optical telescopes, a new distance learning center, and enhancements to the atmospheric and earth science suite of instrumentation.
Whole-School Reform. ERIC Digest, Number 124.
ERIC Educational Resources Information Center
McChesney, Jim
This Digest describes several programs designed to foster successful school reform, and examines the Comprehensive School Reform Demonstration (CSRD) Program, recently approved by Congress. Whole-school (or comprehensive) reform includes a cross-disciplinary set of nationwide and local programs, dedicated to the intellectual and personal nurturing…
Srikanthan, Amirrtha; Amir, Eitan; Warner, Ellen
2016-06-01
To assess whether a dedicated program for young breast cancer patients, including a nurse navigator, improves the frequency of: a) fertility discussion documentation and b) fertility preservation (FP) referrals. A retrospective chart review and prospective survey were undertaken of breast cancer patients diagnosed at age 40 or younger between 2011 and 2013 who received adjuvant or neo-adjuvant chemotherapy at two academic cancer centers in Toronto, Canada. The Odette Cancer Centre (OCC) has a dedicated program for young breast cancer patients while Princess Margaret Cancer Centre (PM) does not. Patient demographics, tumor pathology, treatment and fertility discussion documentation prior to systemic chemotherapy administration were extracted from patient records. Prospective surveys were administered to the same cohort to corroborate data collected. Eighty-one patient charts were reviewed at both OCC and PM. Forty-seven and 49 at OCC and PM returned surveys for a response rate of 58% and 60% respectively. Chart reviews demonstrated no difference in the frequency of fertility discussion documentation (78% versus 75% for OCC and PM, p = 0.71); however, surveys demonstrated higher rates of recall of fertility discussion at OCC (96% versus 80%, p = 0.02). A greater proportion of women were offered FP referrals at OCC, as observed in chart reviews (56% versus 41%, p = 0.09), and surveys (73% versus 51%, p = 0.04). Time to initiation of chemotherapy did not differ between women who underwent FP and those who did not. A dedicated program for young breast cancer patients is associated with a higher frequency of FP referrals without delaying systemic therapy. Copyright © 2016 Elsevier Ltd. All rights reserved.
Weaving Culture and Core Content into FLEX Programs
ERIC Educational Resources Information Center
Schultz, Kennedy M.
2012-01-01
While immersion programs provide some of the greatest benefits to children learning a new language, many school systems have yet to dedicate the financial and personnel resources necessary to plan and implement such programs on a wide scale. In areas where immersion or formal FLES programming does not exist in the schools, often opportunities to…
Curbing Teen Dating Violence: Evidence from a School Prevention Program. RB-9194-CDC
ERIC Educational Resources Information Center
Jaycox, Lisa H.; McCaffrey, Daniel F.; Weidmer Ocampo, Beverly; Marshall, Grant N.; Collins, Rebecca L.; Hickman, Laura J.; Quigley, Denise D.
2006-01-01
This research brief summarizes a survey about the effectiveness of programs from Break the Cycle, a nonprofit organization dedicated to developing and fielding dating-violence prevention programs. The study evaluated "Ending Violence," a three-class-session prevention program. Developed by a Los Angeles-based nonprofit group called Break…
Slimani, Faiçal A A; Hamdi, Mahdjoub; Bentourkia, M'hamed
2018-05-01
Monte Carlo (MC) simulation is widely recognized as an important technique to study the physics of particle interactions in nuclear medicine and radiation therapy. There are different codes dedicated to dosimetry applications and widely used today in research or in clinical application, such as MCNP, EGSnrc and Geant4. However, such codes made the physics easier but the programming remains a tedious task even for physicists familiar with computer programming. In this paper we report the development of a new interface GEANT4 Dose And Radiation Interactions (G4DARI) based on GEANT4 for absorbed dose calculation and for particle tracking in humans, small animals and complex phantoms. The calculation of the absorbed dose is performed based on 3D CT human or animal images in DICOM format, from images of phantoms or from solid volumes which can be made from any pure or composite material to be specified by its molecular formula. G4DARI offers menus to the user and tabs to be filled with values or chemical formulas. The interface is described and as application, we show results obtained in a lung tumor in a digital mouse irradiated with seven energy beams, and in a patient with glioblastoma irradiated with five photon beams. In conclusion, G4DARI can be easily used by any researcher without the need to be familiar with computer programming, and it will be freely available as an application package. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Rodgers, T. E.; Johnson, J. F.
1977-01-01
The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.
NASA Technical Reports Server (NTRS)
Davarian, Faramaz (Editor)
1991-01-01
The NASA Propagation Experimenters Meeting (NAPEX), supported by the NASA Propagation Program, is convened annually to discuss studies made on radio wave propagation by investigators from domestic and international organizations. The meeting was organized into three technical sessions. The first session was dedicated to Olympus and ACTS studies and experiments, the second session was focused on the propagation studies and measurements, and the third session covered computer-based propagation model development. In total, sixteen technical papers and some informal contributions were presented. Following NAPEX 15, the Advanced Communications Technology Satellite (ACTS) miniworkshop was held on 29 Jun. 1991, to review ACTS propagation activities, with emphasis on ACTS hardware development and experiment planning. Five papers were presented.
Sandia Technology engineering and science accomplishments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report briefly discusses the following research being conducted at Sandia Laboratories: Advanced Manufacturing -- Sandia technology helps keep US industry in the lead; Microelectronics-Sandia`s unique facilities transform research advances into manufacturable products; Energy -- Sandia`s energy programs focus on strengthening industrial growth and political decisionmaking; Environment -- Sandia is a leader in environmentally conscious manufacturing and hazardous waste reduction; Health Care -- New biomedical technologies help reduce cost and improve quality of health care; Information & Computation -- Sandia aims to help make the information age a reality; Transportation -- This new initiative at the Labs will help improvemore » transportation, safety,l efficiency, and economy; Nonproliferation -- Dismantlement and arms control are major areas of emphasis at Sandia; and Awards and Patents -- Talented, dedicated employees are the backbone of Sandia`s success.« less
Results and current status of the NPARC alliance validation effort
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Jones, Ralph R.
1996-01-01
The NPARC Alliance is a partnership between the NASA Lewis Research Center (LeRC) and the USAF Arnold Engineering Development Center (AEDC) dedicated to the establishment of a national CFD capability, centered on the NPARC Navier-Stokes computer program. The three main tasks of the Alliance are user support, code development, and validation. The present paper is a status report on the validation effort. It describes the validation approach being taken by the Alliance. Representative results are presented for laminar and turbulent flat plate boundary layers, a supersonic axisymmetric jet, and a glancing shock/turbulent boundary layer interaction. Cases scheduled to be run in the future are also listed. The archive of validation cases is described, including information on how to access it via the Internet.
WIWS: a protein structure bioinformatics Web service collection.
Hekkelman, M L; Te Beek, T A H; Pettifer, S R; Thorne, D; Attwood, T K; Vriend, G
2010-07-01
The WHAT IF molecular-modelling and drug design program is widely distributed in the world of protein structure bioinformatics. Although originally designed as an interactive application, its highly modular design and inbuilt control language have recently enabled its deployment as a collection of programmatically accessible web services. We report here a collection of WHAT IF-based protein structure bioinformatics web services: these relate to structure quality, the use of symmetry in crystal structures, structure correction and optimization, adding hydrogens and optimizing hydrogen bonds and a series of geometric calculations. The freely accessible web services are based on the industry standard WS-I profile and the EMBRACE technical guidelines, and are available via both REST and SOAP paradigms. The web services run on a dedicated computational cluster; their function and availability is monitored daily.
Project SUN (Students Understanding Nature)
NASA Technical Reports Server (NTRS)
Curley, T.; Yanow, G.
1995-01-01
Project SUN is part of NASA's 'Mission to Planet Earth' education outreach effort. It is based on development of low cost, scientifi- cally accurate instrumentation and computer interfacing, coupled with Apple II computers as dedicated data loggers. The project is com- prised of: instruments, interfacing, software, curriculum, a detailed operating manual, and a system of training at the school sites.
NASA Technical Reports Server (NTRS)
1999-01-01
Outside of Building 4200 at Marshall Space Flight Center, a courtyard was constructed in memory of Dr. Wernher von Braun and his contributions to the U. S. Space program. In the middle of the courtyard a fountain was built. The fountain was made operational prior to the 30th arniversary celebration of the Apollo 11 lunar landing. Attending the dedication ceremony were visiting Apollo astronauts and NASA's Safety and Assurance Director Rothenberg.
Teacher Education Accreditation Council Brochure
ERIC Educational Resources Information Center
Teacher Education Accreditation Council, 2009
2009-01-01
The Teacher Education Accreditation Council (TEAC), founded in 1997, is dedicated to improving academic degree programs for professional educators--those who teach and lead in schools, pre-K through grade 12. TEAC accredits undergraduate and graduate programs, including alternate route programs, based on (1) the evidence they have that they…
NASA Astrophysics Data System (ADS)
La Riviere, P. J.; Pan, X.; Penney, B. C.
1998-06-01
Scintimammography, a nuclear-medicine imaging technique that relies on the preferential uptake of Tc-99m-sestamibi and other radionuclides in breast malignancies, has the potential to provide differentiation of mammographically suspicious lesions, as well as outright detection of malignancies in women with radiographically dense breasts. In this work we use the ideal-observer framework to quantify the detectability of a 1-cm lesion using three different imaging geometries: the planar technique that is the current clinical standard, conventional single-photon emission computed tomography (SPECT), in which the scintillation cameras rotate around the entire torso, and dedicated breast SPECT, in which the cameras rotate around the breast alone. We also introduce an adaptive smoothing technique for the processing of planar images and of sinograms that exploits Fourier transforms to achieve effective multidimensional smoothing at a reasonable computational cost. For the detection of a 1-cm lesion with a clinically typical 6:1 tumor-background ratio, we find ideal-observer signal-to-noise ratios (SNR) that suggest that the dedicated breast SPECT geometry is the most effective of the three, and that the adaptive, two-dimensional smoothing technique should enhance lesion detectability in the tomographic reconstructions.
A microbased shared virtual world prototype
NASA Technical Reports Server (NTRS)
Pitts, Gerald; Robinson, Mark; Strange, Steve
1993-01-01
Virtual reality (VR) allows sensory immersion and interaction with a computer-generated environment. The user adopts a physical interface with the computer, through Input/Output devices such as a head-mounted display, data glove, mouse, keyboard, or monitor, to experience an alternate universe. What this means is that the computer generates an environment which, in its ultimate extension, becomes indistinguishable from the real world. 'Imagine a wraparound television with three-dimensional programs, including three-dimensional sound, and solid objects that you can pick up and manipulate, even feel with your fingers and hands.... 'Imagine that you are the creator as well as the consumer of your artificial experience, with the power to use a gesture or word to remold the world you see and hear and feel. That part is not fiction... three-dimensional computer graphics, input/output devices, computer models that constitute a VR system make it possible, today, to immerse yourself in an artificial world and to reach in and reshape it.' Our research's goal was to propose a feasibility experiment in the construction of a networked virtual reality system, making use of current personal computer (PC) technology. The prototype was built using Borland C compiler, running on an IBM 486 33 MHz and a 386 33 MHz. Each game currently is represented as an IPX client on a non-dedicated Novell server. We initially posed the two questions: (1) Is there a need for networked virtual reality? (2) In what ways can the technology be made available to the most people possible?
Cosmological N-body Simulation
NASA Astrophysics Data System (ADS)
Lake, George
1994-05-01
.90ex> }}} The ``N'' in N-body calculations has doubled every year for the last two decades. To continue this trend, the UW N-body group is working on algorithms for the fast evaluation of gravitational forces on parallel computers and establishing rigorous standards for the computations. In these algorithms, the computational cost per time step is ~ 10(3) pairwise forces per particle. A new adaptive time integrator enables us to perform high quality integrations that are fully temporally and spatially adaptive. SPH--smoothed particle hydrodynamics will be added to simulate the effects of dissipating gas and magnetic fields. The importance of these calculations is two-fold. First, they determine the nonlinear consequences of theories for the structure of the Universe. Second, they are essential for the interpretation of observations. Every galaxy has six coordinates of velocity and position. Observations determine two sky coordinates and a line of sight velocity that bundles universal expansion (distance) together with a random velocity created by the mass distribution. Simulations are needed to determine the underlying structure and masses. The importance of simulations has moved from ex post facto explanation to an integral part of planning large observational programs. I will show why high quality simulations with ``large N'' are essential to accomplish our scientific goals. This year, our simulations have N >~ 10(7) . This is sufficient to tackle some niche problems, but well short of our 5 year goal--simulating The Sloan Digital Sky Survey using a few Billion particles (a Teraflop-year simulation). Extrapolating past trends, we would have to ``wait'' 7 years for this hundred-fold improvement. Like past gains, significant changes in the computational methods are required for these advances. I will describe new algorithms, algorithmic hacks and a dedicated computer to perform Billion particle simulations. Finally, I will describe research that can be enabled by Petaflop computers. This research is supported by the NASA HPCC/ESS program.
Perennial plants for biofuel production: bridging genomics and field research.
Alves, Alexandre Alonso; Laviola, Bruno G; Formighieri, Eduardo F; Carels, Nicolas
2015-04-01
Development of dedicated perennial crops has been indicated as a strategic action to meet the growing demand for biofuels. Breeding of perennial crops,however, is often time- and resource-consuming. As genomics offers a platform from which to learn more about the relationships of genes and phenotypes,its operational use in the context of breeding programs through strategies such as genomic selection promises to foster the development of perennial crops dedicated to biodiesel production by increasing the efficiency of breeding programs and by shortening the length of the breeding cycles. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Strengthening moral reasoning through dedicated ethics training in dietetic preparatory programs.
Hewko, Sarah J; Cooper, Sarah L; Cummings, Greta G
2015-01-01
Moral reasoning skills, associated with the ability to make ethical decisions effectively, must be purposively fostered. Among health professionals, enhanced moral reasoning is linked to superior clinical performance. Research demonstrates that moral reasoning is enhanced through dedicated, discussion-based ethics education offered over a period of 3-12 weeks. Current dietetic students and practicing dietitians seeking to strengthen their moral reasoning skills can undertake elective ethics education. Further research within dietetic preparatory programs is warranted to better inform the development and implementation of ethics courses. Copyright © 2015 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Road weather management program performance metrics : implementation and assessment.
DOT National Transportation Integrated Search
2009-08-31
Since the late 1990s, the U.S. Department of Transportation (USDOT), Federal Highway Administration (FHWA) has managed a program dedicated to improving the safety, mobility and productivity of the nations surface transportation modes by integra...
NASA Technical Reports Server (NTRS)
Rummel, J. A.
1982-01-01
The Mission Science Requirements Document (MSRD) for the First Dedicated Life Sciences Mission (LS-1) represents the culmination of thousands of hours of experiment selection, and science requirement definition activities. NASA life sciences has never before attempted to integrate, both scientifically and operationally, a single mission dedicated to life sciences research, and the complexity of the planning required for such an endeavor should be apparent. This set of requirements completes the first phase of a continual process which will attempt to optimize (within available programmatic and mission resources) the science accomplished on this mission.
ERIC Educational Resources Information Center
Rursch, Julie A.; Luse, Andy; Jacobson, Doug
2010-01-01
The IT-Adventures program is dedicated to increasing interest in and awareness of information technology among high school students using inquiry-based learning focused on three content areas: cyber defense, game design programming, and robotics. The program combines secondary, post-secondary, and industry partnerships in educational programming,…
TEAC's Accreditation Process at a Glance, 2009-2011
ERIC Educational Resources Information Center
Teacher Education Accreditation Council, 2011
2011-01-01
The Teacher Education Accreditation Council (TEAC), founded in 1997, is dedicated to improving academic degree programs for professional educators--those who teach and lead in schools, pre-K through grade 12. TEAC accredits undergraduate and graduate programs, including alternate route programs, based on (1) the evidence they have that they…
Implementation and Testing of VLBI Software Correlation at the USNO
NASA Technical Reports Server (NTRS)
Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken
2010-01-01
The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.
Provenance-aware optimization of workload for distributed data production
NASA Astrophysics Data System (ADS)
Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal
2017-10-01
Distributed data processing in High Energy and Nuclear Physics (HENP) is a prominent example of big data analysis. Having petabytes of data being processed at tens of computational sites with thousands of CPUs, standard job scheduling approaches either do not address well the problem complexity or are dedicated to one specific aspect of the problem only (CPU, network or storage). Previously we have developed a new job scheduling approach dedicated to distributed data production - an essential part of data processing in HENP (preprocessing in big data terminology). In this contribution, we discuss the load balancing with multiple data sources and data replication, present recent improvements made to our planner and provide results of simulations which demonstrate the advantage against standard scheduling policies for the new use case. Multi-source or provenance is common in computing models of many applications whereas the data may be copied to several destinations. The initial input data set would hence be already partially replicated to multiple locations and the task of the scheduler is to maximize overall computational throughput considering possible data movements and CPU allocation. The studies have shown that our approach can provide a significant gain in overall computational performance in a wide scope of simulations considering realistic size of computational Grid and various input data distribution.
DOT National Transportation Integrated Search
2014-03-01
The Planning Assistance for Rural Areas (PARA) program, sponsored by the Arizona Department of Transportations : (ADOT) Multimodal Planning Division (MPD), dedicates funding and staff to conduct multimodal transportation planning : studies for loc...
First in-flight results of Pleiades 1A innovative methods for optical calibration
NASA Astrophysics Data System (ADS)
Kubik, Philippe; Lebègue, Laurent; Fourest, Sébastien; Delvit, Jean-Marc; de Lussy, Françoise; Greslou, Daniel; Blanchet, Gwendoline
2017-11-01
The PLEIADES program is a space Earth Observation system led by France, under the leadership of the French Space Agency (CNES). Since it was successfully launched on December 17th, 2011, Pleiades 1A high resolution optical satellite has been thoroughly tested and validated during the commissioning phase led by CNES. The whole system has been designed to deliver submetric optical images to users whose needs were taken into account very early in the design process. This satellite opens a new era in Europe since its off-nadir viewing capability delivers a worldwide 2- days access, and its great agility will make possible to image numerous targets, strips and stereo coverage from the same orbit. Its imaging capability of more than 450 images of 20 km x 20 km per day can fulfill a broad spectrum of applications for both civilian and defence users. For an earth observing satellite with no on-board calibration source, the commissioning phase is a critical quest of wellcharacterized earth landscapes and ground patterns that have to be imaged by the camera in order to compute or fit the parameters of the viewing models. It may take a long time to get the required scenes with no cloud, whilst atmosphere corrections need simultaneous measurements that are not always possible. The paper focuses on new in-flight calibration methods that were prepared before the launch in the framework of the PLEIADES program : they take advantage of the satellite agility that can deeply relax the operational constraints and may improve calibration accuracy. Many performances of the camera were assessed thanks to a dedicated innovative method that was successfully validated during the commissioning period : Modulation Transfer Function (MTF), refocusing, absolute calibration, line of sight stability were estimated on stars and on the Moon. Detectors normalization and radiometric noise were computed on specific pictures on Earth with a dedicated guidance profile. Geometric viewing frame was determined with a particular image acquisition combining different views of the same target. All these new methods are expected to play a key role in the future when active optics will need sophisticated in-flight calibration strategy.
Probabilistic simulation of concurrent engineering of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.
Peregrine System Configuration | High-Performance Computing | NREL
nodes and storage are connected by a high speed InfiniBand network. Compute nodes are diskless with an directories are mounted on all nodes, along with a file system dedicated to shared projects. A brief processors with 64 GB of memory. All nodes are connected to the high speed Infiniband network and and a
Fault Tolerance for VLSI Multicomputers
1985-08-01
that consists of hundreds or thousands of VLSI computation nodes interconnected by dedicated links. Some important applications of high-end computers...technology, and intended applications . A proposed fault tolerance scheme combines hardware that performs error detection and system-level protocols for...order to recover from the error and resume correct operation, a valid system state must be restored. A low-overhead, application -transparent error
Expert overseer for mass spectrometer system
Filby, Evan E.; Rankin, Richard A.
1991-01-01
An expert overseer for the operation and real-time management of a mass spectrometer and associated laboratory equipment. The overseer is a computer-based expert diagnostic system implemented on a computer separate from the dedicated computer used to control the mass spectrometer and produce the analysis results. An interface links the overseer to components of the mass spectrometer, components of the laboratory support system, and the dedicated control computer. Periodically, the overseer polls these devices and as well as itself. These data are fed into an expert portion of the system for real-time evaluation. A knowledge base used for the evaluation includes both heuristic rules and precise operation parameters. The overseer also compares current readings to a long-term database to detect any developing trends using a combination of statistical and heuristic rules to evaluate the results. The overseer has the capability to alert lab personnel whenever questionable readings or trends are observed and provide a background review of the problem and suggest root causes and potential solutions, or appropriate additional tests that could be performed. The overseer can change the sequence or frequency of the polling to respond to an observation in the current data.
A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools
2015-07-14
computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network
An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Little, M. M.
2013-12-01
NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.
1982-10-01
spent in preparing this document. 00. EXECUTIVE SUMMARY The O’Hare Runway Configuration Management System (CMS) is an interactive multi-user computer ...MITRE Washington’s Computer Center. Currently, CMS is housed in an IBM 4341 computer with VM/SP operating system. CMS employs the IBM’s Display...iV 0O, o 0 .r4L /~ wA 0U 00 00 0 w vi O’Hare, it will operate on a dedicated mini- computer which permits multi-tasking (that is, multiple users
LRC-Katherine-Johnson-interview-2017-0914
2017-09-14
Sept. 14, 2017: An interview with Katherine Johnson discussing her career and her reaction to the dedication of the Katherine G. Johnson Computational Research Facility at NASA's Langley Research Center in Hampton, Va., in her honor.
76 FR 71861 - America Recycles Day, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-18
... families have advanced the common good of our Nation by recycling regularly and promoting conservation... then, we have bolstered recycling programs through individual action, community engagement, and... today, we must update and expand existing recycling programs and dedicate ourselves to devising new...
Simulation requirements for the Large Deployable Reflector (LDR)
NASA Technical Reports Server (NTRS)
Soosaar, K.
1984-01-01
Simulation tools for the large deployable reflector (LDR) are discussed. These tools are often the transfer function variety equations. However, transfer functions are inadequate to represent time-varying systems for multiple control systems with overlapping bandwidths characterized by multi-input, multi-output features. Frequency domain approaches are the useful design tools, but a full-up simulation is needed. Because of the need for a dedicated computer for high frequency multi degree of freedom components encountered, non-real time smulation is preferred. Large numerical analysis software programs are useful only to receive inputs and provide output to the next block, and should be kept out of the direct loop of simulation. The following blocks make up the simulation. The thermal model block is a classical heat transfer program. It is a non-steady state program. The quasistatic block deals with problems associated with rigid body control of reflector segments. The steady state block assembles data into equations of motion and dynamics. A differential raytrace is obtained to establish a change in wave aberrations. The observation scene is described. The focal plane module converts the photon intensity impinging on it into electron streams or into permanent film records.
IEEE International Symposium on Biomedical Imaging.
2017-01-01
The IEEE International Symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the IEEE Signal Processing Society (SPS) and the IEEE Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical image processing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by IEEE and included in IEEE Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.
The Theoretical Astrophysical Observatory: Cloud-based Mock Galaxy Catalogs
NASA Astrophysics Data System (ADS)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah
2016-03-01
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results canmore » be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.« less
Optics derotator servo control system for SONG Telescope
NASA Astrophysics Data System (ADS)
Xu, Jin; Ren, Changzhi; Ye, Yu
2012-09-01
The Stellar Oscillations Network Group (SONG) is an initiative which aims at designing and building a groundbased network of 1m telescopes dedicated to the study of phenomena occurring in the time domain. Chinese standard node of SONG is an Alt-Az Telescope of F/37 with 1m diameter. Optics derotator control system of SONG telescope adopts the development model of "Industrial Computer + UMAC Motion Controller + Servo Motor".1 Industrial computer is the core processing part of the motion control, motion control card(UMAC) is in charge of the details on the motion control, Servo amplifier accepts the control commands from UMAC, and drives the servo motor. The position feedback information comes from the encoder, to form a closed loop control system. This paper describes in detail hardware design and software design for the optics derotator servo control system. In terms of hardware design, the principle, structure, and control algorithm of servo system based on optics derotator are analyzed and explored. In terms of software design, the paper proposes the architecture of the system software based on Object-Oriented Programming.
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
The influence of computer-generated path on the robot’s effector stability of motion
NASA Astrophysics Data System (ADS)
Foit, K.; Banaś, W.; Gwiazda, A.; Ćwikła, G.
2017-08-01
The off-line trajectory planning is often carried out due to economical and practical reasons: the robot is not excluded from the production process and the operator could benefit from testing programs in the virtual environment. On the other hand, the dedicated off-line programming and simulation software is often limited in features and is intended to roughly check the program. It should be expected that the arm of the real robot’s manipulator will realize the trajectory in different manner: the acceleration and deceleration phases may trigger the vibrations of the kinematic chain that could affect the precision of effector positioning and degrade the quality of process realized by the robot. The purpose of this work is the analysis of the selected cases, when the robot’s effector has been moved along the programmed path. The off-line generated, test trajectories have different arrangement of points: such approach has allowed evaluating the time needed to complete the each of the tasks, as well as measuring the level of the vibration of the robot’s wrist. All tests were performed without the load. The conclusions of the experiment may be useful during the trajectory planning in order to avoid the critical configuration of points.
The Robotic Hugo E. Schwarz Telescope | CTIO
Program PIA Program GO-FAAR Program Other Opportunities Tourism Visits to Tololo Astro tourism in Chile Tourism in Chile Information for travelers Visit Tololo Media Relations News Press Release Publications of a new electronic drive system for the mount, and the second, dedicate to re-design the dome
Tamarisk coalition - native riparian plant materials program
Stacy Kolegas
2012-01-01
The Tamarisk Coalition (TC), a nonprofit organization dedicated to riparian restoration in the western United States, has created a Native Plant Materials Program to address the identified need for native riparian plant species for use in revegetation efforts on the Colorado Plateau. The specific components of the Native Plant Materials Program include: 1) provide seed...
The Role of Teachers' Guided Reflection in Effecting Positive Program Change.
ERIC Educational Resources Information Center
Vogt, Lynn Allington; Au, Kathryn H. P.
Kamehameha Elementary Education Program (KEEP), in Hawaii, and Rough Rock (which serves Navajo students in Arizona) are dedicated to strengthening the school success of students who have not thrived in traditional mainstream school settings. Both programs have rooted change efforts in the belief that students would experience improved school…
Clinical Investigator Development Program | Center for Cancer Research
Clinical Investigator Development Program Application Deadline: September 30, 2018 Program Starts: July 1, 2019 The NCI Center for Cancer Research (CCR) is pleased to announce our annual call for applications for an exciting training opportunity intended for physicians interested in dedicating their careers to clinical research. Come join a vibrant, multidisciplinary research
Performances and first science results with the VEGA/CHARA visible instrument
NASA Astrophysics Data System (ADS)
Mourard, D.; Tallon, M.; Bério, Ph.; Bonneau, D.; Chesneau, O.; Clausse, J. M.; Delaa, O.; Nardetto, N.; Perraut, K.; Spang, A.; Stee, Ph.; Tallon-Bosc, I.; McAlister, H.; ten Brummelaar, T.; Sturmann, J.; Sturmann, L.; Turner, N.; Farrington, C.; Goldfinger, P. J.
2010-07-01
This paper presents the current status of the VEGA (Visible spEctroGraph and polArimeter) instrument installed at the coherent focus of the CHARA Array, Mount Wilson CA. Installed in september 2007, the first science programs have started during summer 2008 and first science results are now published. Dedicated to high angular (0.3mas) and high spectral (R=30000) astrophysical studies, VEGA main objectives are the study of circumstellar environments of hot active stars or interactive binary systems and a large palette of new programs dedicated to fundamental stellar parameters. We will present successively the main characteristics of the instrument and its current performances in the CHARA environment, a short summary of two science programs and finally we will develop some studies showing the potential and difficulties of the 3 telescopes mode of VEGA/CHARA.
ERIC Educational Resources Information Center
Alotaibi, Khalid Abdullah
2014-01-01
The success of civilization is determined by the excellent education system and excellent education program. It is believed that the aims of producing outstanding and dedicated students are determined by the quality of education program. Students chose the education program based on many criteria, like job guarantee, reputation of the university,…
Glendive Migrant Program. Dedicated to Meeting the Needs of Migrant Children and Their Families.
ERIC Educational Resources Information Center
Trangmoe, John
The Glendive Migrant Program, a 1989 exemplary Chapter 1 program, is a 5-week summer project serving the children of migrant families working in a 60-mile area along the Yellowstone River valley, Montana. The program serves approximately 110 students, ages 1-18. Instructors, supervisors, and aides work with nursery, preschool, and elementary-age…
Damaris: Addressing performance variability in data management for post-petascale simulations
Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck; ...
2016-10-01
With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters.more » Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.« less
Damaris: Addressing performance variability in data management for post-petascale simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck
With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters.more » Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.« less
Aletraris, Lydia; Roman, Paul M
2015-10-01
The provision of HIV education and testing in substance use disorder (SUD) treatment programs is an important public health strategy for reducing HIV incidence. For many at-risk individuals, SUD treatment represents the primary point of access for testing and receiving HIV-related services. This study uses two waves of nationally representative data of 265 privately-funded SUD treatment programs in the U.S. to examine organizational and patient characteristics associated with offering a dedicated HIV/AIDS treatment track, onsite HIV/AIDS support groups, and onsite HIV testing. Our longitudinal analysis indicated that the majority of treatment programs reported providing education and prevention services, but there was a small, yet significant, decline in the number of programs providing these services. Programs placed more of an emphasis on providing information on the transmission of HIV rather than on acquiring risk-reduction skills. There was a notable and significant increase (from 26.0% to 31.7%) in programs that offered onsite HIV testing, including rapid HIV testing, and an increase in the percentage of patients who received testing in the programs. Larger programs were more likely to offer a dedicated HIV/AIDS treatment track and to offer onsite HIV/AIDS support groups, while accredited programs and programs with a medical infrastructure were more likely to provide HIV testing. The percentage of injection drug users was positively linked to the availability of specialized HIV/AIDS tracks and HIV/AIDS support groups, and the percentage of female clients was associated with the availability of onsite support groups. The odds of offering HIV/AIDS support groups were also greater in programs that had a dedicated LGBT track. The findings suggest that access to hospitals and medical care services is an effective way to facilitate adoption of HIV services and that programs are providing a needed service among a group of patients who have a heightened risk of HIV transmission. Nonetheless, the fact that fewer than one third of programs offered onsite testing, and, of the ones that did, fewer than one third of their patients received testing, raises concern in light of federal guidelines. Copyright © 2015 Elsevier Inc. All rights reserved.
Press Site Auditorium dedicated to John Holliman
NASA Technical Reports Server (NTRS)
1999-01-01
From left, Center Director Roy Bridges and NASA Administrator Daniel S. Goldin applaud as Jay Holliman, with the help of his mother, Mrs. Dianne Holliman, unveils a plaque honoring his father, the late John Holliman. At right is Tom Johnson, news group chairman of CNN. The occasion was the dedication of the KSC Press Site auditorium as the John Holliman Auditorium to honor the CNN national correspondent for his enthusiastic, dedicated coverage of America's space program. The auditorium was built in 1980 and has been the focal point for new coverage of Space Shuttle launches. The ceremony followed the 94th launch of a Space Shuttle, on mission STS-96, earlier this morning.
Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool
NASA Astrophysics Data System (ADS)
Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.
1997-12-01
Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.
Race "Still" Matters: Preparing Culturally Relevant Teachers
ERIC Educational Resources Information Center
Durden, Tonia; Dooley, Caitlin McMunn; Truscott, Diane
2016-01-01
This qualitative study explores racial identity development of teacher candidates during a teacher preparation program dedicated to preparing teachers for diverse classrooms. Two black teacher candidates in the US demonstrate their racial identity development through critical reflections offered throughout the program. Findings suggest that…
Code of Federal Regulations, 2010 CFR
2010-01-01
... Office grant program managers. (i) Automated systems referred to in this instruction refers to the loan accounting systems; e.g., Program Loan Accounting System, Automated Multi-Housing Accounting System, and Dedicated Loan Origination System, from which loan and grant disbursements are ordered. (j) This subpart...
Historic Rust College: Fulfilling a Mission.
ERIC Educational Resources Information Center
Hoffman, Carl
1989-01-01
Describes Rust College, a Mississippi college dedicated to educating Blacks from economically and educationally impoverished backgrounds. Discusses the college's financial management, recent fund-raising efforts, building program, and academic programs. Examines the role of the predominantly Black college and Rust's mission to help students…
Unobtrusive Software and System Health Management with R2U2 on a Parallel MIMD Coprocessor
NASA Technical Reports Server (NTRS)
Schumann, Johann; Moosbrugger, Patrick
2017-01-01
Dynamic monitoring of software and system health of a complex cyber-physical system requires observers that continuously monitor variables of the embedded software in order to detect anomalies and reason about root causes. There exists a variety of techniques for code instrumentation, but instrumentation might change runtime behavior and could require costly software re-certification. In this paper, we present R2U2E, a novel realization of our real-time, Realizable, Responsive, and Unobtrusive Unit (R2U2). The R2U2E observers are executed in parallel on a dedicated 16-core EPIPHANY co-processor, thereby avoiding additional computational overhead to the system under observation. A DMA-based shared memory access architecture allows R2U2E to operate without any code instrumentation or program interference.
VizieR Online Data Catalog: Spitzer photometric time series of HD 97658 (Van Grootel+, 2014)
NASA Astrophysics Data System (ADS)
Van Grootel, V.; Gillon, M.; Valencia, D.; Madhusudhan, N.; Dragomir, D.; Howe, A. R.; Burrows, A. S.; Demory, B.-O.; Deming, D.; Ehrenreich, D.; Lovis, C.; Mayor, M.; Pepe, F.; Queloz, D.; Scuflaire, R.; Seager, S.; Segransan, D.; Udry, S.
2017-07-01
We monitored HD 97658 with Spitzer's IRAC camera on 2013 August 10 from 13:01:00 to 18:27:00 UT, corresponding to a transit window as computed from the MOST transit ephemeris (Dragomir et al. 2013, J/ApJ/772/L2). These Spitzer data were acquired in the context of the Cycle 9 program 90072 (PI: M. Gillon) dedicated to the search for the transits of RV-detected low-mass planets. They consist of 2320 sets of 64 individual subarray images obtained at 4.5 μm with an integration time of 0.08 s. They are available on the Spitzer Heritage Archive database under the form of 2320 Basic Calibrated Data files calibrated by the standard Spitzer reduction pipeline (version S19.1.0). (1 data file).
Morgantown People Mover Collision Avoidance System Design Summary
DOT National Transportation Integrated Search
1980-09-01
The Morgantown People Mover (MPM) is an automated two-mode (schedule and demand) transit system that consists of a fleet of electrically powered, rubber-tired, passenger-carrying vehicles operating on a dedicated guideway under computer control. The ...
NASA Technical Reports Server (NTRS)
Mitchell, C. M.
1982-01-01
The NASA-Goddard Space Flight Center is responsible for the control and ground support for all of NASA's unmanned near-earth satellites. Traditionally, each satellite had its own dedicated mission operations room. In the mid-seventies, an integration of some of these dedicated facilities was begun with the primary objective to reduce costs. In this connection, the Multi-Satellite Operations Control Center (MSOCC) was designed. MSOCC represents currently a labor intensive operation. Recently, Goddard has become increasingly aware of human factors and human-machine interface issues. A summary is provided of some of the attempts to apply human factors considerations in the design of command and control environments. Current and future activities with respect to human factors and systems design are discussed, giving attention to the allocation of tasks between human and computer, and the interface for the human-computer dialogue.
Directional view interpolation for compensation of sparse angular sampling in cone-beam CT.
Bertram, Matthias; Wiegert, Jens; Schafer, Dirk; Aach, Til; Rose, Georg
2009-07-01
In flat detector cone-beam computed tomography and related applications, sparse angular sampling frequently leads to characteristic streak artifacts. To overcome this problem, it has been suggested to generate additional views by means of interpolation. The practicality of this approach is investigated in combination with a dedicated method for angular interpolation of 3-D sinogram data. For this purpose, a novel dedicated shape-driven directional interpolation algorithm based on a structure tensor approach is developed. Quantitative evaluation shows that this method clearly outperforms conventional scene-based interpolation schemes. Furthermore, the image quality trade-offs associated with the use of interpolated intermediate views are systematically evaluated for simulated and clinical cone-beam computed tomography data sets of the human head. It is found that utilization of directionally interpolated views significantly reduces streak artifacts and noise, at the expense of small introduced image blur.
Dedicated heterogeneous node scheduling including backfill scheduling
Wood, Robert R [Livermore, CA; Eckert, Philip D [Livermore, CA; Hommes, Gregg [Pleasanton, CA
2006-07-25
A method and system for job backfill scheduling dedicated heterogeneous nodes in a multi-node computing environment. Heterogeneous nodes are grouped into homogeneous node sub-pools. For each sub-pool, a free node schedule (FNS) is created so that the number of to chart the free nodes over time. For each prioritized job, using the FNS of sub-pools having nodes useable by a particular job, to determine the earliest time range (ETR) capable of running the job. Once determined for a particular job, scheduling the job to run in that ETR. If the ETR determined for a lower priority job (LPJ) has a start time earlier than a higher priority job (HPJ), then the LPJ is scheduled in that ETR if it would not disturb the anticipated start times of any HPJ previously scheduled for a future time. Thus, efficient utilization and throughput of such computing environments may be increased by utilizing resources otherwise remaining idle.
Arduino: a low-cost multipurpose lab equipment.
D'Ausilio, Alessandro
2012-06-01
Typical experiments in psychological and neurophysiological settings often require the accurate control of multiple input and output signals. These signals are often generated or recorded via computer software and/or external dedicated hardware. Dedicated hardware is usually very expensive and requires additional software to control its behavior. In the present article, I present some accuracy tests on a low-cost and open-source I/O board (Arduino family) that may be useful in many lab environments. One of the strengths of Arduinos is the possibility they afford to load the experimental script on the board's memory and let it run without interfacing with computers or external software, thus granting complete independence, portability, and accuracy. Furthermore, a large community has arisen around the Arduino idea and offers many hardware add-ons and hundreds of free scripts for different projects. Accuracy tests show that Arduino boards may be an inexpensive tool for many psychological and neurophysiological labs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan; Piro, Markus H.A.
Thermochimica is a software library that determines a unique combination of phases and their compositions at thermochemical equilibrium. Thermochimica can be used for stand-alone calculations or it can be directly coupled to other codes. This release of the software does not have a graphical user interface (GUI) and it can be executed from the command line or from an Application Programming Interface (API). Also, it is not intended for thermodynamic model development or for constructing phase diagrams. The main purpose of the software is to be directly coupled with a multi-physics code to provide material properties and boundary conditions formore » various physical phenomena. Significant research efforts have been dedicated to enhance computational performance through advanced algorithm development, such as improved estimation techniques and non-linear solvers. Various useful parameters can be provided as output from Thermochimica, such as: determination of which phases are stable at equilibrium, the mass of solution species and phases at equilibrium, mole fractions of solution phase constituents, thermochemical activities (which are related to partial pressures for gaseous species), chemical potentials of solution species and phases, and integral Gibbs energy (referenced relative to standard state). The overall goal is to provide an open source computational tool to enhance the predictive capability of multi-physics codes without significantly impeding computational performance.« less
NASA Astrophysics Data System (ADS)
Cai, Yong; Cui, Xiangyang; Li, Guangyao; Liu, Wenyang
2018-04-01
The edge-smooth finite element method (ES-FEM) can improve the computational accuracy of triangular shell elements and the mesh partition efficiency of complex models. In this paper, an approach is developed to perform explicit finite element simulations of contact-impact problems with a graphical processing unit (GPU) using a special edge-smooth triangular shell element based on ES-FEM. Of critical importance for this problem is achieving finer-grained parallelism to enable efficient data loading and to minimize communication between the device and host. Four kinds of parallel strategies are then developed to efficiently solve these ES-FEM based shell element formulas, and various optimization methods are adopted to ensure aligned memory access. Special focus is dedicated to developing an approach for the parallel construction of edge systems. A parallel hierarchy-territory contact-searching algorithm (HITA) and a parallel penalty function calculation method are embedded in this parallel explicit algorithm. Finally, the program flow is well designed, and a GPU-based simulation system is developed, using Nvidia's CUDA. Several numerical examples are presented to illustrate the high quality of the results obtained with the proposed methods. In addition, the GPU-based parallel computation is shown to significantly reduce the computing time.
NASA Astrophysics Data System (ADS)
Werthimer, Dan; Anderson, David; Bowyer, Stuart; Cobb, Jeff; Demorest, Paul
2002-01-01
We summarize results from two radio and two optical SETI programs based at the University of California, Berkeley. We discuss the most promising candidate signals from these searches and present plans for future SETI searches, including SERENDIP V and SETI@home II. The ongoing SERENDIP sky survey searches for radio signals at the 300 meter Arecibo Observatory. SERENDIP IV uses a 168 million channel spectrum analyser and a dedicated receiver to take data 24 hours a day, year round. The sky survey covers a 100 MHz band centered at the 21 cm line (1420 MHz) and declinations from -2 to +38 degrees. SETI@home uses desktop computers of 3.5 million volunteers to analyse 50 Terabytes of data taken at Arecibo. The SETI@home sky survey is 10 times more sensitive and searches a much wider variety of signal types than SERRENDIP IV but covers only a 2.5 MHz band. SETI@home is the planet's largest supercomputer, averaging 25 Tflops. SETI@home participants have contributed over a million years of computing time so far. The SEVENDIP optical pulse search looks for nS time scale pulses at optical wavelengths. It utilizes an automated 30 inch telescope, three ultra fast photo multiplier tubes and a coincidence detector. The target list includes F,G,K and M stars, globular cluster and galaxies. The SPOCK optical SETI program searches for narrow band continuous signals using spectra taken by Marcy and his colleagues in their planet search at Keck observatory.
U.S. Department of Energy Isotope Program
None
2018-01-16
The National Isotope Development Center (NIDC) interfaces with the User Community and manages the coordination of isotope production across the facilities and business operations involved in the production, sale, and distribution of isotopes. A virtual center, the NIDC is funded by the Isotope Development and Production for Research and Applications (IDPRA) subprogram of the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. PNNLâs Isotope Program operates in a multi-program category-2 nuclear facility, the Radiochemical Processing Laboratory (RPL), that contains 16 hot cells and 20 gloveboxes. As part of the DOE Isotope Program, the Pacific Northwest National Laboratory dispenses strontium-90, neptunium-237, radium-223, and thorium-227. PNNLâs Isotope Program uses a dedicated hot-cell for strontium-90 dispensing and a dedicated glovebox for radium-223 and thorium-227 dispensing. PNNLâs Isotope Program has access to state of the art analytical equipment in the RPL to support their research and production activities. DOE Isotope Program funded research at PNNL has advanced the application of automated radiochemistry for isotope such as zirconium-89 and astatine-211 in partnership with the University of Washington.
U.S. Department of Energy Isotope Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The National Isotope Development Center (NIDC) interfaces with the User Community and manages the coordination of isotope production across the facilities and business operations involved in the production, sale, and distribution of isotopes. A virtual center, the NIDC is funded by the Isotope Development and Production for Research and Applications (IDPRA) subprogram of the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. PNNL’s Isotope Program operates in a multi-program category-2 nuclear facility, the Radiochemical Processing Laboratory (RPL), that contains 16 hot cells and 20 gloveboxes. As part of the DOE Isotope Program, the Pacific Northwestmore » National Laboratory dispenses strontium-90, neptunium-237, radium-223, and thorium-227. PNNL’s Isotope Program uses a dedicated hot-cell for strontium-90 dispensing and a dedicated glovebox for radium-223 and thorium-227 dispensing. PNNL’s Isotope Program has access to state of the art analytical equipment in the RPL to support their research and production activities. DOE Isotope Program funded research at PNNL has advanced the application of automated radiochemistry for isotope such as zirconium-89 and astatine-211 in partnership with the University of Washington.« less
The Quest for Mastery: Positive Youth Development through Out-of-School Programs
ERIC Educational Resources Information Center
Intrator, Sam M.; Siegel, Don
2014-01-01
In "The Quest for Mastery," Sam M. Intrator and Don Siegel investigate an emerging trend: the growth of out-of-school programs dedicated to helping underserved youth develop the personal qualities and capacities that will help them succeed in school, college, and beyond. Intensive programs from rowing to youth radio, from lacrosse to…
The Philosopher's Stone: How Basic Skills Programs Fare in Troubled Financial Times
ERIC Educational Resources Information Center
Ray, Thomas P.
2012-01-01
This mixed methods study examined the relative position of basic skills programs with transfer and career technical programs in a large suburban community college in California during the three-year period of budget reductions from 2009-2010 through 2011-2012. The budget line dedicated to part-time or non-contract instruction was analyzed along…
Constructing a Self-Funded Program Takes More than Just Dollars and Cents
ERIC Educational Resources Information Center
Burke, Scott
2012-01-01
With a little ingenuity and a lot of dedication, the author created a self-funded construction program that is weathering the ups and downs of school funding; it enjoys great support from the community, is accomplishing more with less, and collaborative efforts between teachers are paying off. Creating such a program takes time, vision,…
USDA Forest Service, Pacific Southwest Research Station Sudden Oak Death Research Program: 2001-2005
Patrick J. Shea
2006-01-01
The Pacific Southwest Research Station (PSW), U.S. Department of Agriculture (USDA) Forest Service initiated the Sudden Oak Death Research (SOD) Program in late 2000. The program was prompted by late fiscal year funding dedicated directly to begin research on this newly discovered disease. The history of discovery of Phytophthora ramorum, the...
Transformative Change Initiative
ERIC Educational Resources Information Center
Bragg, D. D.; Kirby, C.; Witt, M. A.; Richie, D.; Mix, S.; Feldbaum, M.; Liu, S.; Mason, M.
2014-01-01
The Transformative Change Initiative (TCI) is dedicated to assisting community colleges to scale up innovation in the form of guided pathways, programs of study, and evidence-based strategies to improve student outcomes and program, organization, and system performance. The impetus for TCI is the Trade Adjustment Assistance Community College and…
HPC USER WORKSHOP - JUNE 12TH | High-Performance Computing | NREL
to CentOS 7, changes to modules management, Singularity and containers on Peregrine, and using of changes, with the remaining two hours dedicated to demos and one-on-one interaction as needed
International Society for Technology in Education.
ERIC Educational Resources Information Center
Knox-Quinn, Carolyn
1992-01-01
Provides information about the International Society for Technology in Education (ISTE), an organization dedicated to improving education throughout the world by facilitating communication among instructors, media specialists, computer coordinators, information resource managers (IRMs), and administrative users of technology. Publications and the…
Grant, Michael C; Hanna, Andrew; Benson, Andrew; Hobson, Deborah; Wu, Christopher L; Yuan, Christina T; Rosen, Michael; Wick, Elizabeth C
2018-03-01
Our aim was to determine whether the establishment of a dedicated operating room team leads to improved process measure compliance and clinical outcomes in an Enhanced Recovery after Surgery (ERAS) program. Enhanced Recovery after Surgery programs involve the application of bundled best practices to improve the value of perioperative care. Successful implementation and sustainment of ERAS programs has been linked to compliance with protocol elements. Development of dedicated teams of anesthesia providers was a component of ERAS implementation. Intraoperative provider team networks (surgeons, anesthesiologists, and certified registered nurse anesthetists) were developed for all cases before and after implementation of colorectal ERAS. Four measures of centrality were analyzed in each network based on case assignments, and these measures were correlated with both rates of process measure compliance and clinical outcomes. Enhanced Recovery after Surgery provider teams led to a decrease in the closeness of anesthesiologists (p = 0.04) and significant increase in the clustering coefficient of certified registered nurse anesthetists (p = 0.005) compared with the pre-ERAS network. There was no significant change in centrality among surgeons (p = NS for all measures). Enhanced Recovery after Surgery designation among anesthesiologists and nurse anesthetists-whereby individual providers received an in-service on protocol elements and received compliance data was strongly associated with high compliance (>0.6 of measures; p < 0.001 for each group). In addition, high compliance was associated with a significant reduction in length of stay (p < 0.01), surgical site infection (p < 0.002), and morbidity (p < 0.009). Dedicated operating room teams led to increased centrality among anesthesia providers, which in turn not only increased compliance, but also improved several clinical outcomes. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Optimal one-way and roundtrip journeys design by mixed-integer programming
NASA Astrophysics Data System (ADS)
Ribeiro, Isabel M.; Vale, Cecília
2017-12-01
The introduction of multimodal/intermodal networks in transportation problems, especially when considering roundtrips, adds complexity to the models. This article presents two models for the optimization of intermodal trips as a contribution to the integration of transport modes in networks. The first model is devoted to one-way trips while the second one is dedicated to roundtrips. The original contribution of this research to transportation is mainly the consideration of roundtrips in the optimization process of intermodal transport, especially because the transport mode between two nodes on the return trip should be the same as the one on the outward trip if both nodes are visited on the return trip, which is a valuable aspect for transport companies. The mathematical formulations of both models leads to mixed binary linear programs, which is not a common approach for this type of problem. In this article, as well as the model description, computational experience is included to highlight the importance and efficiency of the proposed models, which may provide a valuable tool for transport managers.
Parallel computing on Unix workstation arrays
NASA Astrophysics Data System (ADS)
Reale, F.; Bocchino, F.; Sciortino, S.
1994-12-01
We have tested arrays of general-purpose Unix workstations used as MIMD systems for massive parallel computations. In particular we have solved numerically a demanding test problem with a 2D hydrodynamic code, generally developed to study astrophysical flows, by exucuting it on arrays either of DECstations 5000/200 on Ethernet LAN, or of DECstations 3000/400, equipped with powerful Alpha processors, on FDDI LAN. The code is appropriate for data-domain decomposition, and we have used a library for parallelization previously developed in our Institute, and easily extended to work on Unix workstation arrays by using the PVM software toolset. We have compared the parallel efficiencies obtained on arrays of several processors to those obtained on a dedicated MIMD parallel system, namely a Meiko Computing Surface (CS-1), equipped with Intel i860 processors. We discuss the feasibility of using non-dedicated parallel systems and conclude that the convenience depends essentially on the size of the computational domain as compared to the relative processor power and network bandwidth. We point out that for future perspectives a parallel development of processor and network technology is important, and that the software still offers great opportunities of improvement, especially in terms of latency times in the message-passing protocols. In conditions of significant gain in terms of speedup, such workstation arrays represent a cost-effective approach to massive parallel computations.
NASA Astrophysics Data System (ADS)
Makino, Junichiro
2002-12-01
We overview our GRAvity PipE (GRAPE) project to develop special-purpose computers for astrophysical N-body simulations. The basic idea of GRAPE is to attach a custom-build computer dedicated to the calculation of gravitational interaction between particles to a general-purpose programmable computer. By this hybrid architecture, we can achieve both a wide range of applications and very high peak performance. Our newest machine, GRAPE-6, achieved the peak speed of 32 Tflops, and sustained performance of 11.55 Tflops, for the total budget of about 4 million USD. We also discuss relative advantages of special-purpose and general-purpose computers and the future of high-performance computing for science and technology.
Space shuttle and life sciences
NASA Technical Reports Server (NTRS)
Mason, J. A.
1977-01-01
During the 1980's, some 200 Spacelab missions will be flown on space shuttle in earth-orbit. Within these 200 missions, it is planned that at least 20 will be dedicated to life sciences research, projects which are yet to be outlined by the life sciences community. Objectives of the Life Sciences Shuttle/Spacelab Payloads Program are presented. Also discussed are major space life sciences programs including space medicine and physiology, clinical medicine, life support technology, and a variety of space biology topics. The shuttle, spacelab, and other life sciences payload carriers are described. Concepts for carry-on experiment packages, mini-labs, shared and dedicated spacelabs, as well as common operational research equipment (CORE) are reviewed. Current NASA planning and development includes Spacelab Mission Simulations, an Announcement of Planning Opportunity for Life Sciences, and a forthcoming Announcement of Opportunity for Flight Experiments which will together assist in forging a Life Science Program in space.
Neural-network dedicated processor for solving competitive assignment problems
NASA Technical Reports Server (NTRS)
Eberhardt, Silvio P. (Inventor)
1993-01-01
A neural-network processor for solving first-order competitive assignment problems consists of a matrix of N x M processing units, each of which corresponds to the pairing of a first number of elements of (R sub i) with a second number of elements (C sub j), wherein limits of the first number are programmed in row control superneurons, and limits of the second number are programmed in column superneurons as MIN and MAX values. The cost (weight) W sub ij of the pairings is programmed separately into each PU. For each row and column of PU's, a dedicated constraint superneuron insures that the number of active neurons within the associated row or column fall within a specified range. Annealing is provided by gradually increasing the PU gain for each row and column or increasing positive feedback to each PU, the latter being effective to increase hysteresis of each PU or by combining both of these techniques.
The Sociology of the Deceased Harvard Medical Unit at Boston City Hospital.
Tishler, Peter V
2015-12-01
Many graduates of the Harvard Medical Unit (HMU) at Boston City Hospital, in either the clinical training/residency program or the research program at the Thorndike Memorial Laboratory, contributed in major ways to the HMU and constantly relived their HMU experiences. The HMU staff physicians, descending from founder and mentor physicians Francis W. Peabody, Soma Weiss, and George R. Minot, were dedicated to the teaching, development, and leadership of its clinical and research trainees, whose confidence and dedication to patient care as a result of their mentorship led many to lifelong achievements as clinicians, teachers, and mentors. Their experience also led to a lifelong love of the HMU (despite its loss), camaraderie, happiness, and intense friendships with their associates.
The Sociology of the Deceased Harvard Medical Unit at Boston City Hospital
Tishler, Peter V.
2015-01-01
Many graduates of the Harvard Medical Unit (HMU) at Boston City Hospital, in either the clinical training/residency program or the research program at the Thorndike Memorial Laboratory, contributed in major ways to the HMU and constantly relived their HMU experiences. The HMU staff physicians, descending from founder and mentor physicians Francis W. Peabody, Soma Weiss, and George R. Minot, were dedicated to the teaching, development, and leadership of its clinical and research trainees, whose confidence and dedication to patient care as a result of their mentorship led many to lifelong achievements as clinicians, teachers, and mentors. Their experience also led to a lifelong love of the HMU (despite its loss), camaraderie, happiness, and intense friendships with their associates. PMID:26604868
ERIC Educational Resources Information Center
Teacher Education Accreditation Council, 2010
2010-01-01
The Teacher Education Accreditation Council (TEAC), founded in 1997, is dedicated to improving academic degree and certificate programs for professional educators--those who teach and lead in schools, pre-K through grade 12, and to assuring the public of their quality. TEAC accredits undergraduate and graduate programs, including alternate route…
Creating a Structured Support System for Postsecondary Success
ERIC Educational Resources Information Center
White, Carol Cutler
2018-01-01
For numerous reasons, it can be difficult for foster youth to succeed in postsecondary education. This chapter offers insight into state-level policies and programs, community college programs dedicating to supporting foster youth, and a framework for creating a structured support system to increase student success.
Coyote Community College Case Study.
ERIC Educational Resources Information Center
National Inst. of Standards and Technology, Gaithersburg, MD.
The Malcolm Baldrige National Quality Award (MBNQA) was created in 1987 to foster the success of the Baldrige National Quality Program. The award program, sponsored by the United States Department of Education, is a public-private partnership dedicated to improving national competitiveness. The National Institute of Standards and Technology…
Eurogrid: a new glideinWMS based portal for CDF data analysis
NASA Astrophysics Data System (ADS)
Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.
2012-12-01
The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.
Delzell, Patricia B; Boyle, Alex; Schneider, Erika
2015-06-01
The purpose of this study was to define and report on the effect of a comprehensive musculoskeletal sonography training program to improve accuracy (sensitivity and specificity) for the diagnosis of rotator cuff tears in relatively inexperienced operators. Before the training program was implemented, radiologists (n = 12) had a mean of 2 years (range, <1-12 years) of experience performing and interpreting musculoskeletal sonography. Pre- and post-training shoulder sonographic results were compared to surgical reports or, in their absence, to shoulder magnetic resonance imaging or computed tomographic arthrographic results if within 2 months of the sonographic examination. A total of 82 patients were included in the pre-training group (January 2010-December 2011), and 50 patients were included in the post-training group (January 2012-June 2013). The accuracy, sensitivity, specificity, and positive and negative predictive values were determined for the presence or absence of supraspinatus and infraspinatus tendon tears. After implementation of the training program, the sensitivity of sonography for detecting full-thickness rotator cuff tears increased by 14%, and the sensitivity for detecting partial-thickness rotator cuff tears increased by 3%. Quality improvement programs and acquisition standardization along with ongoing, focused case conferences for the entire care team increased the sensitivity of shoulder sonography for diagnosing both full- and partial-thickness rotator cuff tears, independent of the years of operator experience. © 2015 by the American Institute of Ultrasound in Medicine.
The ALICE Software Release Validation cluster
NASA Astrophysics Data System (ADS)
Berzano, D.; Krzewicki, M.
2015-12-01
One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.
Cloud@Home: A New Enhanced Computing Paradigm
NASA Astrophysics Data System (ADS)
Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco
Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).
Virginia's monitoring goals and programs: eastern state perspective
Dana Bradshaw
1993-01-01
Unlike the federal ownership patterns of the western United States, the eastern states are still largely in the hands of the private landowner. As a result, the implementation of the Partners in Flight program in the East will depend a great deal on the motivation and dedication of individual states. Monitoring programs in particular are in a position to benefit from...
ERIC Educational Resources Information Center
Ishiyama, John; Miles, Tom; Balarezo, Christine
2010-01-01
In this article, we investigate the graduate curricula of political science programs and 122 Ph.D.-granting political science programs in the United States and how they seek to prepare political science teachers. We first investigate whether the department offers a dedicated political science course at the graduate level on college teaching, and…
Laser Programs, the first 25 years, 1972-1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, E.M.
1998-03-04
Welcome to Laser Programs. I am pleased that you can share in the excitement of 25 years of history since we began as a small program of 125 people to our current status as a world premier laser and applied science research team of over 1700 members. It is fitting that this program, which was founded on the dream of developing inertial confinement fusion technology, should celebrate this anniversary the same year that the ground is broken for the National Ignition Facility (NIF). Also at the same time, we are feeling the excitement of moving forward the Atomic Vapor Lasermore » Isotope Separation (AVLIS) technology toward private sector use and developing many alternate scientific applications and technologies derived from our core programs. It is through the hard work of many dedicated scientists, engineers, technicians, and administrative team members that we have been able to accomplish the remarkable internationally recognized achievements highlighted here. I hope this brochure will help you enjoy the opportunity to share in the celebration and pride of our scientific accomplishments; state-of-the-art facilities; and diligent, dedicated people that together make our Laser Programs and Lawrence Livermore National Laboratory the best in the world.« less
TU-A-18C-01: ACR Accreditation Updates in CT, Ultrasound, Mammography and MRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, R; Berns, E; Hangiandreou, N
2014-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, the ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-datemore » as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, mammography, ultrasound, and computed tomography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program. To understand the new requirements of the ACR ultrasound accreditation program, and roles the physicist can play in annual equipment surveys and setting up and supervising the routine QC program. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process.« less
MO-AB-207-02: ACR Update in MR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, R.
2015-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-04: ACR Update in Mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berns, E.
2015-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-01: ACR Update in CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNitt-Gray, M.
2015-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-00: ACR Update in MR, CT, Nuclear Medicine, and Mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-03: ACR Update in Nuclear Medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harkness, B.
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
Kaiser, W; Faber, T S; Findeis, M
1996-01-01
The authors developed a computer program that detects myocardial infarction (MI) and left ventricular hypertrophy (LVH) in two steps: (1) by extracting parameter values from a 10-second, 12-lead electrocardiogram, and (2) by classifying the extracted parameter values with rule sets. Every disease has its dedicated set of rules. Hence, there are separate rule sets for anterior MI, inferior MI, and LVH. If at least one rule is satisfied, the disease is said to be detected. The computer program automatically develops these rule sets. A database (learning set) of healthy subjects and patients with MI, LVH, and mixed MI+LVH was used. After defining the rule type, initial limits, and expected quality of the rules (positive predictive value, minimum number of patients), the program creates a set of rules by varying the limits. The general rule type is defined as: disease = lim1l < p1 < or = lim1u and lim2l < p2 < or = lim2u and ... limnl < pn < or = limnu. When defining the rule types, only the parameters (p1 ... pn) that are known as clinical electrocardiographic criteria (amplitudes [mV] of Q, R, and T waves and ST-segment; duration [ms] of Q wave; frontal angle [degrees]) were used. This allowed for submitting the learned rule sets to an independent investigator for medical verification. It also allowed the creation of explanatory texts with the rules. These advantages are not offered by the neurons of a neural network. The learned rules were checked against a test set and the following results were obtained: MI: sensitivity 76.2%, positive predictive value 98.6%; LVH: sensitivity 72.3%, positive predictive value 90.9%. The specificity ratings for MI are better than 98%; for LVH, better than 90%.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... departments and agencies, and would include an office dedicated to expanding foreign investment and assisting... Government's trade, foreign investment, export, and business programs and functions. Accordingly, to further... sustainable economic growth through trade and foreign investment, and to ensure the effective [[Page 10936...
As regulatory pressure to reduce the environmental impact of urban stormwater intensifies, U.S. municipalities increasingly seek a dedicated source of funding for stormwater programs, such as a stormwater utility. In rare instances, single family residences are eligible for utili...
Literacy as Social Action in City Debate
ERIC Educational Resources Information Center
Cridland-Hughes, Susan
2012-01-01
This study examines critical literacy and the intersections of oral, aural, written, and performative literate practices in City Debate, an afterschool program dedicated to providing debate instruction to students in a major Southeastern city. Previous research into definitions and beliefs about literacy in an urban debate program over its twenty…
A Survey of 100 Community Colleges on Student Substance Use, Programming, and Collaborations
ERIC Educational Resources Information Center
Chiauzzi, Emil; Donovan, Elizabeth; Black, Ryan; Cooney, Elizabeth; Buechner, Allison; Wood, Mollie
2011-01-01
Objective: The objective was to survey community college personnel about student substance use, and infrastructure (staff and funding), programs, and collaborations dedicated to substance use prevention. Participants: The sample included 100 administrators, faculty, and health services staff at 100 community colleges. Methods: Participants…
The RSZ BASIC programming language manual
NASA Technical Reports Server (NTRS)
Stattel, R. J.; Niswander, J. K.; Kochhar, A. K.
1980-01-01
The RSZ BASIC interactive language is described. The RSZ BASIC interpreter is resident in the Telemetry Data Processor, a system dedicated to the processing and displaying of PCM telemetry data. A series of working examples teaches the fundamentals of RSZ BASIC and shows how to construct, edit, and manage storage of programs.
Missouri School Improvement Program: Support and Intervention
ERIC Educational Resources Information Center
Missouri Department of Elementary and Secondary Education, 2016
2016-01-01
The Missouri State Board of Education and the Department of Elementary and Secondary Education are dedicated to ensuring that all children have access to good schools that prepare them for college and career success. The Missouri School Improvement Program: Support and Intervention Plan takes a differentiated approach to state support based on…
Animals and Inmates: A Sharing Companionship behind Bars.
ERIC Educational Resources Information Center
Moneymaker, James M.; Strimle, Earl O.
1991-01-01
Describes People, Animals and Love (PAL), organization dedicated to bringing people and pets together and PAL program implemented in one correctional facility. Notes that PAL program has given prisoners opportunity to learn vocational trade while improving their quality of life by showing compassion and understanding to animals. (Author/NB)
Family, School, and Community Partnerships: Practical Strategies for Afterschool Programs
ERIC Educational Resources Information Center
Finn-Stevenson, Matia
2014-01-01
Much attention is given today to the importance of forging family, school, and community partnerships. Growing numbers of schools, many of them with afterschool programs, are dedicating resources to support and sustain relationships with families and community-based organizations. And, among government agencies and the philanthropic sector, there…
Access to Care: Overcoming the Rural Physician Shortage.
ERIC Educational Resources Information Center
Baldwin, Fred D.
1999-01-01
Describes three state-initiated programs that address the challenge of providing access to health care for Appalachia's rural residents: a traveling pediatric diabetes clinic serving eastern Kentucky; a telemedicine program operated out of Knoxville, Tennessee; and a new medical school in Kentucky dedicated to training doctors from Appalachia for…
CEEB Campus to Prospective Student Programs.
ERIC Educational Resources Information Center
Kirkman, Kay
Johnson County Community College, one of 20 institutions of higher education in the greater Kansas City metropolitan area, has developed a comprehensive communications program which works. Close to eight percent of the college's current operating budget is dedicated to the communication process. Most publications are printed on campus by the Word…
Special issue of Computers and Fluids in honor of Cecil E. (Chuck) Leith
Zhou, Ye; Herring, Jackson
2017-05-12
Here, this special issue of Computers and Fluids is dedicated to Cecil E. (Chuck) Leith in honor of his research contributions, leadership in the areas of statistical fluid mechanics, computational fluid dynamics, and climate theory. Leith's contribution to these fields emerged from his interest in solving complex fluid flow problems--even those at high Mach numbers--in an era well before large scale supercomputing became the dominant mode of inquiry into these fields. Yet the issues raised and solved by his research effort are still of vital interest today.
Special issue of Computers and Fluids in honor of Cecil E. (Chuck) Leith
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ye; Herring, Jackson
Here, this special issue of Computers and Fluids is dedicated to Cecil E. (Chuck) Leith in honor of his research contributions, leadership in the areas of statistical fluid mechanics, computational fluid dynamics, and climate theory. Leith's contribution to these fields emerged from his interest in solving complex fluid flow problems--even those at high Mach numbers--in an era well before large scale supercomputing became the dominant mode of inquiry into these fields. Yet the issues raised and solved by his research effort are still of vital interest today.
Assessment of multiple DWI offender restrictions
DOT National Transportation Integrated Search
1989-12-01
This report discusses nine new approaches for reducing recidivism among multiple DWI offenders: dedicated detention facilities, diversion programs, electronic monitoring, ignition interlock systems, intensive probation supervision, publishing offende...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Mi-Ae; Moore, Stephen C.; McQuaid, Sarah J.
Purpose: The authors have previously reported the advantages of high-sensitivity single-photon emission computed tomography (SPECT) systems for imaging structures located deep inside the brain. DaTscan (Isoflupane I-123) is a dopamine transporter (DaT) imaging agent that has shown potential for early detection of Parkinson disease (PD), as well as for monitoring progression of the disease. Realizing the full potential of DaTscan requires efficient estimation of striatal uptake from SPECT images. They have evaluated two SPECT systems, a conventional dual-head gamma camera with low-energy high-resolution collimators (conventional) and a dedicated high-sensitivity multidetector cardiac imaging system (dedicated) for imaging tasks related to PD.more » Methods: Cramer-Rao bounds (CRB) on precision of estimates of striatal and background activity concentrations were calculated from high-count, separate acquisitions of the compartments (right striata, left striata, background) of a striatal phantom. CRB on striatal and background activity concentration were calculated from essentially noise-free projection datasets, synthesized by scaling and summing the compartment projection datasets, for a range of total detected counts. They also calculated variances of estimates of specific-to-nonspecific binding ratios (BR) and asymmetry indices from these values using propagation of error analysis, as well as the precision of measuring changes in BR on the order of the average annual decline in early PD. Results: Under typical clinical conditions, the conventional camera detected 2 M counts while the dedicated camera detected 12 M counts. Assuming a normal BR of 5, the standard deviation of BR estimates was 0.042 and 0.021 for the conventional and dedicated system, respectively. For an 8% decrease to BR = 4.6, the signal-to-noise ratio were 6.8 (conventional) and 13.3 (dedicated); for a 5% decrease, they were 4.2 (conventional) and 8.3 (dedicated). Conclusions: This implies that PD can be detected earlier with the dedicated system than with the conventional system; therefore, earlier identification of PD progression should be possible with the high-sensitivity dedicated SPECT camera.« less
The New Generation of Information Systems.
ERIC Educational Resources Information Center
Grunwald, Peter
1990-01-01
A new generation of home-use electronic information systems could help transform American schooling. These services reach beyond computer enthusiasts, using various combinations of mass marketing techniques, attractive graphics, easy-to-use controls, localized information, low-cost access, and dedicated terminals. Representative samples include…
CAM: A high-performance cellular-automaton machine
NASA Astrophysics Data System (ADS)
Toffoli, Tommaso
1984-01-01
CAM is a high-performance machine dedicated to the simulation of cellular automata and other distributed dynamical systems. Its speed is about one-thousand times greater than that of a general-purpose computer programmed to do the same task; in practical terms, this means that CAM can show the evolution of cellular automata on a color monitor with an update rate, dynamic range, and spatial resolution comparable to those of a Super-8 movie, thus permitting intensive interactive experimentation. Machines of this kind can open up novel fields of research, and in this context it is important that results be easy to obtain, reproduce, and transmit. For these reasons, in designing CAM it was important to achieve functional simplicity, high flexibility, and moderate production cost. We expect that many research groups will be able to own their own copy of the machine to do research with.
Benefits and costs of low thrust propulsion systems
NASA Technical Reports Server (NTRS)
Robertson, R. I.; Rose, L. J.; Maloy, J. E.
1983-01-01
The results of costs/benefits analyses of three chemical propulsion systems that are candidates for transferring high density, low volume STS payloads from LEO to GEO are reported. Separate algorithms were developed for benefits and costs of primary propulsion systems (PPS) as functions of the required thrust levels. The life cycle costs of each system were computed based on the developmental, production, and deployment costs. A weighted criteria rating approach was taken for the benefits, with each benefit assigned a value commensurate to its relative worth to the overall system. Support costs were included in the costs modeling. Reference missions from NASA, commercial, and DoD catalog payloads were examined. The program was concluded reliable and flexible for evaluating benefits and costs of launch and orbit transfer for any catalog mission, with the most beneficial PPS being a dedicated low thrust configuration using the RL-10 system.
NASA Astrophysics Data System (ADS)
Nagy, Tamás; Vadai, Gergely; Gingl, Zoltán
2017-09-01
Modern measurement of physical signals is based on the use of sensors, electronic signal conditioning, analog-to-digital conversion and digital signal processing carried out by dedicated software. The same signal chain is used in many devices such as home appliances, automotive electronics, medical instruments, and smartphones. Teaching the theoretical, experimental, and signal processing background must be an essential part of improving the standard of higher education, and it fits well to the increasingly multidisciplinary nature of physics and engineering too. In this paper, we show how digital phonocardiography can be used in university education as a universal, highly scalable, exciting, and inspiring laboratory practice and as a demonstration at various levels and complexity. We have developed open-source software templates in modern programming languages to support immediate use and to serve as a basis of further modifications using personal computers, tablets, and smartphones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David, G.; Rapp, R.; Ruan, L.
The RIKEN BNL Research Center (RBRC) was established in April 1997 at Brookhaven National Laboratory. It is funded by the ''Rikagaku Kenkyusho'' (RIKEN, The Institute of Physical and Chemical Research) of Japan and the U. S. Department of Energy’s Office of Science. The RBRC is dedicated to the study of strong interactions, including spin physics, lattice QCD, and RHIC physics through the nurturing of a new generation of young physicists. The RBRC has theory, lattice gauge computing and experimental components. It is presently exploring the possibility of an astrophysics component being added to the program. The primary theme for thismore » workshop related to sharing the latest experimental and theoretical developments in area of low transverse momentum (p T) dielectron and photons. All the presentations given at the workshop are included in this proceedings, primarily as PowerPoint presentations.« less
Sixteenth ARPA Systems and Technology Symposium
1993-06-22
10:1 weight reduction over existing MILSTAR feed networks. 0 • In addition, EMS has demonstrated their dedication to ARPA and this technology bY cost...Corporation Computing Devices International DynCorp-Meridian COMSAT Laboratories E-Systems Inc. Context Systems Eastman Kodak Company Contraves Inc. EG&G CTA...were outstanding mathematicians and said, "Your first project is to compute how much volume and weight of water would fill the light bulb." He gave
Developing inclusive employment: lessons from Telenor Open Mind.
Kalef, Laura; Barrera, Magda; Heymann, Jody
2014-01-01
Despite significant gains in legal rights for people with disabilities, the employment rate for individuals with disabilities in many countries remains extremely low. Programs to promote the inclusion of people with disabilities in the workforce can have an important impact on individuals' economic and social prospects, as well as societal benefits. This article aims to explore Telenor Open Mind, a job training program at Norway's largest telecommunications company with financial support from Norway's Labor and Welfare Organization (NAV), which acts as a springboard for individuals with disabilities into the workplace. A qualitative case study design was utilized to explore the Telenor Open Mind Program. Drawing on field research conducted in Oslo during 2011, this article explores subjective experiences of individuals involved with the program, through interviews and program observations. Telenor Open Mind's two-year program is comprised of a three month training period, in which individuals participate in computer and self-development courses followed by a 21-month paid internship where participants gain hands-on experience. The program has an average 75% rate of employment upon completion and a high rate of participant satisfaction. Participation in the program led to increased self-confidence and social development. The company experienced benefits from greater workplace satisfaction and reductions in sick leave rates. The Telenor Open Mind program has provided benefits for participants, the company, and society as a whole. Participants gain training, work experience, and increased employability. Telenor gains dedicated and trained employees, in addition to reducing sick leave absences among all employees. Finally, society benefits from the Open Mind program as the individuals who gain employment become tax-payers, and no longer need to receive benefits from the government.
NASA Astrophysics Data System (ADS)
Angeli, C.; Cimiraglia, R.
2013-02-01
A symbolic program performing the Formal Reduction of Density Operators (FRODO), formerly developed in the MuPAD computer algebra system with the purpose of evaluating the matrix elements of the electronic Hamiltonian between internally contracted functions in a complete active space (CAS) scheme, has been rewritten in Mathematica. New version : A program summaryProgram title: FRODO Catalogue identifier: ADV Y _v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVY_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3878 No. of bytes in distributed program, including test data, etc.: 170729 Distribution format: tar.gz Programming language: Mathematica Computer: Any computer on which the Mathematica computer algebra system can be installed Operating system: Linux Classification: 5 Catalogue identifier of previous version: ADV Y _v1_0 Journal reference of previous version: Comput. Phys. Comm. 171(2005)63 Does the new version supersede the previous version?: No Nature of problem. In order to improve on the CAS-SCF wavefunction one can resort to multireference perturbation theory or configuration interaction based on internally contracted functions (ICFs) which are obtained by application of the excitation operators to the reference CAS-SCF wavefunction. The previous formulation of such matrix elements in the MuPAD computer algebra system, has been rewritten using Mathematica. Solution method: The method adopted consists in successively eliminating all occurrences of inactive orbital indices (core and virtual) from the products of excitation operators which appear in the definition of the ICFs and in the electronic Hamiltonian expressed in the second quantization formalism. Reasons for new version: Some years ago we published in this journal a couple of papers [1, 2] hereafter to be referred to as papers I and II, respectively dedicated to the automated evaluation of the matrix elements of the molecular electronic Hamiltonian between internally contracted functions [3] (ICFs). In paper II the program FRODO (after Formal Reduction Of Density Operators) was presented with the purpose of providing working formulas for each occurrence of the ICFs. The original FRODO program was written in the MuPAD computer algebra system [4] and was actively used in our group for the generation of the matrix elements to be employed in the third-order n-electron valence state perturbation theory (NEVPT) [5-8] as well as in the internally contracted configuration interaction (IC-CI) [9]. We present a new version of the program FRODO written in the Mathematica system [10]. The reason for the rewriting of the program lies in the fact that, on the one hand, MuPAD does not seem to be any longer available as a stand-alone system and, on the other hand, Mathematica, due to its ubiquitousness, appears to be increasingly the computer algebra system most widely used nowadays. Restrictions: The program is limited to no more than doubly excited ICFs. Running time: The examples described in the Readme file take a few seconds to run. References: [1] C. Angeli, R. Cimiraglia, Comp. Phys. Comm. 166 (2005) 53. [2] C. Angeli, R. Cimiraglia, Comp. Phys. Comm. 171 (2005) 63. [3] H.-J. Werner, P. J. Knowles, Adv. Chem. Phys. 89 (1988) 5803. [4] B. Fuchssteiner, W. Oevel: http://www.mupad.de Mupad research group, university of Paderborn. Mupad version 2.5.3 for Linux. [5] C. Angeli, R. Cimiraglia, S. Evangelisti, T. Leininger, J.-P. Malrieu, J. Chem. Phys. 114 (2001) 10252. [6] C. Angeli, R. Cimiraglia, J.-P. Malrieu, J. Chem. Phys. 117 (2002) 9138. [7] C. Angeli, B. Bories, A. Cavallini, R. Cimiraglia, J. Chem. Phys. 124 (2006) 054108. [8] C. Angeli, M. Pastore, R. Cimiraglia, Theor. Chem. Acc. 117 (2007) 743. [9] C. Angeli, R. Cimiraglia, Mol. Phys. in press, DOI:10.1080/00268976.2012.689872 [10] http://www.wolfram.com/Mathematica. Mathematica version 8 for Linux.
Efficient iterative image reconstruction algorithm for dedicated breast CT
NASA Astrophysics Data System (ADS)
Antropova, Natalia; Sanchez, Adrian; Reiser, Ingrid S.; Sidky, Emil Y.; Boone, John; Pan, Xiaochuan
2016-03-01
Dedicated breast computed tomography (bCT) is currently being studied as a potential screening method for breast cancer. The X-ray exposure is set low to achieve an average glandular dose comparable to that of mammography, yielding projection data that contains high levels of noise. Iterative image reconstruction (IIR) algorithms may be well-suited for the system since they potentially reduce the effects of noise in the reconstructed images. However, IIR outcomes can be difficult to control since the algorithm parameters do not directly correspond to the image properties. Also, IIR algorithms are computationally demanding and have optimal parameter settings that depend on the size and shape of the breast and positioning of the patient. In this work, we design an efficient IIR algorithm with meaningful parameter specifications and that can be used on a large, diverse sample of bCT cases. The flexibility and efficiency of this method comes from having the final image produced by a linear combination of two separately reconstructed images - one containing gray level information and the other with enhanced high frequency components. Both of the images result from few iterations of separate IIR algorithms. The proposed algorithm depends on two parameters both of which have a well-defined impact on image quality. The algorithm is applied to numerous bCT cases from a dedicated bCT prototype system developed at University of California, Davis.
Line-by-line spectroscopic simulations on graphics processing units
NASA Astrophysics Data System (ADS)
Collange, Sylvain; Daumas, Marc; Defour, David
2008-01-01
We report here on software that performs line-by-line spectroscopic simulations on gases. Elaborate models (such as narrow band and correlated-K) are accurate and efficient for bands where various components are not simultaneously and significantly active. Line-by-line is probably the most accurate model in the infrared for blends of gases that contain high proportions of H 2O and CO 2 as this was the case for our prototype simulation. Our implementation on graphics processing units sustains a speedup close to 330 on computation-intensive tasks and 12 on memory intensive tasks compared to implementations on one core of high-end processors. This speedup is due to data parallelism, efficient memory access for specific patterns and some dedicated hardware operators only available in graphics processing units. It is obtained leaving most of processor resources available and it would scale linearly with the number of graphics processing units in parallel machines. Line-by-line simulation coupled with simulation of fluid dynamics was long believed to be economically intractable but our work shows that it could be done with some affordable additional resources compared to what is necessary to perform simulations on fluid dynamics alone. Program summaryProgram title: GPU4RE Catalogue identifier: ADZY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 62 776 No. of bytes in distributed program, including test data, etc.: 1 513 247 Distribution format: tar.gz Programming language: C++ Computer: x86 PC Operating system: Linux, Microsoft Windows. Compilation requires either gcc/g++ under Linux or Visual C++ 2003/2005 and Cygwin under Windows. It has been tested using gcc 4.1.2 under Ubuntu Linux 7.04 and using Visual C++ 2005 with Cygwin 1.5.24 under Windows XP. RAM: 1 gigabyte Classification: 21.2 External routines: OpenGL ( http://www.opengl.org) Nature of problem: Simulating radiative transfer on high-temperature high-pressure gases. Solution method: Line-by-line Monte-Carlo ray-tracing. Unusual features: Parallel computations are moved to the GPU. Additional comments: nVidia GeForce 7000 or ATI Radeon X1000 series graphics processing unit is required. Running time: A few minutes.
Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter
2009-06-01
Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.
Analysis hierarchical model for discrete event systems
NASA Astrophysics Data System (ADS)
Ciortea, E. M.
2015-11-01
The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.
Scientific Visualization & Modeling for Earth Systems Science Education
NASA Technical Reports Server (NTRS)
Chaudhury, S. Raj; Rodriguez, Waldo J.
2003-01-01
Providing research experiences for undergraduate students in Earth Systems Science (ESS) poses several challenges at smaller academic institutions that might lack dedicated resources for this area of study. This paper describes the development of an innovative model that involves students with majors in diverse scientific disciplines in authentic ESS research. In studying global climate change, experts typically use scientific visualization techniques applied to remote sensing data collected by satellites. In particular, many problems related to environmental phenomena can be quantitatively addressed by investigations based on datasets related to the scientific endeavours such as the Earth Radiation Budget Experiment (ERBE). Working with data products stored at NASA's Distributed Active Archive Centers, visualization software specifically designed for students and an advanced, immersive Virtual Reality (VR) environment, students engage in guided research projects during a structured 6-week summer program. Over the 5-year span, this program has afforded the opportunity for students majoring in biology, chemistry, mathematics, computer science, physics, engineering and science education to work collaboratively in teams on research projects that emphasize the use of scientific visualization in studying the environment. Recently, a hands-on component has been added through science student partnerships with school-teachers in data collection and reporting for the GLOBE Program (GLobal Observations to Benefit the Environment).
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization.
Jung, Sang-Kyu; McDonald, Karen
2011-08-16
Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization
2011-01-01
Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net. PMID:21846353
Army Hearing Program Talking Points Calendar Year 2016
2017-09-12
Reserve ARMY HEARING PROGRAM TALKING POINTS CALENDAR YEAR 2016 TIP No. 51-065-0817 2 BACKGROUND Hearing health in the Army has improved...over time, largely due to the dedicated work of hearing health experts. However, noise-induced hearing loss and associated problems have not been...eliminated. The Army Hearing Program continually evolves to address hearing health challenges, and maintains the momentum to build iteratively upon
1988-09-01
maintenance programs. They use "a dedicated age exploration technique and actuarial analyses (31:847)" to Justify any changes to programs. RAAF. The...A066593). 8. Coffin, M.D. and C.F. Tiffany. "New Air Force Requirements for Structural Safety, Durability and Life Management," AIAA/ ASME /SAE 16th
Adult Congenital Cardiac Care.
Kogon, Brian E; Miller, Kati; Miller, Paula; Alsoufi, Bahaaldin; Rosenblum, Joshua M
2017-03-01
The Adult Congenital Heart Association (ACHA) is dedicated to supporting patients with congenital heart disease. To guide patients to qualified providers and programs, it maintains a publicly accessible directory of dedicated adult congenital cardiac programs. We analyzed the directory in 2006 and 2015, aiming to evaluate the growth of the directory as a whole and to evaluate the growth of individual programs within the directory. We also hope this raises awareness of the growing opportunities that exist in adult congenital cardiology and cardiac surgery. Data in the directory are self-reported. Only data from US programs were collected and analyzed. By the end of 2015, compared to 2006, there were more programs reporting to the directory in more states (107 programs across 42 states vs 57 programs across 33 states), with higher overall clinical volume (591 vs 164 half-day clinics per week, 96,611 vs 34,446 patient visits). On average, each program was busier (5 vs 2 half-day clinics per week per program). Over the time period, the number of reported annual operations performed nearly doubled (4,346 operations by 210 surgeons vs 2,461 operations by 125 surgeons). Access to ancillary services including specific clinical diagnostic and therapeutic services also expanded. Between 2006 and 2015, the clinical directory and the individual programs have grown. Current directory data may provide benchmarks for staffing and services for newly emerging and existing programs. Verifying the accuracy of the information and inclusion of all programs will be important in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
2009-12-07
The word 'breakthrough' aptly describes the transformational science and milestones achieved at the Argonne Leadership Computing Facility (ALCF) throughout 2008. The number of research endeavors undertaken at the ALCF through the U.S. Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program grew from 9 in 2007 to 20 in 2008. The allocation of computer time awarded to researchers on the Blue Gene/P also spiked significantly - from nearly 10 million processor hours in 2007 to 111 million in 2008. To support this research, we expanded the capabilities of Intrepid, an IBM Blue Gene/P systemmore » at the ALCF, to 557 teraflops (TF) for production use. Furthermore, we enabled breakthrough levels of productivity and capability in visualization and data analysis with Eureka, a powerful installation of NVIDIA Quadro Plex S4 external graphics processing units. Eureka delivered a quantum leap in visual compute density, providing more than 111 TF and more than 3.2 terabytes of RAM. On April 21, 2008, the dedication of the ALCF realized DOE's vision to bring the power of the Department's high performance computing to open scientific research. In June, the IBM Blue Gene/P supercomputer at the ALCF debuted as the world's fastest for open science and third fastest overall. No question that the science benefited from this growth and system improvement. Four research projects spearheaded by Argonne National Laboratory computer scientists and ALCF users were named to the list of top ten scientific accomplishments supported by DOE's Advanced Scientific Computing Research (ASCR) program. Three of the top ten projects used extensive grants of computing time on the ALCF's Blue Gene/P to model the molecular basis of Parkinson's disease, design proteins at atomic scale, and create enzymes. As the year came to a close, the ALCF was recognized with several prestigious awards at SC08 in November. We provided resources for Linear Scaling Divide-and-Conquer Electronic Structure Calculations for Thousand Atom Nanostructures, a collaborative effort between Argonne, Lawrence Berkeley National Laboratory, and Oak Ridge National Laboratory that received the ACM Gordon Bell Prize Special Award for Algorithmic Innovation. The ALCF also was named a winner in two of the four categories in the HPC Challenge best performance benchmark competition.« less
Teaching Cybersecurity Using the Cloud
ERIC Educational Resources Information Center
Salah, Khaled; Hammoud, Mohammad; Zeadally, Sherali
2015-01-01
Cloud computing platforms can be highly attractive to conduct course assignments and empower students with valuable and indispensable hands-on experience. In particular, the cloud can offer teaching staff and students (whether local or remote) on-demand, elastic, dedicated, isolated, (virtually) unlimited, and easily configurable virtual machines.…
The anomaly data base of screwworm information
NASA Technical Reports Server (NTRS)
Giddings, L. E.
1976-01-01
Standard statistical processing of anomaly data in the screwworm eradication data system is possible from data compiled on magnetic tapes with the Univac 1108 computer. The format and organization of the data in the data base, which is also available on dedicated disc storage, are described.
Cone-beam micro computed tomography dedicated to the breast.
Sarno, Antonio; Mettivier, Giovanni; Di Lillo, Francesca; Cesarelli, Mario; Bifulco, Paolo; Russo, Paolo
2016-12-01
We developed a scanner for micro computed tomography dedicated to the breast (BµCT) with a high resolution flat-panel detector and a microfocus X-ray tube. We evaluated the system spatial resolution via the 3D modulation transfer function (MTF). In addition to conventional absorption-based X-ray imaging, such a prototype showed capabilities for propagation-based phase-contrast and related edge enhancement effects in 3D imaging. The system limiting spatial resolution is 6.2mm -1 (MTF at 10%) in the vertical direction and 3.8mm -1 in the radial direction, values which compare favorably with the spatial resolution reached by mini focus breast CT scanners of other groups. The BµCT scanner was able to detect both microcalcification clusters and masses in an anthropomorphic breast phantom at a dose comparable to that of two-view mammography. The use of a breast holder is proposed in order to have 1-2min long scan times without breast motion artifacts. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
The Grad Cohort Workshop: Evaluating an Intervention to Retain Women Graduate Students in Computing
Stout, Jane G.; Tamer, Burçin; Wright, Heather M.; Clarke, Lori A.; Dwarkadas, Sandhya; Howard, Ayanna M.
2017-01-01
Women engaged in computing career tracks are vastly outnumbered by men and often must contend with negative stereotypes about their innate technical aptitude. Research suggests women's marginalized presence in computing may result in women psychologically disengaging, and ultimately dropping out, perpetuating women's underrepresentation in computing. To combat this vicious cycle, the Computing Research Association's Committee on the Status of Women in Computing Research (CRA-W) runs a multi-day mentorship workshop for women graduate students called Grad Cohort, which consists of a speaker series and networking opportunities. We studied the long-term impact of Grad Cohort on women Ph.D. students' (a) dedication to becoming well-known in one's field, and giving back to the community (professional goals), (b) the degree to which one feels computing is an important element of “who they are” (computing identity), and (c) beliefs that computing skills are innate (entity beliefs). Of note, entity beliefs are known to be demoralizing and can lead to disengagement from academic endeavors. We compared a propensity score matched sample of women and men Ph.D. students in computing programs who had never participated in Grad Cohort to a sample of past Grad Cohort participants. Grad Cohort participants reported interest in becoming well-known in their field to a greater degree than women non-participants, and to an equivalent degree as men. Also, Grad Cohort participants reported stronger interest in giving back to the community than their peers. Further, whereas women non-participants identified with computing to a lesser degree than men and held stronger entity beliefs than men, Grad Cohort participants' computing identity and entity beliefs were equivalent to men. Importantly, stronger entity beliefs predicted a weaker computing identity among students, with the exception of Grad Cohort participants. This latter finding suggests Grad Cohort may shield students' computing identity from the damaging nature of entity beliefs. Together, these findings suggest Grad Cohort may fortify women's commitment to pursuing computing research careers and move the needle toward greater gender diversity in computing. PMID:28119657
The Grad Cohort Workshop: Evaluating an Intervention to Retain Women Graduate Students in Computing.
Stout, Jane G; Tamer, Burçin; Wright, Heather M; Clarke, Lori A; Dwarkadas, Sandhya; Howard, Ayanna M
2016-01-01
Women engaged in computing career tracks are vastly outnumbered by men and often must contend with negative stereotypes about their innate technical aptitude. Research suggests women's marginalized presence in computing may result in women psychologically disengaging, and ultimately dropping out, perpetuating women's underrepresentation in computing. To combat this vicious cycle, the Computing Research Association's Committee on the Status of Women in Computing Research (CRA-W) runs a multi-day mentorship workshop for women graduate students called Grad Cohort, which consists of a speaker series and networking opportunities. We studied the long-term impact of Grad Cohort on women Ph.D. students' (a) dedication to becoming well-known in one's field, and giving back to the community ( professional goals ), (b) the degree to which one feels computing is an important element of "who they are" ( computing identity) , and (c) beliefs that computing skills are innate ( entity beliefs ). Of note, entity beliefs are known to be demoralizing and can lead to disengagement from academic endeavors. We compared a propensity score matched sample of women and men Ph.D. students in computing programs who had never participated in Grad Cohort to a sample of past Grad Cohort participants. Grad Cohort participants reported interest in becoming well-known in their field to a greater degree than women non-participants, and to an equivalent degree as men. Also, Grad Cohort participants reported stronger interest in giving back to the community than their peers. Further, whereas women non-participants identified with computing to a lesser degree than men and held stronger entity beliefs than men, Grad Cohort participants' computing identity and entity beliefs were equivalent to men. Importantly, stronger entity beliefs predicted a weaker computing identity among students, with the exception of Grad Cohort participants. This latter finding suggests Grad Cohort may shield students' computing identity from the damaging nature of entity beliefs. Together, these findings suggest Grad Cohort may fortify women's commitment to pursuing computing research careers and move the needle toward greater gender diversity in computing.
Addition of a Hydrological Cycle to the EPIC Jupiter Model
NASA Astrophysics Data System (ADS)
Dowling, T. E.; Palotai, C. J.
2002-09-01
We present a progress report on the development of the EPIC atmospheric model to include clouds, moist convection, and precipitation. Two major goals are: i) to study the influence that convective water clouds have on Jupiter's jets and vortices, such as those to the northwest of the Great Red Spot, and ii) to predict ammonia-cloud evolution for direct comparison to visual images (instead of relying on surrogates for clouds like potential vorticity). Data structures in the model are now set up to handle the vapor, liquid, and solid phases of the most common chemical species in planetary atmospheres. We have adapted the Prather conservation of second-order moments advection scheme to the model, which yields high accuracy for dealing with cloud edges. In collaboration with computer scientists H. Dietz and T. Mattox at the U. Kentucky, we have built a dedicated 40-node parallel computer that achieves 34 Gflops (double precision) at 74 cents per Mflop, and have updated the EPIC-model code to use cache-aware memory layouts and other modern optimizations. The latest test-case results of cloud evolution in the model will be presented. This research is funded by NASA's Planetary Atmospheres and EPSCoR programs.
Salloum, Alison; Crawford, Erika A; Lewin, Adam B; Storch, Eric A
2015-01-01
Computer-assisted cognitive behavioral therapy (CCBT) programs for childhood anxiety are being developed, although research about factors that contribute to implementation of CCBT in community mental health centers (CMHC) is limited. The purpose of this mixed-methods study was to explore consumers' and providers' perceptions of utilizing a CCBT for childhood anxiety in CMHC in an effort to identify factors that may impact implementation of CCBT in CMHC. Focus groups and interviews occurred with 7 parents, 6 children, 3 therapists, 3 project coordinators and 3 administrators who had participated in CCBT for childhood anxiety. Surveys of treatment satisfaction and treatment barriers were administered to consumers. RESULTS suggest that both consumers and providers were highly receptive to participation in and implementation of CCBT in CMHC. Implementation themes included positive receptiveness, factors related to therapists, treatment components, applicability of treatment, treatment content, initial implementation challenges, resources, dedicated staff, support, outreach, opportunities with the CMHC, payment, and treatment availability. As studies continue to demonstrate the effectiveness of CCBT for childhood anxiety, research needs to continue to examine factors that contribute to the successful implementation of such treatments in CMHC.
Position Paper on College-Sponsored Child Care Programs.
ERIC Educational Resources Information Center
National Coalition for Campus Child Care, Inc., Milwaukee, WI.
Universities must be prepared to provide quality child care not only to accommodate their changing student population, but also to help attract and retain competent and dedicated employees. Campus child care programs should be: (1) models to the community, to early education specialists, to parents, and to policymakers; (2) an integral part of the…
77 FR 32959 - Request for Information on Strategies for Improving Outcomes for Disconnected Youth
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
...? What is the best evidence to support these program designs (e.g., correlational or longitudinal... approval to blend funds from multiple funding sources and obtain waivers, such as for program design... 2013 budget does not request dedicated funding for Performance Partnership Pilots as they are designed...
ERIC Educational Resources Information Center
Womble, Myra J.; Adams, J. Elaine; Stitt-Gohdes, Wanda L.
2000-01-01
Focus groups with 25 business and 18 marketing teachers and 6 business/industry representatives elicited the following opinions: the primary purpose of business/marketing education is work force preparation; dedicated faculty and administrative support are ideal features; a strong voice for vocational education is needed; and important skill areas…
Another Perspective: El Sistema--A Perspective for North American Music Educators
ERIC Educational Resources Information Center
Tunstall, Tricia
2013-01-01
Herein, Tricia Tunstall presents a critique of the article by Melissa Lesniak published in the December 2012 "Music Educators Journal," and offers a new perspective on the Venezuelan youth orchestra program known as "El Sistema." The program, which began in Caracas thirty-eight years ago, is dedicated to changing the lives of…
Learning from Elsewhere: Portrayal of Holistic Educators in Ecuador and Vietnam.
ERIC Educational Resources Information Center
Yihong, Fan
A phenomenological research project examined a holistic school in Ecuador and a creativity methodology program in Vietnam. The educators in these programs have dedicated themselves to implementing a holistic and humanistic vision and philosophy of education in their teaching practice. The study demonstrates how they have successfully created a…
Sugarcane Genotype Selection on Muck and Sand Soils in Florida — a Case for Dedicated Environments
USDA-ARS?s Scientific Manuscript database
Traditionally, the cooperative sugarcane (Saccharum spp.) breeding program located at Canal Point has selected genotypes exclusively on muck soils in the early to middle stages of the program. Only about 0.20% of genotypes are ever tested on sand, resulting in the possibility that many sand-adapted ...
Sugarcane Genotype Selection on Muck and Sand Soils in Florida — a Case for Dedicated Environments
USDA-ARS?s Scientific Manuscript database
Traditionally, the cooperative sugarcane breeding program at Canal Point, Florida has selected genotypes exclusively on muck soils in the early to middle stages of the program, resulting in the possibility that many genotypes adapted to sand soils are discarded. The objective of this study was to de...
REL Pacific Program Guide: Working Together to Answer Education Questions That Matter
ERIC Educational Resources Information Center
Regional Educational Laboratory Pacific, 2014
2014-01-01
The Regional Educational Laboratory of the Pacific (REL Pacific) program at McREL International partners with schools, state departments of education, and other education stakeholders to use data and research to drive data-informed decisions. Dedicated, experienced staff work closely with local experts, who provide support for ongoing regional…
Edith de Nancrede at Hull-House: Theatre Programs for Youth.
ERIC Educational Resources Information Center
Hecht, Stuart J.
1991-01-01
Describes the work of Edith de Nancrede in developing theater programs for youth at Chicago's Hull-House during the early part of the twentieth century. Describes how her intense dedication to theater and education contributed to the success of Hull-House and to the achievements of its leader, Jane Addams. (PRA)
Actuarial Science at One Four-Year Comprehensive University
ERIC Educational Resources Information Center
Charlwood, Kevin E.
2014-01-01
Building an Actuarial Science program designated as advanced requires dedicated faculty, support from the administration, and a core group of strong students. Washburn University may serve as a model for those wishing to start or enhance such a program at their institution. We face three main ongoing challenges: first, the hiring and retention of…
Clinical Investigator Development Program | Center for Cancer Research
The Center for Cancer Research (CCR), a division of the National Cancer Institute (NCI), National Institutes of Health (NIH), Department of Health and Human Services (DHHS), is pleased to announce its annual call for applications for the Clinical Investigator Development Program (CIDP). This is an exciting training opportunity intended for physicians interested in dedicating
Assessing Virtue: Measurement in Moral Education at Home and Abroad
ERIC Educational Resources Information Center
Alexander, Hanan A.
2016-01-01
How should we assess programs dedicated to education in virtue? One influential answer draws on quantitative research designs. By measuring the inputs and processes that produce the highest levels of virtue among participants according to some reasonable criterion, in this view, we can determine which programs engender the most desired results.…
Electron Microscopy-Data Analysis Specialist | Center for Cancer Research
PROGRAM DESCRIPTION The Cancer Research Technology Program (CRTP) develops and implements emerging technology, cancer biology expertise and research capabilities to accomplish NCI research objectives. The CRTP is an outward-facing, multi-disciplinary hub purposed to enable the external cancer research community and provides dedicated support to NCI’s intramural Center for
New Program Aims $300-Million at Young Biomedical Researchers
ERIC Educational Resources Information Center
Goodall, Hurley
2008-01-01
Medical scientists just starting at universities have been, more and more often, left empty-handed when the federal government awards grants. To offset this, the Howard Hughes Medical Institute, a nonprofit organization dedicated to medical research, announced a new program that will award $300-million to as many as 70 young scientists. The Early…
Roofs--Their Problems and Solutions.
ERIC Educational Resources Information Center
Swentkofske, Carl J.
Most roofs are meant to withstand the elements for a period of 20 years; to achieve this goal, however, school officials must believe in a dedicated maintenance program and sell it to their superiors and school boards. Establishment of a school district roof maintenance program is explained. Job qualifications and training methods for an inhouse…
MassMutual Partners with EP for a Dynamic Double Play
ERIC Educational Resources Information Center
Exceptional Parent, 2008
2008-01-01
In 2002 "Exceptional Parent" (EP) magazine had a vision--a vision of a dynamic, community outreach program that would raise the public's awareness about the special needs community. This program, now known as Disability Awareness Night, or DAN, would enlighten the public by calling attention to the dedication, perseverance, and the extraordinary…
This article is the preface or editors note to the dedicated issue of the Journal of the Air & Waste Management Association for a selection of scientific papers from the specialty conference entitled, "Particulate Matter Supersites Program and Related Studies," that was...
Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program
The politics of Medicaid: 1980-1989.
Cohen, S S
1990-01-01
Grim statistics on infant mortality and women's health alone are not enough to keep Medicaid funded. What is also needed is a strong, vociferous lobby dedicated to protecting these important programs.
Using Remotely Sensed Data to Automate and Improve Census Bureau Update Activities
NASA Astrophysics Data System (ADS)
Desch, A., IV
2017-12-01
Location of established and new housing structures is fundamental in the Census Bureau's planning and execution of each decennial census. Past Census address list compilation and update programs have involved sending more than 100,000 workers into the field to find and verify housing units. The 2020 Census program has introduced an imagery based In-Office Address Canvassing Interactive Review (IOAC-IR) program in an attempt to reduce the in-field workload. The human analyst driven, aerial image based IOAC-IR operation has proven to be a cost effective and accurate substitute for a large portion of the expensive in-field address canvassing operations. However, the IOAC-IR still required more than a year to complete and over 100 full-time dedicated employees. Much of the basic image analysis work done in IOAC-IR can be handled with established remote sensing and computer vision techniques. The experience gained from the Interactive Review phase of In-Office Address Canvassing has led to the development of a prototype geo-processing tool to automate much of this process for future and ongoing Address Canvassing operations. This prototype utilizes high-resolution aerial imagery and LiDAR to identify structures and compare their location to existing Census geographic information. In this presentation, we report on the comparison of this exploratory system's results to the human based IOAC-IR. The experimental image and LiDAR based change detection approach has itself led to very promising follow-on experiments utilizing very current, high repeat datasets and scalable cloud computing. We will discuss how these new techniques can be used to both aid the US Census Bureau meet its goals of identify all the housing units in the US as well as aid developing countries better identify where there population is currently distributed.
A Needs Assessment for a Longitudinal Emergency Medicine Intern Curriculum.
Shappell, Eric; Ahn, James
2017-01-01
A key task of emergency medicine (EM) training programs is to develop a consistent knowledge of core content in recruits with heterogeneous training backgrounds. The traditional model for delivering core content is lecture-based weekly conference; however, a growing body of literature finds this format less effective and less appealing than alternatives. We sought to address this challenge by conducting a needs assessment for a longitudinal intern curriculum for millennial learners. We surveyed all residents from the six EM programs in the greater Chicago area regarding the concept, format, and scope of a longitudinal intern curriculum. We received 153 responses from the 300 residents surveyed (51% response rate). The majority of respondents (80%; 82% of interns) agreed or strongly agreed that a dedicated intern curriculum would add value to residency education. The most positively rated teaching method was simulation sessions (91% positive responses), followed by dedicated weekly conference time (75% positive responses) and dedicated asynchronous resources (71% positive responses). Less than half of respondents (47%; 26% of interns) supported use of textbook readings in the curriculum. There is strong learner interest in a longitudinal intern curriculum. This needs assessment can serve to inform the development of a universal intern curriculum targeting the millennial generation.
Digital Waveguide Architectures for Virtual Musical Instruments
NASA Astrophysics Data System (ADS)
Smith, Julius O.
Digital sound synthesis has become a standard staple of modern music studios, videogames, personal computers, and hand-held devices. As processing power has increased over the years, sound synthesis implementations have evolved from dedicated chip sets, to single-chip solutions, and ultimately to software implementations within processors used primarily for other tasks (such as for graphics or general purpose computing). With the cost of implementation dropping closer and closer to zero, there is increasing room for higher quality algorithms.
Towards Effective Non-Invasive Brain-Computer Interfaces Dedicated to Gait Rehabilitation Systems
Castermans, Thierry; Duvinage, Matthieu; Cheron, Guy; Dutoit, Thierry
2014-01-01
In the last few years, significant progress has been made in the field of walk rehabilitation. Motor cortex signals in bipedal monkeys have been interpreted to predict walk kinematics. Epidural electrical stimulation in rats and in one young paraplegic has been realized to partially restore motor control after spinal cord injury. However, these experimental trials are far from being applicable to all patients suffering from motor impairments. Therefore, it is thought that more simple rehabilitation systems are desirable in the meanwhile. The goal of this review is to describe and summarize the progress made in the development of non-invasive brain-computer interfaces dedicated to motor rehabilitation systems. In the first part, the main principles of human locomotion control are presented. The paper then focuses on the mechanisms of supra-spinal centers active during gait, including results from electroencephalography, functional brain imaging technologies [near-infrared spectroscopy (NIRS), functional magnetic resonance imaging (fMRI), positron-emission tomography (PET), single-photon emission-computed tomography (SPECT)] and invasive studies. The first brain-computer interface (BCI) applications to gait rehabilitation are then presented, with a discussion about the different strategies developed in the field. The challenges to raise for future systems are identified and discussed. Finally, we present some proposals to address these challenges, in order to contribute to the improvement of BCI for gait rehabilitation. PMID:24961699
24 CFR 891.315 - Prohibited facilities.
Code of Federal Regulations, 2014 CFR
2014-04-01
... advances under the Section 811 Program, as well as loans financed under subpart E of this part. Project facilities may not include infirmaries, nursing stations, spaces dedicated to the delivery of medical...
24 CFR 891.315 - Prohibited facilities.
Code of Federal Regulations, 2010 CFR
2010-04-01
... advances under the Section 811 Program, as well as loans financed under subpart E of this part. Project facilities may not include infirmaries, nursing stations, spaces dedicated to the delivery of medical...
24 CFR 891.315 - Prohibited facilities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... advances under the Section 811 Program, as well as loans financed under subpart E of this part. Project facilities may not include infirmaries, nursing stations, spaces dedicated to the delivery of medical...
24 CFR 891.315 - Prohibited facilities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... advances under the Section 811 Program, as well as loans financed under subpart E of this part. Project facilities may not include infirmaries, nursing stations, spaces dedicated to the delivery of medical...
24 CFR 891.315 - Prohibited facilities.
Code of Federal Regulations, 2013 CFR
2013-04-01
... advances under the Section 811 Program, as well as loans financed under subpart E of this part. Project facilities may not include infirmaries, nursing stations, spaces dedicated to the delivery of medical...
Testing of electrical equipment for a commercial grade dedication program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, J.L.; Srinivas, N.
1995-10-01
The availability of qualified safety related replacement parts for use in nuclear power plants has decreased over time. This has caused many nuclear power plants to purchase commercial grade items (CGI) and utilize the commercial grade dedication process to qualify the items for use in nuclear safety related applications. The laboratories of Technical and Engineering Services (the testing facility of Detroit Edison) have been providing testing services for verification of critical characteristics of these items. This paper presents an overview of the experience in testing electrical equipment with an emphasis on fuses.
49 CFR 700.3 - Availability of documents, assistance, and information.
Code of Federal Regulations, 2012 CFR
2012-10-01
... segments dedicated to the following topics: Amtrak's computer system and its communication codes; interline... concerns in a letter or other written communication directed to the appropriate vice president or to the Director of Corporate Communications. Amtrak will bring such communications to the attention of the...
49 CFR 700.3 - Availability of documents, assistance, and information.
Code of Federal Regulations, 2013 CFR
2013-10-01
... segments dedicated to the following topics: Amtrak's computer system and its communication codes; interline... concerns in a letter or other written communication directed to the appropriate vice president or to the Director of Corporate Communications. Amtrak will bring such communications to the attention of the...
49 CFR 700.3 - Availability of documents, assistance, and information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... segments dedicated to the following topics: Amtrak's computer system and its communication codes; interline... concerns in a letter or other written communication directed to the appropriate vice president or to the Director of Corporate Communications. Amtrak will bring such communications to the attention of the...
49 CFR 700.3 - Availability of documents, assistance, and information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... segments dedicated to the following topics: Amtrak's computer system and its communication codes; interline... concerns in a letter or other written communication directed to the appropriate vice president or to the Director of Corporate Communications. Amtrak will bring such communications to the attention of the...
Cyberspace: Devolution and Recovery
2011-03-23
time of the source of the burst and we do not know if it was accidental, an act of God , or a malicious attack. 28 The remainder of a speech like...Security 15 Mailing List, Federal Vulnerability Knowledgebase (VKB), US-CERT Portal, US-CERT Einstein Program, Internet Health and Status Service...The US-CERT portal is a website dedicated to sharing relevant information with participants. The Einstein Program is a program that allows for the
Noonan, Rita K; Gibbs, Deborah
2009-01-01
This special issue captures several threads in the ongoing evolution of sexual violence prevention. The articles that follow examine an empowerment evaluation process with four promising programs dedicated to preventing first-time male perpetration of sexual violence, as well as evaluation findings. Both the evaluation approach and the programs examined shed light on how sexual violence prevention can continue to be improved in the future.
NASA Technical Reports Server (NTRS)
Smith, Llwyn
1995-01-01
This paper describes the Space Test Program (STP) which provides access to space for the DOD-wide space research and development (R&D) community. STP matches a ranked list of sanctioned experiments with available budgets and searches for the most cost effective mechanisms to get the experiments into space. The program has successfully flown over 350 experiments, using dedicated freeflyer spacecraft, secondary space on the Space Shuttle, and various host satellites.
Bistatic passive radar simulator with spatial filtering subsystem
NASA Astrophysics Data System (ADS)
Hossa, Robert; Szlachetko, Boguslaw; Lewandowski, Andrzej; Górski, Maksymilian
2009-06-01
The purpose of this paper is to briefly introduce the structure and features of the developed virtual passive FM radar implemented in Matlab system of numerical computations and to present many alternative ways of its performance. An idea of the proposed solution is based on analytic representation of transmitted direct signals and reflected echo signals. As a spatial filtering subsystem a beamforming network of ULA and UCA dipole configuration dedicated to bistatic radar concept is considered and computationally efficient procedures are presented in details. Finally, exemplary results of the computer simulations of the elaborated virtual simulator are provided and discussed.
Distributed-Memory Computing With the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Riley, Christopher J.; Cheatwood, F. McNeil
1997-01-01
The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA), a Navier-Stokes solver, has been modified for use in a parallel, distributed-memory environment using the Message-Passing Interface (MPI) standard. A standard domain decomposition strategy is used in which the computational domain is divided into subdomains with each subdomain assigned to a processor. Performance is examined on dedicated parallel machines and a network of desktop workstations. The effect of domain decomposition and frequency of boundary updates on performance and convergence is also examined for several realistic configurations and conditions typical of large-scale computational fluid dynamic analysis.
Users' guide to new approaches and sanctions for multiple DWI offenders
DOT National Transportation Integrated Search
1989-12-01
This guide describes nine new approaches for reducing recidivism among multiple DWI offenders: dedicated detention facilities, diversion programs, electronic monitoring, ignition interlock systems, intensive probation supervision, publishing offender...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldron, G.A.
1973-01-01
The symposium was convened to dedicate the new Kansas Geological Survey building at the University of Kansas and conduct an exchange of ideas on the elements of a national energy policy. Dr. William W. Hambleton presented the introductory speech. Papers presented were: The elements of a national energy policy, Merrill W. Haas; A national energy policy - what should it include, Dr. Wilson M. Laird; Elements of a national energy policy, John D. Emerson, National energy policy and environmental quality, Dr. Beatrice E. Willard; Energy and the environment, Jerome H. Svore; A congressional point of view on energy policy, Senatormore » Clifford P. Hansen; and The time element in a national energy policy, Governor Robert D. Ray of Iowa. The dedication program followed. (MCW)« less
Hall, Katherine C; Diffenderfer, Sandy K; Stidham, April; Mullins, Christine M
2018-04-19
In the 1990s, dedicated education units transformed undergraduate preceptorships, but graduate preceptorships remain static. The dyadic nurse practitioner preceptorship model supports an environment where faculty, students, and preceptors may overlook nuances that affect the teaching-learning process. This article describes an innovative clinical education model, Student and Preceptor Advancement in a Dedicated Education Site, designed to improve preceptorships for advanced practice nurses. The focus is on adaptations made to facilitate use in advanced practice nursing programs.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
Radiotherapy Monte Carlo simulation using cloud computing technology.
Poole, C M; Cornelius, I; Trapp, J V; Langton, C M
2012-12-01
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.
10. CUT OF THE CHURCH BUILDING AS IT APPEARED ON ...
10. CUT OF THE CHURCH BUILDING AS IT APPEARED ON THE PRINTED PROGRAM OF THE JANUARY 1, 1893 DEDICATION SERVICE. - Second Presbyterian Church, Pontatoc Avenue & Hernando Street, Memphis, Shelby County, TN
2014-01-16
ZACK JONES AND JIM LYDON OF MSFC’S ADVANCED MANUFACTURING TEAM, WITH MSFC’S M2 SELECTIVE LASER MELTING SYSTEM. THE M2 IS CURRENTLY DEDICATED TO ADVANCED COPPER MATERIAL DEVELOPMENT FOR THE LOW COST UPPER STAGE PROGRAM.
EMC Aspects of Turbulence Heating ObserveR (THOR) Spacecraft
NASA Astrophysics Data System (ADS)
Soucek, J.; Ahlen, L.; Bale, S.; Bonnell, J.; Boudin, N.; Brienza, D.; Carr, C.; Cipriani, F.; Escoubet, C. P.; Fazakerley, A.; Gehler, M.; Genot, V.; Hilgers, A.; Hanock, B.; Jannet, G.; Junge, A.; Khotyaintsev, Y.; De Keyser, J.; Kucharek, H.; Lan, R.; Lavraud, B.; Leblanc, F.; Magnes, W.; Mansour, M.; Marcucci, M. F.; Nakamura, R.; Nemecek, Z.; Owen, C.; Phal, Y.; Retino, A.; Rodgers, D.; Safrankova, J.; Sahraoui, F.; Vainio, R.; Wimmer-Schweingruber, R.; Steinhagen, J.; Vaivads, A.; Wielders, A.; Zaslavsky, A.
2016-05-01
Turbulence Heating ObserveR (THOR) is a spacecraft mission dedicated to the study of plasma turbulence in near-Earth space. The mission is currently under study for implementation as a part of ESA Cosmic Vision program. THOR will involve a single spinning spacecraft equipped with state of the art instruments capable of sensitive measurements of electromagnetic fields and plasma particles. The sensitive electric and magnetic field measurements require that the spacecraft- generated emissions are restricted and strictly controlled; therefore a comprehensive EMC program has been put in place already during the study phase. The THOR study team and a dedicated EMC working group are formulating the mission EMC requirements already in the earliest phase of the project to avoid later delays and cost increases related to EMC. This article introduces the THOR mission and reviews the current state of its EMC requirements.
The first dedicated life sciences Spacelab mission
NASA Technical Reports Server (NTRS)
Perry, T. W.; Rummel, J. A.; Griffiths, L. D.; White, R. J.; Leonard, J. I.
1984-01-01
JIt is pointed out that the Shuttle-borne Spacelab provides the capability to fly large numbers of life sciences experiments, to retrieve and rescue experimental equipment, and to undertake multiple-flight studies. A NASA Life Sciences Flight Experiments Program has been organized with the aim to take full advantages of this capability. A description is provided of the scientific aspects of the most ambitious Spacelab mission currently being conducted in connection with this program, taking into account the First Dedicated Life Sciences Spacelab Mission. The payload of this mission will contain the equipment for 24 separate investigations. It is planned to perform the mission on two separate seven-day Spacelab flights, the first of which is currently scheduled for early 1986. Some of the mission objectives are related to the study of human and animal responses which occur promptly upon achieving weightlessness.
Dedicated computer system AOTK for image processing and analysis of horse navicular bone
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Fojud, A.; Koszela, K.; Mueller, W.; Górna, K.; Okoń, P.; Piekarska-Boniecka, H.
2017-07-01
The aim of the research was made the dedicated application AOTK (pol. Analiza Obrazu Trzeszczki Kopytowej) for image processing and analysis of horse navicular bone. The application was produced by using specialized software like Visual Studio 2013 and the .NET platform. To implement algorithms of image processing and analysis were used libraries of Aforge.NET. Implemented algorithms enabling accurate extraction of the characteristics of navicular bones and saving data to external files. Implemented in AOTK modules allowing the calculations of distance selected by user, preliminary assessment of conservation of structure of the examined objects. The application interface is designed in a way that ensures user the best possible view of the analyzed images.
A USNRC perspective on the use of commercial-off-shelf software (COTS) in advanced reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, J.C.
1997-12-01
The use of commercially available digital computer systems and components in safety critical systems (nuclear power plant, military, and commercial applications) is increasing rapidly. While this paper focuses on the software aspects of the application most of these continents are applicable to the hardware aspects as well. Commercial dedication (the process of assuring that a commercial grade item will perform its intended safety function) has demonstrated benefits in cost savings and a wide base of user experience, however, care must be taken to avoid difficulties with some aspects of the dedication process such as access to vendor development information, configurationmore » management long term support, and system integration.« less
Development of clinical contents model markup language for electronic health records.
Yun, Ji-Hyun; Ahn, Sun-Ju; Kim, Yoon
2012-09-01
To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. CCML HAS THE FOLLOWING STRENGTHS: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems.
ERIC Educational Resources Information Center
Weiss, Michael J.
2017-01-01
This is a testimony from Michael Weiss, a senior researcher at Manpower Demonstration Research Corporation (MDRC), a nonprofit, nonpartisan social policy research organization that is dedicated to learning what works to improve policies and programs that affect the poor. Founded in 1974, MDRC evaluates existing programs and develops new solutions…
ERIC Educational Resources Information Center
Rundle-Thiele, Sharyn R.; Wymer, Walter
2010-01-01
This article analyzes the extent to which Australian and New Zealand marketing educators use dedicated or stand-alone courses to equip students with alternative views of business. A census of marketing programs in degree-granting universities was conducted. Program brochures were obtained via the Internet and were content analyzed. This study…
ERIC Educational Resources Information Center
Gates, Susan M.; Hamilton, Laura S.; Martorell, Paco; Burkhauser, Susan; Heaton, Paul; Pierson, Ashley; Baird, Matthew; Vuollo, Mirka; Li, Jennifer J.; Lavery, Diana Catherine; Harvey, Melody; Gu, Kun
2014-01-01
New Leaders is dedicated to promoting student achievement by developing outstanding school leaders to serve in urban schools. RAND Corporation researchers conducted a formative and summative external evaluation of the New Leaders program, its theory of action, and its implementation from 2006 through 2013. This document presents technical…
ERIC Educational Resources Information Center
Tuleja, Elizabeth A.; Greenhalgh, Anne M.
2008-01-01
Educating undergraduate business students in the 21st century requires more than addressing the quantitative side of business; rather, it calls for including the more qualitative "soft skills," such as speaking and writing. This article examines the design, delivery, and effectiveness of an undergraduate program dedicated to leadership,…
California's Early Learning & Development System: A Review of Funding Streams and Programs
ERIC Educational Resources Information Center
Miller, Kate; Perez, Giannina S.
2010-01-01
California's public early learning and development programs and related services are funded through a range of federal, state and local sources. The purpose and scope of these funding streams vary broadly: some sources are dedicated primarily to serving children, birth to age five, and their families, while others can also be utilized for…
ERIC Educational Resources Information Center
Raths, David
2010-01-01
In the tug-of-war between researchers and IT for supercomputing resources, a centralized approach can help both sides get more bang for their buck. As 2010 began, the University of Washington was preparing to launch its first shared high-performance computing cluster, a 1,500-node system called Hyak, dedicated to research activities. Like other…
14 CFR 1214.108 - Termination.
Code of Federal Regulations, 2011 CFR
2011-01-01
... NASA. (1) The termination fee for dedicated flights will be computed as a percentage of the Shuttle... Space Shuttle Flights of Payloads for Non-U.S. Government, Reimbursable Customers § 1214.108 Termination... termination occurs Termination fee, percent of Shuttle standard flight price 18 or more 10 17 or more but less...
14 CFR 1214.108 - Termination.
Code of Federal Regulations, 2013 CFR
2013-01-01
... NASA. (1) The termination fee for dedicated flights will be computed as a percentage of the Shuttle... Space Shuttle Flights of Payloads for Non-U.S. Government, Reimbursable Customers § 1214.108 Termination... termination occurs Termination fee, percent of Shuttle standard flight price 18 or more 10 17 or more but less...
14 CFR 1214.108 - Termination.
Code of Federal Regulations, 2010 CFR
2010-01-01
... NASA. (1) The termination fee for dedicated flights will be computed as a percentage of the Shuttle... Space Shuttle Flights of Payloads for Non-U.S. Government, Reimbursable Customers § 1214.108 Termination... termination occurs Termination fee, percent of Shuttle standard flight price 18 or more 10 17 or more but less...
14 CFR 1214.108 - Termination.
Code of Federal Regulations, 2012 CFR
2012-01-01
... NASA. (1) The termination fee for dedicated flights will be computed as a percentage of the Shuttle... Space Shuttle Flights of Payloads for Non-U.S. Government, Reimbursable Customers § 1214.108 Termination... termination occurs Termination fee, percent of Shuttle standard flight price 18 or more 10 17 or more but less...
14 CFR § 1214.108 - Termination.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Provisions Regarding Space Shuttle Flights of Payloads for Non-U.S. Government, Reimbursable Customers § 1214... standard services to NASA. (1) The termination fee for dedicated flights will be computed as a percentage of the Shuttle standard flight price and will be based on the table below. Months before scheduled...
NASA Technical Reports Server (NTRS)
DeChant, Lawrence Justin
1998-01-01
In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.
Comparative Implementation of High Performance Computing for Power System Dynamic Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng
Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
Computational and mathematical methods in brain atlasing.
Nowinski, Wieslaw L
2017-12-01
Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.
Observing Ben Wyckoff: From Basic Research to Programmed Instruction and Social Issues
Escobar, Rogelio; Lattal, Kennon A
2011-01-01
L. Benjamin Wyckoff's seminal contributions to both psychological theory and application are the subject of this review. Wyckoff started his academic career as a graduate student at Indiana University, where he developed the observing-response procedure under the guidance of B. F. Skinner and C. J. Burke. At the University of Wisconsin–Madison, Wyckoff refined his mathematical theory of secondary reinforcement. This theory was the impetus for his creation of an electronic simulation of a rat running a T maze, one of the first “computer models” of learning. Wyckoff next went to Emory University, leaving there to help create two of the most successful companies dedicated to the advancement of programmed instruction and teaching machines: Teaching Machines, Inc. and the Human Development Institute. Wyckoff's involvement in these companies epitomizes the application of basic behavior-analytic principles in the development of technology to improve education and human relationships. The emergent picture of Wyckoff is that of a man who, through his research, professional work in educational applications of behavioral principles, and active involvement in the civil rights movement of the 1960s, was strongly committed to applying behavioral science to positively influence human behavior change. PMID:22532737
MO-E-18A-01: Imaging: Best Practices In Pediatric Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, C; Strauss, K; MacDougall, R
This imaging educational program will focus on solutions to common pediatric imaging challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children's hospitals. Areas of focus will include general radiography, the use of manual and automatic dose management in computed tomography, and enterprise-wide radiation dose management in the pediatric practice. The educational program will begin with a discussion of the complexities of exposure factor control in pediatric projection radiography. Following this introduction will be two lectures addressing the challenges of computed tomography (CT) protocol optimization in the pediatric population. The firstmore » will address manual CT protocol design in order to establish a managed radiation dose for any pediatric exam on any CT scanner. The second CT lecture will focus on the intricacies of automatic dose modulation in pediatric imaging with an emphasis on getting reliable results in algorithmbased technique selection. The fourth and final lecture will address the key elements needed to developing a comprehensive radiation dose management program for the pediatric environment with particular attention paid to new regulations and obligations of practicing medical physicists. Learning Objectives: To understand how general radiographic techniques can be optimized using exposure indices in order to improve pediatric radiography. To learn how to establish diagnostic dose reference levels for pediatric patients as a function of the type of examination, patient size, and individual design characteristics of the CT scanner. To learn how to predict the patient's radiation dose prior to the exam and manually adjust technique factors if necessary to match the patient's dose to the department's established dose reference levels. To learn how to utilize manufacturer-provided automatic dose modulation technology to consistently achieve patient doses within the department's established size-based diagnostic reference range. To understand the key components of an enterprise-wide pediatric dose management program that integrates the expanding responsibilities of medial physicists in the new era of dose monitoring.« less
The November 1, 2017 issue of Cancer Research is dedicated to a collection of computational resource papers in genomics, proteomics, animal models, imaging, and clinical subjects for non-bioinformaticists looking to incorporate computing tools into their work. Scientists at Pacific Northwest National Laboratory have developed P-MartCancer, an open, web-based interactive software tool that enables statistical analyses of peptide or protein data generated from mass-spectrometry (MS)-based global proteomics experiments.
Master/Programmable-Slave Computer
NASA Technical Reports Server (NTRS)
Smaistrla, David; Hall, William A.
1990-01-01
Unique modular computer features compactness, low power, mass storage of data, multiprocessing, and choice of various input/output modes. Master processor communicates with user via usual keyboard and video display terminal. Coordinates operations of as many as 24 slave processors, each dedicated to different experiment. Each slave circuit card includes slave microprocessor and assortment of input/output circuits for communication with external equipment, with master processor, and with other slave processors. Adaptable to industrial process control with selectable degrees of automatic control, automatic and/or manual monitoring, and manual intervention.
1982-02-23
segregate the computer and storage from the outside world 2. Administrative security to control access to secure computer facilities 3. Network security to...Classification Alternative A- 8 NETWORK KG GENSER DSSCS AMPE TERMINALS TP No. 022-4668-A Figure A-2. Dedicated Switching Architecture Alternative A- 9...communications protocol with the network and GENSER message transmission to the - I-S/A AMPE processor. 7. DSSCS TPU - Handles communications protocol with
Indian National Gas Hydrate Program Expedition 01 report
Collett, Timothy S.; Riedel, M.; Boswell, R.; Presley, J.; Kumar, P.; Sathe, A.; Sethi, A.; Lall, M.V.; ,
2015-01-01
The Indian National Gas Hydrate Program Expedition 01 was designed to study the gas-hydrate occurrences off the Indian Peninsula and along the Andaman convergent margin with special emphasis on understanding the geologic and geochemical controls on the occurrence of gas hydrate in these two diverse settings. During Indian National Gas Hydrate Program Expedition 01, dedicated gas-hydrate coring, drilling, and downhole logging operations were conducted from 28 April 2006 to 19 August 2006.
DOT National Transportation Integrated Search
1998-03-01
Ten years ago the Federal Aviation Administration (FAA) Office of Aviation Medicine embarked on a research and development program dedicated to human factors in aviation maintenance and inspection. Since 1989 FAA has invested over $12M in maintenance...
Steiding, Christian; Kolditz, Daniel; Kalender, Willi A
2014-03-01
Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum percentage interscan variation of repeated measurements was less than 4% and 1.7% on average for all investigated quality criteria. The NPS-based image noise differed by less than 5% from the conventional standard deviation approach and spatially selective 10% MTF values were well comparable to subjective results obtained with 3D resolution pattern. Determining only transverse spatial resolution and global noise behavior in the central field of measurement turned out to be insufficient. The proposed framework transfers QA routines employed in conventional CT in an advanced version to CBCT for fully automated and time-efficient evaluation of technical equipment. With the modular phantom design, a routine as well as an expert version for assessing IQ is provided. The QA program can be used for arbitrary CT units to evaluate 3D imaging characteristics automatically.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steiding, Christian; Kolditz, Daniel; Kalender, Willi A., E-mail: willi.kalender@imp.uni-erlangen.de
Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, andmore » an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum percentage interscan variation of repeated measurements was less than 4% and 1.7% on average for all investigated quality criteria. The NPS-based image noise differed by less than 5% from the conventional standard deviation approach and spatially selective 10% MTF values were well comparable to subjective results obtained with 3D resolution pattern. Determining only transverse spatial resolution and global noise behavior in the central field of measurement turned out to be insufficient. Conclusions: The proposed framework transfers QA routines employed in conventional CT in an advanced version to CBCT for fully automated and time-efficient evaluation of technical equipment. With the modular phantom design, a routine as well as an expert version for assessing IQ is provided. The QA program can be used for arbitrary CT units to evaluate 3D imaging characteristics automatically.« less
The Birth of the Cosmic Frontier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolb, Rocky; Turner, Mike
Scientists Rocky Kolb and Mike Turner recount the time they first proposed that Fermilab – dedicated to the study of the universe's smallest constituents — expand its program to include the stars, galaxies and the cosmos.
Pregnancy in the Military: Importance of Psychosocial Health to Birth Outcomes
2016-05-11
Pearson et al., 2013). Studies within the military community are limited. Purpose: Describe findings across a program of research dedicated to prenatal maternal psychosocial health to birth outcomes for a military population.
2014-01-16
QUINCY BEAN, JIM LYDON, AND ZACK JONES OF MSFC’S ADVANCED MANUFACTURING TEAM, WITH MSFC’S M2 SELECTIVE LASER MELTING SYSTEM. THE M2 IS CURRENTLY DEDICATED TO ADVANCED COPPER MATERIAL DEVELOPMENT FOR THE LOW COST UPPER STAGE PROGRAM.
Luján, J L; Crago, P E
2004-11-01
Neuroprosthestic systems can be used to restore hand grasp and wrist control in individuals with C5/C6 spinal cord injury. A computer-based system was developed for the implementation, tuning and clinical assessment of neuroprosthetic controllers, using off-the-shelf hardware and software. The computer system turned a Pentium III PC running Windows NT into a non-dedicated, real-time system for the control of neuroprostheses. Software execution (written using the high-level programming languages LabVIEW and MATLAB) was divided into two phases: training and real-time control. During the training phase, the computer system collected input/output data by stimulating the muscles and measuring the muscle outputs in real-time, analysed the recorded data, generated a set of training data and trained an artificial neural network (ANN)-based controller. During real-time control, the computer system stimulated the muscles using stimulus pulsewidths predicted by the ANN controller in response to a sampled input from an external command source, to provide independent control of hand grasp and wrist posture. System timing was stable, reliable and capable of providing muscle stimulation at frequencies up to 24Hz. To demonstrate the application of the test-bed, an ANN-based controller was implemented with three inputs and two independent channels of stimulation. The ANN controller's ability to control hand grasp and wrist angle independently was assessed by quantitative comparison of the outputs of the stimulated muscles with a set of desired grasp or wrist postures determined by the command signal. Controller performance results were mixed, but the platform provided the tools to implement and assess future controller designs.
cudaMap: a GPU accelerated program for gene expression connectivity mapping.
McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong
2013-10-11
Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.
1999-05-27
NASA Administrator Daniel S. Goldin hands Mrs. Dianne Holliman a plaque honoring her late husband, John Holliman, a CNN national correspondent. Standing behind Goldin is Center Director Roy Bridges. At right is Tom Johnson, news group chairman of CNN. A ceremony dedicated the KSC Press Site auditorium as the John Holliman Auditorium to honor the correspondent for his enthusiastic, dedicated coverage of America's space program. The auditorium was built in 1980 and has been the focal point for new coverage of Space Shuttle launches. The ceremony followed the 94th launch of a Space Shuttle, on mission STS-96, earlier this morning
Clinical peer mentoring: partnering BSN seniors and sophomores on a dedicated education unit.
Harmer, Bonnie McKay; Huffman, Jaime; Johnson, Barbara
2011-01-01
The authors describe a clinical peer mentoring (CPM) program that partnered 16 pairs of senior (mentors) and sophomore (novices) BSN students to provide patient care on a dedicated education unit at a VA Medical Center. Situated learning theory and Tanner's Clinical Judgment Model provided frameworks for CPM implementation. Survey findings suggested novices and mentors perceived improvements in self-confidence, prioritization, time management, clinical judgment, and evidence-based practice use. Many mentors spontaneously expressed an interest in becoming a preceptor or nurse educator. Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins
2014-06-30
VANDENBERG AIR FORCE BASE, Calif. – A memorial plaque honoring Laurie K. Walls is affixed to the umbilical tower on Space Launch Complex 2 at Vandenberg Air Force Base in California for the launch of NASA's Orbiting Carbon Observatory-2, or OCO-2. Walls, a thermal analysis engineer with the Launch Services Program, or LSP, at NASA's Kennedy Space Center, died June 4. This dedication to Walls from the members of the launch team was read during the OCO-2 countdown commentary: "The OCO-2 mission has special meaning to NASA's Launch Services Program as we have dedicated it to one of our LSP Teammates, Laurie Walls. Laurie began her career over 30 years ago as a thermal engineer for McDonnell Douglas in Huntsville, Alabama, supporting NASA's Marshall Space Flight Center. She moved to Florida in 1985. Shortly after coming to Florida, Laurie became a civil servant working on the Shuttle program return to flight effort post-Challenger. In 1998, Laurie joined the newly formed Launch Services Program as one of the founding members of the flight analysis group. She served in LSP as the thermal discipline expert until her untimely death earlier this month. Laurie worked thermal issues for numerous NASA Delta II and Atlas V missions. Additionally, she provided key thermal support for both Delta II Heavy development and Atlas V Certification. Laurie was an integral member of LSP's family and she was truly dedicated to NASA and the LSP team. She will be greatly missed. We honor Laurie with a special memorial placed on the SLC-2 umbilical tower, and we thank ULA for helping to make this happen." Launch of OCO-2 is scheduled for 5:56 a.m. EDT on July 1. To learn more about NASA's Launch Services Program, visit http://www.nasa.gov/centers/kennedy/launchingrockets/index.html. Photo credit: NASA/Randy Beaudoin
Dedication of emergency diesel generators` control air subsystem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, M.; Myers, G.; Palumbo, M.
1994-12-31
In the spring of 1993, the need to upgrade Seabrook Station`s emergency diesel generators` (EDGs`) control air system from nonsafety related to safety related was identified. This need was identified as a result of questions raised by the US Nuclear Regulatory Commission, which was conducting an Electrical Distribution Safety Functional Inspection at Seabrook at that time. The specific reason for the reassignment of safety classification was recognition that failure of the control air supply to the EDGs` jacket cooling water temperature control valves could cause overcooling of the EDGs, which potentially could result in EDG failure during long-term operation. Thismore » paper addresses how the installed control air system was upgraded to safety related using Seabrook`s Commercial Grade Dedication (CGD) Program and how, by using the dedication skills obtained over the past few years, it was done at minimal cost.« less
Systematic and deliberate orientation and instruction for dedicated education unit staff.
Smyer, Tish; Tejada, Marianne Bundalian; Tan, Rhigel Alforque
2015-03-01
On the basis of increasing complexity of the health care environment and recommended changes in how nurses are educated to meet these challenges, the University of Nevada Las Vegas, School of Nursing established an academic-practice partnership with Summerlin Hospital Medical Center to develop a dedicated education unit (DEU). When the DEU model was implemented, variables that were not discussed in the literature needed to be addressed. One such challenge was how to impart pedagogy related to clinical teaching to the DEU nursing staff who would be acting as clinical dedicated unit instructors (CDIs). Of chief concern was the evaluation and monitoring of the quality of CDI-student interactions to ensure optimal student learning outcomes. This article addresses the development of a deliberate, systematic approach to the orientation and continued education of CDIs in the DEU. This information will assist other nursing programs as they begin to implement DEUs. Copyright 2015, SLACK Incorporated.
ERIC Educational Resources Information Center
Kees, Michelle; Risk, Brittany; Meadowbrooke, Chrysta; Nellett, Timothy; Spinner, Jane
2017-01-01
Student veterans have been attending college in greater numbers since the passing of the Post/9-11 GI Bill. Although similar to other nontraditional students, student veterans face unique transition challenges that can affect their pursuit of higher education. Many student veterans could benefit from dedicated programs to help them succeed in…
ERIC Educational Resources Information Center
Wolf, Patrick J.
2011-01-01
In 2006 Wisconsin policymakers identified the School Choice Demonstration Project (SCDP) as the organization to help answer lingering questions about the effects of the MPCP [Milwaukee Parental Choice Program]. The SCDP is a national research organization, based in the University of Arkansas' Department of Education Reform, dedicated to the…
ERIC Educational Resources Information Center
Bennett, Troy
2016-01-01
Academic researchers dedicated to contributing to quality improvement in professional practice need to find more effective ways to bridge the gap between research and practice. Bridging the gap requires understanding what encourages the use of research findings by practitioners and what discourages it, as well as understanding how practitioners…
ERIC Educational Resources Information Center
Taylor, Josephine A.; Gomez, Julio Cesar; Quintero, Gloria; Nausa, Ricardo; Rey, Luz Libia
2011-01-01
This study examines a group of approximately 1,100 English as a foreign language students who attended a tutoring program dedicated to training learners in study skills and language learning strategies. The study covers a five-year period of time during which the tutoring program remained consistent in its focus and organization. Students…
Bringing out Children's Wonderful Ideas in Teaching Chinese as a Foreign Language.
ERIC Educational Resources Information Center
Yang, Yi
This paper describes one after-school program at the Cambridge Chinese School, dedicated to teaching Chinese literacy to Chinese K-12 students in the Boston, Massachusetts area. In 1998, the school initiated the "Chinese as a Foreign Language" program to cater to the needs of U.S. families with an interest in the Chinese language and culture…
ERIC Educational Resources Information Center
Grúnová, Markéta; Brandlová, Karolína; Svitálek, Jan; Hejcmanová, Pavla
2017-01-01
Local communities play a key role in the sustainability of any conservation program. We evaluated the impact of an environmental education program for school children in the surroundings of the Delta du Saloum Biosphere reserve (Senegal) dedicated to the conservation of African charismatic fauna with the critically endangered Western Derby eland…
Mentoring by Modem. Connect for Kids: Guidance for Grown-Ups.
ERIC Educational Resources Information Center
Newberger, Julee
This article describes the use of an eMentoring program by the Orphan Foundation of America, a nonprofit organization dedicated to helping young people move out of foster care and into adult life. The eMentoring program, which allows youth and their mentors to communicate via e-mail, helps foster youth prepare for work life by matching them with…
ERIC Educational Resources Information Center
Randolph, Gayle C., II; McCarthy, Karen V.
Families whose primary or sole means of financial support is derived from the welfare system are attempting to meet immediate survival needs in the same manner as families outside of the system. Project Self-Sufficiency is a program which dedicates time to building trusting relationships based on mutual respect and the belief that, with support,…
ERIC Educational Resources Information Center
Koffer Miller, Kaitlin H.; Mathew, Mary; Nonnemacher, Stacy L.; Shea, Lindsay L.
2018-01-01
A growing number of individuals with autism spectrum disorder are aging into adulthood. In the United States, Medicaid is the primary payer for services for adults with autism spectrum disorder, yet there are few funded programs that provide dedicated supports to this population. This study examined the experiences of adults with autism spectrum…
ERIC Educational Resources Information Center
Smith, Laura K.
In the year 2000, news and entertainment programs dedicated a great deal of comedic attention to the presidential election. Taking a Uses and Gratifications approach, this paper examines the role of comedy among the young electorate (undergraduate students at a Texas university). It concludes comedic programs, while popular, are among many sources…
A Needs Assessment for a Longitudinal Emergency Medicine Intern Curriculum
Shappell, Eric; Ahn, James
2017-01-01
Introduction A key task of emergency medicine (EM) training programs is to develop a consistent knowledge of core content in recruits with heterogeneous training backgrounds. The traditional model for delivering core content is lecture-based weekly conference; however, a growing body of literature finds this format less effective and less appealing than alternatives. We sought to address this challenge by conducting a needs assessment for a longitudinal intern curriculum for millennial learners. Methods We surveyed all residents from the six EM programs in the greater Chicago area regarding the concept, format, and scope of a longitudinal intern curriculum. Results We received 153 responses from the 300 residents surveyed (51% response rate). The majority of respondents (80%; 82% of interns) agreed or strongly agreed that a dedicated intern curriculum would add value to residency education. The most positively rated teaching method was simulation sessions (91% positive responses), followed by dedicated weekly conference time (75% positive responses) and dedicated asynchronous resources (71% positive responses). Less than half of respondents (47%; 26% of interns) supported use of textbook readings in the curriculum. Conclusion There is strong learner interest in a longitudinal intern curriculum. This needs assessment can serve to inform the development of a universal intern curriculum targeting the millennial generation. PMID:28116005
2000-08-01
The International Medical Informatics Association (IMIA) agreed on international recommendations in health informatics/medical informatics education. These should help to establish courses, course tracks or even complete programs in this field, to further develop existing educational activities in the various nations and to support international initiatives concerning education in health and medical informatics (HMI), particularly international activities in educating HMI specialists and the sharing of courseware. The IMIA recommendations centre on educational needs for healthcare professionals to acquire knowledge and skills in information processing and information and communication technology. The educational needs are described as a three-dimensional framework. The dimensions are: 1) professionals in healthcare (physicians, nurses, HMI professionals, ...), 2) type of specialisation in health and medical informatics (IT users, HMI specialists) and 3) stage of career progression (bachelor, master, ...). Learning outcomes are defined in terms of knowledge and practical skills for healthcare professionals in their role (a) as IT user and (b) as HMI specialist. Recommendations are given for courses/course tracks in HMI as part of educational programs in medicine, nursing, healthcare management, dentistry, pharmacy, public health, health record administration, and informatics/computer science as well as for dedicated programs in HMI (with bachelor, master or doctor degree). To support education in HMI, IMIA offers to award a certificate for high quality HMI education and supports information exchange on programs and courses in HMI through a WWW server of its Working Group on Health and Medical Informatics Education (http:www.imia.org/wg1).
Wiseman, James E; Ituarte, Philip H G; Ro, Kevin; Pasternak, Jesse D; Quach, Chi A; Tillou, Areti K; Hines, O Joe; Hiatt, Jonathan R; Yeh, Michael W
2012-06-01
The endocrine surgery program was established at the University of California, Los Angeles, in 2006 to enhance the educational experience of surgical residents in this area. The impact of this program on subjective and objective measures of resident education was prospectively tracked. Resident case logs, American Board of Surgery In-Training Examination scores, self-assessment surveys, and annual rotation evaluations from July 2005 to June 2009 were reviewed. The mean number of endocrine cases reported by graduates doubled during the study period (from 18 to 36, P < .001). Self-assessment scores increased for thyroid (from 4.53 to 5.76, P = .04) and parathyroid (from 4.46 to 5.90, P = .03) disorders. The mean rating for the endocrine rotation (from 3.23 to 3.95, P = .005) improved, with specific increases in the quantity (from 3.05 to 3.74, P = .02) and quality (from 3.25 to 3.95, P = .002) of operative experience. Since 2006, trainees have coauthored 17 peer-reviewed reports and 3 textbook chapters on endocrine topics. The establishment of a dedicated endocrine surgery program has a measurable impact on resident education within this core content area. Copyright © 2012. Published by Elsevier Inc.
Final Technical Report - DE-EE0003542
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haley, James D
Wind has provided energy for thousands of years: some of the earliest windmill engineering designs date back to ancient Babylonia and India where wind would be used as a source of irrigation. Today, wind is the quickest growing resource in Americas expanding energy infrastructure. However, to continue to positively diversify Americas energy portfolio and further reduce the countrys reliance of foreign oil, the industry must grow substantially over the next two decades in both turbine installations and skilled industrial manpower to support. The wind sector is still an emergent industry requiring maturation and development of its labor force: dedicated trainingmore » is needed to provide the hard and soft skills to support the increasingly complex wind turbine generators as the technology evolves. Furthermore, the American workforce is facing a steep decline in available labor resources as the baby boomer generation enters retirement age. It is therefore vital that a process is quickly created for supporting the next generation of wind technicians. However, the manpower growth must incorporate three key components. First, the safety and technical training curriculum must be standardized across the industry - current wind educational programs are disparate and dedicated standardization programs must be further refined and implemented. Second, it is essential that the wind sector avoid disrupting other energy production industries by cannibalizing workers, which would indirectly affect the rest of Americas energy portfolio. The future wind workforce must be created organically utilizing either young people entering the workforce or train personnel emerging from careers outside of energy production. Third, the training must be quick and efficient as large amounts of wind turbines are being erected each year and this growth is expected to continue until at least 2035. One source that matches these three requirements is personnel transitioning from military service to the civilian sector. Utilizing the labor pool of transitioning military personnel and a dedicated training program specifically tailored to military hard and soft skills, the wind workforce can rapidly expand with highly skilled personnel. A tailored training program also provides career opportunities to an underutilized labor force as the personnel return from active military duty. This projects goal was to create a Wind Workforce Development Program that streamlines the wind technician training process using industry-leading safety programs and building on existing military experience. The approach used was to gather data from the wind industry, develop the curriculum and test the process to ensure it provides adequate training to equip the technicians as they transition from the military into wind. The platform for the curriculum development is called Personal Qualification Standards (PQS), which is based on the program of the same name from the United States Navy. Not only would the program provide multiple delivery methods of training (including classroom, computer-based training and on-the-job training), but it also is a familiar style of training to many military men and women. By incorporating a familiar method of training, it encourages active participation in the training and reduces the time for personnel to grasp the concept and flow of the training requirements. The program was tested for thoroughness, schedule and efficacy using a 5-person pilot phase during the last two years. The results of the training were a reduction in time to complete training and increased customer satisfaction on client project sites. However, there were obstacles that surfaced and required adaptation throughout the project including method of delivery, curriculum development and project schedules and are discussed in detail throughout the report. There are several key recommendations in the report that discuss additional training infrastructure, scalability within additional alternative energy markets and organizational certification through standardization committees.« less
Volunteer Clouds and Citizen Cyberscience for LHC Physics
NASA Astrophysics Data System (ADS)
Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit
2011-12-01
Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.
Nuclear Safety via Commercial Grade Dedication - Hitting the Right Target - 12163
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kindred, Greg
2012-07-01
S.A.Technology has developed and implemented a highly effective Commercial Grade Dedication program that has been used to qualify a variety of equipment to the rigorous requirements of ASTM NQA-1. In specific cases, S.A.Technology personnel have worked closely with clients to develop complex Commercial Grade Dedication plans that satisfy the scrutiny of the US Department of Energy. These projects have been as simple as Passive Mechanical systems, and as complicated as Active Mechanical and Electrical systems. S.A.Technology's Commercial Grade Dedication plans have even been used as presentation materials to a client's internal departments encompassing Engineering, Quality and Procurement. This is themore » new target of today's CGD: exposing the reasoning behind the dedication process. Previously, only test and inspection results were expected. Today's CGD now needs to show how the decisions presented are the right decisions to make. We must be willing to undergo the process of learning how each new piece of equipment is affected by the system it is placed into, as well as understanding how that equipment can affect the system itself. It is a much more complicated and time-consuming endeavor to undertake. On top of it all, we must be able to voice those discoveries and rationalizations in a clear and concise manner. Unless we effectively communicate our intentions to the reader, we will not be understood. If researched correctly and presented properly, today's Commercial Grade Dedication plans will answer the appropriate questions before they are asked. (authors)« less
Visualization and processing of computed solid-state NMR parameters: MagresView and MagresPython.
Sturniolo, Simone; Green, Timothy F G; Hanson, Robert M; Zilka, Miri; Refson, Keith; Hodgkinson, Paul; Brown, Steven P; Yates, Jonathan R
2016-09-01
We introduce two open source tools to aid the processing and visualisation of ab-initio computed solid-state NMR parameters. The Magres file format for computed NMR parameters (as implemented in CASTEP v8.0 and QuantumEspresso v5.0.0) is implemented. MagresView is built upon the widely used Jmol crystal viewer, and provides an intuitive environment to display computed NMR parameters. It can provide simple pictorial representation of one- and two-dimensional NMR spectra as well as output a selected spin-system for exact simulations with dedicated spin-dynamics software. MagresPython provides a simple scripting environment to manipulate large numbers of computed NMR parameters to search for structural correlations. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Massport Air Emission Reduction Efforts and Community Enhancement Projects
This page describes efforts at Massport to reduce their emissions, including their Clean Truck Replacement program at Conley Terminal, their rubber tire gantry crane repower, idle and truck trip reductions, park creation, and dedicated freight corridor.
Semiannual Report: Oct 1, 2010 - Mar 31, 2011
Semiannual Report #EPA-350-R-11-005, May, 2011. The dedicated staff of the OIG will continue to do its best to ensure that Agency programs achieve their intended results and that its funds are properly expended.
U.S. intelligence system: model for corporate chiefs?
Gilad, B
1991-01-01
A fully dedicated intelligence support function for senior management is no longer a luxury but a necessity. Companies can enhance their intelligence capabilities by using the government model as a rough blueprint to structure such a program.
Implementation of transportation asset management in Grandview, Missouri : final report.
DOT National Transportation Integrated Search
2017-02-01
The successful implementation of transportation asset management (TAM) by local governments facilitates the optimization of limited resources. The use of a data-driven TAM program helps to identify and prioritize needs, identify and dedicate resource...
Information granules in image histogram analysis.
Wieclawek, Wojciech
2018-04-01
A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.
Instrument Systems Analysis and Verification Facility (ISAVF) users guide
NASA Technical Reports Server (NTRS)
Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.
1985-01-01
The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.
Operational plans for life science payloads - From experiment selection through postflight reporting
NASA Technical Reports Server (NTRS)
Mccollum, G. W.; Nelson, W. G.; Wells, G. W.
1976-01-01
Key features of operational plans developed in a study of the Space Shuttle era life science payloads program are presented. The data describes the overall acquisition, staging, and integration of payload elements, as well as program implementation methods and mission support requirements. Five configurations were selected as representative payloads: (a) carry-on laboratories - medical emphasis experiments, (b) mini-laboratories - medical/biology experiments, (c) seven-day dedicated laboratories - medical/biology experiments, (d) 30-day dedicated laboratories - Regenerative Life Support Evaluation (RLSE) with selected life science experiments, and (e) Biomedical Experiments Scientific Satellite (BESS) - extended duration primate (Type I) and small vertebrate (Type II) missions. The recommended operational methods described in the paper are compared to the fundamental data which has been developed in the life science Spacelab Mission Simulation (SMS) test series. Areas assessed include crew training, experiment development and integration, testing, data-dissemination, organization interfaces, and principal investigator working relationships.
2003-10-15
KENNEDY SPACE CENTER, FLA. - Before the start of the kickoff presentation for Spaceport Super Safety and Health Day, Center Director Jim Kennedy (left) chats with guest speaker Capt. Charles Plumb (USNR retired) and United Space Alliance Vice President and Deputy Program Manager, Florida Operations, Bill Pickavance. Spaceport Super Safety and Health Day is an annual event at KSC and Cape Canaveral Air Force Station dedicated to reinforcing safe and healthful behaviors in the workforce. Safety Awards were also given to individuals and groups.
Dedicated education unit: implementing an innovation in replication sites.
Moscato, Susan R; Nishioka, Vicki M; Coe, Michael T
2013-05-01
An important measure of an innovation is the ease of replication and achievement of the same positive outcomes. The dedicated education unit (DEU) clinical education model uses a collaborative academic-service partnership to develop an optimal learning environment for students. The University of Portland adapted this model from Flinders University, Australia, to increase the teaching capacity and quality of nursing education. This article identifies DEU implementation essentials and reports on the outcomes of two replication sites that received consultation support from the University of Portland. Program operation information, including education requirements for clinician instructors, types of patient care units, and clinical faculty-to-student ratios is presented. Case studies of the three programs suggest the DEU model is adaptable to a range of different clinical settings and continues to show promise as one strategy for addressing the nurse faculty shortage and strengthening academic-clinical collaborations while maintaining quality clinical education for students. Copyright 2013, SLACK Incorporated.
NASA Technical Reports Server (NTRS)
Hodge, Kenneth E. (Compiler); Kellogg, Yvonne (Editor)
1996-01-01
A technical symposium, aircraft display dedication, and pilots' panel discussion were held on May 27, 1992. to commemorate the 20th anniversary of the first flights of the F-8 Digital Fly-By-Wire (DFBW) and Supercritical Wing (SCW) research aircraft. The symposium featured technical presentations by former key government and industry participants in the advocacy, design, aircraft modification, and flight research program activities. The DFBW and SCW technical contributions are cited. A dedication ceremony marked permanent display of both program aircraft. The panel discussion participants included eight of the eighteen research and test pilots who flew these experimental aircraft. Pilots' remarks include descriptions of their most memorable flight experiences. The report also includes a survey of the Gulf Air War, an after-dinner presentation by noted aerospace author and historian Dr. Richard Hallion.
Canale, Maria Laura; Camerini, Andrea; Magnacca, Massimo; Del Meglio, Jacopo; Lilli, Alessio; Donati, Sara; Belli, Lucia; Lencioni, Stefania; Amoroso, Domenico; Casolo, Giancarlo
2017-11-01
The burden of cardiac side effects in oncology patients will dramatically increase in the near future as a result of the widespread use of anticancer agents affecting the cardiovascular system, the general population aging, the heightened attention in the detection of cardiac toxicity and the absolute gain in terms of overall survival. The relationship between cardiologists and oncologists should therefore be closer leading to the definition of cardio-oncology. The increased number of such patients requires the creation of a dedicated patient assistance program in order to guarantee every patient the possibility of an interdisciplinary and multiprofessional approach. A dedicated care pathway needs a reorganization of internal resources to ensure high standards of care. The proposed pathway is actually active at our institution and has been implemented taking into account available facilities and planned work amount. Our patient cardio-oncology program could be adapted with minimal changes to different hospitals.
NASA Technical Reports Server (NTRS)
Hodge, Kenneth E. (Compiler)
1996-01-01
A technical symposium, aircraft display dedication, and pilots' panel discussion were held on May 27, 1992, to commemorate the 20th anniversary of the first flights of the F-8 Digital Fly-By-Wire (DFBW) and Supercrit- ical Wing (SCW) research aircraft. The symposium featured technical presentations by former key government and industry participants in the advocacy, design, aircraft modification, and flight research program activities. The DFBW and SCW technical contributions are cited. A dedication ceremony marked permanent display of both program aircraft. The panel discussion participants included eight of the eighteen research and test pilots who flew these experimental aircraft. Pilots' remarks include descriptions of their most memorable flight experiences The report also includes a survey of the Gulf Air War, and an after-dinner presentation by noted aerospace author and historian Dr. Richard Hallion.
CIRIR Programs: Drilling and Research Opportunities at the Rochechouart Impact Structure
NASA Technical Reports Server (NTRS)
Lambert, P.; Alwmark, C.; Baratoux, D.; Brack, A.; Bruneton, P.; Buchner, E.; Claeys, P.; Dence, M.; French, B.; Hoerz, F
2017-01-01
Owing to its size, accessibility and erosional level, the Rochechouart impact structure, dated at 203 +/- 2 Ma (recalc.), is a unique reser-voir of knowledge within the population of the rare terrestrial analogous to large impacts craters observed on planetary surfaces. The site gives direct access to fundamental mechanisms both in impact-related geology (origin and evolution of planets) and biology (habitability of planets, emergence and evolution of life). For the last decade P. Lambert has been installing Rochechouart as International Natural Laboratory for studying impact processes and collateral effects on planetary surfaces. For this purpose the Center for International Research on Impacts and on Rochechouart (CIRIR) was installed on site in 2016 with twofold objectives and activities. First ones are scientific and dedicated to the scientific community. The second are cultural and educational and are dedi-cated to the public sensu lato. We present here the CIRIR, its scientific programs and the related reseach opportunities.
Themba-Nixon, Makani; Sutton, Charyn D; Shorty, Lawrence; Lew, Rod; Baezconde-Garbanati, Lourdes
2004-07-01
This article examines state Master Settlement Agreement (MSA) funding of tobacco control in communities of color. The primary research question was whether MSA monies resulted in dedicated funding for communities of color at the state level. This article also explores some of the historical factors that shape the relationship of communities of color to MSA funding as well as some of the institutional barriers to implementing comprehensive tobacco control programs in these communities. Three model approaches to funding parity in tobacco control programs were examined as case studies. Because of the limited amount of research available in this area, the data on tobacco control funding for communities of color was collected in interviews with state tobacco control agencies during October 2003. Findings supported our hypothesis that there were few dedicated resources at the state level for tobacco control and prevention in communities of color.
Astronomers Without Borders: An IYA2009 Organizational Node Dedicated to Connecting Groups Worldwide
NASA Astrophysics Data System (ADS)
Simmons, M.
2008-11-01
Astronomers Without Borders (AWB) is a new global organization and IYA2009 Organizational Node dedicated to furthering understanding and goodwill across national and cultural boundaries using the universal appeal of astronomy. The AWB network of Affiliates will bring together up to 1000 astronomy clubs, magazines and other organizations involved in astronomy. IYA2009 projects include The World at Night, a Special IYA2009 Project, and coordination of the 100 Hours of Astronomy Global Cornerstone Project. Sharing Telescopes and Resources (STAR) gathers surplus and new equipment in developed countries and donates them to clubs in undeveloped countries, with follow-up programs meant to ensure the best use of the equipment. The AWB website will serve as the basis for all programs including forums, galleries, video conferences and other relationship-building activities. AWB will continue and grow for many years beyond the end of IYA2009.
Cost Effective Computer-Assisted Legal Research, or When Two Are Better Than One.
ERIC Educational Resources Information Center
Griffith, Cary
1986-01-01
An analysis of pricing policies and costs of LEXIS and WESTLAW indicates that it is less expensive to subscribe to both using a PC microcomputer rather than a dedicated terminal. Rules for when to use each database are essential to lowering the costs of online legal research. (EM)
Going Paperless: How One School Board Made the Move to Electronic Agendas.
ERIC Educational Resources Information Center
Mills, Nancy V.
2000-01-01
An effort to improve communications between school board members and the superintendent and administrators of the Katy (Texas) Independent School District has evolved into electronic board agendas and paperless board meetings. Installation of laptop computers, printers, fax machines, and dedicated phone lines in board members' homes was key. (MLH)
2005-12-01
data collected via on-board instrumentation -VxWorks based computer. Each instrument produces a continuous time history record of up to 250...data in multidimensional hierarchies and views. UGC 2005 Institute a high performance data warehouse • PostgreSQL 7.4 installed on dedicated filesystem
OCLC Research: 2012 Activity Report
ERIC Educational Resources Information Center
OCLC Online Computer Library Center, Inc., 2013
2013-01-01
The mission of the Online Computer Library Center (OCLC) Research is to expand knowledge that advances OCLC's public purposes of furthering access to the world's information and reducing library costs. OCLC Research is dedicated to three roles: (1)To act as a community resource for shared research and development (R&D); (2) To provide advanced…
Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco
2013-05-01
During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.
Kim, Tane; Hao, Weilong
2014-09-27
The study of discrete characters is crucial for the understanding of evolutionary processes. Even though great advances have been made in the analysis of nucleotide sequences, computer programs for non-DNA discrete characters are often dedicated to specific analyses and lack flexibility. Discrete characters often have different transition rate matrices, variable rates among sites and sometimes contain unobservable states. To obtain the ability to accurately estimate a variety of discrete characters, programs with sophisticated methodologies and flexible settings are desired. DiscML performs maximum likelihood estimation for evolutionary rates of discrete characters on a provided phylogeny with the options that correct for unobservable data, rate variations, and unknown prior root probabilities from the empirical data. It gives users options to customize the instantaneous transition rate matrices, or to choose pre-determined matrices from models such as birth-and-death (BD), birth-death-and-innovation (BDI), equal rates (ER), symmetric (SYM), general time-reversible (GTR) and all rates different (ARD). Moreover, we show application examples of DiscML on gene family data and on intron presence/absence data. DiscML was developed as a unified R program for estimating evolutionary rates of discrete characters with no restriction on the number of character states, and with flexibility to use different transition models. DiscML is ideal for the analyses of binary (1s/0s) patterns, multi-gene families, and multistate discrete morphological characteristics.
Smyser, Christopher D; Tam, Emily W Y; Chang, Taeun; Soul, Janet S; Miller, Steven P; Glass, Hannah C
2016-10-01
Neonatal neurocritical care is a growing and rapidly evolving medical subspecialty, with increasing numbers of dedicated multidisciplinary clinical, educational, and research programs established at academic institutions. The growth of these programs has provided trainees in neurology, neonatology, and pediatrics with increased exposure to the field, sparking interest in dedicated fellowship training in fetal-neonatal neurology. To meet this rising demand, increasing numbers of training programs are being established to provide trainees with the requisite knowledge and skills to independently deliver care for infants with neurological injury or impairment from the fetal care center and neonatal intensive care unit to the outpatient clinic. This article provides an initial framework for standardization of training across these programs. Recommendations include goals and objectives for training in the field; core areas where clinical competency must be demonstrated; training activities and neuroimaging and neurodiagnostic modalities which require proficiency; and programmatic requirements necessary to support a comprehensive and well-rounded training program. With consistent implementation, the proposed model has the potential to establish recognized standards of professional excellence for training in the field, provide a pathway toward Accreditation Council for Graduate Medical Education certification for program graduates, and lead to continued improvements in medical and neurological care provided to patients in the neonatal intensive care unit. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lubin, Pierre; Vincent, Stéphane; Caltagirone, Jean-Paul
2005-04-01
The scope of this Note is to show the results obtained for simulating the two-dimensional head-on collision of two solitary waves by solving the Navier-Stokes equations in air and water. The work is dedicated to the numerical investigation of the hydrodynamics associated to this highly nonlinear flow configuration, the first numerical results being analyzed. The original numerical model is proved to be efficient and accurate in predicting the main features described in experiments found in the literature. This Note also outlines the interest of this configuration to be considered as a test-case for numerical models dedicated to computational fluid mechanics. To cite this article: P. Lubin et al., C. R. Mecanique 333 (2005).
NASA Technical Reports Server (NTRS)
Bremmer, D. A.
1986-01-01
The feasibility of some off-the-shelf microprocessors and state-of-art software is assessed (1) as a development system for the principle investigator (pi) in the design of the experiment model, (2) as an example of available technology application for future PI's experiments, (3) as a system capable of being interactive in the PCTC's simulation of the dedicated experiment processor (DEP), preferably by bringing the PI's DEP software directly into the simulation model, (4) as a system having bus compatibility with host VAX simulation computers, (5) as a system readily interfaced with mock-up panels and information displays, and (6) as a functional system for post mission data analysis.
Development of Clinical Contents Model Markup Language for Electronic Health Records
Yun, Ji-Hyun; Kim, Yoon
2012-01-01
Objectives To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Methods Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. Results CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. Conclusions CCML has the following strengths: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems. PMID:23115739
Anderson, Allison C; Mackey, Tim K; Attaran, Amir; Liang, Bryan A
2016-01-01
Illicit online pharmacies are a growing global public health concern. Stakeholders have started to engage in health promotion activities to educate the public, yet their scope and impact has not been examined. We wished to identify health promotion activities focused on consumer awareness regarding the risks of illicit online pharmacies. Organizations engaged on the issue were first identified using a set of engagement criteria. We then reviewed these organizations for health promotion programs, educational components, public service announcements, and social media engagement. Our review identified 13 organizations across a wide spectrum of stakeholders. Of these organizations, 69.2% (n = 9) had at least one type of health promotion activity targeting consumers. Although the vast majority of these organizations were active on Facebook or Twitter, many did not have dedicated content regarding online pharmacies (Facebook: 45.5%, Twitter: 58.3%). An online survey administered to 6 respondents employed by organizations identified in this study found that all organizations had dedicated programs on the issue, but only half had media planning strategies in place to measure the effectiveness of their programs. Overall, our results indicate that though some organizations are actively engaged on the issue, communication and education initiatives have had questionable effectiveness in reaching the public. We note that only a few organizations offered comprehensive and dedicated content to raise awareness on the issue and were effective in social media communications. In response, more robust collaborative efforts between stakeholders are needed to educate and protect the consumer about this public health and patient safety danger.
Life Cycle Analysis of Dedicated Nano-Launch Technologies
NASA Technical Reports Server (NTRS)
Zapata, Edgar; McCleskey, Carey; Martin, John; Lepsch, Roger; Hernani, Tosoc
2014-01-01
Recent technology advancements have enabled the development of small cheap satellites that can perform useful functions in the space environment. Currently, the only low cost option for getting these payloads into orbit is through ride share programs. As a result, these launch opportunities await primary payload launches and a backlog exists. An alternative option would be dedicated nano-launch systems built and operated to provide more flexible launch services, higher availability, and affordable prices. The potential customer base that would drive requirements or support a business case includes commercial, academia, civil government and defense. Further, NASA technology investments could enable these alternative game changing options.With this context, in 2013 the Game Changing Development (GCD) program funded a NASA team to investigate the feasibility of dedicated nano-satellite launch systems with a recurring cost of less than $2 million per launch for a 5 kg payload to low Earth orbit. The team products would include potential concepts, technologies and factors for enabling the ambitious cost goal, exploring the nature of the goal itself, and informing the GCD program technology investment decision making process. This paper provides an overview of the life cycle analysis effort that was conducted in 2013 by an inter-center NASA team. This effort included the development of reference nano-launch system concepts, developing analysis processes and models, establishing a basis for cost estimates (development, manufacturing and launch) suitable to the scale of the systems, and especially, understanding the relationship of potential game changing technologies to life cycle costs, as well as other factors, such as flights per year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less
NASA Astrophysics Data System (ADS)
Sextos, Anastasios G.
2014-01-01
This paper presents the structure of an undergraduate course entitled 'programming techniques and the use of specialised software in structural engineering' which is offered to the fifth (final) year students of the Civil Engineering Department of Aristotle University Thessaloniki in Greece. The aim of this course is to demonstrate the use of new information technologies in the field of structural engineering and to teach modern programming and finite element simulation techniques that the students can in turn apply in both research and everyday design of structures. The course also focuses on the physical interpretation of structural engineering problems, in a way that the students become familiar with the concept of computational tools without losing perspective from the engineering problem studied. For this purpose, a wide variety of structural engineering problems are studied in class, involving structural statics, dynamics, earthquake engineering, design of reinforced concrete and steel structures as well as data and information management. The main novelty of the course is that it is taught and examined solely in the computer laboratory ensuring that each student can accomplish the prescribed 'hands-on' training on a dedicated computer, strictly on a 1:1 student over hardware ratio. Significant effort has also been put so that modern educational techniques and tools are utilised to offer the course in an essentially paperless mode. This involves electronic educational material, video tutorials, student information in real time and exams given and assessed electronically through an ad hoc developed, personalised, electronic system. The positive feedback received from the students reveals that the concept of a paperless course is not only applicable in real academic conditions but is also a promising approach that significantly increases student productivity and engagement. The question, however, is whether such an investment in educational technology is indeed timely during economic recession, where the academic priorities are rapidly changing. In the light of this unfavourable and unstable financial environment, a critical overview of the strengths, the weaknesses, the opportunities and the threats of this effort is presented herein, hopefully contributing to the discussion on the future of higher education in the time of crisis.
Agricultural Research Service: biodefense research.
Gay, C G
2013-01-01
The National Animal Health Program at the Agricultural Research Service (ARS), United States Department of Agriculture (USDA), includes research programs dedicated to the defense of animal agriculture against the treat of biological agents with the potential of significant economic harm and/or public health consequences. This article provides a summary of the program and identifies its relevance to national initiatives to protect livestock and poultry as well as global food security. An introduction to setting research priorities and a selection of research accomplishments that define the scope of the biodefense research program is provided.
The Birth of the Cosmic Frontier
Kolb, Rocky; Turner, Mike
2018-05-31
Scientists Rocky Kolb and Mike Turner recount the time they first proposed that Fermilab â dedicated to the study of the universe's smallest constituents â expand its program to include the stars, galaxies and the cosmos.
SS-Wrapper: a package of wrapper applications for similarity searches on Linux clusters.
Wang, Chunlin; Lefkowitz, Elliot J
2004-10-28
Large-scale sequence comparison is a powerful tool for biological inference in modern molecular biology. Comparing new sequences to those in annotated databases is a useful source of functional and structural information about these sequences. Using software such as the basic local alignment search tool (BLAST) or HMMPFAM to identify statistically significant matches between newly sequenced segments of genetic material and those in databases is an important task for most molecular biologists. Searching algorithms are intrinsically slow and data-intensive, especially in light of the rapid growth of biological sequence databases due to the emergence of high throughput DNA sequencing techniques. Thus, traditional bioinformatics tools are impractical on PCs and even on dedicated UNIX servers. To take advantage of larger databases and more reliable methods, high performance computation becomes necessary. We describe the implementation of SS-Wrapper (Similarity Search Wrapper), a package of wrapper applications that can parallelize similarity search applications on a Linux cluster. Our wrapper utilizes a query segmentation-search (QS-search) approach to parallelize sequence database search applications. It takes into consideration load balancing between each node on the cluster to maximize resource usage. QS-search is designed to wrap many different search tools, such as BLAST and HMMPFAM using the same interface. This implementation does not alter the original program, so newly obtained programs and program updates should be accommodated easily. Benchmark experiments using QS-search to optimize BLAST and HMMPFAM showed that QS-search accelerated the performance of these programs almost linearly in proportion to the number of CPUs used. We have also implemented a wrapper that utilizes a database segmentation approach (DS-BLAST) that provides a complementary solution for BLAST searches when the database is too large to fit into the memory of a single node. Used together, QS-search and DS-BLAST provide a flexible solution to adapt sequential similarity searching applications in high performance computing environments. Their ease of use and their ability to wrap a variety of database search programs provide an analytical architecture to assist both the seasoned bioinformaticist and the wet-bench biologist.
SS-Wrapper: a package of wrapper applications for similarity searches on Linux clusters
Wang, Chunlin; Lefkowitz, Elliot J
2004-01-01
Background Large-scale sequence comparison is a powerful tool for biological inference in modern molecular biology. Comparing new sequences to those in annotated databases is a useful source of functional and structural information about these sequences. Using software such as the basic local alignment search tool (BLAST) or HMMPFAM to identify statistically significant matches between newly sequenced segments of genetic material and those in databases is an important task for most molecular biologists. Searching algorithms are intrinsically slow and data-intensive, especially in light of the rapid growth of biological sequence databases due to the emergence of high throughput DNA sequencing techniques. Thus, traditional bioinformatics tools are impractical on PCs and even on dedicated UNIX servers. To take advantage of larger databases and more reliable methods, high performance computation becomes necessary. Results We describe the implementation of SS-Wrapper (Similarity Search Wrapper), a package of wrapper applications that can parallelize similarity search applications on a Linux cluster. Our wrapper utilizes a query segmentation-search (QS-search) approach to parallelize sequence database search applications. It takes into consideration load balancing between each node on the cluster to maximize resource usage. QS-search is designed to wrap many different search tools, such as BLAST and HMMPFAM using the same interface. This implementation does not alter the original program, so newly obtained programs and program updates should be accommodated easily. Benchmark experiments using QS-search to optimize BLAST and HMMPFAM showed that QS-search accelerated the performance of these programs almost linearly in proportion to the number of CPUs used. We have also implemented a wrapper that utilizes a database segmentation approach (DS-BLAST) that provides a complementary solution for BLAST searches when the database is too large to fit into the memory of a single node. Conclusions Used together, QS-search and DS-BLAST provide a flexible solution to adapt sequential similarity searching applications in high performance computing environments. Their ease of use and their ability to wrap a variety of database search programs provide an analytical architecture to assist both the seasoned bioinformaticist and the wet-bench biologist. PMID:15511296
Research on Spectroscopy, Opacity, and Atmospheres
NASA Technical Reports Server (NTRS)
Kurucz, Robert L.
1999-01-01
To make my calculations more readily accessible I have set up a web site cfaku5.harvard.edu that can also be accessed by FTP. it has 5 9GB disks that hold all of my atomic and diatomic molecular data, my tables of distribution function opacities, my grids of model atmospheres, colors, fluxes, etc, my program that are ready for distribution, most of my recent papers. Atlases and computed spectra will be added as they are completed. New atomic and molecular calculations will be added as they are completed. I got my atomic programs that had been running on a Cray at the San Diego Supercomputer Center to run on my Vaxes and Alpha. I started with Ni and Co because there were new laboratory analyses that included isotopic and hyperfine splitting. Those calculations are described in the appended abstract for the 6th Atomic Spectroscopy and oscillator Strengths meeting in Victoria last summer. A surprising finding is that quadrupole transitions have been grossly in error because mixing with higher levels has not been included. I now have enough memory in my Alpha to treat 3000 x 3000 matrices. I now include all levels up through n=9 for Fe I and 11, the spectra for which the most information is available. I am finishing those calculations right now. After Fe I and Fe 11, all other spectra are "easy", and I will be in mass production. ATL;LS12, my opacity sampling program for computing models with arbitrary abundances, has been put on the web server. I wrote a new distribution function opacity program for workstations that replaces the one I used on the Cray at the San Diego Supercomputer Center. Each set of abundances would take 100 Cray hours costing $100,000. 1 ran 25 cases. Each of my opacity CDs contains three abundances. I have a new program -iinning on the Alpha that takes about a week. I am going to have to get a faster processor or I will have to dedicate a whole workstation just to opacities.
A programming framework for data streaming on the Xeon Phi
NASA Astrophysics Data System (ADS)
Chapeland, S.;
2017-10-01
ALICE (A Large Ion Collider Experiment) is the dedicated heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shut-down of the LHC, the ALICE detector will be upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system (O2) a sustained throughput of 3.4 TB/s. This data will be processed on the fly so that the stream to permanent storage does not exceed 90 GB/s peak, the raw data being discarded. In the context of assessing different computing platforms for the O2 system, we have developed a framework for the Intel Xeon Phi processors (MIC). It provides the components to build a processing pipeline streaming the data from the PC memory to a pool of permanent threads running on the MIC, and back to the host after processing. It is based on explicit offloading mechanisms (data transfer, asynchronous tasks) and basic building blocks (FIFOs, memory pools, C++11 threads). The user only needs to implement the processing method to be run on the MIC. We present in this paper the architecture, implementation, and performance of this system.
Neurocritical care education during neurology residency: AAN survey of US program directors.
Sheth, K N; Drogan, O; Manno, E; Geocadin, R G; Ziai, W
2012-05-29
Limited information is available regarding the current state of neurocritical care education for neurology residents. The goal of our survey was to assess the need and current state of neurocritical care training for neurology residents. A survey instrument was developed and, with the support of the American Academy of Neurology, distributed to residency program directors of 132 accredited neurology programs in the United States in 2011. A response rate of 74% (98 of 132) was achieved. A dedicated neuroscience intensive care unit (neuro-ICU) existed in 64%. Fifty-six percent of residency programs offer a dedicated rotation in the neuro-ICU, lasting 4 weeks on average. Where available, the neuro-ICU rotation was required in the vast majority (91%) of programs. Neurology residents' exposure to the fundamental principles of neurocritical care was obtained through a variety of mechanisms. Of program directors, 37% indicated that residents would be interested in performing away rotations in a neuro-ICU. From 2005 to 2010, the number of programs sending at least one resident into a neuro-ICU fellowship increased from 14% to 35%. Despite the expansion of neurocritical care, large proportions of US neurology residents have limited exposure to a neuro-ICU and neurointensivists. Formal training in the principles of neurocritical care may be highly variable. The results of this survey suggest a charge to address the variability of resident education and to develop standardized curricula in neurocritical care for neurology residents.
NASA Astrophysics Data System (ADS)
Berdychowski, Piotr P.; Zabolotny, Wojciech M.
2010-09-01
The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.
NASA Technical Reports Server (NTRS)
Mohling, Robert A.; Marquardt, Eric D.; Fusilier, Fred C.; Fesmire, James E.
2003-01-01
The Cryogenic Information Center (CIC) is a not-for-profit corporation dedicated to preserving and distributing cryogenic information to government, industry, and academia. The heart of the CIC is a uniform source of cryogenic data including analyses, design, materials and processes, and test information traceable back to the Cryogenic Data Center of the former National Bureau of Standards. The electronic database is a national treasure containing over 146,000 specific bibliographic citations of cryogenic literature and thermophysical property data dating back to 1829. A new technical/bibliographic inquiry service can perform searches and technical analyses. The Cryogenic Material Properties (CMP) Program consists of computer codes using empirical equations to determine thermophysical material properties with emphasis on the 4-300K range. CMP's objective is to develop a user-friendly standard material property database using the best available data so government and industry can conduct more accurate analyses. The CIC serves to benefit researchers, engineers, and technologists in cryogenics and cryogenic engineering, whether they are new or experienced in the field.
On Stellar Flash Echoes from Circular Rings
NASA Astrophysics Data System (ADS)
Nemiroff, Robert; Mukherjee, Oindabi
2018-01-01
A flash -- or any episode of variability -- that occurs in the vicinity of a circular ring might be seen several times later, simultaneously, as echoes on the ring. Effective images of the flash are created and annihilated in pairs, with as many as four flash images visible concurrently. Videos detailing sequences of image pair creation, tandem motion, and subsequent image annihilation are shown, given simple opacity and scattering assumptions. It is proven that, surprisingly, images from a second pair creation event always annihilate with images from the first. Caustic surfaces between flash locations yielding two and four images are computed. Although such ring echos surely occur, their practical detection might be difficult as it could require dedicated observing programs involving sensitive photometry of extended objects. Potential flash sources include planetary and interstellar gas and dust rings near and around variable stars, flare stars, novae, supernovae, and GRBs. Potentially recoverable information includes size, distance, temporal history, and angular isotropy of both the ring and flash.
Software architecture for intelligent image processing using Prolog
NASA Astrophysics Data System (ADS)
Jones, Andrew C.; Batchelor, Bruce G.
1994-10-01
We describe a prototype system for interactive image processing using Prolog, implemented by the first author on an Apple Macintosh computer. This system is inspired by Prolog+, but differs from it in two particularly important respects. The first is that whereas Prolog+ assumes the availability of dedicated image processing hardware, with which the Prolog system communicates, our present system implements image processing functions in software using the C programming language. The second difference is that although our present system supports Prolog+ commands, these are implemented in terms of lower-level Prolog predicates which provide a more flexible approach to image manipulation. We discuss the impact of the Apple Macintosh operating system upon the implementation of the image-processing functions, and the interface between these functions and the Prolog system. We also explain how the Prolog+ commands have been implemented. The system described in this paper is a fairly early prototype, and we outline how we intend to develop the system, a task which is expedited by the extensible architecture we have implemented.
Numerical simulation of controlled directional solidification under microgravity conditions
NASA Astrophysics Data System (ADS)
Holl, S.; Roos, D.; Wein, J.
The computer-assisted simulation of solidification processes influenced by gravity has gained increased importance during the previous years regarding ground-based as well as microgravity research. Depending on the specific needs of the investigator, the simulation model ideally covers a broad spectrum of applications. These primarily include the optimization of furnace design in interaction with selected process parameters to meet the desired crystallization conditions. Different approaches concerning the complexity of the simulation models as well as their dedicated applications will be discussed in this paper. Special emphasis will be put on the potential of software tools to increase the scientific quality and cost-efficiency of microgravity experimentation. The results gained so far in the context of TEXUS, FSLP, D-1 and D-2 (preparatory program) experiments, highlighting their simulation-supported preparation and evaluation will be discussed. An outlook will then be given on the possibilities to enhance the efficiency of pre-industrial research in the Columbus era through the incorporation of suitable simulation methods and tools.
NASA Technical Reports Server (NTRS)
Werthimer, D.; Tarter, J.; Bowyer, S.
1985-01-01
Serendip II is an automated system designed to perform a real time search for narrow band radio signals in the spectra of sources in a regularly scheduled, non-Seti, astronomical observing program. Because Serendip II is expected to run continuously without requiring dedicated observing time, it is hoped that a large portion of the sky will be surveyed at high sensitivity and low cost. Serendip II will compute the power spectrum using a 65,536 channel fast Fourier transform processor with a real time bandwidth of 128 KHz and 2 Hz per channel resolution. After searching for peaks in a 100 KHz portion of the radio telescope's IF band, Serendip II will move to the next 100 KHz portion using a programmable frequency synthesizer; when the whole IF band has been scanned, the process will start again. Unidentified peaks in the power spectra are candidates for further study and their celestial coordinates will be recorded along with the time and power, IF and RF frequency, and bandwidth of the peak.
ERIC Educational Resources Information Center
Dewhurst, Marit; Desai, Dipti
2016-01-01
Purpose: The rise of out-of-school youth arts organizations, especially those dedicated to addressing social issues with young people, suggests a growing need for spaces in which we prepare young people to creatively and critically shape their communities. While the popularity of these programs is certainly positive, it does little to tell us what…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nutini, Irene
2017-09-20
A short overview of the Liquid Argon In A Testbeam (LArIAT) experiment hosted at Fermilab is reported. This program supports the Liquid Argon Time Projection Chamber (LArTPC) Neutrino Experiments at Fermilab. The LArIAT program consists of a calibration of a LArTPC in a dedicated charged particle beamline. The first total pion interaction cross section measurement ever made on argon is presented here (preliminary result).
Chinas Future SSBN Command and Control Structure
2016-11-01
ndupress.ndu.edu SF No. 299 1 China’s ongoing modernization program is transforming the country’s nuclear arsenal from one consisting of a few...also be mediated by bureaucratic poli- tics, including the time-honored tradition of interservice rivalry. The emergent SSBN fleet may represent a prime...represent a reliable source of resources. China has dedicated substantial resources to undertaking a nuclear modernization program designed to ensure
Trombetti, A; Hars, M; Herrmann, F; Rizzoli, R; Ferrari, S
2013-03-01
This controlled intervention study in hospitalized oldest old adults showed that a multifactorial fall-and-fracture risk assessment and management program, applied in a dedicated geriatric hospital unit, was effective in improving fall-related physical and functional performances and the level of independence in activities of daily living in high-risk patients. Hospitalization affords a major opportunity for interdisciplinary cooperation to manage fall-and-fracture risk factors in older adults. This study aimed at assessing the effects on physical performances and the level of independence in activities of daily living (ADL) of a multifactorial fall-and-fracture risk assessment and management program applied in a geriatric hospital setting. A controlled intervention study was conducted among 122 geriatric inpatients (mean ± SD age, 84 ± 7 years) admitted with a fall-related diagnosis. Among them, 92 were admitted to a dedicated unit and enrolled into a multifactorial intervention program, including intensive targeted exercise. Thirty patients who received standard usual care in a general geriatric unit formed the control group. Primary outcomes included gait and balance performances and the level of independence in ADL measured 12 ± 6 days apart. Secondary outcomes included length of stay, incidence of in-hospital falls, hospital readmission, and mortality rates. Compared to the usual care group, the intervention group had significant improvements in Timed Up and Go (adjusted mean difference [AMD] = -3.7s; 95 % CI = -6.8 to -0.7; P = 0.017), Tinetti (AMD = -1.4; 95 % CI = -2.1 to -0.8; P < 0.001), and Functional Independence Measure (AMD = 6.5; 95 %CI = 0.7-12.3; P = 0.027) test performances, as well as in several gait parameters (P < 0.05). Furthermore, this program favorably impacted adverse outcomes including hospital readmission (hazard ratio = 0.3; 95 % CI = 0.1-0.9; P = 0.02). A multifactorial fall-and-fracture risk-based intervention program, applied in a dedicated geriatric hospital unit, was effective and more beneficial than usual care in improving physical parameters related to the risk of fall and disability among high-risk oldest old patients.
A fast algorithm for computer aided collimation gamma camera (CACAO)
NASA Astrophysics Data System (ADS)
Jeanguillaume, C.; Begot, S.; Quartuccio, M.; Douiri, A.; Franck, D.; Pihet, P.; Ballongue, P.
2000-08-01
The computer aided collimation gamma camera is aimed at breaking down the resolution sensitivity trade-off of the conventional parallel hole collimator. It uses larger and longer holes, having an added linear movement at the acquisition sequence. A dedicated algorithm including shift and sum, deconvolution, parabolic filtering and rotation is described. Examples of reconstruction are given. This work shows that a simple and fast algorithm, based on a diagonal dominant approximation of the problem can be derived. Its gives a practical solution to the CACAO reconstruction problem.
News on Seeking Gaia's Astrometric Core Solution with AGIS
NASA Astrophysics Data System (ADS)
Lammers, U.; Lindegren, L.
We report on recent new developments around the Astrometric Global Iterative Solution system. This includes the availability of an efficient Conjugate Gradient solver and the Generic Astrometric Calibration scheme that had been proposed a while ago. The number of primary stars to be included in the core solution is now believed to be significantly higher than the 100 Million that served as baseline until now. Cloud computing services are being studied as a possible cost-effective alternative to running AGIS on dedicated computing hardware at ESAC during the operational phase.
Automated method for structural segmentation of nasal airways based on cone beam computed tomography
NASA Astrophysics Data System (ADS)
Tymkovych, Maksym Yu.; Avrunin, Oleg G.; Paliy, Victor G.; Filzow, Maksim; Gryshkov, Oleksandr; Glasmacher, Birgit; Omiotek, Zbigniew; DzierŻak, RóŻa; Smailova, Saule; Kozbekova, Ainur
2017-08-01
The work is dedicated to the segmentation problem of human nasal airways using Cone Beam Computed Tomography. During research, we propose a specialized approach of structured segmentation of nasal airways. That approach use spatial information, symmetrisation of the structures. The proposed stages can be used for construction a virtual three dimensional model of nasal airways and for production full-scale personalized atlases. During research we build the virtual model of nasal airways, which can be used for construction specialized medical atlases and aerodynamics researches.
The Merits of the Continued Instruction of ADA as a First Language at the Naval Postgraduate School
1994-09-01
H., Ada as a Second Language, McGraw Hill Book Company, 1986. Deitel , H.M. and Deitel P.J., C+ + How to Program , Prentice Hatl, 1994. Fastrack...complex language whose success is dependent on its users knowing how to utilize all of its features to yield good programs . (Fastrack) The addition of...Ada in particular as the first programming language at the Naval Postgraduate School The catch-22 of industry’s dedication to C++ and the Department
2011-08-13
CAPE CANAVERAL, Fla. -- NASA’s Space Shuttle Program Launch Integration Manager Mike Moses speaks to current and former space shuttle workers and their families during the “We Made History! Shuttle Program Celebration,” Aug. 13, at the Kennedy Space Center Visitor Complex, Fla. The event was held to honor shuttle workers’ dedication to the agency’s Space Shuttle Program and to celebrate 30 years of space shuttle achievements. The event featured food, music, entertainment, astronaut appearances, educational activities, giveaways, and Starfire Night Skyshow. Photo credit: Gianni Woods
Data Mining and Knowledge Discover - IBM Cognitive Alternatives for NASA KSC
NASA Technical Reports Server (NTRS)
Velez, Victor Hugo
2016-01-01
Skillful tools in cognitive computing to transform industries have been found favorable and profitable for different Directorates at NASA KSC. In this study is shown how cognitive computing systems can be useful for NASA when computers are trained in the same way as humans are to gain knowledge over time. Increasing knowledge through senses, learning and a summation of events is how the applications created by the firm IBM empower the artificial intelligence in a cognitive computing system. NASA has explored and applied for the last decades the artificial intelligence approach specifically with cognitive computing in few projects adopting similar models proposed by IBM Watson. However, the usage of semantic technologies by the dedicated business unit developed by IBM leads these cognitive computing applications to outperform the functionality of the inner tools and present outstanding analysis to facilitate the decision making for managers and leads in a management information system.
Data communications in a parallel active messaging interface of a parallel computer
Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E
2013-11-12
Data communications in a parallel active messaging interface (`PAMI`) of a parallel computer composed of compute nodes that execute a parallel application, each compute node including application processors that execute the parallel application and at least one management processor dedicated to gathering information regarding data communications. The PAMI is composed of data communications endpoints, each endpoint composed of a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes and the endpoints coupled for data communications through the PAMI and through data communications resources. Embodiments function by gathering call site statistics describing data communications resulting from execution of data communications instructions and identifying in dependence upon the call cite statistics a data communications algorithm for use in executing a data communications instruction at a call site in the parallel application.
ERIC Educational Resources Information Center
Carrington, Michal; Chen, Richard; Davies, Martin; Kaur, Jagjit; Neville, Benjamin
2011-01-01
An argument map visually represents the structure of an argument, outlining its informal logical connections and informing judgments as to its worthiness. Argument mapping can be augmented with dedicated software that aids the mapping process. Empirical evidence suggests that semester-length subjects using argument mapping along with dedicated…
Cooley building opens in Houston. Demonstrates value of fully integrated marketing communications.
Rees, Tom
2002-01-01
The Texas Heart Institute at St. Luke's Episcopal HospiTal in Houston dedicated its new 10-story Denton A. Cooley Building in January. The structure opened with a fanfare, thanks to a well-integrated marketing communications program.
36 CFR 59.3 - Conversion requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... AND WATER CONSERVATION FUND PROGRAM OF ASSISTANCE TO STATES; POST-COMPLETION COMPLIANCE... not been dedicated or managed for recreation/conservation use may be used as replacement land even if... proposed conversion and substitution constitute significant changes to the original Land and Water...